First Impression of Usabilla
We have Usabilla listed in our ‘Conduct a User Test’ category along with Loop11 and Usertesting.com. In earlier posts, Ben and I have reviewed Usabilla and concluded it would be a valuable resource for simple usability tests for websites. Basically, if you are a business owner with a website, then Usabilla is a good place to start with user testing. It’s simple. It has a usable free service plan and support resources to help you get started today.
Website testing, in it’s simplest form, is observing people using your site with the intent to make improvements. Usabilla offers a way to test users completing tasks on the website. You give a task. The user completes the task with a click and/or a note. You can see where they clicked, what they wrote, and the time it took.
What is the experience of using Usabilla?
Before I started with my own test, I studied the Usabilla features. I first heard of Usabilla on review-type blog post where the founder and CEO of the company, Paul Veugen, commented and addressed issues uncovered in the review. That shows engagement and interest. This isn’t a dead-tool or a side project for someone. It makes me trust these folks. Points for Usabilla, right off the bat.
My trust in the company is further supported by their blog. The posts are relevant to the small business owner interested in user testing a website. Five-things-you-can-test-under-five-minutes and Guerilla-usability-testing-tools-improve-conversion-rate-satisfaction are two insightful and practical posts I read.
Additionally, I liked that I could test sites [using the screencapture tool] and/or uploaded images for my tests. This allows me to follow the advice of Steve Krug and test the napkin sketch of my designs [Maxim:Start earlier than you think makes sense]. Collecting users for the test was made easier by providing a javaScrip widget to embed on the site to get actual site visitors to perform the test. And, lastly, I want to give a shout out to the Usabilla Support. I had a question about the widget placement on the page [it was being covered by the content]. I followed the ‘Chat Now’ link on the dashboard to quickly get in touch with Paul v. A. who looked at my site and instructed me how to fix my problem. Try to find that kind of support in another free tool!
The Process of setting up the test
- Sign up
- Create a test
- give it a name
- custom logo – not sure where this was used
- Add a page – generated screenshot from a URL didn’t work on one page -had to use my own screenshot
- Choose a task – create a new task
- preview the test
- activate the test
- invite participants – via URL or Widget
- Pause test and Edit
- Analyze the results
The experience of using the tools was smooth and went as expected. No frustration or undo cognitive load. The real hard part is deciding what to test and how to test it. Thankfully the Blog was very helpful, even if the standard questions were not. Meaning, I think those standard questions won’t reveal actionable insights. More on this later.
How to get the most value from remote user testing with Usabilla
Just because you have access to a powerful tool, doesn’t mean you can produce powerful results. The tool is only as good as the user. If this discourages you, don’t let it. Everyone has to begin somewhere [and we've chosen Usabilla as that tool]. Here are some of my thoughts based on my first experiences with remote user testing like this.
First and foremost, what do you test? More specifically, what questions do you ask the user about your design in order to get actionable / profitable results? I’ve talked about this before in my post,”It’s the Questions, Stoopid“. Basically, everyone has impressions and they are generally unique and varied. That makes them both hard to test and [mostly] un-actionable.
The tool, powerful as it is, allows you to test whatever you like. You can use the standard questions. IMHO, this is likely to return superficial, un-actionable results. Or, dig a bit deeper into your design decisions and ask some targeted questions and will lead to concrete site changes, more conversions, better [smoother? more elegant? less confusing?] critical paths for your site visitor through your site.
Developing this talent for question creation is the real challenge for UX experts. It requires a scientific eye for data vs noise and the creativity to manipulate the tools to reveal actionable insights. It’s hard – try it yourself and you’ll see. Of course you could be the ‘chosen one’. For the rest of us, I suggest these test questions as food for thought and a good starting point (borrowed from the Usabilla Blog post mentioned earlier).
“Where would you click to start to use this product?“
Simple test of how and if the user notices the call to action and the Critical Path of the user experience. If they get it ‘wrong’ [or I should say, click where you don't expect] or it takes a long time for them to click [nice feature of Usabilla measures this time], then you can review / revise your design and retest. Usabilla makes it easy to do this.
“Which elements make you trust on this website, and why?“
Trust is paramount on the internet. If I don’t trust a website, then I’m not likely to convert for them – buy stuff, give my info, sign up for anything. Could be just me… I believe good designers put in trust building elements into sites – certification logos, personal pictures – not stock, quality content, well known brands or imagery, etc. The question will reveal: Are users picking this up? Do they click on the elements you expect? And, because of a nice contextual note tool, the user can very leave a specific note about the design. Yep, that “…and, why?” is powerful.
A few final thoughts:
Where you get / recruit your test participants matters, I feel. If you are simply hoping to increase the efficiency of the ‘around the watercooler’ test – sure, go ahead and use your facebook and social media to get participants. They are returning users. And, I guess, if your site is based on returning users, then this would be fine and good. But, if you need first time users, then use the widget or ask via some other means than existing contact lists.
I saw an banner ad for Usabilla where the tagline was ‘We give designers quantitative ammunition to go with their qualitative insight’. I thought this was perfectly phrased. We do test insight. We do test the assumptions that lead to our design decisions. That’s why we test. We ask ourselves: Will making this button red attract more attention? Will adding a recognizable logo increase the sense of trust in the users? Until ‘the scientists’ develop a brain scanning and opinion deciphering device, we’ll have to be creative with the user-testing tools available to uncover the effect of our designs on the user.
I’m glad to add Usabilla to our tools and recommend it to anyone looking to quickly and easily start testing today.