The Nuts and Bolts of How to Design a User Survey

how-to-design-a-user-survey

First Things First

Today’s post is all about creating a survey. If you have a few spare minutes, would you mind taking the survey before continuing to read? You’d be doing us a big solid and you’ll have a big leg up on understanding this article than all of the bourgeois skimmers. Thanks in advance. πŸ™‚

Why Should You Use a Survey?

I could write my own section here but this article from Elizabeth Ferrall-Nunge at Design Staff does a great job summing things up.

When to use a survey

Before designing your survey, think about the questions you want answered, and decide if surveys are the right tool for the job.

Surveys are great when you want to…

  • Track changes over time β€” See what changes before and after a feature launch.
  • Quantify issues seen in user studies β€” We know [x] is a problem for some users, but how many
  • Measure attitudes, intents, or task success.

But surveys are not very good at…

  • Discovering underlying user motivations and needs β€” Interviewing users are better suited for this.
  • Understanding whether people can successfully use your product β€” Again, user studies are better for this.
  • Uncovering actual user behavior and habits β€” Since people are bad at self-reporting, logs analysis is better suited for this.

The Backstory

Long time readers will recognize the name LessAccounting. They’re an online bookkeeping app and I’ve been writing about them for the past few months as we’ve gone through a redesign of the brochure part of their website.

Where it stands now is that we have the current site and we have a new version of the site setup in InVision as a high-fidelity wireframe. This lets us click around the design as if it were a functioning site without having to do any actual coding.

Before we move past the design phase of the redesign, it’s time to get feedback on both designs.

Now, arguably, I should have done this initially, before the redesign even happened. But two things contributed to holding off to this point:

1. After conducting a professional review of the website – which included my own analysis of the website and web analytics, I was convinced that the problems were large enough that testing would only confirm my suspicions and as such, weren’t crucial.

2. Allan (head honcho at LessAccounting) didn’t think it was really necessary. He was happy to jump head first into a new design without testing.

In retrospect, that’s not the smart way to do it. It’s totally possible that in conducting these evaluations that we’ll find data that goes against our assumptions and that will result in having to change some things in the new design – essentially making us do work more than one time.

So for all you new UXers out there, here’s an object lesson:

1. Your client is, most likely, going to play down the need for testing, even if they hired you for it specifically. It’s a weird thing, but in my experience, also true.

2. UX professionals develop a good intuition about what works in a website. That doesn’t mean that intuition should be a substitute for proper testing. In this case, better late than never, but I should have done this at the beginning of the process.

Now that we’re through with that little confessional, let’s get down to brass tacks.

How to develop your questions

Originally this post was going to be about my experience with Loop11 in creating a survey. But after digging into and creating my own survey, it makes more sense to talk about this process first and then how to integrate it into a survey tool: Usabilla, UserZoom, Loop11, etc.

There’s a weird paralysis that sets in almost immediately when setting out to create a survey. One question reigns supreme: what should I ask these people?

That’s a really good question to ask. And the answer is another questions: What do you hope to get out of it? What do you expect to do with the results?

For me, I want them to tell me who is taking my test, how easy it is to use the website, how easy it is to understand the website, and what they think about their interactions with the website. The first tells me how much weight I should place in their opinions and the rest of them will help me figure out how to design a better website.

Speaking of web design, all designs are born out of many design decisions. And those design decisions are based on a group of assumptions. These assumptions went into creating the current version of the website and more assumptions were made when going through the redesign.

It seems important to get to the bottom of all of that.

1. You need to test your assumptions

The one thing you know for sure is that you have an opinion about how things are now. That’s a good place to start. Develop some baseline assumptions. For LessAccounting I asked myself to complete a series of statements:

1. People use online bookkeeping software because….
2. As a business tool, business owners want…
3. Things people must feel about this website (generically)…
4. Things the app must do…

For me, I completed those statements thusly:

1. People use online bookkeeping software because….
a. it’s convenient
b. it’s easy
c. it’s smart (reminders, etc.)
d. it fits into my workflow
e. it helps with taxes

2. As a business tool, business owners want…
a. easy
b. fits into their workflow
c. inexpensive

3. Things people must feel about this website (generically)…
a. secure
b. trust
c. credibility/authority
d. support

4. Things the app must do…
a. save you time
b. make taxes easy
c. integrate into the users workflow
d. help customers make better decisions*

* This is a bit of a stretch but it’s meant to address the Reports part of the app. The better the reports, the better the decisions are that are based on those reports.

5. In addition to the questions above, Allan and I have had conversations about the current site and the redesign that we’ve already completed and we also have some anecdotal information from UserZoom and ThreeQuestionTest that suggests that one design has a better look-and-feel while the other is easier to understand and navigate. I also want to test these assumptions.

6. A month ago I wrote an article about the 6 things your home page must have to keep it from sucking. Those attributes, in case you’re rusty are:

2. Know your user

The main customer for LessAccounting are small businesses with less than $10 million in annual sales and with 20 employees or less.

It’s important that we vet the survey takers and give the target market a higher weight when assessing the results. It’s good to know how people in general react to the website but by the same token, having a lot of non-customers giving you feedback isn’t helpful when trying to understand the motivations of your users.

It’s important to construct appropriate demographic questions to be able to filter the data.

The Test Questions

When I was developing my survey, I started with the demographic questions. They were the easy ones to write, and were a good way to work into the other test questions. Here’s what I came up with:

1. Are you male or female?
2. What is your age?
3. Do you own your own business?
4. Are you an accountant or are you responsible for keeping the books for a business?
5. How many people work at your place of employment?
6. Have you ever used or do you currently use accounting software? If so, what?
7. Do you bank online?

In these seven questions, we know for sure how targeted of a user this person is. I added in the last question to get a baseline for their comfort with working with their finances online.

Then it was onto testing my assumptions. These questions were more difficult to design. It was made more challenging by trying to design questions that could fit into the limitations of the Loop11 software. I don’t want to bash them because I believe they offer a good product, but after having been exposed to what UserZoom offers, it makes me wonder why some of their ideas (branching questions, predefining users, a wider variety of test question types) aren’t in Loop11. So there were some workarounds.

But in the end, I believe the questions are solid. They are:

8. (From memory) What does this website do?
9. (From memory) Who should use this website?
10. What are the main reasons you would consider using an online bookkeeping app?
11. Please rank the following features of a bookkeeping app in order of their importance to you.

– Attractive “look and feel”
– Proposals/Invoicing
– Connects directly to your bank
– Great customer service
– Security
– Cost
– Time tracking
– Makes taxes easy

12. (From memory) What features does LessAccounting have that would appeal to small business owners?
13. (From memory) What publications have written articles or reviews about LessAccounting?
14. Task: Your task is to navigate to the page where you’d find the name of the person in charge of customer support. Click “Task Complete” once you have found their name.
15. What is the name of the head of customer service?
16. Task: Navigate to the page where you would find information on how long you can use the free trial.
17. How long is the free trial?
18. Task: Please locate the page that tells you how much LessAccounting costs.
19. How much does LessAccounting cost?
20. On a scale of 1-7 (7 is high), how secure do you feel LessAccounting is with your data?
21. On a scale of 1-7 (7 is high), how trustworthy is LessAccouting?
22. On a scale of 1-7 (7 is high), how would you rate the “look and feel” of the website?
23. On a scale of 1-7 (7 is high), overall, how easy was it to find the answers for the previous questions and tasks?
24. Is there anything you would change about the website?
25. Is there anything that you love and absolutely would not change about the website?

And that’s it.

Truthfully, it feels just a bit long. 25 questions, including demographic questions. But it can easily be completed in 10-15 minutes, so I’m not going to worry about it too much just yet.

Any of the questions that are prefaced as (from memory) are written that way because the user is show the LessAccounting website and asked to either look at one page or multiple pages before clicking “next”. When they proceed to the next page, they are asked several questions about the content of the website. The point here is to get at memorability. My presumption is, if the content doesn’t stick in your head, then you really didn’t understand what you were being shown. That disconnect is likely to lead to a higher bounce rate. Memorability is key.

Several of the questions attempt to understand the mindset of the user.Β  Others ask them to find specific important information. Finally, we want to understand how users feel about the site in general, including what they’d keep and what they’d change.

I also addressed the looks good/works good issue that Allan and I have with the current and new designs.

In short, the questions are all drawn directly from the initial assumptions.

Further Thoughts about Writing Survey Questions

1. In the actual survey (which you can take here, if you didn’t take it at the top of the article), I mix up the demographic questions with the other questions. I’ve never seen this done before and I don’t honestly know if it’s going to impact the test results. I never ask a demographic question that’s leading – for instance asking them if they’re a business owner before asking them who LessAccounting is for. That would be a no-no.

For me, it felt more natural and less stilted to break up the question blocks. It makes digesting the survey easier and in my opinion, easier to get through. It loses a bit of its clinicalness. But if anybody has data or research into whether this is a good or bad practice, please tell us in the comments.

2. The particular wording of the questions was tricky. For one thing, it had to work within the Loop11 framework. This lead to the “look at this and then answer some questions based on what you just saw” method of testing. I’m not sure if that’s the best way to approach memorability or discoverability, but it’s the best I could do with the tool. For another thing, I found it necessary to be particular with my wording.

For example, I original had a question that said “On a scale of 1-7, how confusing is the website?”. But after some consideration I decided that this was a leading question and invited dumping on the website. It didn’t get at any underlying truth. I ended up rephrasing the question as, “On a scale of 1-7, overall, how easy was it to find the answers for the previous questions and tasks?” This, rather than asking a general question about how confusing the website is, gets at the user’s true feelings for their interactions with the website. It might seem like a subtle difference, but I find the second question to be much better.

3. I wanted to design a survey that could be used for both the current and the new design. This lead me to ignore parts of the website that were designed with a specific purpose in mind. For instance, I do not ask any questions in the survey where LessAccounting is compared to a competitor even though on the current site, that is a big deal.

In my new design, I ignored the competitors on the basis that saying their name on your website is free publicity. It’s arguable that I should have challenged this assumption but since there’s no sign of this information in the new design, I opted to leave it out altogether.

The closest I get to dealing with competitor’s products is in the demographic data. When I ask what programs they’ve used or currently use, they can choose from a multiple choice list. This will help us understand what percentage of the target demographic has previous experience with other online software and can give us a clue as to how important it is to craft a message that juxtaposes LessAccounting against their more well-known competitors.

Final Thoughts

All told, it took me about 4-hours to craft this survey. I will be sure to share the results of this survey in a future post. If you haven’t yet taken it… seriously, help a brother out.

When it comes to working on your own survey, your assumptions will vary depending on the website you want to test. However, knowing your demographic and testing your assumptions are the best way to get results that are actionable. After all, the assumptions that are confirmed should be held. Those that aren’t should be dumped. And that, in a nutshell, is the iteration process. And iteration, is what it’s all about.

3 comments on “The Nuts and Bolts of How to Design a User Survey

  1. Pingback: Three Quick Questions, One Quick BUX Review | A Better User Experience

  2. Thanks for this article Ben, it is useful point you make about testing against your intuition and then to read how you actually decided what questions to ask the user.

    One piece of feedback about your question “Please rank the following features of a bookkeeping app in order of their importance to you.”, as the user of online accounting software. I would have preferred to have been able to rank each one of those on a scale of 1 to 7, rather than put them into priority, because some of them are of equal importance to me. For example my online bookkeeping package MUST be affordable, do invoicing AND be secure. I wouldn’t rank one over the other so my answer to this question felt wrong.

  3. Pingback: Nuts and Bolts of How To Design a User Survey | Cornerstone Coding

Leave a Reply