Impressions of

Recently, my buddy who I profiled in an earlier blog post, sent me a link to a user-test. Actually, it was a Facebook update that looked like this:

Facebook post with user test link

Here's the User-Test. Give it a go.

The link takes you to a start page shown below:

Test page for the test

If I would have read this, things would be different.

Because I’m interested in karma points, I became a member of  I clicked on Start Here and it shows:


Final instructions to the user before the test.

Ah ha! The test tricked me.  I clicked on “Show Image” and was shown a screenshot in my browser window. But the screenshot was too big and I had to scroll back and forth to see it.  I wasted my 5 seconds and blew the test. And then, the screenshot went away and I was asked,”what is the purpose of the website?” and given a text input area.

The prompt was: “Take a look at the website screenshot, and look at the design/colors/text.”  The questions were:

  • What is the purpose of the website?
  • Do you like the color scheme?
  • Is it too cluttered?
  • Is there too much text on the left of the iPhone image?
  • Is the navigation bar easy to understand?

Yep, I didn’t read the instructions. If I had, I would have known to concentrate and look at for 5 seconds. Duh, it’s called! Sometimes, I’m late to the party.

My immediate reaction was, “How can you ask these questions based on viewing the screenshot for 5 seconds?”.  What questions would you ask after only giving the user 5 seconds to view the site and test?  Can you get reliable data based on this type of test? How does the social aspect and the repetitive nature of the tests affect the results?

Now, I’m interested and I decided to do another.

The next test reiterated the 5 second rule in its prompt.  Prompt: “You have 5 seconds to view the image. Afterwards, you’ll be asked a few short questions”  This test asked:

  • What does this company do?
  • What did you think of the layout and design?
  • What stood out most to you?
  • If you were looking for a website, would you continue to go further into the site to find out more?
  • Does the website look professional?

Okay, I’m getting the hang of it now.

I do another one – this one the prompt is, “You are a young woman who regularly purchases beauty products (skin care/cosmetics)”.  And, the questions:

  • How would you rate this page on a scale of 0 (really bad) to 10 (really great) ?
  • Was it clear what the product was?
  • What would you do next?
  • What did you like most on the page?
  • What did you like least on the page?

I saw my buddy on Facebook and we chatted about it.  He said, “You get a lot of BS answers because people really don’t have incentive to answer thoughtfully.”

He also gave me an idea of how it works using the ‘community option’ service level of You get 30 responses for free for signing up.  And then, for every test that you take,  your test can be given. Paid plans start at $20 (100 test responses) a month and go to $200 (1000 responses) a month.

To summarize my opinions, I feel that this type of test could be a part of a good user testing plan. I was not impressed with the questions in the three tests that I took. Following the G.I.G.O ( garbage in garbage out ) principle, bad questions will yield bad answers. I doubt that the questions / answers will yield any actionable information beyond confirming pre-existing assumptions about the sites. However, I feel that a well-designed question that takes into account the nature of the test could produce actionable results. question should be general and open-ended. They should deal with impressions and feelings, rather than specifics.

Finally,taking my opinions into account, here are my top five questions from the 15 I saw:

  • Does the website look professional?
  • What stood out most to you?
  • What did you like most on the page?
  • What did you like least on the page?
  • Was it clear what the product was?

4 comments on “Impressions of

  1. Why are those your “top 5” questions? Are those the questions you thought were the most likely of the questions that you saw to result in actionable information? What kind of feedback would you expect to give?

    Did you give thoughtful responses to the questions? Do you plan on uploading a design to get feedback?

  2. Great questions, Ben.

    These five questions best met my criteria: producing actionable results, general (not specific), open-ended and dealing with impressions. Questions should not deal with specifics because of the short time the user has the focus.

    With a 5 second test, I would only expect to get general impressions and feelings. I am sure that there are better questions than this.

    The real question is, “will I use and recommend five second”. To be honest I’m lukewarm on a recommendation. Perhaps after doing my own tests, I may feel different. I do think it’s worth 20 bucks to test it out on 100 testers

  3. Pingback: The Ben and Newman Show Podcast #004 « Better User Experience

  4. Pingback: Breaking News: User Experience Testing Can Be Better « Better User Experience

Leave a Reply