UX Killer Tool: Get Statistically Valid Data With Google Consumer Surveys

Google Consumer Surveys

One of the biggest problems that UX professionals have is getting enough users to create a statistically valid sample.

We’ve argued before for that statistical validity isn’t necessary most of the time. And when the goal is to make your site better, rather than to prove something to be statistically valid, that’s true. Still, there are times when you need to feel sure of your data.

Two cases:

1. Your firm wants to launch a new product and wants to test the idea before launching fully.
2. You need data to convince the business owner that another way than his is the right way to go.

There are, of course, other reasons why you’d want statistically valid data to back up your decisions but you get the idea.

There’s a Tool for That

It’s called Google Consumer Surveys.

Google is the first company I’ve seen to offer a survey solution that will generate statistically valid results. Even better, they’ll do it for only $150. (And if you know of others, hit me up in the comments.)

“Only $150 dollars, that’s almost 10 gallons of gas, man. Why so expensive?”

In context, it won’t seem expensive at all. For this cost, you can create a 1-question survey and get 1,500 responses. Once the data has been collected, Google gives you a really impressive dashboard where you can do complex marketing analysis.

Let’s Take a Closer Look

Google Consumer Surveys has been under development for the last 18-months but recently came online in early September.

It offers users a unique chance to get insights about their brands and marketing campaigns.

Setting up a survey is easy. This video explains the whole thing in 76 seconds.

If you can’t be bothered with the video, the basics are these:

1. Start by creating a survey from scratch or use a template.

2. Pick your audience. You have three choices.

(1) The general Internet population
(2) A specific demographic
(3) A specific set of people based on a screener question

A fair note of warning: Pricing is affected by your selection. If you choose the general Internet population, pricing is $0.10 per respondent. If you choose a specific demographic or want users screened, that price jumps to $0.50 per respondent.

3. Write your question.

This is a multi-part step. The first thing you do is pick your question type.

Once you have picked your question format, you are asked to fill out the details of the question.

For example, with the first box above, which is called the “single answer” format, the details look like this:

It’s a really straight-forward affair.

4. Choose the number of respondents and pay

Google offers three choices: 1,000, 1,500, and 2,500.

The cost is based on which group you chose in step 2. But for 1,500 responses, which would achieve statistical validity, it’s going to run you either $150 (for one question at $0.10 per response) or $750 (for one question at $0.50 per response).

This is what Google has to say on how many responses you should buy:

The number of responses you should purchase depends on how confident you want to be in your results. A top-line margin of error of 10% or more is acceptable for some business decisions, but others require more accuracy. For general internet audience questions, we recommend 1,500 responses to ensure that top-line results will be within a 3-5% margin of error and that results segmented by one or two dimensions (e.g. age, gender, etc.) will be within a 10% margin of error. The number of responses recommended in order to receive statistically significant responses can be different for targeted surveys to a custom audience.

5. That’s all there is too it. Next, you wait. Google will begin to collect the responses.

A Word About the People Answering the Questions

How does Google get their users for these surveys?

Google has a payment exchange system where users can get access to, say, a newspaper article in return for answering a question. Google does all of this through the DoubleClick publisher’s network, which they own.

For the most part, it’s a win-win-win all around. The user gets access to the content, the marketer gets valid data to their question, and Google takes some money in the middle.

The Results

The results dashboard, in a word is DOPE. It does a good job of making you feel like the money you just spent to know the answer to one question was totally worth it.

See those demographic buttons on the left? Click those and you’ll get data on just that market segment.

But don’t take my word for it, click around yourself. Here’s one that Google created as an example.

For business owners who have never had access to data in this form before, this feels very powerful indeed.

Are there any downsides?

As with everything, there are a number of limitations to the service which may affect its usefulness to you.

1. National sample only: You can’t geo-target your questions. You can get regional data in the dashboard (segmented by state, region, or country) but you can’t get all of the responses from one particular region or state.

2. Surveys are limited to 2 questions: This seems like a problem but when you think about it, it has more to do with how they group the results than anything else. It’s true that they limit you to 2 questions per-survey. BUT, and this is crucial – you’re paying on a per-response basis. So if you ask 1 question, it’s $150 and if you ask 2 questions it’s $300. All of your surveys are grouped in the same place, so this seems like an artificial limit to me. If you had a 10 question survey, you could make up 10 1-question surveys and all 10 responses would be reported in the same place. This effectively solves this problem. However, the way this is presented is a bit confusing, so that’s why it’s worth mentioning.

3. Pricing can get expensive: It’s true, $750 for 1,500 qualified responses sounds expensive. It’s just a matter of whether knowing the answer to that question is going to either save or make you at least that money. For those that say “no”, this will be too expensive. For the rest, it’s reasonable taken in context. However, that being said: don’t ask bad questions or you’ll definitely waste money.

4. It’s impossible to ask open ended questions: Personally, I found this a bit limiting. But I understand that Google wants to make things as automated as possible, so some limits had to be drawn. My solution was to ask the question in a different way.

It’s also worth noting that I’ve always had this problem with every user-testing or survey solution I try — their technology forces me to ask my questions in a way that their software can handle. So this is no different. It’s still a little irritating but it’s also really common.


I believe great websites come from understanding the website’s message and goals. Underlying that message and underlying the website’s goals are a set of assumptions about what the user wants from your product and/or service. If the assumptions are wrong, then everything built on top of it will be wrong for your market. In these instances, I think it makes a tremendous amount of sense to have data to back up your assumptions. If you have to burn a little capital to have valid data, I think it’s worth it.

Now, should you run a test every time you want to change your homepage? Absolutely not. That would be a waste of cash.

So do you need Google Consumer Surveys? For some of you, yeah, you will. You’re going to be involved in bringing a product to market or involved in brand messaging and this tool can help nail down what those ideas should be.

It’s a way to get feedback from a relatively large sample (especially when compared to usertesting.com/Usabilla-type of survey tool) for cheap.

Personally, I’ve used it with some of my clients and we’ve felt like we’ve gotten valuable feedback. So give it a whirl.

Google Customer Surveys. That’s wassup.

One comment on “UX Killer Tool: Get Statistically Valid Data With Google Consumer Surveys

  1. If you want usability feedback on a large scale to generate quantitative statistics and metrics the Loop11 is the only tool I’m aware of that can do this.
    Typically surveys are a very poor way of sourcing usability feedback.

Leave a Reply