Category Archives: Polling Point

Pssssst: It’s Still a Matrix

Even though I can make cool patterns with it, it’s still a matrix, and in the end, it’s still a bit boring.


I still think this is an innovative approach, and one with promise — but I think it would be much stronger with logos or product images in place of the text. Take advantage of the unique opportunities you get in web-based research.


Filed under matrixes make me cry, Polling Point, the web is a visual medium

How Do You Tell if Your Survey Design Has Problems?

Harris often asks respondents to click a radio button at the end of the survey, asking something along the lines of “How much do you agree with this statement: I noticed mistakes in this survey,” with answer choices ranging from strongly agree to strongly disagree. (I keep neglecting to get a screencap of that, but it’s along those general lines.)

Here’s NPD’s similar approach:


Polling Point has asked, at various times, something along the same basic lines; I remember because there was something they kept getting wrong, and I kept complaining about it, to no avail. (It was something minor related to the answer choices they were giving about which news program you watched most often, but I can’t remember the details.)

All of these approaches are,  on the one hand, commendable: respondents should be able to provide feedback, and asking directly in the research instrument itself lets the feedback go directly to the survey designers, as opposed to making those with issues go through the front office of the giant research company to try to make a complaint that will never be properly routed.

But here’s the other side of the coin:

Wouldn’t it be better to extensively test our research before it goes into the field?


Bring in a consultant to short-term edit your questionnaires and long-term train your staff in writing better questions and designing better studies. Me, for instance.

Alternate suggestion:

Offer special incentives to especially active panelists who have proven themselves in past feedback to be critical of errors in exchange for them “previewing” certain surveys. You can probably get them to do a decent job of quality assurance testing/user acceptance testing/proofreading/whatever you want to call it for nothing other than an additional entry in the sweepstakes or an extra $2 in their prize money account. (Yes, itoccurs to me they could be doing this right now, just without the special incentives, and I’m doing their work for free.)

The thing is, finding errors and other usability problems in complicated market research instruments really isn’t the sort of thing you want to be doing in real-time on a live survey after people work their way through it. Whether you’re exposing the whole panel or just the selected subset of early reviewers to the unedited project, you’re still trying to do triage on something that should have been right before it launched.

Get it right the first time, guys.

Leave a comment

Filed under bad user experiences, Harris, Market Research, NPD, Polling Point, web research

Why We Fight

I think I need the occasional reminder that the point of this isn’t so much to point and laugh at Greenfield and other worthy targets, but instead that bad research needs to be eliminated, because it actually hurts us all.

What’s the long-term effect of Greenfield getting respondents to take survey after survey, in a never-ending chain of sweepstakes entries?

What’s the long-term effect of Polling Point asking respondents to mark screen after screen of company names as green or red?

What’s the long-term effect of untrained telemarketers asking respondents who they’ll vote for — and mispronouncing the name of each candidate?

What’s the long-term effect of Zogby’s willingness to get in bed with and take research on just about any topic for just about any client?

As the pool of willing cooperators dwindles, how much longer can we rely on the remaining respondents to fill in the gaps?

Short-term, there are ways to compensate, but at some point, we’re going to regret that we let this go on as long as we did.

Leave a comment

Filed under bad user experiences, election polling, Greenfield, Market Research, Polling Point, Public Opinion Polling, Zogby

Don’t Fear “Not Sure.”

I think Polling Point does some interesting work; they’ve got the thing where they show about 16 buttons, each labeled with the name of a company, and ask you to click to select the ones you like, which turn green as you do; then they put up the same list and ask you to click the ones you don’t like, which turn red. Clever, though I think I’d try to integrate the corporate logos into the buttons as well.

They also have a cool way of asking follow-ups, where they’ll show them in a pop-up balloon-looking window that connects to your initially selected choice with a speech-bubble-looking-thing.

But then they ask me this:

The first choice — Limited — well, I’ve been in Limited a slew of times in my life; I’ve stood around and watched my wife buy various things. I’ve definitely been there, but I’ve never personally purchased anything there.

I guess “Did not purchase anything here in the last week” is the most appropriate option, and maybe it’s the obvious option to other people, but to me, that option implies I’ve bought things there occasionally.

I think the best solution is a “not sure” choice. “Not sure” on almost EVERY question gives everyone who doesn’t fit one of the other categories a place to put themselves. I’d definitely choose “not sure” on Limited and on Catherine’s Plus Store — Limited, because as I’vr said, been there lots but never bought anything; Catherine’s because the name is vaguely familiar, but I’m sure I’ve never been — so I can’t state authoritatively if it’s in my area or not.

Typically, when I offer “not sure” on a piece of research, the percentage of respondents selecting it is < 5%. If it’s much larger than that, there’s one of two things happening: either I’ve asked people to rate something that they’re unfamiliar with, or I’ve designed a defective piece of research.

Either way, we’ve learned something valuable. “Not sure” is a useful tool: it keeps respondents from feeling “trapped” by a question with options that don’t correctly apply to them, and it serves as a self-diagnosis tool for you.


Filed under answer choices, Polling Point