So, I dipped by toe back in to web research this morning. Since I generally have decent experiences with Harris polls, I decided to give them a shot. As usual with them, it was a fairly painless experience; nothing really glaringly wrong or obnoxious — so let’s not dwell too much on either of these things.
First, from a pure design standpoint, I don’t understand the point of these massively over-wide columns. If you’re going to answer true for some and false for some, it’s really a lot of left-right mouse or trackpad motion — enough that it created a minor annoyance for me. In a 3-question true/false setup like this, it’s really not terrible — but in a longer series of questions, it might drive me to drink:
Wouldn’t this shopped version be easier to use?
Like I said, pretty minor. Which brings us to my second and final observation on this poll:
Who the heck are the numbers for?
So, yes: all pretty minor.
So for a couple of weeks now, I’ve been getting emails from the Los Angeles Times about how my email newsletter subscriptions are about to end. I’ve been ignoring them, because I don’t think I actually get any emails from the Los Angeles Times. I suppose I must have registered with a real email address on their site to read a story once, years ago, before BugMeNot and their Firefox extension made such things unnecessary. In any case, I don’t care, fine, whatever, stop sending me those newsletters you’re not actually sending me, I’ll find a way to survive, despite the longing I shall forever feel in my heart.
Just now, though, I got this brilliant piece of email from them:
“Why have we stopped sending you emails?” WHAT DO YOU THINK THIS THING IS? IT’S AN EMAIL! THAT YOU’RE SENDING ME! ABOUT HOW YOU’VE STOPPED SENDING ME EMAILS WHICH IN ACTUALITY YOU NEVER WERE SENDING ME IN THE FIRST PLACE!”
It boggles the mind.
This is probably a nitpick:
It’s bad enough when you ask me which brand of orange juice I trust most — though I can’t imagine why any nationally known brand of juice would be more or less “trustworthy” than any other — it’s not like I have reason to believe Tropicana’s short-lived carton redesign is indicative of some sort of quality control problem or that they’ve been filling the cartons half a glass short — but what exactly is there to trust or distrust about a razor manufacturer? What am I not getting here? It’s a company that makes razors and razor blades, not a candidate for high office.
Most of the items in this particular matrix are actually relevant; to me, that makes “brand I trust more” stand out like a sore thumb.
Of the online polls I see regularly, Harris Interactive seems to be the best of the lot. They actually did something I thought was particularly good in a piece of research I saw today, but of course I’m going to post on the thing they did today that I didn’t like as much.
First, I was asked this:
Nothing horrible there; it’s a matrix, but it’s not too huge, and the answer choices are fine. But then they ask this as the very next screen:
I check my bank statements all the time, so I started clicking the right-most column, just like I’d done with the Facebook questions — and then I realized that the right-most column wasn’t “very likely” anymore, but had mutated into “not applicable.” I suppose you can make an argument that the column is necessary on this page — the Facebook questions were asked only of Facebook users, and it’s possible some people answering this screen might not have a 401(k), or might not have any credit cards — but I think it would have been better to either let the “not at all likely” column take care of those folks or to have added the “not applicable” onto the Facebook questions as well.
The more you can keep answer choices identical, the more your respondents can glide effortlessly through the research. Note that what Harris did here is really quite minor, and not at all as aggravating as what I’ve seen done elsewhere from time to time, where a painfully over-sized matrix runs across multiple pages, but with the answer choices randomly shifting to abuse the respondents make sure the respondents are paying attention. What I’m talking about here is nitpicking compared to that sort of thing — but the common element is that your respondent is going to click the wrong thing because they expect their answer choice to be somewhere else.
This happens to be Greenfield’s work, but I’m fairly sure Harris does it as well.
How, exactly, (and why) would you attempt to enter a range of zip codes in answer to this question? Especially considering the field is constrained to 5 characters in size. Was there a time during which respondents were insisting on claiming their zip code was “00001-99999″ or something?
Thanks for the tip. I was going to attempt to select three or four different states, but since you’ve helpfully told me to only select one, I’ll see what I can do.