It was an intriguing question: “How do I find out about statistically significant usability testing?”. I’m sure it’s one that you’ve encountered, and maybe your reaction was the same as mine: “That’s the wrong question”. Then I realised that ifContinue reading… Statistically significant usability testing
It’s been a while since I ranted on about response rates on surveys. In that article, I took the view that “2% is a terrible response rate” and had a few reasons why and tips for doing better. Recently, I’veContinue reading… Surveys – what is an acceptable response rate
In the 1950s, a well-designed survey could often achieve over 90% response rates. Since then, response rates have consistently declined. But I was still a bit shocked the other day when a post on a usability discussion group quoted aContinue reading… Survey response rates? 2% is not good enough
Questionnaires often ask us to rate something or other. Recently, I’ve been asked about: ♦ my satisfaction with a huge website ♦ the effectiveness of a selection of ways to maintain or increase charge-out rates ♦ the cleanliness of aContinue reading… Piggy in the middle? Why people choose the midpoint in rating questions on questionnaires
A few years ago, I realised that when we’re testing products with the general public, we’re actually doing a type of market research. So I joined the Market Research Society in the hope of making connections with other market researchers,Continue reading… The Market Research Society Conference, or “usability? what’s that?”
I am a usability consultant and I believe, and find in practice, that usability evaluation is the best way to find out whether a document works for its users. However, I have frequently been in a position where my clientsContinue reading… Market Research or Usability Evaluation?