#1 – Only survey those you want to survey.
- I recently got a survey asking me to rate my satisfaction with one of the website’s products even though I haven’t used it; I’ve only read about the product on their website.
- In a user satisfaction survey, only survey those who’ve used your product, or at least make a response option like “Not applicable” available. Better yet, survey those who’ve used your product several times so that they have more of an informed opinion.
- Sounds obvious, but the website that made this mistake is arguably the most well-known website in the world. Sometimes it’s the obvious that gets overlooked.
I really enjoyed today’s blog post from the famous Googler and entrepreneur Avinash Kaushik on tips for data presentation, particularly the importance of moving discussion quickly off data and onto insights and actions.
But if the occasion is a strategic discussion, any occasion about taking action on data, then you need to get off data as fast as you can.
No skill is more in demand than the ability to communicate the “so what?” when it comes to any occupation that deals with data. Just go to a job search website and search through job postings using words like “research” or “data” or “analyst.” You’ll see the ability to uncover strategic insight and recommendations from data as a requirement in virtually every single job posting.
I haven’t watched American Idol in years, but I still remember the cringe-worthy auditions of those who claim to be the next Idol yet can’t carry a tune to save their lives. For some, it’s hard to objectively evaluate their own talent when there is so much at stake.
Psychologists have coined the term “motivated reasoning,” a tendency for people to reason in ways that allow them to form or maintain desirable beliefs (e.g. that they can sing). They may readily accept information that supports their beliefs as valid but question information that challenges their beliefs (remember how angry those contestants were at the Idol judges?).
In a similar vein, research should not be conducted by those who have a stake in how the research findings turn out. This may seem obvious, but I’m surprised by how often it still happens.
There’s a recent term called “Google Statisticians.” No, they are not statisticians who work for Google; they are people who do statistical analyses by googling words like “how to do significance testing” or “how to calculate p.”
As biostatistician Jeff Leek pointed out, most analyses are no longer performed by statisticians, as data are now abundant and cheap to collect. Long gone are the days of door-to-door surveys, and phone surveys are almost a thing of the past. Online surveys are everywhere due to platforms like Lime Surveys and the powerful Google Consumer Surveys that make it easy to collect and analyze survey data. Log file data is free and overwhelming in size. There’s even software geared towards non-statisticians that automates statistical analyses.
My main recommendation – include only users impacted by the change in your analysis; exclude users who are not.
- Let’s say you have an e-commerce site. You want to test whether certain changes to your checkout page would increase conversion (% of users purchasing).
- You want to run a 2 x 2 Multi-Variable experiment with 1 control and 3 treatment groups.
- Your current conversion is 5%; you want to detect conversion changes as small as 10% (with the conventional 80% probability of detection and confidence level at 95%).
- According to this table in my blog post, you would need 30,400 users in each group, or 30400 x 4 = 121,600 users in total visiting your site. (That’s a lot!)