Crossing the ‘Say-Think’ Chasm with Human-Centred Survey Design
As a child, my mother would always tell me, ‘mean what you say, say what you mean’. As a market researcher, this line has stayed with me given how closely I work with human responses on a day-to-day basis. Of course, we know that 99% of responses are accurate and valid, but there’s often a small percentage of responses which (intentionally or otherwise) obscure the ‘truth’.
“There’s the truth… and then there’s the truth.”
Understandably, working out the real ‘truth’ in people’s answers can be a bit of a problem for research. So, how then can we ensure that the responses we’re receiving are true and accurate? Do we just have to trust them and hope for the best? Not necessarily.
As we learn more about the (il)logic behind human behaviour, we learn more about ways to outsmart ourselves and bring this truth to light. And one of these ways is what we call, ‘unconscious testing’.
Put simply, unconscious testing tries to untangle the truth by looking at automatic, effortless behaviours. As research shows, the more automatic a behaviour is, the more it reflects the real ‘truth’, providing a truer reflection of our thoughts and feelings.
Unconscious tests come in all shapes and sizes. Arguably the most popular are things like implicit association tests (Freud, anybody?), but with the advance in technology, there’s been a boom in biometric tests like eye-tracking, heart-rate monitors, sweat recordings, which help us as researchers cut through the fluff and get to the crux of the issue.
While the above are all fantastic ways of uncovering the truth within, they’re not so practical for survey research (imagine being asked about your childhood experiences in a taste testing task). But the thinking is along the right path…
Recently, the Square Holes team were lucky enough to attend the 2022 Human Insights Conference, where a researcher by the name of X presented some compelling research on how to do unconscious survey testing well, and it looked something like this:
This got me thinking. Recently, there’s been a growing amount of work showing the link between ‘gestures’ (the act of moving one’s person) and implicit knowledge. One study in particular shows that children struggling to complete a math problem, when discussing their working out, often unconsciously revealed new and correct problem solving strategies – expressed only in gesture. It seems there’s a strong link between physical movements and implicit thinking, leveraged in the above example by using quite a common gesture – tapping one’s phone – as a survey response instrument.
Another element of this technique is time-based monitoring. Research has also shown that time-taken-to-answer can also impact the honesty, or truthfulness of a response. As an example, a popular implicit test, the Implicit Response Test (IRT) sets any response below a threshold of 500-600ms as an ‘unconscious’ response, and any responses beyond that as conscious and therefore, susceptible to influence. However, it is necessary to first get a ‘true north’ baseline of response times – to get an idea of how quickly a respondent normally proceeds through the survey – as everyone is a bit different and will likely respond at different rates.
In sum, the application of this technique is clear and positive for the world of research. Surveys are inherently based on people and their responses. Although this sometimes comes with challenges, particularly when there is incentive (social, political, economic) to respond in a less-than-truthful manner, it is the same human element which makes research so valuable. What we as researchers must do, however, is explore, trial, and utilize different kinds of smart tech and thinking to try and outsmart our clever brains.