Clear Reporting & Critical Thinking: Why User Experience Needs to Remember its Roots in Psychology

“We are going to become increasingly disappointed with our progress if we just keep doing all of this back slapping that we’ve become so accustomed to.” – Whitney Hess (IA Summit 10 Keynote)

There was a time, back in the early 1990s, when almost everyone involved with UX research had a background in Psychology. Back in those days, the term “User Experience” didn’t really exist, and the nearest discipline was Human-Computer Interaction (HCI). Back then, you learned about new developments in the field by reading long, boring looking photocopies of journal articles and conference proceedings. Things were different. There was a massive gulf between the theory-laden academic researchers and hands-on web designers.

I don’t for a minute want to go back to those days – the quality of design on the web was far lower. Most websites were ugly and hard to use. But, for all its flaws, the HCI community knew how to share research findings effectively. This knowledge has been massively diluted the transition from academia to industry.

When you read about UX research on the web these days, most reports have more in common with a press release than a piece of Psychology research. If it’s on a fashionable topic or if it’s reported by a famous figure, it’s reblogged and retweeted a vast number of times, without critical evaluation. This is bad.

In “proper” Psychology research, you always…

  • Ensure the study is reproducible: you should report your study in such detail that anyone, anywhere can reproduce it and independently analyze it for its strengths and shortcomings. This is the cornerstone of all good research, and I can’t overstate its importance.
  • Clearly explain all the shortcomings: you should never gloss over the weaknesses in your study design. In fact, you should focus in on them. It doesn’t matter if your study itself doesn’t deliver interesting patterns, so long as you explain what went wrong and the weaknesses in your experimental design. “Failed” studies are still very useful to read about, as your peers can learn from your mistakes.
  • Define your variables: even industry standard terms like bounce rate are defined differently, depending on who you speak to.
  • Share the data: interview transcripts, data logs, everything goes into an appendix or somewhere online where people can critique it. Never hide behind percentages, and never, ever dress up small-scale qualitative usability studies as more than they really are. Even if you use an eye tracker.
  • Never cite a secondary source without reading the primary source: for example, if you want to refer to the famous Jam buying study in Barry Swartz’s Paradox of Choice, you go read the primary source (Iyengar & Lepper, 2000), critically evaluate it yourself, and then reference both primary and secondary sources in your write-up. It’s your responsibility to do this, to avoid misunderstandings being amplified and rebroadcast.

The fact is, UX researchers in industry are never going to follow all of these maxims, because they can’t. They’re NDAed or financially motivated to keep at least some of the details secret. They’re probably not even given the time to write up their work in this level of detail for internal use. This is the reality of industrial research, and I’m not arguing that it needs to change.

My point is that you need to be highly aware of the shortcomings of the information that’s shared in our industry. Many of us are living on a diet of press releases and ‘top ten tips’ articles. The “UX Myths” meme has gained some traction lately (mainly thanks to Zoltán Gócza & Zoltán Kollin), which is good, but you shouldn’t rely on soundbite articles to tell you why other soundbite articles are wrong. If you do this, you’ll always be on the back foot.

You may not consider yourself an Applied Psychologist, but if you’ve ever designed something while thinking about user behaviour, then you are, by definition, an Applied Psychologist. You’re just not necessarily a very good one. Yet.

15 comments