A composite image of a usability test on a television set Image credit: Shutterstock

Playback

Often the best way to fix your faults is to look directly at them

4 minute read

Regularly reviewing your performance and providing self-critique can help you improve your skills as a usability test moderator.

The camera and microphone are powerful implements in a usability tester’s toolkit. With them, you can capture the user’s actions and facial expressions and gain all kinds of insight into their behaviors, thoughts and expectations when using a product. But these tools can be equally useful in monitoring your own performance as a test moderator.

The Camera Never Lies

One of the pitfalls of running moderated usability tests is that you have to put another human being in the room with the test participant. And humans are, you know — flawed.

As such, that human moderator might bring their own biases into a test and provide subtle cues to the participant that they wouldn’t receive in unmoderated testing. The human moderator might go on a talking jag, not realizing that they are quelling any impetus to express thoughts or insights by the participant. That human moderator may inadvertently spill the beans about a task’s solution through their line of questioning or conversation with the participant.

That human moderator is going to be you at some point.

How do you avoid these pitfalls? While there’s no guarantee that you can avoid them altogether, you can certainly mitigate the problems by putting the best possible moderator in the room with the participant. You know: You.

Just as recorded sessions provide a wealth of information about the participant in a study, they also reveal a lot about you as a moderator. You should occasionally review your studies from a self-critique perspective to make sure that you’re not violating any of the rules or best practices of test moderation.

Who is that guy?

Having recently run a usability test to evaluate the utility of a popular pizza website, I decided to review my own performance as a moderator. I watched the playback of the 20 minute test (which I thought went very well at the time) and came away with a different view of the session. I also found some specific areas where I could focus and improve.

To Be? Or Not To Be?

As the test started, I read from a script that went some thing along the lines of:

“Hi, Test Candidate. My name is Andrew, and I’m going to be walking you through this session today."

But sounded like:

“HI TEST CANDIDATE. MY NAME IS ANDREW AND I AM TOTALLY NOT READING THIS SCRIPT FROM A SHEET OF PAPER. THIS IS MY NORMAL MANNER OF SPEAKING."

In other words: Stiff. And as I watched the participant’s reaction to this Shakespearean monologue, I realized they noticed it, too.

One of your objectives as a moderator is to make the participant feel comfortable. One way you can do that is to act naturally and relaxed. Reading your test script word-for-word is probably not going to serve you as well as paraphrasing and speaking casually with the candidate.

So how could I improve on this? Better preparation and lots of practice will help alleviate the problem for future sessions.

In Other Words…

Another thing I caught myself doing during the session was paraphrasing what the candidate said in my own words; repeating it back to them.

Candidate: “Wow. I didn’t think that would happen."

Me: “So, in other words, you expected to be taken to the menu page when you clicked on the link?"

Wrong. This is textbook leading the participant. While my intention was just to make sure I understood the user’s context and expectation, I was planting a subtle suggestion into that seemingly innocent question.

Better to ask an open-ended question that gets the user talking about what’s on their mind; not mine:

Me: “What did you think would happen when you clicked that link?"

If you need clarification “in other words,” make sure it’s the test participant’s words you receive.

It’s okay to bail out

The test in question called for the candidate to complete three tasks. While the first and third went smoothly; the second task had them stumped. Unfortunately, we spent a lot of time “in the weeds” looking for something that just wasn’t on the path that the user had chosen. I should have bailed when this became apparent.

Users are going to fail tasks. It’s part of the process. But it’s also important to keep in mind that failure — even when you’ve explained that mistakes are your fault and never theirs — is disheartening. Spending a lot of time watching the user bang their head against a task you know they aren’t going to successfully complete is not productive.

Once it becomes obvious that the participant is not going to successfully complete a task, you need to politely move on.

Test, Observe, Tweak, Repeat

Developing your UX skills is not that much different than developing the digital products you work on. In this Agile-loving world we’re living in, it pays to spend some time treating your own process the way you treat your products. Seek to iteratively improve your skills as a moderator by routinely reviewing your performance, looking for areas in which you can improve, making the necessary adjustments, and repeating the process.

Until next time…

*Thanks to Leona for unwittingly providing my faux usability participant in this week’s header image.

What are your thoughts? Join me in the conversation over on Threads , Bluesky Social , or Mastodon .

Originally published October 2, 2016
File under: ux  research  usability  techniques