By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Home > Blog >
Moderated Usability Testing: A Tactical Guide [2024]
Home > Blog >
Moderated Usability Testing: A Tactical Guide [2024]

Moderated Usability Testing: A Tactical Guide [2024]

Kritika Oberoi
January 12, 2024

Moderated usability tests are the cheapest, fastest way to iterate on your product. You learn what not to build before even a single line of code is written, saving your team weeks of engineering time. In case you already have a product up and running, these tests let you know why your user conversion rate is messing up, where the user is rage clicking, etc.

As a bonus, you get to see users react in the most unexpected ways to your designs — confirming that people are not so predictable after all.

But like with most things in a business, decisions often comes down to 💰 at the end of the day. So here are some examples of how moderated usability tests, or the lack-thereof has impacted real products and companies:

  1. SoundCloud found over 150 usability issues by testing their mobile app before release, meanwhile…
  2. Citibank accidentally transferred $900 million instead of $8 million to a client because they relied on software with poor usability

Despite the obvious value of moderated usability tests, many teams struggle to conduct them because they don’t know how to do them well. One of the most common statements I hear from PMs and Designers is, “I’m not sure which questions to ask”.

This article will give you an overview of:

  • The basics of moderated usability tests
  • Quick tips to improve your moderated usability tests
  • Real life usability testing scripts used by expert researchers and designers at companies like Growens, PandaDoc, 15Five, and Obvious.
  • A checklist that you can stick to right before every moderated usability testing call

The contents of this article are equally applicable to remote usability tests and in-person usability tests.

Moderated Usability Tests: The Basics

First thing’s first — here are some basic guidelines to help you run a successful usability test.

What is a moderated Usability Test?

A moderated usability test guages—you guessed it—the usability of your product.

You conduct a moderated usability test by giving participants a prototype or a product and asking them to complete one or more "tasks" on it. These tasks are representative of user flows you'd like your users to understand, without external intervention.

During the usability test you will observe your users and identify areas where they get confused or frustrated with your product. This helps you identify ways to improve the usability of your product.

What makes a Usability Test successful?

Realistic scenarios are a must

Give your user a task that is representative of what they might actually be doing on your software. For example, when we were running usability tests for a time-stamped note-taking feature at Looppanel, we asked users to take notes during a mock user interview (a user interview inside the real user interview — meta, I know!).

Ask users to Think Out Loud

With a usability test you’re trying to simulate how your user would feel the first time they interact with your product. What you want to know is — what’s going on in this person’s mind? How are they reacting to the product the first time they see it? Are they able to navigate through it successfully?

Since we don’t live in a world where you can read someone’s mind (thank god!), we do the next best thing with moderated usability tests — ask users to Think Out Loud. It is exactly what it sounds like — users stream-of-consciousness share all their thoughts, feelings, questions as they’re navigating through the prototype.

Did they look at a button on the right-hand-side? Why? What did they expect it to do? You really want All. The. Details.

For a detailed look at how the Think Out Loud (aka Think Aloud) Protocol works, check out this article.

Ask the "5 Whys"

You’re running a moderated usability test so you can understand the root of your user’s confusion and what their true needs and expectations are. You won’t get to the root the first time you ask them a question, so keep probing with Why until you hit the depth of the reason.

For example, when I asked users about emojis we were testing in our note-taking feature, many participants responded that they found them “confusing”. Although that’s insightful, if I don’t understand why they found them confusing, I don’t know whether to replace those emojis with new ones, with text, or remove them altogether.

By probing with ‘Why’ we found that emojis were open to interpretation for users — they weren’t sure what ⭐ or ✅ should mean. Given the rapid, time-constrained nature of taking notes during a call, they often chose to skip using the emojis altogether because it simply took too long to figure what to do with them.

Knowing why they were confused allowed us to step back and remove emojis from the note-taking view altogether, replacing them with a simple bookmark to time-stamp key moments that didn’t require too much hemming and hawing to figure out.

Closely observe the users

A moderated usability test gives you a wealth of information — some in the user’s facial expressions, others in the movements of their mouse, and still some more in the hesitant tone of their voice when they’re talking out loud.
You’ll want to try to pay attention to all of the above. It does get pretty challenging, which is why it helps to:

  1. Record the session in case you need to go back to it,
  2. Have a note-taker on the call with you to notice anything you missed!

Why do Usability Tests fail?

Leading or Close-Ended Questions

“Do you like Feature X?” is a bad question to ask for 2 reasons. First of all, your user’s likely response to a question like that will be “Yes, I like this” or “No, I don’t” because of the close-ended nature of the question. As we covered above, what you really want to know is why! Second, your question isn’t neutral (you’ve embedded the word “like”) and because people often default into responding politely, they’re more likely to say “Yes, sure I do!”

A better way to phrase the same question is, “How do you feel about X feature and why?” You’re not leading the user in any particular direction, and you’re more likely to get lots of amazing insight, beyond just a simple “Yes” or “No”.

Answering Every Question

The point of a usability test is to see when your users are confused, which they will often be. When they are confused, they’ll default into asking questions — “What does this icon mean?”, “Where’s the right tab?”, and on and on. These questions are actually amazing data points — they point you to where your users are really, truly confused and where their expectations from your product were not met.

What you do not want to do is start answering all these questions during the test itself — that’ll take you off track from the task you’re testing for and may give the user information mid-task that wouldn’t otherwise be available in a natural setting.

Instead what I often do is let users know upfront that I may not answer all their questions (but they should keep them coming anyway), note their questions down and then answer them at the end of the task. That way I can dig into the reason these questions came up at all (more why follow-up questions!).

Caveat: Answer your user’s questions if they are blocking their progress on the task! (e.g., if the user is struggling with a prototype limitation, you can jump in to answer their question)

Telling Users What To Do

Usability tests can be one of the biggest challenges of self-control (think Marshmallow test) because your job is to watch your user struggle through your design, but can’t say anything. At All.

You’re trying to see what your user would do if you weren’t around so you have got to simulate that experience. Sadly that means zipping it and silently watching them struggle through your prototypes ​​😭

Tips for better Usability Tests

  1. Honesty is the Best Policy! Because people are often trying to be nice rather than honest, it’s helpful to explicitly clarify to them upfront — it’s okay to be brutally honest! I won’t be hurt, I’ll actually appreciate your candid feedback.
  2. Bring Your Own Note-taker: Get someone on your team to observe and take notes during the call. It’s nearly impossible to talk to users, observe what they’re doing, take notes, and think at the same time. But having good notes will save you SO much time in the analysis phase when you’re trying to find patterns across calls. Whenever possible, cajole, bribe or otherwise convince someone on your team to join your usability tests and take notes or bookmark key moments so it’s easy for you to collate your findings later.
  3. Always Be Recording: You never know what moment you’ll want to go back to, or who you’ll want to send a clip of a user rage clicking on the prototype — so always be recording!

Scripting your Moderated Usability Tests

Now that the basics are covered, it’s time to get into the actual moderated usability tests interview scripts.

A script is very important for any user research interview, but with a usability test it’s even more crucial since you need to guide users through very specific tasks.

4 core parts of Moderated Usability Test scripts

  1. 📝 Introduction: This is where you explain to the users what a usability test is, what the expectations from them are, as well as ask for permission to record the call and confirm that any legal documentation your company may require is signed! You can use a handy tool like Looppanel to record, transcribe & take notes for the session
  2. 🔥 Warm Up Question: It’s always best to let your user “warm up” to the interview, getting them comfortable with some contextual questions that can help you evaluate their responses as well. Typically you’ll want to ask a bit about who your users are and their experience with your product or competitors, if any.
  3. 🧩 Usability Tasks: This is the meat of the matter. This is where you’ll set up specific realistic “tasks” for your users (e.g., set up a new Google Doc and invite your teammate to collaborate on it). You’ll watch them navigate through your prototype, noticing when they click the wrong part of the screen, when their faces scrunch up in confusion, and even timing them on how long it took to complete the task.
  4. 🎁 Wrap Up Questions: The Outro! Ask your final follow-up questions and allow your participants to ask any they might have been holding onto as well.
    One of my favorite questions for this part of the call is, “Is there anything I didn’t ask you that you think I should be aware of?”. Some really magical insights come up with just that one single open ended question.

How to moderate a Usability Test

Moderation in a usability test is all about setting the stage and observing your users.

During the introduction and warm up phase of the study, focus on making sure your participant is feeling comfortable and relaxed. If you are conducting the usability test in person, watch out for the participant's body language as well.

Once you've given your task to the user, your job is to observe. Pay attention to where your users look frustrated, pause for a long time, or otherwise show signs of confusion.

Do not interrupt, help your users, or jump in as they talk through the task. It's hard, I know! But if you interrupt your user or start answering their questions in the middle of the task, you are endangering the validity of the test.

In real life your users won't have someone to answer their questions. So you can't answer them during the task either.

Once a task is over, feel free to answer your user's questions before guiding them to their next task.

Usability Testing Guides from successful User Researchers

Many incredible teams like PandaDoc, Growens, 15Five, and Obviousave already created moderated usability testing scripts that their design & research teams are able to leverage every time a new study comes up.

To help you get kicked off faster, these expert teams were gracious enough to let us share their knowledge & resources with you:

  1. PandaDoc’s Usability Testing Script [Click to View]
    Contributed by Shannon Morgan, User Researcher

  2. Growens’ Usability Testing Script [Click to View]
    Contributed by Chiara Scesa, Group Design Lead & Dalila Bonomi, Design Researcher & Service Designer

  3. 15Five’s Usability Testing Script [Click to View]
    Contributed by Abbigail Rose Christensen, Senior Product Designer

  4. Obvious’ Usability Testing Script [Click to View]
    Contributed by Aarti Bhatnagar, Design Researcher

Checklist for Moderated Usability Calls

As one final parting gift from me to you, I wanted to share a checklist I usually keep handy during usability tests with reminders of the key tasks I need to complete. Pull this out before every call to make sure you haven’t missed anything!

Before the Call

  • Reminders: Send a reminder email to your participant to confirm that:
    1) They can still attend the call at the agreed time
    2) They will have reliable internet access
    3) They will be able to join the call from their computer and share their screen
  • Note-takers: Ensure that your note-taker:
    1) Is added to the call invite
    2) Has a link to a document or app (like Looppanel!) where they can take notes during the call
  • Prototype: Keep your prototype link ready to share with the participant

During the Call

  • Record the call: Ask the participant for permission and start recording the call

After the Call

  • Debrief: Clean up your notes and check-in with your note-taker for a 5 minute debrief. During the debrief, cover these topics:
    1) Key takeaways and observations that you both may have had
    2) Anything that didn’t work in the script or usability test set up itself (e.g., did the prototype not load correctly?)
    3) Questions you’d like to add or remove from the script based on your findings so far
  • Analysis of Findings (this is a whole other article!)


Moderated Usability Testing
Usability Testing
User Testing
Testing Techniques
Share this:

Get the best resources for UX Research, in your inbox

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Related Articles