How to conduct usability testing
Looking for tips on how to conduct usability testing? You’ve come to the right place. In this chapter, we walk you through the process step-by-step. By following these steps, you’ll be well on your way to conducting effective usability tests and gathering valuable insights.
Top tip: Run a pilot usability test
Running a pilot usability test before you conduct the full-scale test can be a helpful way to identify and address any potential issues or limitations. You can think of this as a dress rehearsal before the big day.
The pilot can be run with a colleague, but it’s a good idea to choose someone who hasn’t interacted with the design or product before. That way, they’ll have the same level of understanding as an external test participant.
After running the pilot test session, conduct a debriefing with your colleague. Ask open-ended questions to gather their thoughts, opinions, and suggestions on where improvements could be made. This process helps you identify and rectify potential issues early on, ensuring a more effective and successful test.
1. Brief your usability testing participants
Briefing your test participants will vary depending on the type of testing being conducted, whether it’s in-person, remote, moderated, or unmoderated.
No matter the test, it’s important to provide your participants with clear and concise instructions, address any questions or concerns they may have, and ensure they understand the testing process and expectations.
If you’re conducting moderated testing, you might want to write a usability testing script ahead of time. A well-structured and thorough briefing can help your participants feel more comfortable and confident during the testing session, which can lead to more reliable and valuable feedback.
Here are some general guidelines to follow when briefing your participants.
Briefing moderated in-person usability test participants
Start with an overview: Begin by providing an overview of the testing process, including the goals of the study, the tasks you’re asking participants to perform, and the type of feedback or data you’re looking to gather.
Explain the testing process: Provide a step-by-step explanation of how the testing session will be conducted, including any equipment or tools participants will be using, and the order they’ll complete the tasks.
Clarify expectations: Clearly explain what’s expected from participants, including how they should approach the tasks and anything specific they should keep in mind while testing, like explaining their thought process out loud.
Address questions and concerns: Allow participants to ask questions and address any concerns they have about the testing process.
Briefing moderated remote usability test participants
Send a pretest briefing email: Before the session, send your participants a briefing email that includes information about the purpose of the test, what they’re expected to do, any specific instructions they need to follow, and what you’re hoping to find out.
Introduce the test: At the start of the test, reiterate the purpose of the test and what the tasks involve, clarify expectations, and answer any questions.
Address technical requirements: Make sure participants are aware of any technical requirements, such as software or hardware they need to have installed for the session.
Provide clear written instructions: Clearly outline the tasks, instructions, and expectations. The usability testing platform you’re using will likely have automated instructions that you can use or edit to suit your needs.
2. Set up the usability testing environment
During the planning stage, you’ll have decided on the type of testing environment you’re using, whether it’s an in-person or remote setting. Now it’s time to prepare the environment to ensure that everything is ready to go.
Here are some top tips for setting up your testing environment for in-person and remote settings.
You can skip this step if you’re running an unmoderated remote usability test.
Choose a quiet and comfortable space: It should be free from distractions and have minimal background noise.
Make sure you have access to the tools you need: For example, a computer or mobile device and the ability to record. Check that everything is working and that you have the test loaded onto the device you’re using.
Clear permissions ahead of time: If you need participants to sign NDAs or consent forms, share them ahead of time, or have them available on the day.
Greet your participants: For in-person sessions, arrange to have someone greet participants when they arrive, and have a colleague with you to take notes during the test.
Choose a video conferencing tool: For remote sessions, choose a reliable video conferencing tool that allows for screen sharing (and remote control of the participant’s computer, if necessary). Zoom, Microsoft Teams, and Google Meet are all popular options. Before the test, you might want to run a test call with a colleague to make sure that everything is set up correctly and that the audio and video quality are working well.
Check your internet connection: This probably goes without saying, but make sure you have a stable internet connection with enough bandwidth to support video. In case of any technical difficulties, have a backup plan. For example, have access to a different video conferencing tool or a way to contact the participant via email or phone if the video call suddenly drops.
Read more: Curious to know which video conferencing tools user researchers prefer? Check out the ReOps Toolbox Project for a list of favored moderated study tools.
3. Conduct the usability test
Your test participants have been briefed and everything is ready to roll! In this section, we share useful advice for conducting effective usability tests.
Observation and note-taking
When you run a moderated study, observation and note-taking allow you to capture valuable insights and feedback from your participants. Here are some tips for effective observation and note-taking during a usability test:
Be attentive and focused: Pay close attention to how participants interact with the product or prototype. Stay focused on their actions, behaviors, and verbal cues to capture relevant information.
Use multiple senses: Observe not only what participants do but also what they say, their facial expressions, body language, and emotions. Take note of any frustration, confusion, or satisfaction they express.
Take detailed notes: This is why you might want a dedicated note-taker to join the session. And if possible, record the session too so that you have something to refer back to. When taking notes, record your observations in detail, including positive and negative feedback, specific actions taken by the participant, and any issues or usability problems identified. Use descriptive language to capture the essence of their experience.Consider using a structured approach to note-taking, such as using a predefined template or form that includes specific categories, such as task completion, navigation, error rate, and feedback.
Capture quotes: Try to capture direct quotes from participants, especially when they express their thoughts, opinions, or feedback verbally. These quotes can provide valuable insights and can be used as supporting evidence in your findings.
Stay objective: Aim to be as objective as you can and avoid making assumptions or interpretations based on your own biases. Stick to the facts and avoid jumping to conclusions or making subjective judgments.
Prioritize critical issues: Identify and prioritize critical issues or usability problems that multiple participants encounter or that impact their experience. These notes should be highlighted for further analysis and action.
Collaborate with a team: If you’re working as part of a team, collaborate throughout the observation and note-taking process. Use a shared document or communication tools to work in real-time and capture multiple perspectives.
Review and summarize: After each usability test, review and summarize your notes while they’re still fresh in your mind. This can help you consolidate any findings, identify patterns or trends, and prepare for analysis and reporting.
Read more: Usability/UX consultant Steve Krug shares some handy downloadable resources on his website, including instructions for note-takers and test observers on what to expect.
Collecting user feedback
Gathering feedback from your participants during a usability test is important for understanding their thoughts, opinions, and experiences when interacting with your product. Here are some tips for how to collect feedback during usability testing:
Ask open-ended questions: Ask questions that encourage participants to provide detailed and descriptive feedback. Avoid leading or biased questions that can influence their responses. For example, instead of asking “Did you find the navigation confusing?”, ask “Tell me about your experience with the navigation.”
Encourage thinking aloud: Ask participants to verbalize their thoughts and feelings as they interact with the product or prototype. This can provide valuable insights into their decision-making process and uncover potential usability issues.
Use probing techniques: Use probing techniques, such as asking why, how, or tell me more to gain deeper insights and encourage participants to elaborate. This can help you uncover motivations, preferences, or pain points.
Create a non-judgmental environment: Create a non-judgmental and welcoming environment that encourages participants to share their honest feedback – both positive and negative – without feeling judged or criticized. Assure your participants that their feedback is valuable and will be used to improve the product.
Use Likert scales: Consider using Likert scales to gather feedback about specific aspects of your product or prototype, such as usability, satisfaction, or preference. This can provide quantitative data that can be analyzed for trends or patterns. This is especially useful for unmoderated testing.
Follow up on unclear feedback: If participants provide unclear feedback, follow up with clarifying questions to better understand their perspective. Avoid making assumptions or guesses about what they meant.
Summarize and confirm feedback: Summarize and confirm feedback at the end of the usability test to ensure accuracy and understanding. This can help you validate feedback and gather any additional insights.
4. Debrief and thank your usability test participants
At the end of the session, don’t forget to debrief and thank your participants for taking part in your usability test.
Highlighting areas where they provided valuable insights can help participants understand their role in the testing process and feel appreciated for their contribution. If they had any concerns or issues during the test, now’s the time to address them and answer any questions.
The debrief is also a good opportunity to ask your participants how they felt about the testing process itself. This can help you identify any areas for improvement and make changes for future usability testing. You can do this in unmoderated tests too by including an open-ended question at the end of the test asking participants about their test experience, or a Likert scale asking how easy or difficult the task was to complete.
If participants were offered incentives for their participation, make sure they receive this promptly. This can help maintain a positive relationship and show appreciation for their time and effort. And finally, remember to thank your participants. Let them know their participation was valuable and important for the development of the product.
Conducting effective usability testing
In this chapter, we’ve covered each step of the usability testing process.
To recap, this involves:
Briefing participants: Whether your usability test is in-person or remote, providing clear and concise instructions, addressing questions and concerns, and setting expectations are vital for making participants feel comfortable.
Setting up the environment: Carefully prepare your testing environment, whether in-person or remote, to minimize distractions and ensure all necessary tools are ready. For remote sessions, choose a reliable video conferencing tool and have a backup plan for technical issues.
Conducting the test: During the usability test, effective observation and note-taking are essential for capturing valuable insights. Be attentive, use multiple senses to observe participants, and prioritize critical issues for further analysis.
Collecting user feedback: Encourage participants to provide detailed feedback using open-ended questions, thinking aloud, and probing techniques. Create a non-judgmental environment to ensure honest feedback.
Debriefing and thanking participants: At the end of the session, debrief participants, highlight their valuable insights, address concerns, and gather feedback on the testing process itself. Promptly provide any promised incentives and express your gratitude for their participation.
Additionally, consider running a pilot usability test as a “dress rehearsal:to identify and address potential issues early on.
Following these steps will help you conduct effective usability tests and gather valuable insights to improve your product.