17 Apr 2026

|

18 min

Usability test script

Write a usability test script that keeps sessions consistent and reduces bias. Includes a step-by-step guide, template, and moderated vs unmoderated examples.

Usability test script

A usability test script is the backbone of any effective moderated usability testing session. It keeps your research structured, ensuring consistent results, unbiased questions, and the kind of insights that move your product forward.

Whether you're running your first usability test or looking to improve your existing process, having a well-crafted script ensures every session follows the same structure. This consistency is what transforms scattered observations into actionable insights that actually improve your product.

In this guide, we'll walk you through everything you need to know about usability test scripts: what they are, why they matter, what to include, and how to write one that delivers reliable results. We've also included a ready-to-use template you can copy and adapt for your own research.

Key takeaways

  • A usability test script is a structured document that guides facilitators through each session, ensuring consistency and reducing bias across all participants.

  • Scripts are different from test plans. Scripts focus on what you say and do during sessions, while plans cover the broader research strategy.

  • Every script should include an introduction, background questions, realistic tasks, follow-up questions, and a wrap-up section.

  • Pilot testing your script with colleagues helps catch confusing phrasing or leading questions before you run real sessions.

  • Keep sessions to 30 minutes maximum so participants stay engaged and your results stay reliable.

  • Lyssna supports both moderated and unmoderated usability testing, so you can run scripted sessions and analyze results from a single platform.

Usability test script

What is a usability test script?

A usability test script is a structured document that outlines exactly what a facilitator should say and do during a moderated usability testing session. It includes the introduction, task instructions, questions to ask, and how to wrap up – all written in advance to ensure consistency across every participant.

Why scripts matter for consistency and unbiased testing

Scripts matter because they remove guesswork from your sessions. When you're facilitating a test, it's easy to accidentally phrase a question differently for each participant, give subtle hints, or skip important steps when you're pressed for time. A script keeps you on track and ensures every participant gets the same experience.

This consistency is crucial for gathering reliable data. If you ask one participant "How easy was that?" and another "What did you think of that process?", you're measuring different things. Scripts ensure your phrasing stays neutral and your questions produce comparable responses.

Difference between a test script and a test plan

It's worth clarifying the distinction between these two documents:

Document

Purpose

Contents

Test plan

Defines the overall research strategy

Research goals, participant criteria, timeline, success metrics, methodology

Test script

Guides the actual session

Introduction wording, task instructions, questions, moderator notes

Think of the test plan as your "what and why" document, while the script is your "how" document. You need both, but they serve different purposes. Your usability test plan sets the strategy; your script executes it.

Run better usability tests

Plan, run, and analyze moderated and unmoderated usability tests – all in one place.

Why you need a usability test script

Creating a script might feel like extra work upfront, but it pays dividends throughout your research. According to Forrester's 2024 CX Index, even minor CX quality improvements can add tens of millions in revenue. Here's why it's worth the investment.

Reduces facilitator bias

Even experienced researchers can be influenced by cognitive biases that affect participants through their tone, word choice, or body language. A script provides neutral phrasing for each phase of the test, reducing the chance of bias creeping into your questions or line of inquiry.

Keeps sessions consistent

When you're running multiple sessions over days or weeks, it's easy for your approach to drift. A script keeps every session on track, ensuring each participant receives identical instructions and questions.

Produces quality insights

Consistent sessions produce comparable data. When every participant completes the same tasks with the same instructions, you can confidently identify patterns, build a clear usability testing report, and make data-driven decisions about your product.

Makes testing repeatable and scalable

A well-documented script becomes invaluable for iterative product development. It captures institutional knowledge, making it easy to onboard new team members, repeat studies for benchmarking, or hand off research to colleagues.

lightbulb-02.svg

Pro tip: Scripts are most useful for moderated usability testing, where a human moderator provides prompts and instructions. For unmoderated usability testing, you can adapt your script into written prompts integrated into your testing tool.

What to include in a usability test script

A comprehensive usability test script covers five key sections:

  • Introduction and consent: Set the tone and get recording permission

  • Background questions: Gather participant context

  • Tasks and scenarios: Observe real behavior on key flows

  • Follow-up questions: Understand the experience after each task

  • Wrap-up and debrief: Gather overall impressions and close

Let's break down what each should contain.

Introduction and consent

Your introduction sets the tone for the entire session. Make sure it covers the following:

  • Welcome the participant and thank them for their time

  • Explain the purpose of the session (testing the product, not them)

  • Clarify that there are no "right" or "wrong" answers

  • Request consent for recording

  • Explain how their data will be used

Here's essential language to include: "Remember that we're testing the app, not testing you. There are no 'right' or 'wrong' answers." This simple statement helps participants relax and behave more naturally.

Background questions

Before diving into tasks, gather context about your participant's experience level and relevant behaviors. These warm-up questions serve two purposes: they help you understand the participant's perspective and ease them into the session.

Here are some example background questions:

  • How familiar are you with [product category]?

  • What tools do you currently use for [relevant task]?

  • How often do you [relevant behavior]?

Keep this section brief, around five minutes maximum. You're gathering context, not conducting a full user interview.

Tasks and scenarios

This is the heart of your usability test. Select 3–5 tasks for the user to perform, sequenced logically. For example, if testing an ecommerce site: 1) sign up for an account, 2) search for a specific type of product, 3) add it to the cart, 4) complete the purchase.

When writing tasks, avoid using terms that appear directly in the user interface. Instead, give scenarios. Rather than saying "pick out a laptop with a large screen," say "You're looking for a new computer you can use to watch movies on when traveling for work."

For each task, document:

  • Success criteria: What does successful completion look like?

  • Key observations: What specific behaviors or reactions should you watch for?

  • Follow-up questions: What clarifying questions might you ask?

Our video goes into more detail on how to write usability testing tasks and scenarios.

Follow-up questions

After each task, ask usability testing questions that help you understand the participant's experience:

  • "How easy or difficult was it to complete this task?"

  • "What aspects of the interface design helped or hindered you?"

  • "Did you encounter any unexpected or confusing elements?"

  • "What would you change to make this task easier?"

These questions should be open-ended and neutral, avoiding anything that leads participants toward a particular answer.

Wrap-up and debrief

End your session by gathering overall impressions and giving participants space to share anything you might have missed:

  • "What did you find most frustrating about this experience?"

  • "How does this compare to other [similar products/websites] you've used?"

  • "If you could change one thing about this experience, what would it be?"

  • "Is there anything else you'd like to share about your experience today?"

  • "What did you find most useful or helpful?"

lightbulb-02.svg

Pro tip: End on a positive question. Asking about what was helpful last ensures participants leave feeling good about their contribution.

Usability test script

How to write a usability test script (step by step)

Ready to create your own script? Follow these steps to build one that delivers reliable insights.

Step 1: Define research goals

Before writing a single word, clarify what you're trying to learn. Are you testing whether users can complete a specific flow? Comparing two design approaches? Identifying pain points in an existing product? Your goals shape every other decision.

Step 2: Choose moderated or unmoderated testing

Your testing approach affects how you write your script. Moderated usability testing allows for real-time follow-up questions and observation, while unmoderated testing requires more detailed written instructions since participants complete tasks independently. Most teams today conduct both formats as remote usability testing, which further shapes how you write your script.

Step 3: Write neutral introductions

Craft your introduction to put participants at ease without biasing their responses. Avoid language that suggests what you're hoping to find or how you expect them to perform.

Step 4: Create realistic tasks

Design tasks that reflect real user goals, not feature demonstrations. Think about what users actually try to accomplish with your product and frame tasks around those objectives.

Step 5: Avoid leading questions

Review every question for bias. Instead of "Did you find this feature helpful?", ask "How was your experience using this feature?" The first assumes helpfulness; the second invites honest feedback.

Step 6: Pilot test your script

Test your script with colleagues beforehand. A quick run-through helps catch confusing phrasing or subtle wording that could nudge participants toward a particular path, before it affects real data.

Quote icon

Practitioner insight: "Lyssna enables me to design the test I need, get it reviewed by peers and stakeholders, and publish to a pool of prequalified testers very quickly."
– Benjamin B., G2 review

Step 7: Refine before running sessions

Based on pilot feedback, adjust timing, clarify confusing instructions, and tighten your question wording. Then you're ready to run your first real session.

lightbulb-02.svg

Pro tip: Keep the entire test to 30 minutes maximum. Longer sessions can cause mental fatigue that skews results.

Usability test script

Usability test script template

Use this template as a starting point for your moderated usability testing sessions. Customize the tasks, questions, and scenarios to fit your specific research goals. You can also grab our ready-made Notion template for moderated usability testing scripts.

Pre-session checklist

  • Recording equipment tested and ready

  • Backup recording method available

  • Participant's audio/video connection tested

  • Prototype/website loaded and functional

  • Consent forms prepared

  • Note-taking materials ready

  • Timer available

Introduction (5 minutes)

"Hi [Name], thank you so much for taking the time to speak with me today. My name is [Your Name], and I'll be guiding you through this session.

Today, we're going to look at [product/website/app]. I'm interested in understanding how people interact with it, so I'll ask you to complete a few tasks and share your thoughts as you go.

Before we start, I want to emphasize that we're testing the product, not testing you. There are no 'right' or 'wrong' answers. Your honest feedback, positive or negative, is incredibly valuable.

I'll be recording this session so I can review it later. The recording will only be used for research purposes and won't be shared publicly. Is that okay with you?

Do you have any questions before we begin?"

Warm-up questions (5 minutes)

  • "Can you tell me a little about yourself and what you do?"

  • "How familiar are you with [product category]?"

  • "What tools or websites do you currently use for [relevant task]?"

Task instructions

"Now I'm going to ask you to complete a few tasks. As you work through them, please think aloud — share what you're looking at, what you're thinking, and what you're trying to do. There's no rush, and remember, we're testing the product, not you."

Task 1: [Task Name]

"Imagine [scenario]. Starting from this page, please [specific goal]."

Moderator notes:

  • Success criteria: [Define what completion looks like]

  • Key observations: [What behaviors to watch for]

  • Follow-up questions: "What made you choose that approach?" / "What were you expecting to happen?"

Task 2: [Task Name]

"Now imagine [scenario]. Please [specific goal]."

Moderator notes:

  • Success criteria: [Define what completion looks like]

  • Key observations: [What behaviors to watch for]

  • Follow-up questions: [Relevant clarifying questions]

Task 3: [Task Name]

"For this last task, [scenario]. Please [specific goal]."

Moderator notes:

  • Success criteria: [Define what completion looks like]

  • Key observations: [What behaviors to watch for]

  • Follow-up questions: [Relevant clarifying questions]

Think-aloud prompts (use as needed)

Stay neutral and avoid leading language. Let participants discover and react naturally.

  • "What are you thinking right now?"

  • "What are you looking for?"

  • "Tell me about what you're seeing here."

  • "What would you expect to happen if you clicked that?"

  • "What would you do next?"

  • "How does this compare to what you expected?"

Wrap-up questions (10 minutes)

  • "What did you find most frustrating about this experience?"

  • "How does this compare to other [similar products] you've used?"

  • "If you could change one thing about this experience, what would it be?"

  • "What would make this easier or more helpful for someone like you?"

  • "Is there anything else you'd like to share about your experience today?"

  • "What did you find most useful or helpful?"

Closing (2 minutes)

"Thank you so much for your time and feedback today. Your insights are incredibly valuable and will help us improve the product.

[Explain next steps regarding incentive/compensation if applicable]

Do you have any final questions for me?"

Usability test script

Moderated vs unmoderated usability test scripts

The type of testing you're running significantly impacts how you structure your script.

Aspect

Moderated Script

Unmoderated Script

Facilitation

Researcher guides participant in real-time

Participant completes tasks independently

Script length

Can be more flexible; follow-ups happen naturally

Must be comprehensive; no opportunity for clarification

Question style

Conversational; can probe deeper

Written prompts; must anticipate all scenarios

Task instructions

Can clarify if participant is confused

Must be crystal clear from the start

Best for

Complex tasks, exploratory research, prototype testing

Large sample sizes, collecting user feedback at scale, validation testing

For unmoderated tests, your script essentially becomes a set of written instructions embedded in your testing tool. Every scenario must be self-explanatory, and you'll rely on screen recordings and written responses rather than real-time observation.

Quote icon

Practitioner insight: "I love that Lyssna lets me be lightweight and fast. As a UX practice of one in a large enterprise, my stakeholders won't slow down for research. Lyssna lets me provide insights at speed."
–Kevin Boulier, Lead UX Strategist & Designer at ManpowerGroup

Example usability test script

Let's look at a practical example: testing the hotel booking flow for a travel app.

Context: You're a UX researcher at a travel company. After improving the flight booking experience, you want to test whether users can successfully book a hotel for their trip.

Introduction excerpt:

"Today we'll be looking at our travel booking app. I'm particularly interested in how you'd go about planning accommodation for a trip. As you work through the tasks, please share your thoughts out loud."

Task 1: Find and book a hotel

"Imagine you've just booked a flight from San Francisco to New York for a business trip next month. You need to find a hotel near Times Square for three nights. Starting from the home screen, please find and book a suitable hotel."

Success criteria: Participant successfully completes a hotel booking

Key observations:

  • Does the user find the hotel search easily?

  • How do they filter results?

  • Do they understand the pricing information?

  • Any hesitation at checkout?

Follow-up questions:

  • "What information was most important to you when choosing a hotel?"

  • "Was there anything you expected to see that wasn't there?"

  • "How confident do you feel about this booking?"

Task 2: Modify the booking

"Now imagine your trip dates have changed. You need to extend your stay by one night. Please update your reservation."

Success criteria: Participant successfully modifies the booking dates

Key observations:

  • Can they find the modification option?

  • Is the process intuitive?

  • Do they understand any price changes?

Potential insights from this test:

  • Whether the hotel search is discoverable from the main navigation

  • How users interpret filter options and pricing displays

  • Pain points in the modification flow

  • Confidence levels throughout the booking process

Usability test script

Common mistakes when writing usability test scripts

Even experienced researchers make these errors. Here's what to avoid:

Leading questions

Instead of: "Did you find the checkout process easy?" Write: "How would you describe the checkout process?"

Leading questions suggest the answer you're hoping for and bias your results.

Giving hints or instructions

Instead of: "Click the menu icon in the top right to find settings" Write: "Please find where you would change your account settings"

Your script should test whether users can discover features, not guide them to the answer.

Too many tasks

Stick to 3–5 tasks maximum. Overloading participants leads to fatigue, rushed responses, and unreliable data. Quality insights come from focused sessions, not exhaustive ones.

Unclear success criteria

Before running sessions, define what "completion" looks like for each task. Without clear criteria, you'll struggle to analyze results consistently.

Testing opinions instead of behavior

Instead of: "Do you like this design?" Write: "Please complete [specific task]" then observe their behavior

User testing is most valuable when you observe what people do, not just what they say they'd do.

lightbulb-02.svg

Pro tip: After writing your script, ask a colleague to read each question and tell you what answer they think you're hoping for. If they can guess, the question is probably leading.

How to adapt usability test scripts for different stages

Your script should evolve based on where you are in the product development process.

Early concept testing

At the concept testing stage, you're testing ideas rather than finished designs. Scripts should focus on understanding user reactions to concepts and gathering feedback on direction.

  • Tasks might involve reviewing mockups or sketches

  • Questions focus on comprehension and appeal

  • Success criteria are more qualitative

Prototype testing

When testing prototypes, your script balances task completion with understanding user expectations.

  • Tasks test specific flows and interactions

  • Include questions about what users expect to happen next

  • Keep track of where the prototype's limitations shape the experience

Quote icon

Practitioner insight: "The ability for us to design a quick mockup, run it on Lyssna and receive feedback within an hour has helped us reach definitive design decisions much sooner than before."

– Chris Taylor, Senior UX/UI Designer at Canstar

Live website testing

Testing a live website allows for more realistic scenarios and comprehensive task flows.

  • Tasks can span multiple features and sessions

  • Include competitive comparison questions

  • Focus on efficiency and satisfaction metrics

Benchmark usability testing

This stage, often called summative usability testing, requires your script to remain consistent over time.

  • Use identical task wording across studies

  • Include standardized usability metrics like task success rate and time on task

  • Document any changes to enable accurate comparisons

How Lyssna helps teams run better usability tests

Creating a great script is just the first step. You also need the right tools to execute your research effectively.

With Lyssna, you can run both moderated and unmoderated usability tests from a single platform. Our user interviews feature lets you facilitate live sessions, capture participant insights through recordings and transcriptions, and share findings with your team, all in one place.

For unmoderated testing, Lyssna's usability testing tools let you set up tasks and questions that participants complete on their own time. You can capture task success, gather feedback through follow-up questions, and analyze results faster with built-in reporting.

Need participants? Our research panel gives you access to over 690,000 vetted participants with precise demographic targeting, so you can recruit the right users and start testing within hours, not weeks.

Quote icon

Practitioner insight: "With the feature additions of interviews and screeners, we've been able to reduce the number of research tools needed to support our work and are able to conduct research more efficiently."

– Jenn Wolf, Senior Director of CX at Nav

Start testing with real users

Recruit from 690,000+ vetted participants and run your first usability test today.

FAQs about usability test scripts

How long should a usability test script be?
minus icon
minus icon
Can I use the same script for moderated and unmoderated testing?
minus icon
minus icon
Should I memorize my usability test script?
minus icon
minus icon
How many tasks should I include in a usability test?
minus icon
minus icon
What's the difference between a usability test script and a discussion guide?
minus icon
minus icon
Author profile image of Diane Leyman

Diane Leyman

Senior Content Marketing Manager

Diane Leyman is the Senior Content Marketing Manager at Lyssna. She brings extensive experience in content strategy and management within the SaaS industry, along with editorial and content roles in publishing and the not-for-profit sector

linkedin.svg

You may also like these articles

Try for free today

Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.

No credit card required

4.5/5 rating
Rating logos