26 Mar 2026

|

21 min

Google HEART Framework

The Google HEART framework helps teams measure UX using happiness, engagement, adoption, retention, and task success. Learn how it works and how to apply it.

Google HEART Framework

The Google HEART framework offers UX teams a structured approach to measuring what actually matters about user experience – moving beyond vanity metrics to capture the full picture of how people interact with your product. Developed by researchers at Google, this framework has become a trusted method for teams who want to connect user experience improvements to measurable business outcomes.

If you've ever struggled to prove the value of UX work to stakeholders, or found yourself drowning in analytics data without clear direction, HEART provides a practical solution. It helps you focus on the metrics that reveal whether users are genuinely satisfied, engaged, and successful with your product.

This guide covers what the HEART framework is, why it matters, and how to implement it effectively with your team.

Key takeaways

  • The Google HEART framework measures user experience across five categories: happiness, engagement, adoption, retention, and task success. Together, they give UX teams a complete picture of how users interact with a product.

  • The Goals-Signals-Metrics (GSM) model connects each HEART category to specific product goals, observable signals, and measurable metrics, keeping your research focused and actionable.

  • You don't need to track all five categories at once. Start with the two or three most relevant to your current product goals and expand as your measurement practice matures.

  • HEART works best when you combine quantitative data (analytics, surveys, task completion rates) with qualitative research (user interviews, usability testing, open-ended feedback) to understand both what's happening and why.

  • When metrics conflict, return to your goals. Qualitative research helps you understand tradeoffs from the user's perspective and decide which outcome to prioritize.

  • Connecting HEART metrics to business outcomes like reduced support costs, increased retention, or higher customer lifetime value makes it easier to get stakeholder buy-in and justify UX investment.

  • User research platforms like Lyssna bring together the research methods you need to measure HEART metrics in one platform, from surveys and usability tests to user interviews, helping teams gather actionable insights quickly and iterate faster.

Google HEART framework

What is the Google HEART framework?

The HEART framework is a user-centered metrics system developed by Kerry Rodden, Hilary Hutchinson, and Xin Fu at Google. It emerged from a practical challenge: how do you measure user experience quality at scale in a way that's meaningful and actionable?

Before HEART, many teams relied heavily on business metrics like conversion rates or page views. While these numbers matter, they tell you little about the actual user experience. A high conversion rate doesn't reveal whether users found the process frustrating or delightful – it just tells you they completed it.

The framework organizes user experience metrics into five distinct categories:

Category

What it measures

Example question

Happiness

User satisfaction and sentiment

How satisfied are users with this feature?

Engagement

Depth and frequency of interaction

How often do users return to use this feature?

Adoption

New user uptake

How many users are trying this feature for the first time?

Retention

Continued use over time

Are users still using this feature after 30 days?

Task success

Efficiency and effectiveness

Can users complete key tasks without errors

What sets HEART apart is its holistic view of user experience. It balances user satisfaction and behavioral metrics instead of focusing solely on business outcomes or specific design elements. This makes it particularly valuable for teams who want to understand not just whether users convert, but whether they're having a good experience along the way.

The framework remains highly relevant today. Forrester's 2025 CX Index found that only 6% of brands globally improved their CX quality, because the fundamental challenge hasn't changed: teams still need to demonstrate the value of UX work, prioritize improvements based on evidence, and track whether changes actually help users. HEART provides a common language for these conversations.

Why the HEART framework matters for UX teams

The HEART framework addresses three core challenges that UX teams consistently face: proving impact, focusing measurement, and making evidence-based decisions.

Aligning UX with business outcomes

One of the biggest challenges UX teams face is connecting their work to outcomes that stakeholders care about. When you can show that improving task success rates led to reduced support tickets, or that higher happiness scores correlate with increased retention, you're speaking a language that resonates across the organization. McKinsey research shows that CX-focused companies achieve more than double the revenue growth of their peers.

HEART helps bridge this gap by providing metrics that matter to both users and the business. As Kerry Rodden, creator of the HEART framework, points out, "Metrics are not useful unless they are aligned closely with the team's high-level goals." For each metric, ask yourself: "What would success look like, and how will I know if I've achieved it?" 

This clarity keeps your experience research focused and makes it easier to communicate results to stakeholders.

Measuring what truly matters to users

Traditional analytics can tell you that users clicked a button or visited a page, but they don't reveal whether users felt confident, frustrated, or confused during the process. HEART encourages you to measure the things that actually indicate experience quality.

For example, a product team tracking only page views might see healthy traffic numbers while missing that users are struggling to complete key tasks. By adding happiness and task success metrics – through satisfaction surveys and usability testing – the team can uncover friction points that raw analytics would never reveal.

Supporting data-driven design decisions

When you have clear metrics tied to specific goals, design decisions become less about opinions and more about evidence. Instead of debating whether a new onboarding flow is "better," you can measure whether it improves adoption rates and task success.

This doesn't mean ignoring intuition or qualitative insights – quite the opposite. HEART works best when you combine quantitative and qualitative research to understand both what's happening and why.

Google HEART Framework

The five HEART metrics explained

Each HEART category captures a different dimension of user experience. Here's what they measure, how to track them, and what they look like in practice.

Happiness

Happiness measures user satisfaction, sentiment, and perception of your product. It's the most subjective of the HEART categories, capturing how users feel about their experience rather than just what they do.

What happiness reveals:

  • Overall satisfaction with your product or feature

  • Perceived ease of use

  • Likelihood to recommend to others

  • Emotional response to the experience

Common happiness metrics:

  • Net Promoter Score (NPS)

  • Customer Satisfaction Score (CSAT)

  • System Usability Scale (SUS) scores

  • Sentiment analysis from open-ended feedback

Example: A SaaS company might track NPS after users complete their first project, asking "How likely are you to recommend this tool to a colleague?" A score increase from 32 to 45 after a redesign indicates improved user happiness.

lightbulb-02.svg

Pro tip: Go beyond the numbers. Combine quantitative happiness scores with qualitative data from user interviews to build a deeper understanding of how your users feel.

Engagement

Engagement measures the frequency and depth of user interaction with your product. It goes beyond simple usage counts to understand how meaningfully users are interacting with what you've built.

What engagement reveals:

  • How often users return to your product

  • How deeply they explore features

  • Which features drive the most interaction

  • Whether users are getting value from their time

Common engagement metrics:

  • Session duration

  • Feature usage frequency

  • Actions per session

  • Return visit rate

Example: A productivity app might track how many tasks users create per week. If users who create 5+ tasks in their first week have 3x higher retention, that engagement signal becomes a key indicator of product-market fit.

Adoption

Product adoption measures how many users start using a feature or product. It's particularly important when launching new functionality or trying to grow your user base.

What adoption reveals:

  • Whether users discover new features

  • How effectively you're communicating value

  • Barriers to initial use

  • Success of onboarding experiences

Common adoption metrics:

  • New user signups

  • Feature activation rate

  • Percentage of users who complete onboarding

  • Time to first key action

Example: After launching a new collaboration feature, a SaaS team tracks feature activation rate and discovers only 15% of users have tried it. Usability testing reveals the feature is buried in a submenu. Moving it to the main navigation increases adoption to 40% within two weeks.

Retention

Retention measures whether users continue to return over time. It's one of the strongest indicators of product value. Research by Bain & Company found that increasing retention by just 5% can boost profits by 25 to 95%.

What retention reveals:

  • Long-term product value

  • Whether initial engagement translates to habit formation

  • Churn risk indicators

  • Impact of product changes on loyalty

Common retention metrics:

  • Day 7, Day 30, Day 90 retention rates

  • Monthly active users (MAU)

  • Churn rate

  • Customer lifetime value

Example: A mobile app might track what percentage of users who signed up in January are still active in April. If retention drops significantly after a major update, that's a signal to investigate what changed in the user experience.

Task success

Task success measures how effectively users complete key tasks within your product. It's the most directly actionable category, revealing specific usability issues that can be addressed through design improvements.

What task success reveals:

  • Whether users can accomplish their goals

  • Where friction exists in key workflows

  • Efficiency of task completion

  • Error rates and recovery patterns

Common task success metrics:

  • Task completion rate

  • Time on task

  • Error rate

  • Number of steps to completion

Example: An ecommerce team measures checkout completion rate and finds 30% of users abandon at the payment step. After simplifying the form from five fields to three, completion rate improves by 18% and error rates drop significantly.

Category

Key metrics 

How to measure

Happiness

NPS, CSAT, SUS scores

Surveys, open-ended feedback

Engagement

Session duration, feature usage frequency, actions per session

Product analytics, behavioral data

Adoption

Feature activation rate, onboarding completion, time to first action

Product analytics, onboarding funnels

Retention

Day 7/30/90 retention rates, churn rate, customer lifetime value

Cohort analysis, product analytics

Task success

Task completion rate, time on task, error rate

Usability testing, task-based testing

Google heart framework

HEART framework metrics vs traditional UX metrics

Understanding how HEART differs from other measurement approaches helps you choose the right metrics for your situation.

HEART vs conversion-only metrics

Conversion metrics tell you whether users completed a desired action, but they don't reveal the quality of the experience. A user might convert despite a frustrating process, or abandon a well-designed flow due to external factors.

Conversion metrics

HEART approach

Focus on end outcomes

Measures experience quality throughout

Binary (converted or didn't)

Captures degrees of success

Business-centric

User-centric with business alignment

Doesn't explain why

Combines with qualitative insights

HEART vs analytics-only KPIs

Pure analytics data shows you what users do, but not how they feel about it. Page views, click rates, and session duration are useful signals, but they need context to be meaningful.

HEART encourages you to pair behavioral data with attitudinal measures. High engagement numbers mean little if happiness scores are low. Users might be spending time on your product because they're confused, not because they're delighted.

Qualitative vs quantitative measurement

The most effective HEART implementations combine both approaches:

  • Quantitative data (analytics, surveys with rating scales) tells you what's happening at scale

  • Qualitative data (interviews, open-ended feedback, usability testing) tells you why it's happening

lightbulb-02.svg

Pro tip: Your HEART focus should evolve with your product. A product in its growth phase might prioritize adoption and engagement, while a mature product benefits from focusing on retention and task success. Revisit your metrics regularly to stay aligned with what matters most.

Google heart framework

How to apply the Google HEART framework (step by step)

The HEART framework works best when paired with the Goals-Signals-Metrics (GSM) model. This structured approach ensures your metrics connect directly to what you're trying to achieve.

Step 1: Define product goals

Start by clarifying what success looks like for your product or feature. Goals should be specific and tied to user outcomes, not just business metrics.

Good goal examples:

  • "Users should be able to complete their first project within 10 minutes of signing up"

  • "Users should feel confident navigating the dashboard without help documentation"

  • "New users should understand the core value proposition during onboarding"

Questions to ask:

  • What user behavior indicates success?

  • What experience do we want users to have?

  • What problems are we trying to solve?

Step 2: Map goals to HEART metrics

Once you have clear goals, identify which HEART categories are most relevant. You don't need to measure all five. Focus on what matters most for your current objectives.

Goal

HEART categories

Improve onboarding satisfaction

Happiness + task success

Increase feature discovery

Adoption + engagement

Reduce user churn

Retention + happiness

Step 3: Select signals and metrics

For each HEART category you're tracking, identify:

  • Signals: Observable behaviors or attitudes that indicate progress toward your goal

  • Metrics: Specific, measurable ways to quantify those signals

Example for task success:

  • Goal: Users complete checkout without errors

  • Signal: Users successfully place orders on first attempt

  • Metric: Checkout completion rate, error rate per session

Step 4: Collect qualitative and quantitative data

Gather data from multiple sources to build a complete picture:

Quantitative sources:

  • Product analytics (Google Analytics, Mixpanel)

  • In-app surveys with rating scales

  • A/B test results

  • Support ticket volume

Qualitative sources:

  • User interviews

  • Usability testing sessions

  • Open-ended survey responses

  • Customer support conversations

Tools like Lyssna can help you gather both quantitative and qualitative data in one place, from satisfaction surveys and usability tests to user interviews with participants from a research panel.

Quote icon

Practitioner insight: "A full-blown research project can take a lot of time and energy, but you can have meaningful early results from Lyssna in a single day. I think that's one of the best benefits I've seen: faster and better iteration."
– Alan Dennis, Product Design Manager at YNAB

Step 5: Analyze results and iterate

Review your metrics regularly and look for patterns:

  • Which metrics are improving? Which are declining?

  • Do quantitative trends match qualitative feedback?

  • Are there unexpected correlations between categories?

Iteration approaches:

  • Small-scale tests: Roll out changes to a subset of users to see how they perform before a wider release

  • Continuous monitoring: Keep tracking the same metrics to measure the long-term impact of your changes

lightbulb-02.svg

Pro tip: Assign a team member to own specific metrics. This ensures someone is always tracking progress and initiating discussions around improvements.

Google HEART Framework

Examples of the HEART framework in practice

These examples show how different teams have applied HEART metrics to solve real product challenges.

SaaS onboarding optimization

Situation: A project management tool noticed high signup rates but low activation. Users weren't creating their first project.

HEART focus: Adoption + task success

Approach:

  • Measured time to first project creation (task success)

  • Tracked percentage of users completing onboarding steps (adoption)

  • Surveyed users who abandoned onboarding (happiness)

Outcome: Discovered users were overwhelmed by options. Simplified onboarding to focus on one key action, increasing activation by 35%.

Feature rollout measurement

Situation: A mobile app launched a new collaboration feature and needed to understand whether users were finding and using it.

HEART focus: Adoption + engagement

The team tracked feature discovery rate and measured collaboration actions per user, then conducted interviews with early adopters to understand their experience. They found that users loved the feature but couldn't find it. Adding contextual prompts doubled adoption within two weeks.

This kind of rapid testing and iteration is exactly where unmoderated research shines. User research platforms like Lyssna let you run preference tests or first click tests to validate feature placement before committing to a full release.

Quote icon

Practitioner insight: "The ability for us to design a quick mockup, run it on Lyssna and receive feedback within an hour has helped us reach definitive design decisions much sooner than before."
– Chris Taylor, Senior UX/UI Designer at Canstar

Website redesign validation

Situation: An ecommerce site redesigned their product pages and needed to validate the changes.

HEART focus: Task success + happiness

Approach:

  • Compared task completion rates before and after (task success)

  • Measured time to find product information (task success)

  • Collected satisfaction ratings via post-purchase surveys (happiness)

Outcome: Task success improved, but happiness scores dropped slightly. Qualitative research revealed users missed a feature that was removed. The team restored it with an improved design.

This example highlights one of HEART's biggest strengths: surfacing conflicts between metrics. Without tracking both task success and happiness, the team might have considered the redesign a straightforward win.

Product usability improvements

Situation: A healthcare app received complaints about confusing navigation, and the team noticed a steady decline in monthly active users.

HEART focus: Task success + retention

The team ran usability tests to identify navigation pain points and measured error rates on key workflows. They also tracked 30-day retention rates to understand whether navigation issues were driving users away. The results confirmed a clear link: users who encountered errors on core tasks were 3x more likely to churn within the first month.

After addressing three major navigation issues, error rates dropped 60% and 30-day retention improved by 15%.

Google HEART Framework

Common mistakes when using the HEART framework

The HEART framework is flexible by design, but there are a few common pitfalls to watch out for when putting it into practice.

Measuring too many metrics

It's tempting to track everything, but too many metrics create noise and make it harder to identify what matters. Start with two to three HEART categories most relevant to your current goals, then expand as needed.

Focus on the metrics that directly connect to your product goals. If you're launching a new feature, prioritize adoption and task success. If you're working to reduce churn, focus on retention and happiness.

Ignoring qualitative feedback

Numbers alone don't tell the full story. If your happiness scores are dropping because of a single frustrating feature, start by addressing that specific issue, then revisit the bigger picture.

Qualitative research helps you understand the "why" behind your metrics. A drop in engagement could mean users are bored, confused, or simply busy. Only by talking to them will you find out which.

Treating HEART as analytics-only

HEART is most powerful when it drives conversations about user experience and informs design decisions, not when it lives on a dashboard that nobody acts on.

Signs your HEART implementation needs attention:

  • Metrics are reviewed but rarely discussed

  • There's no connection between metrics and design changes

  • Qualitative research isn't part of the process

  • Stakeholders see reports but don't act on them

Not aligning with business goals

HEART metrics should connect to outcomes your organization cares about. If you can't explain why a metric matters for the business, stakeholders are unlikely to prioritize acting on it.

For each metric, articulate the business impact. "Improving task success reduces support tickets" or "higher retention increases customer lifetime value" makes the connection clear.

How UX research supports HEART metrics

As the original HEART research paper notes, the framework was designed to "complement, not replace, existing user experience research methods". In other words, HEART metrics are only as strong as the research feeding them. Here's how to match your research approach to the metrics that matter most.

Research method

HEART categories

Example metrics

User interviews

Happiness, retention, engagement

Satisfaction drivers, churn reasons, motivation patterns

Usability testing

Task success

Task completion observations, friction points, error recovery

Surveys

Happiness, engagement, adoption

NPS, CSAT, satisfaction ratings, self-reported usage

First click testing

Task success, adoption

Feature discoverability, navigation success rate

Task-based testing

Task success

Completion rate, time on task, error rate

User interviews

User interviews provide deep qualitative insights into user satisfaction, frustrations, and motivations. They're particularly valuable for understanding happiness and identifying the reasons behind engagement or retention patterns.

Best for: happiness, understanding context behind any metric

Usability testing

Usability testing directly measures task success by observing users attempting to complete specific tasks. It reveals where users encounter friction, where they succeed, and what causes them to abandon a workflow.

Best for: task success, identifying usability issues affecting other metrics

Surveys

Surveys scale your ability to measure happiness through satisfaction ratings, NPS, and open-ended feedback. They can also capture self-reported engagement and adoption data.

Best for: happiness, scaled feedback on any category

First click testing

First click testing reveals whether users can find what they're looking for, directly supporting task success measurement. It's particularly useful for evaluating navigation and information architecture.

Best for: task success, adoption (feature discovery)

Task-based testing

Structured task-based tests measure completion rates, time on task, and error rates – the core usability metrics of task success. They provide quantitative data that complements qualitative observations from usability testing sessions.

Best for: task success, validating design improvements

Quote icon

Practitioner insight: "Lyssna is an excellent unmoderated, quantitative research tool and is building strong capabilities to support qualitative research as well with a quality participant panel at an affordable cost."
– Jenn Wolf, Senior Director of CX at Nav

How Lyssna helps teams measure HEART metrics

Lyssna brings together the research methods you need to measure HEART metrics in one platform – from surveys and usability tests to user interviews and participant recruitment.

Collect happiness data with surveys

Use Lyssna's survey tools to gather satisfaction ratings, NPS scores, and open-ended feedback. Target specific user segments through the Lyssna research panel to understand how happiness varies across your user base.

Measure task success with usability tests

Run unmoderated usability tests to measure task completion rates, time on task, and error rates. With Lyssna's research panel, you can recruit participants and get results quickly – often within hours – to inform design iterations.

Understand engagement through user interviews

Combine quantitative engagement data from your analytics tools with qualitative insights from user interviews on Lyssna to understand what drives users to engage with specific features.

Track adoption with first click and preference tests

Use first click testing and preference tests to evaluate whether users can discover new features and understand their value. These lightweight tests are ideal for validating feature placement and onboarding flows before a full rollout.

Support continuous measurement

Because Lyssna supports both quantitative and qualitative research methods, you can track HEART metrics like happiness, task success, and retention over time. Run regular pulse surveys, schedule recurring usability tests, and use the research panel to maintain a steady stream of user feedback.

Quote icon

Practitioner insight: "We used to spend days collecting the data we can now get in an hour with Lyssna. We're able to get a sneak preview of our campaigns' performance before they even go live."
– Aaron Shishler, Copywriter Team Lead at monday.com

FAQs about the Google HEART framework

Do I need to measure all five HEART categories?
minus icon
minus icon
How often should I review HEART metrics?
minus icon
minus icon
Can HEART work for small teams without dedicated researchers?
minus icon
minus icon
How do I get stakeholder buy-in for HEART metrics?
minus icon
minus icon
What's the difference between HEART and other UX frameworks like AARRR?
minus icon
minus icon
How do I handle conflicting metrics?
minus icon
minus icon
Start measuring what matters
minus icon
minus icon
pete-martin.png

Pete Martin

Content writer

Pete Martin is a content writer for a host of B2B SaaS companies, as well as being a contributing writer for Scalerrs, a SaaS SEO agency. Away from the keyboard, he’s an avid reader (history, psychology, biography, and fiction), and a long-suffering Newcastle United fan.

You may also like these articles

Try for free today

Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.

No credit card required

4.5/5 rating
Rating logos