17 Apr 2026
|8 min
Growth UXR: What it means and why it matters
Zoe Glas, UX research manager at Google, shares her perspective on what it takes to do research that the business actually listens to.

There's a version of UX research that feels rigorous, thorough, and completely invisible to the people who matter. Researchers gather the data, write up the findings, present the deck – and then watch as nothing changes. The work was good. So why didn't it land?
Zoe Glas has spent a lot of time thinking about this gap. As a UX research manager at Google overseeing a team of privacy and security researchers, her work is as technically complex as it gets. But the challenge she keeps coming back to isn't complexity – it's connection. And it's the challenge that growth UXR is designed to solve.
"The number one thing that will always protect researchers is if they make money," she says. The clearer you can draw a line between your research and the decisions that drive the business forward, the more irreplaceable you become.
Read on for more, or watch the full conversation with Zoe below.
What is growth UXR?
Growth UXR is the practice of directly connecting UX research to business outcomes – targeting increases to the business by tying research work to what gets built, what gets launched, and what succeeds.
It's not a different kind of research so much as a different orientation. The methods might look the same. What changes is how you position the work, how you communicate it, and how clearly you can connect it to a business decision someone actually made.
In Zoe's words, growth is about "how do we not only conduct the right research to get to those answers, but how do we position ourselves and how do we discuss what we're doing such that those answers are adopted" – so that stakeholders point back to your team and say that was the research that made the difference.
This matters more now than it ever has. As AI takes on more of the mechanical work of research – transcription, recruiting, note synthesis – the parts that can't be automated are business acumen and user psychology. Growth UXR is how researchers make themselves central to both.
Adapting your communication to your audience
One of the most practical shifts Zoe describes is learning to adapt how you communicate research depending on who's in the room. The question isn't what they need to know – it's what decisions they need to make.
For a VP or senior stakeholder, the conversation should be about strategy at a high level: does this direction make sense, and what would need to be true for it to work? "You should be recommending specifically what that outcome is," she says. "Not the data says this and therefore, but 'we should do this.' Then a little bit of the data is included there."
That's a very different conversation from the one you'd have with a product manager or designer who works with you every day. With them, you can and should go deep – into edge cases, risks, and the nuances of what the research revealed. Zoe's example: the fact that a download button turns blue when it shouldn't might matter a great deal to your PM and have real implications for how users experience the product. Your VP doesn't need that level of detail, and presenting it to them doesn't serve the decision they're trying to make.
This isn't about simplifying your work – it's about meeting people at the level of decision they're actually responsible for.
The difference between data and findings in UX research
One of the most common mistakes Zoe sees – especially from researchers who are earlier in their careers – is presenting data when what the stakeholder needed was findings.
"Data is something that you collect," she says. "So and so said this. This is where it's going." Analysis gets you one step further: what does this mean for this user? But findings are something else – the synthesis of what that evidence means for the direction the team is heading, where the risks are, and what needs to happen next.
"The second that I see in a deck, six out of 12 people said this, I'm like, this deck is worthless. You've given me data. You haven't made yourself part of that strategic conversation."
When research doesn't get acted on, it's rarely because stakeholders are dismissive. It's usually because they didn't know how to take the findings and build them into the conversations already happening. That's not their failure – it's a communication gap the researcher can close.
Minimum viable research: Adapting to a faster world
AI isn't just changing what researchers do – it's changing the pace at which the teams around them work. Designers can prototype overnight. Engineers can ship faster. If research is still running on a traditional, drawn-out process, it becomes a bottleneck rather than an asset.
Zoe's answer is minimum viable research: returning to hypothesis testing and asking: what's the smallest amount of research needed to validate whether something works. "What are we actually trying to accomplish and what is the minimum viable amount of research I need to see if it successfully addresses that. And then only do that piece."
This doesn't mean foundational research disappears. If anything, Zoe argues it becomes more important – particularly at the start and end of a project. But in the middle ground, where teams are iterating quickly, a slow and traditional research process can get in the way of exactly the business impact growth UXR is trying to create.
What researchers often get wrong about empathy
One of the sharpest things Zoe says is also one of the most counterintuitive: empathy, as it's typically framed in research, has been oversold.
Not because it isn't valuable – it is, as a tool for communicating and landing findings. But when researchers are told that empathy is their job, or that they're responsible for how others feel about users, it sets them up to fail. "It is not fair to be responsible for someone else's emotions," she says.
There's a practical problem too. When researchers lean on empathy without understanding the business context, they can end up inadvertently dismissing the role of everyone else on the team. The PM who launched a feature you didn't like isn't unempathetic – they had different metrics, different pressures, and a different set of tradeoffs. Empathy without that context doesn't land as insight. It lands as friction.
The takeaway is to take responsibility for what's reasonably within your control – and to make sure the strategic case for your findings is strong enough to stand on its own.
What skills will matter most for UX researchers as AI takes over?
The picture Zoe paints of the near future is one where AI handles more of the work researchers used to spend most of their time on. Transcription, recruiting, note-taking – all of it getting faster, more automated, more assisted.
What that leaves is time. And the question is what you do with it.
"If you're not spending 10 hours transcribing your notes, if you're not spending 20 hours handling your recruiting, that is time that you now have." She's not worried about the profession shrinking as a result. She's more interested in what researchers do with the space that opens up – whether they use it to get closer to the business, to develop sharper judgment, and to make the human parts of the work count more.
Business acumen. User psychology. Strategic communication. Those are the skills that won't be automated. And they're the skills that growth UXR is built on.

Diane Leyman
Senior Content Marketing Manager
Diane Leyman is the Senior Content Marketing Manager at Lyssna. She brings extensive experience in content strategy and management within the SaaS industry, along with editorial and content roles in publishing and the not-for-profit sector
You may also like these articles


Try for free today
Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.
No credit card required


