Research synthesis should be the exciting part – that moment when scattered data points suddenly connect into clear, actionable insights. But for many teams, it’s become a slog through spreadsheets, sticky notes, and endless hours of manual organization.

Data from our 2025 Research Synthesis Report, surveying 300 practitioners globally, reveals that 60.3% cite time-consuming manual work as their biggest synthesis frustration. It’s not just a minor inconvenience – it's the top barrier preventing teams from getting maximum value from their research efforts.

But here’s what’s interesting: while most teams complete their synthesis in 1–5 days (65.3%), those days are filled with manual tasks that could be streamlined. The teams that have figured this out aren’t working longer – they're working smarter.

The manual work bottleneck: What's actually eating your time

When we asked practitioners which synthesis activities take the most time, the results painted a clear picture of where the bottlenecks are.

Reading through data and responses tops the list, with 59% of practitioners reporting this as time-intensive. Close behind is organizing findings into a coherent structure (57.3%) and identifying patterns and themes (55%).

What’s revealing is the distinction between data processing and output creation. While nearly 6 in 10 researchers struggle with the initial data review, fewer find the final presentation phase burdensome. This suggests the real efficiency opportunity lies in those early analytical stages, less so in creating the final deliverable.

The consistency of these findings across our sample suggests these aren’t isolated team issues – they represent fundamental challenges in how research synthesis typically works.

Who feels the pain most

Not all roles experience manual work frustrations equally. Our data reveals some interesting patterns in who struggles most with time-consuming synthesis tasks.

Designers are disproportionately affected by manual work challenges. While they represent 40.3% of our sample, they make up 42% of those citing manual work as a frustration. This makes sense given that designers often find themselves conducting research as part of broader product development work, despite not necessarily having specialized research training or dedicated time to allocate to research.

Marketing and customer insights professionals show a different synthesis pattern. A striking 35% spend more than five days on synthesis, which is higher than typical patterns. This extended timeline likely reflects the different types of data they work with, often combining user research with market analysis, customer insights, and business metrics.

Even dedicated UX researchers aren’t immune to manual work frustrations, representing 9.4% of those citing this pain point despite making up only 8.3% of our sample. This suggests that specialized training alone doesn’t solve the manual work challenge – the issue runs deeper into available tools and processes.

The ripple effects of manual-heavy workflows

When teams get bogged down in manual work, the impacts extend far beyond just taking longer to complete synthesis. Our research indicates several downstream effects.

Quality concerns emerge when teams feel rushed through manual processes. A third of practitioners (33%) worry about ensuring objectivity and reducing personal bias in their synthesis – something that no doubt becomes more acute when you’re pressed for time.

The volume challenge adds another layer of complexity. Nearly half of participants (46.3%) struggle with synthesizing large volumes of data, and manual processing likely intensifies this cognitive burden.

Perhaps most critically, 39.3% of practitioners find it difficult to translate insights into actionable recommendations. When teams exhaust their mental energy on tasks like data processing, less remains for the strategic thinking needed to transform findings into business impact.

How successful teams are streamlining manual work

Despite these challenges, some teams have found ways to work more efficiently. Three distinct approaches are emerging among practitioners who have managed to reduce manual synthesis burdens.

AI-assisted processing gains momentum

The most striking development is the rapid adoption of AI-assisted analysis, now used by the 54.7% of practitioners in our sample. This isn’t unquestioning trust in automation – teams are being strategic about where they apply AI assistance.

Among AI users, 82.9% rely on it for generating summaries of key findings, effectively automating the most time-consuming manual task. Pattern identification follows at 61%, showing growing trust in AI’s ability to spot themes across large datasets.

However, teams maintain human oversight for strategic interpretation. Only 47.6% trust AI to translate insights into recommendations, suggesting a clear boundary: AI handles data processing, humans make meaning from the data.

Collaborative efficiency approaches

Team collaboration remains equally popular (55% use team debriefs), often working alongside AI assistance rather than competing with it. The most common approach is small team collaboration (41%), which allows for focused discussion without the coordination overhead of larger groups.

Many teams are adopting mixed approaches (18%) that combine individual AI-assisted analysis with team validation sessions. This hybrid model appears to offer the best of both worlds: efficiency from automation and quality assurance from human collaboration.

Strategic tool integration 

Beyond AI and collaboration, teams are being selective about their overall tool usage. Our data shows that practitioners are moving away from trying to do everything manually, but they’re not adopting tools indiscriminately either.

The focus appears to be on tools that address the specific bottlenecks we identified: data processing, organization, and pattern recognition – rather than trying to automate the entire synthesis process.

Quick wins to reduce manual synthesis work

Based on patterns from the results in our sample, several practical approaches on how to reduce the burden of manual synthesis work emerge.

Start with summary generation

Since 82.9% of AI users trust automated summaries, this represents the lowest-risk, highest-impact entry point for reducing manual work. 

Identify patterns collaboratively

Rather than struggling to identify themes on your own, run focused group sessions to quickly identify patterns across data sources.

This is something UX research leader Odette Jansen is doing – and it's having an impact. She brings stakeholders, designers, and product people together to sort through findings and make sense of them together, mapping themes, identifying opportunities, linking insights to impact, and brainstorming solutions.

Create reusable organizational templates

Since 57.3% of practitioners find organizing findings into structure time-consuming, develop standardized templates for common research types. This can help reduce repetitive organizational work.

Focus on human interpretation

Let automation and templates handle the mechanical work, freeing up your time to make sense of findings and turn them into actionable next steps.

The confidence payoff

Here's the encouraging news: despite frustrations with manual work, teams maintain strong confidence in their synthesis processes. An impressive 97% express at least moderate confidence that their process captures important insights, with 52% feeling very confident.

This suggests the issue isn’t capability – the challenge is efficiency. The frustration with manual work doesn’t stem from the inability to do good synthesis, but from the time-consuming nature of the manual tasks involved.

This points to a clear opportunity: if teams can reduce the manual burden, they can focus more of their expertise on the interpretation that creates real business value.

Looking ahead: Working smarter, not longer

When we asked practitioners what would most improve their synthesis process, the response was telling. Better tools topped the list at 31%, while more time for synthesis ranked much lower at 17%.

This reveals an important insight: teams don’t want to spend more time on synthesis – they want to spend their existing time more effectively.

The practitioners we surveyed consistently emphasize that the future lies in hybrid approaches. AI and automation handle the mechanical aspects of data processing, while humans focus on interpretation, context, and strategic application.

As one participant aptly put it: 

“I think we will need to show the value of a human doing this with the HELP of AI and not as something AI can solely do on its own.”

The path forward

Manual work will always be part of research synthesis – the human element of interpreting user needs and business context remains irreplaceable. But the mechanical aspects of data processing don’t have to dominate the process.

The teams already succeeding with streamlined synthesis aren’t working longer hours (we hope!) or cutting corners on quality. They’re being thoughtful about where they apply human expertise versus where they leverage automation and structured approaches.

Whether you’re drowning in spreadsheets or looking to optimize an already-functional process, the data indicates that small changes in how you handle manual work can create significant improvements in both efficiency and outcomes.

Ready to experience synthesis that focuses on insights, not manual work? Lyssna’s AI-powered Synthesize feature automatically generates summaries and identifies themes from your research data, letting you spend more time on strategic interpretation and less time on mechanical processing. Try it free and join the 54.7% of teams already using AI to enhance their research practice.

You may also like these articles

Try for free today

Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.

No credit card required

4.5/5 rating