How Ooma uses the Lyssna research panel to get valuable feedback for making design decisions

We speak with Kevin Dyck and Jennifer Livaudais at Ooma.

Summary

See how Ooma harnessed Lyssna's research panel to recruit quality participants and gain quick, objective insights, shaping their design decisions.

Ooma, a telecommunications company, offers a range of communications solutions for both residential and business customers via its smart, cloud-based software service platform.

With a focus on VoIP phone systems, internet security, and smart home devices, Ooma delivers reliable and affordable services tailored to small to mid-market businesses, enterprises, and home users.

Ooma case study

Ooma’s lean, fast-moving design team is dedicated to ensuring users have a seamless and intuitive experience with all their products and services. They work in two-week sprints to stay in sync with the development team, focusing on gathering feedback and make quick iterations. 

We had the opportunity to meet with Kevin Dyck, Manager of UX & Product Design, and Jennifer Livaudais, Senior Product Designer and UX Researcher, to discuss the critical role that a user research platform can play in gathering quick results to validate design decisions.

Challenge

When discussing the challenges they faced, Kevin and Jennifer highlighted three key factors they sought in a UX research platform.

Fast, objective insights

“We're not looking for subjective opinions or thought processes. What I want is, when people see an interface, what is their gut reaction to how to solve a navigation problem?” shared Kevin. 

Always a highly-prioritized factor when it comes to testing platforms, Kevin spoke about wanting the ability to validate design decisions with objective data to help settle internal debates, aid in stakeholder management, and remove subjective opinions from problem-solving.

“Fast objective data is what I want out of a tool”

Kevin Dyck

Manager of UX & Product Design at Ooma

ooma case study

Assess platform intuitiveness 

Another key factor that Kevin highlighted was the ability to assess their platform’s intuitiveness, “Where a user is presented with something that they've never seen before. How do they intuitively understand what that thing does and where they should click?

“And if they have to think about it for more than a couple of seconds, then we didn't do it right."

High-quality participants

Jennifer noted that the previous tool the Ooma design team used unfortunately yielded "really low-quality respondents who seemed to have low technical proficiency or understanding." 

The concern about panelists lacking basic software knowledge had her questioning the panel screening process and quality measures. This was the catalyst to start searching for a better alternative.

Ready to transform your business? Start now!

Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.

Solution

After previously using Lyssna at a different company, Kevin and the Ooma design team made the decision to transition away from Useberry and adopt Lyssna.

This shift allowed the team to obtain fast and objective insights, aligning with the design team’s two-week sprint schedule. Kevin and his team aim to conduct at least one test every sprint and have found that Lyssna has made it easier to meet that cadence.

“From the time that we get the [internal] request to formulating a proper test, recruiting, and running the test, the turnaround time using Lyssna is really, really quick. That's the biggest benefit to me,” shares Kevin.

“Lyssna is an easy tool for quick user tests”

Jennifer Livaudais

Senior Product Designer and UX Researcher at Ooma

Ooma case study

In addition to the fast turnaround times, Jennifer values the high-quality participants she recruits through the Lyssna panel. The assurance that Lyssna's QA team checks and grades each response provides confidence that their results accurately reflect how the target audience feels.

We offer a quality guarantee on our research panel. If a response appears low quality, you can simply delete it and you'll receive a new response free of charge. Reported responses get a close review from our QA team and we take relevant actions with the panelist. This can include removing them from the panel after a warning.

In fact, these quality measures start from the very beginning, with each Lyssna panelist undergoing a manual review upon sign-up and receiving a quality score. Panelists who receive multiple poor scores are barred from further testing, ensuring the integrity of the panel. 

Jennifer also shares that they've used most of the different methodologies in Lyssna, however they typically favor prototype and preference testing. She calls out the user-friendly functionality of the prototype test. 

“With platforms like Userlytics, you have to guide the user during the prototype test. For example, 'when you're done, click next'. But with Lyssna, you can have different start and goal screens so that they [the participants] know when they've reached the end. And a lot of platforms don't do that.”

“When I have a complex or multi-part prototype, being able to customize what I want users to look at and when is really helpful.”

Jennifer Livaudais

Senior Product Designer and UX Researcher at Ooma

Results

In addition to Lyssna's user-friendly interface and rapid results, Kevin and Jennifer like the way the results are formatted. Jennifer explains that it allows them to “Go back to the team, present the results, and show what our users said across various criteria. Then, based on the results, make a recommendation.”

“We use metrics like time on task, heatmaps, and success-fail metrics. I find the heatmaps to be really interesting.”

Kevin Dyck

Manager of UX & Product Design at Ooma

Ooma case study

The adoption of Lyssna has yielded many insights for Ooma's team. One example that Kevin shared involved settling an internal debate within the team regarding the choice of colors for their user interface. 

Initially, they considered sticking closely to their original brand colors, but decided to test alternative options. Kevin adds, “We were brainstorming a bunch of different ideas and the option I thought would come in third place won by a fantastic margin.”

Ooma case study

“It was preferred more than all of the other color options combined, which was really, really illuminating.

“We would have never been able to make the case for that particular design choice without the results we got from Lyssna.”

Try for free today

Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.

No credit card required