Tree roots representing root causes of data quality issues in the market research industry

Not all panels are created equal. Are we dealing with the symptoms of data quality in the industry rather than the root causes?

Chris Atkins, Managing Director

As I celebrate my 10-year anniversary at Yonder Data Solutions, I’ve been reflecting on my journey from my early days in business development to my current role as Managing Director, and on the evolution of the market research industry over the past decade. When I first joined, I was somewhat disillusioned by the online research and panel landscape. Yet, I firmly believed that online research could still be done right – that quality data could be delivered by focusing on the respondent experience and maintaining a commitment to excellence in both talent and service. From day one, Yonder Data Solutions has embodied this ethos and an unwavering dedication to data quality, and it continues to drive our success today.

Challenging today’s focus

As I look across the research industry today, data quality has never been higher on the agenda. Tales of unreliable data and fraudulent responses are all too common, pushing many in the industry to invest heavily in sophisticated data cleaning technologies and manual review processes. Whilst these tools and processes are absolutely necessary (and ones we employ ourselves), I can’t help but question if the research industry is focusing too much on managing the symptoms of poor data quality rather than addressing its root causes. The real issue lies deeper: the industry’s neglect of the participant and the survey experience, and the subsequent impact on data integrity.

Y Live: Delivering excellence from within

We believe that not all panels are created equal. That’s why we do things differently with our UK panel, Y Live. We place our panel at the heart of our business and take great pride in fostering a panel that truly feels valued and engaged. For us, first-class data begins with respect for respondents. We ensure that every panel member is compensated fairly – above minimum wage – for completing our surveys, which are designed to be both meaningful and considerate of participants’ time and opinions. Our transparent monetary incentives, based on survey duration, provide panellists with clarity on exactly how they are rewarded. And unlike anonymous vouchers or reward cards, we pay our respondents via bank transfer or cheques, verifying that our panel is made up of real people based in the UK.

We know our panel approach works. In a recent UK survey we carried out amongst 18-30 year olds, we found considerably lower quality removal rates across our automated and manual quality checks, compared to samples from our trusted partners on the study.

Panel% quality removals
Y Live panel6%
Panel partners32%

Client and respondent feedback as a testament to quality

One of the strongest indicators of our panel’s quality is the consistent positive feedback we receive from clients regarding the engagement levels of our panel members, particularly in their responses to open-ended questions. While we use AI technology to probe and enrich the data from open-ended questions within our surveys, technology alone isn’t enough. If you’re working with an already disengaged panel, no amount of AI probing will improve the quality of the responses.

We received a lot of positive feedback, and I want to highlight how valuable the open-ended contributions were in adding depth and clarity to the rest of the data. There was also a logical progression in responses as people moved through brand funnels, validated hypotheses, and revealed clear and consistent patterns between subgroups, giving us a greater degree of confidence in the accuracy of what we were working with”.

Simon Thompson, Managing Director, Relish

Our members genuinely enjoy being part of our panel, which is reflected in our high retention and engagement rates – our recontact rates can be as high as 84%! We’re also rated ‘Excellent’ on Trustpilot, and our members’ feedback speaks for itself:

The surveys are interesting and varied. I find they come at a reasonable frequency, so you are not inundated with them. There is, of course, the added bonus of getting paid for your responses”.

Patricia, Y Live member

Beyond technology: Revisiting foundations

The allure of new technologies can sometimes overshadow the fundamental principles of good research. While the latest technological breakthroughs often make headlines, we don’t always give enough attention to crucial elements like high-quality questionnaire design or minimising biases. At Yonder, our investment extends beyond technology. We invest heavily in our skilled team of data collection experts who understand that the best results stem from applying the best practices with rigour and robustness.

Today’s app-based and online tools often overlook important aspects of research, such as representative samples or the inclusion of unheard voices. For example, we’ve found that many of our older respondents prefer to receive payment via cheque rather than bank transfer. So, as an industry, are we fully considering the potential exclusion of certain demographics with app-only panels?

Examining the ecosystem: A call for simplification and transparency

Much of today’s sample ecosystem seems to be shrouded in secrecy, with unclear accountability, ownership, and transparency around the panels being used. The proliferation of routers, apps, and algorithms means that if respondents don’t qualify for one survey, they’re quickly redirected to another. While this may sound efficient, the actual experience is far from ideal. Respondents are often bounced around, filling out multiple sets of screening and demographic questions before they even get the chance to participate in a survey. With rewards as low as 200 points (equivalent to £0.25) for 15 to 20 mins of their time, it’s hardly surprising that respondents aren’t fully engaged by the time they’re completing an actual survey. I’m not suggesting that blending panel sources is the root cause of data quality issues, but it’s worth considering why high-quality respondents don’t make up a higher proportion of the sample ecosystem.

At Y Live, we’ve opted for a simpler approach. Instead of relying on routers, we prioritise human expertise. We understand that getting screened out or quota failed is frustrating for respondents, so our survey managers work diligently to manage sampling and fieldwork, ensuring respondents’ time isn’t wasted. By actively monitoring quotas and adjusting invitation mailouts to meet quota requirements, we achieve representative samples – we see sampling as an underappreciated skill in today’s market.

It seems to me that too much focus is placed on managing the symptoms of poor data quality rather than tackling the root causes. If we dedicated more attention and focus to building engaged panels, rewarding people fairly, and enhancing the respondent experience, we could see a substantial improvement in data quality.

Leading by example

At Yonder Data Solutions, our commitment to delivering high-quality, reliable data is why leading organisations entrust us with their fieldwork and data collection needs. They trust not only in our processes and technology, but believe in our core philosophy – that respecting and valuing respondents leads to better, more reliable insights. In short, good data in = good data out. This focus on the participant experience, combined with our continuous pursuit of innovative methodologies, allows us to address the root causes of data quality issues, not just their symptoms.

If the research industry is to evolve, we must turn our attention back to the people behind the data – the respondents. This shift might mean clients need to pay more for higher-quality samples, but the resulting improvement in quality will undoubtably be clear. While technology will continue to advance, the human aspect of data collection should never be overlooked.