Researcher analysing participant data during online qualitative research session.

Data Quality in Market Research: Lessons from the UK Benchmark

Over the past two years, data quality in market research has shifted from background concern to industry-wide priority.

Reports of fake respondents, disengaged panellists and AI-generated survey entries have exposed just how fragile data integrity can be when quality checks fail.

In response, the Market Research Society (MRS), the Association for Qualitative Research (AQR), ESOMAR, and the Insights Association joined forces to launch the Global Data Quality Initiative (GDQ) – a coordinated effort to set shared benchmarks and governance frameworks that strengthen trust in research data worldwide.

The initiative’s first audit, Wave 0, examined the quality of online quantitative surveys conducted in the United States and found that nearly 40 percent of responses showed signs of inattention, disengagement or potential fraud.

Building on that foundation, Wave 1 expanded the study to more than 70 countries – including the UK, Canada, Australia, Japan, and key European markets – to benchmark how data quality varies by region, methodology, and supplier type.

With the publication of the UK Wave 1 Benchmark, researchers can now see where the UK sits in the global picture and which parts of the process are most vulnerable to quality issues.

What the Numbers Actually Show

Wave 1 analysed over 280,000 UK survey records collected in the first half of 2025.
It compared data gathered directly by research agencies with data supplied through independent or third-party sample suppliers (organisations that provide respondent access for online or hybrid studies).
While the dataset included results from more than twenty markets, the UK’s performance stood out for several reasons.

UK Data Quality Benchmarking Report

Headline UK results:

  • Overall removals – Research agencies excluded 4.5 % of respondents for quality or fraud concerns, compared with 11.1 % among suppliers.* 
  • Post-survey clean-outs – Around 5–6 % of completed responses were removed for inattentive or low-quality behaviour.*
  • B2C vs B2B – Consumer (B2C) samples showed removal rates well below global averages (6.6 % vs 13.1 %), whereas business (B2B) samples aligned more closely with international norms.*
  • Encryption use – 99% of UK agencies employ link encryption or server-to-server validation, one of the highest rates globally.*
  • Abandonment – Only 6.7 % of agency respondents dropped out mid-survey, compared with 14.9 % for suppliers. 

*(Insights Association, UK Data Quality Benchmarking Report, Wave 1, 2025)

On the surface, that paints a positive picture: UK agencies appear to be collecting cleaner, more secure data than the global average.

But the global benchmark itself – an aggregation of results from North America, Europe and Asia – still shows significant variation in detection methods and thresholds, so direct comparisons should be treated cautiously.

Beneath the Surface: Optimism or Oversight?

Lower removal rates can indicate genuinely better quality, but they can also signal less aggressive detection of poor behaviour.

Because every agency defines “fraudulent” or “low-quality” slightly differently, benchmarks depend heavily on internal processes and tools.

In Wave 1, supplier data showed more than twice the removal rate of agency data.

That suggests a continuing gap in how quality controls are applied along the supply chain, especially where multi-layer outsourcing is involved.

Suppliers often aggregate respondents from numerous sources, and each hand-off introduces risk if vetting standards differ.

Three key interpretations emerge:

  1. Structural strength – The UK’s strong encryption standards and use of fraud-detection technology are paying off.
    Angelfish, for instance, follows MRS and AQR quality frameworks, applying multi-stage participant checks before any recruit reaches a session.
  2. Supplier disparity – The higher removal rate among suppliers points to ongoing vulnerabilities in outsourced sample provision. Clearer accountability and auditability are needed throughout the recruitment chain.
  3. Soft fraud risk – Even when overt fraud is removed, subtle disengagement – participants clicking through too quickly, providing shallow answers, or failing attention checks – can still erode data quality. These behaviours often evade automated detection and require human oversight to catch.

As a recent Research Live podcast put it, “Governance, not goodwill, is what ensures data integrity.”
Wave 1 shows that UK infrastructure is strong, but the governance of supplier networks still needs tightening.

How Brands Are Responding

Across the industry, concern has evolved into expectation.

In Research Live’s feature What Do Brands Think About Data Quality?” (2025), client-side researchers reported that they now expect proof of quality, not just promises.

An open letter published by MRS and industry leaders echoed that call, urging agencies and suppliers to collaborate rather than compete on quality standards (Research Live, 2025).

Another article, Data Quality Issues on the Rise, noted that governance gaps persist even among seasoned suppliers, a warning that policy alone is not enough.

For brands, data quality is no longer a differentiator; it is the minimum requirement. For agencies, credibility now depends on being transparent about how integrity is achieved from recruitment to analysis.

Magnifying glass reviewing market research data fraud

The Qualitative Challenge: Where “Good Enough” Isn’t

While Wave 1 primarily assessed quantitative data, its findings carry clear implications for qualitative research.
Authenticity – the tone, emotion and depth that define qualitative insight – cannot be guaranteed by automation alone.

Four pressures continue to shape data quality in qualitative fieldwork:

  1. Price pressure and procurement-first thinking – When recruitment is treated as a commodity, timelines and budgets squeeze validation efforts.
  2. Over-reliance on automation – Fraud-detection software helps, but AI-generated or semi-automated responses are becoming increasingly sophisticated.
  3. Participant fatigue – Over-surveyed audiences, low incentives and impersonal onboarding reduce engagement before fieldwork even begins.
  4. Opaque recruitment chains – Multi-tier supplier models can make it difficult to trace where a participant originated or how they were screened (MRS Data Integrity Report, 2024).

At Angelfish Fieldwork, we see these factors every day, and know that the solution lies in human-centred recruitment.

Personal communication, active validation and transparent sourcing create the trust that automated systems alone can’t replicate.

Why the UK Benchmark Matters

The UK has long been recognised for its rigorous research standards.
MRS and ESOMAR codes of conduct are widely adopted, and Wave 1 confirms that UK agencies outperform many markets on encryption and validation.
But benchmarks are not just a scorecard, they are a call to action.

If low removal rates genuinely reflect higher-quality respondents, the industry should double down on what works: transparent sourcing, early participant engagement and stronger screening design.

If, however, those figures hint at under-detection, researchers need to invest in behavioural and linguistic validation, for example, by:

  • Monitoring response speed and consistency during screeners or interviews,
  • Analysing open-ended language for signs of repetition or non-human phrasing,
  • Combining digital fingerprinting with manual review where anomalies appear.

Either way, Wave 1 gives the UK a chance to define what good data really looks like, backed by evidence rather than assumption.

Market research ensuring data quality in qualitative research

Angelfish Fieldwork’s Perspective: Beyond Compliance

At Angelfish Fieldwork, quality begins with people, not platforms.
We follow a simple principle: every great project starts with the right participants.

We:

  • Verify each participant’s identity and suitability using a blend of manual and digital checks before fieldwork begins.
  • Maintain transparency in every recruitment chain so clients always know the source of their respondents.
  • Engage participants personally to confirm motivation and relevance, ensuring genuine interest in the topic.
  • Gather feedback after each project to refine and improve our recruitment approach.

In practice, that means data integrity isn’t an isolated checkpoint, it’s woven through every stage of recruitment.

“Wave 1 gave us the benchmark; now it’s up to agencies like ours to raise it.”
Lisa Boughton, Director, Angelfish Fieldwork

What the Industry Should Take Away

Although Wave 1 focused mainly on quantitative data, its insights are invaluable for qualitative specialists.
They show where structural strengths exist and where governance, engagement and supplier transparency must improve.

Four priorities emerge for the UK research community:

  1. Recognise that quality is structural. It must be built into recruitment and design, not audited afterwards.
  2. Strengthen governance. MRS, AQR and ESOMAR are driving frameworks, but responsibility also sits with every agency commissioning or supplying respondents.
  3. Prioritise participant experience. Engaged participants deliver richer, more accurate insights.
  4. Collaborate on standards. Sharing best practice openly benefits the entire research ecosystem.

Looking Ahead

The Global Data Quality Initiative has indicated that a Wave 2 benchmark is planned to track progress and extend testing into qualitative and hybrid methodologies (Insights Association, 2025).
Whether or not those timelines hold, the direction is clear: data quality measurement will become continual, not occasional.

For qualitative recruitment, this is the moment to prove that human-centred approaches deliver not only more meaningful stories but also cleaner, more dependable data.

Because if we can’t trust the people behind our data, we can’t trust the strategies built on it.

Want to learn more about data quality in market research?

Explore our Market Research Data Quality page or revisit our earlier blog, How Fieldwork Agencies Can Strengthen Data Quality.

Looking for a partner that takes respondent integrity seriously?

Discover how we recruit authentically on our Participant Recruitment page.

Let's Talk

Related Articles

synthetic respondent in market research

Synthetic Respondents in Market Research...

Market research is all about understanding real people—how they think, feel, and behave. But what if...
Angelfish Fieldwork have signed the Global Data Quality Pledge

Why Use a Fieldwork Agency? Trust, Trans...

What does ‘quality’ really mean in qualitative research today? It’s a deceptively simple question bu...
people discussing market research participant recruitment

5 Common Challenges in Recruitment of Re...

Recruiting the right market research participants is like baking a cake. Get the ingredients right, ...