Why 92% of Marketers are Missing Out On Incrementality Testing (and what it's costing them)

Article
Why 92% of marketers are missing out on incrementality testing (and what it's costing them)

You're in that budget meeting again. Leadership wants proof that your marketing drives real results and not just pretty charts showing correlation. You've got dashboards full of attribution data, but everyone's thinking the same thing: "Would customers have bought anyway?"

Supermetrics’ 2025 Marketing Data Report uncovered this: 41% of marketers can't measure ROI effectively, while only 8% use incrementality testing—one of the best ways to prove that marketing actually works. 

That leaves 92% of teams without a reliable method to show their true impact.

The measurement crisis that’s costing marketers credibility

The problem isn't a lack of data. The problem is proving your marketing actually caused the results you're measuring. This creates a cascade of issues that most marketing teams are dealing with. 

Here's what we found when we surveyed marketers:

  • 41% struggle to measure ROI accurately
  • 63% consider ROI their most important metric
  • 30% say proving ROI is marketing's main job

CTA: The 2025 Marketing Data Report

Read this guide to learn the trends, challenges, and opportunities for marketing measurement.


Download the free report → 

The attribution trap that’s misleading teams

Most teams rely on marketing attribution models that show correlation, not causation. They can tell you what happened after someone saw your ad, but not whether your ad made it happen.

With typical attribution, someone searches for your brand, clicks your ad, and buys. Your dashboard credits the ad with the full conversion.

But that person was already looking for you. They probably found your website anyway, with or without the ad.

This attribution bias creates serious problems that impact every aspect of your marketing:

Immediate decision-making problems

  • You think ads are more effective than they actually are
  • You optimize based on misleading data
  • You put more budget toward channels that aren't driving new business

Organizational consequences:

  • Budget discussions become battles: Without proof of causation, every budget meeting turns into a negotiation where you're defending spend based on correlation
  • Team confidence drops: When you can't show what's definitely working, your team starts questioning whether their work matters
  • Leadership trust erodes: Executives who can't see clear evidence of ROI start viewing marketing as a cost center rather than a growth driver, leading to budget cuts and reduced strategic influence.

Why Marketing Mix Modeling (MMM) isn’t practical for most budgets

47% of marketing leaders plan to invest in MMM next year. That’s more than any other measurement method.

However, MMM has major limitations for smaller marketing budgets. It requires substantial data volume and complexity that most teams don't have. 

Zach Bricker, our Lead Solutions Engineer, is direct about this: "If you're spending $50,000 a year on marketing, you have no business doing an MMM. None whatsoever. You don't have the granularity and volume of data required for the model to perform accurately."

MMM works when you have:

  • Sufficient budget scale with at least $2 million in annual ad spend for meaningful analysis
  • Extensive historical data with detailed performance data
  • A diverse channel mix with multiple marketing channels running simultaneously
  • Data richness, where you can obtain granular data across different channels and not just the same digital channels for years. 
  • Resource investment for significant implementations and ongoing management costs 

What marketing incrementality testing shows you

Incrementality testing is a methodology for isolating a marketing campaign's true causal impact. It answers the fundamental question: "How many incremental sales do individual channels or activities bring?"

Here’s how it works in practice. 

You split your audience into two groups: 

  • Test group - sees your marketing
  • Control group - sees nothing. 

The difference in behavior between groups shows what your marketing actually accomplished. It's essentially an A/B test in which group B gets zero marketing. 

Think of incrementality testing like a medical study. Just like doctors use placebos to prove a treatment works, you need a control group to prove your marketing works. Without this comparison, you're measuring correlation, not causation.

Take the Barbie movie example. Millions of people saw it after massive marketing campaigns. But did they go because of the ads, or because they already loved Barbie and planned to see it regardless?

Traditional attribution would credit the marketing for all those ticket sales. Incrementality testing would only count the additional sales that happened because of the marketing — the truly incremental impact.

The benefits and realities of incrementality testing

Incrementality testing offers significant advantages, making it the "gold standard of causality." However, incrementality testing isn't perfect. 

Benefits of incremental testing

  • Gold standard of causality: Unlike attribution, it proves what actually caused conversions
  • Validates assumptions: Tests whether your marketing beliefs align with reality
  • Drives confident decision-making: Removes guesswork from budget allocation

Limitations to understand:

  • Requires a significant amount of data: You need enough volume for statistically meaningful results
  • Difficult to scale: Can't test everything simultaneously without interfering with results
  • Point-in-time limited: Results apply to specific conditions, markets, and timeframes

Understanding these limitations helps you test strategically. 

For example, results from testing TikTok ads in the UK during September provide strong directional insights, but may not apply universally to other markets or seasons.

How to start incrementality testing

You don't need a massive budget or complex setup to begin testing incremental impact. Here's a practical approach:

Get your team on board first

Most teams avoid incrementality testing because they don't understand it. Build support by:

  • Showing how it's different from attribution tracking
  • Sharing examples of insights other methods miss
  • Running small pilot tests to demonstrate value
  • Getting leadership to support the approach

Pick your first test carefully

Start with a channel where you suspect people might convert anyway. Good options include:

  • Brand search campaigns
  • Retargeting to existing customers
  • Campaigns for well-known products
  • Markets where you already have strong brand awareness

You may also include channels you haven't tested yet but have unknown incrementality potential:

  • New advertising platforms you're considering
  • Untested audience segments
  • Different creative formats or messaging approaches
  • Geographic markets you haven't entered

The first category helps you identify where attribution might be overstating impact. The second category enables you to discover new growth opportunities where incrementality could be high.

Decide what success looks like

Be clear about what incremental lift means for your business:

  • Higher sales in the test group vs. the control group
  • More new customers acquired
  • Increased average order values
  • Better long-term customer value

Plan your testing schedule

Don't test everything simultaneously. Build a plan for the next 30-90 days:

  • Start with your biggest spending channels
  • Focus on campaigns you're not sure about
  • Test initiatives where you need to prove ROI to leadership

How to sell incrementality testing to leadership

When you're proposing this to executives, focus on business outcomes they care about:

  • "We'll finally prove marketing's real ROI." This should be positioned as solving the attribution problem that's been making budget discussions difficult.
  • "We'll spend our budget more effectively." Explain how understanding incremental impact helps you spend money on channels that drive new business.
  • "We'll make decisions faster and with more confidence." Show how clearer measurement speeds up optimization and reduces second-guessing on strategy.
  • "We'll have an advantage over competitors." Point out that most companies aren't measuring incrementality, so you'll have insights they don't.

Why this matters now

The 8% of marketers using incrementality testing have something the other 92% don't: confidence in their measurement data. They know what's actually working and can prove it to leadership.

They're not guessing which channels drive incremental value, defending their budget based on correlation, or making optimization decisions based on what truly moves the business forward.

The question isn't whether you can afford to start incrementality testing. It's whether you can continue making decisions without knowing what your marketing actually accomplishes.


Get the complete picture with The 2025 Marketing Data Report

This post covers one insight from our comprehensive research into marketing measurement challenges. Download the full report to see what else we discovered about data strategy, ROI measurement, and how top-performing teams approach marketing analytics.

Download the free report →

Want more like this?

Want more like this?

Insight delivered to your inbox

Keep up to date with our free email. Hand picked whitepapers and posts from our blog, as well as exclusive videos and webinar invitations keep our Users one step ahead.

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

side image splash

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy