Analytics Layer โ€” See the Full Picture

Last-click attribution is lying about which campaigns work.

A buyer sees your YouTube ad on Monday, clicks a Shopping ad on Wednesday, abandons cart, returns via a retargeting Display ad on Friday, and purchases through a brand Search ad on Saturday. Under last-click attribution, Brand Search gets 100% of the credit. YouTube, Shopping, and Display get zero. That means your reporting says YouTube and Display are worthless โ€” when they actually started the entire journey.

2.7

Avg touchpoints before e-commerce purchase

40%

Revenue value missed by last-click on assist channels

DDA

Data-driven attribution โ€” now default in Google Ads

GA4

Cross-channel path analysis for the full journey

The core problem

Last-click attribution rewards closers and punishes starters.

E-commerce buyers rarely purchase on their first interaction. They discover a product through one channel, research it through another, and buy through a third. Attribution determines which channel gets credit for the sale โ€” and that credit directly impacts where you allocate budget. Get attribution wrong and you starve the campaigns that initiate demand while over-investing in the ones that merely capture it.

A typical Indian e-commerce buyer journey

Mon 0%

YouTube Ad

Discovers product in a video ad. Watches 70%. Does not click.

Wed 0%

Shopping Ad

Searches "buy cotton kurta online". Clicks Shopping ad. Browses 3 products. Leaves.

Fri 0%

Display Retarget

Sees retargeting ad while reading news. Clicks. Adds to cart. Abandons.

Sat 100%

Brand Search

Searches your brand name. Clicks brand ad. Completes purchase. โ‚น2,800.

Under last-click: Brand Search gets โ‚น2,800 credit. YouTube, Shopping, and Display retargeting get โ‚น0.

You cut top-of-funnel

YouTube and Display show zero direct conversions under last-click. They get budget-cut first. But they were creating the demand that Shopping and Brand Search captured. Remove them and the whole funnel shrinks โ€” but the cause is invisible in the report.

Brand Search looks artificially strong

Brand Search claims most conversions at the lowest CPA. It looks like your best campaign. In reality, it is capturing demand created elsewhere. If you doubled Brand Search budget, you would not double conversions โ€” because the bottleneck is upstream, not at the brand query level.

PMax ROAS is inflated

PMax touches multiple surfaces. Under last-click, it absorbs conversions from Display views and YouTube impressions that other campaigns initiated. The "12ร— ROAS" on PMax might be 5ร— when credit is distributed fairly across the journey.

The models

Attribution models: what each one does and when to use it.

Google deprecated most rule-based models (first-click, linear, time-decay, position-based) in 2023 and made data-driven attribution (DDA) the default. That was the correct decision for most accounts โ€” but understanding how the models differ helps you interpret reports and make smarter budget calls.

Data-Driven Attribution (DDA)

Recommended

DDA uses machine learning to analyse all conversion paths in your account and distributes credit based on which touchpoints actually influenced the purchase decision. It assigns fractional credit โ€” so a Shopping click that appears in 80% of conversion paths gets more credit than a Display impression that appears in 20%.

When to use

Any account with enough conversion data (300+ conversions/month is ideal, but Google applies it with less). This is the default in Google Ads and the one I recommend for all e-commerce accounts.

Limitation

It is a black box โ€” Google does not publish the model weights. You can see the credit distribution in reports but not the logic behind it. Cross-channel (e.g., Meta โ†’ Google) attribution is not included.

Last-Click

100% of credit goes to the final click before purchase. Simple to understand but systematically biased toward bottom-of-funnel campaigns (Brand Search, retargeting) and against top-of-funnel (YouTube, Display, non-brand Shopping). Use it only as a secondary comparison view, never as the primary model for budget decisions.

Status: Still available as a reporting comparison. Not recommended for bid optimisation.

First-Click, Linear, Time-Decay, Position-Based

Google deprecated these rule-based models in mid-2023. They were useful for manual analysis โ€” particularly position-based (40% first, 40% last, 20% middle) โ€” but they applied arbitrary fixed weights that did not reflect actual buyer behaviour. Existing conversion actions using these models were automatically migrated to DDA. If you see references to them in older reports or agency proposals, they are no longer applicable.

How I use attribution

Attribution is not a setting you choose once. It is a framework for every budget decision.

Selecting DDA in your conversion action settings is the first step, not the last. Real attribution work means using multiple data views to understand what each campaign contributes to the full purchase journey โ€” and making budget decisions that account for assisted value, not just last-touch credit.

01

Assisted Conversions Report โ€” Monthly Review

The Assisted Conversions report in Google Ads shows how many conversions each campaign assisted (appeared in the path) versus how many it directly converted (was the last click). A campaign with high assist volume and low direct conversions is a demand initiator โ€” cutting its budget will reduce conversions across other campaigns downstream.

Example from a real e-commerce account:

Campaign Direct Conv Assisted Conv Assist Ratio
Brand Search142180.13
Shopping โ€” Tier 189670.75
PMax741121.51
Display Retargeting12957.92
Non-brand Search31541.74

Display Retargeting has an assist ratio of 7.92 โ€” it participates in 8ร— more conversion paths than it directly closes. Judging it by last-click alone would make it look like a failure.

02

GA4 Conversion Paths โ€” Understand the Actual Journey

GA4's Advertising โ†’ Conversion Paths report shows the actual sequence of channels a buyer touched before purchasing. This is the only place you can see the full cross-channel journey โ€” including Meta Ads, organic search, direct, email, and Google Ads together. I review this monthly to identify which channel combinations produce the highest conversion rates and AOV. If "Meta Ads โ†’ Google Shopping โ†’ Brand Search" produces 2ร— higher AOV than "Google Shopping โ†’ Brand Search" alone, that tells me Meta is driving higher-intent buyers into the Google funnel.

03

Incrementality Testing โ€” The Budget Experiment

The ultimate attribution question is: "If I turn this campaign off, how many of its conversions disappear entirely versus shift to another campaign?" This is incrementality testing. I run it on campaigns with questionable attribution โ€” particularly Brand Search (would those buyers have come through organic anyway?) and PMax (how much of its ROAS is genuinely incremental?). The method: pause the campaign for 2โ€“4 weeks in a controlled test, measure the impact on total account conversions, and compare against the credit the campaign was claiming. The gap between claimed credit and actual incremental impact tells you the campaign's true value.

04

Blended ROAS vs Channel ROAS โ€” Two Different Numbers

I report both numbers every month. Channel ROAS is the ROAS per campaign type (Shopping ROAS, PMax ROAS, etc.) โ€” useful for campaign-level optimisation. Blended ROAS is total revenue divided by total ad spend across all campaigns โ€” this is the number that tells you whether your overall Google Ads investment is profitable. A campaign can have a "bad" channel ROAS but improve blended ROAS by feeding conversions into other campaigns. Display retargeting at 2ร— ROAS looks weak in isolation โ€” but if removing it drops blended ROAS from 5.8ร— to 4.2ร—, it is clearly pulling its weight.

What goes wrong

6 Attribution Mistakes That Lead to Bad Budget Decisions

01

Cutting Display because it shows zero last-click conversions

Display retargeting rarely gets the last click โ€” buyers see the ad, remember the product, and return through Search or direct. Its value is almost entirely in assists. Cutting it removes a critical re-engagement touchpoint and causes Shopping and PMax ROAS to drop 2โ€“4 weeks later โ€” but by then, the cause is no longer obvious.

02

Doubling Brand Search budget because it has the highest ROAS

Brand Search ROAS is high because it captures demand created elsewhere. Doubling its budget does not double demand โ€” it just increases impression share on queries you were already winning. The incremental return on extra Brand Search spend is near zero in most accounts.

03

Comparing Google Ads ROAS against Meta ROAS in isolation

Google Ads and Meta have separate attribution systems, different conversion windows, and different counting methodologies. A 5ร— ROAS in Google Ads and a 3ร— ROAS in Meta do not mean Google is better โ€” Meta may be initiating the journey that Google closes. The only fair comparison is in GA4, where both channels appear in the same conversion path.

04

Ignoring view-through conversions entirely

View-through conversions โ€” where someone saw but did not click your ad and later purchased โ€” are over-counted by default. But dismissing them entirely also misses real impact. The right approach: report them separately, look at the view-through-to-click ratio, and discount view-through value by 50โ€“80% rather than counting it at full value or zero.

05

Using different attribution models in Google Ads and GA4

Google Ads uses DDA by default; GA4 also defaults to DDA but with a different model and data set. If you report from both platforms without noting the model difference, you will get conflicting numbers for the same campaigns. I standardise reporting to use Google Ads DDA for campaign-level decisions and GA4 for cross-channel analysis.

06

Never testing incrementality

Attribution models estimate credit distribution. Incrementality tests measure actual impact. Without ever running a pause test or geo-lift experiment, you are making six-figure budget decisions based entirely on modelled estimates. I run at least one incrementality test per quarter on the campaign with the most questionable attribution โ€” usually Brand Search or PMax.

What is included

Attribution analysis is part of every management engagement.

DDA setup & verification

Confirm all conversion actions use DDA. Set appropriate conversion windows based on your actual purchase cycle length.

Monthly attribution report

Assisted conversions, conversion path analysis, blended vs channel ROAS, and assist ratio trends per campaign type.

GA4 cross-channel paths

Monthly review of top conversion paths including Meta, organic, direct, and email alongside Google Ads campaigns.

Quarterly incrementality tests

Pause tests or geo-lift experiments on campaigns with the most questionable attribution โ€” typically Brand Search or PMax.

Looker Studio attribution dashboard

Live dashboard with blended ROAS, channel ROAS, assist ratios, and conversion lag data โ€” updated daily.

Budget reallocation guidance

Attribution data translated into specific budget recommendations โ€” which campaigns to scale, reduce, or test further.

Common questions

Attribution FAQ

My Google Ads already uses data-driven attribution. Do I need to do anything else?

DDA in the conversion settings is the starting point, not the finish line. You also need to review the Assisted Conversions report monthly (most people never open it), analyse GA4 conversion paths for cross-channel insights, set conversion windows that match your buying cycle (the default 30-day window may not fit your business), and run incrementality tests quarterly to validate what the model tells you. The attribution model distributes credit โ€” you still need to interpret what the distribution means for budget decisions.

How do I know if a campaign is truly driving incremental sales?

The only way to truly know is an incrementality test. Pause the campaign for 2โ€“4 weeks and measure the impact on total account conversions. If conversions drop by the number that campaign was claiming, it is genuinely incremental. If total conversions barely change, those conversions were going to happen anyway through other channels. I run this quarterly โ€” typically on Brand Search first (the most common culprit of over-attribution) and then on PMax.

Should I look at Google Ads attribution or GA4 attribution?

Both โ€” they serve different purposes. Use Google Ads DDA for campaign-level bidding and budget decisions within the Google Ads platform. Use GA4 for cross-channel analysis โ€” understanding how Google Ads, Meta Ads, organic search, email, and direct traffic work together in conversion paths. The numbers will not match exactly because the models and data sources differ. That is normal. The insight comes from understanding both views, not from trying to reconcile them to a single number.

My Meta Ads show higher ROAS than Google Ads. Does that mean Meta is better?

Not necessarily. Meta and Google use different attribution systems with different counting windows and methodologies. Meta defaults to a 7-day click / 1-day view window and claims view-through conversions aggressively. Google Ads DDA distributes credit across the click path. Comparing them directly is like comparing different rulers โ€” the measurements are not in the same units. The fair comparison happens in GA4, where both channels appear in the same conversion path report with the same attribution model applied. That is where I determine the actual relative contribution of each channel.

How does attribution affect my PMax ROAS numbers?

Significantly. PMax touches multiple surfaces โ€” Shopping, Display, YouTube, Search, Gmail, Discover โ€” and absorbs credit from all of them. Under DDA, PMax gets fractional credit from every path it participated in, which tends to be generous because PMax impressions show up in a large percentage of conversion paths. Add brand query cannibalisation (if brand exclusions are not active) and view-through conversions (included by default), and PMax ROAS can be inflated by 30โ€“60%. The first thing I do with PMax attribution is separate brand-excluded ROAS, click-only ROAS, and assisted ROAS โ€” three different views that together show the real picture.

Free attribution check

Are You Making Budget Decisions on Bad Attribution Data?

I will review your attribution setup, assisted conversion patterns, and PMax credit distribution โ€” and tell you which campaigns are over-credited and which are under-valued.