Performance Max is powerful.
Without governance, it is expensive.
PMax gives Google's automation access to every ad surface at once โ Search, Shopping, Display, YouTube, Gmail, Discover, Maps. For e-commerce, that scale is genuinely valuable. The problem is that without proper structure, it also gives Google permission to spend your budget anywhere, on anything, with minimal transparency.
6.4ร
Avg PMax ROAS (brand-excluded)
100%
Accounts get brand exclusions
38%
Avg wasted PMax spend recovered
Weekly
Search term & placement review
Results vary by category, product margins, and competition.
The transparency problem
Google designed PMax to be a black box. I design governance around it.
Performance Max is the only campaign type in Google Ads where you cannot see which keywords triggered your ads, cannot choose which ad surfaces your budget goes to, and cannot control whether Google spends on Search versus Display versus YouTube. That opacity is by design โ Google wants full algorithmic control. My job is to add structure and monitoring around PMax so you get the benefits of automation without the blind spots.
Brand cannibalisation
PMax absorbs branded search queries by default โ inflating its own ROAS while making your Brand Search campaign look weak. Without exclusions, you are paying PMax CPCs for queries that should cost โน2โโน5 in Brand Search.
Generic asset groups
One asset group with generic headlines and one audience signal for 500+ products. Google has no clarity on which product category maps to which creative or audience โ so it guesses. Poorly.
No placement visibility
By default, PMax can spend your budget on children's gaming apps, low-quality Display Network sites, and Gmail placements with near-zero conversion probability. Without exclusions, you will never know.
Weak audience signals
Google says audience signals are "hints, not hard targeting." True โ but weak hints (broad interest categories, no Customer Match data) mean Google's starting point is almost random. Strong signals cut the learning phase in half.
Inflated conversion data
PMax includes view-through conversions by default โ someone saw a Display ad and bought later. That conversion may have happened anyway. Without isolating view-through from click-through, ROAS numbers are inflated by 15โ30% in most accounts.
No margin-based targeting
PMax optimises for total conversion value โ not profit. It will happily push budget toward high-revenue, low-margin products because the algorithm cannot see your margins unless you segment by product tier.
The governance framework
How I structure Performance Max for e-commerce clients.
Governed PMax means every decision Google's automation makes is informed by your product data, constrained by your brand rules, and measurable against your actual margin targets. Here are the five structural pillars I implement on every account.
Brand Exclusions โ Non-negotiable on every account
Brand exclusions prevent PMax from serving ads on your own brand-name searches. Without them, PMax takes credit for conversions that would have come through your Brand Search campaign at a fraction of the CPC. I have audited accounts where 35โ50% of PMax conversions were brand queries โ once excluded, the "real" PMax ROAS dropped from 8ร to 3.5ร.
Google introduced account-level brand exclusions for PMax in 2023. I apply them on day one. This single change is the fastest way to get an honest performance picture from any PMax campaign.
Asset Group Architecture โ One group per product category
Each asset group contains headlines, descriptions, images, and audience signals that are specific to one product category. A women's ethnic wear brand has separate asset groups for sarees, kurtis, and lehengas โ not one generic asset group that says "Shop our collection" across everything.
Headlines
Category-specific. "Banarasi Silk Sarees" not "Shop Now." 15 headlines per group, testing long-tail product terms.
Images
Product photography from that category โ not generic lifestyle shots. Min 5 images per group, landscape + square ratios.
Listing Groups
Product feed filtered to only include SKUs from that category. Prevents mixing product types within an asset group.
Audience Signals โ First-party data, not Google's guesses
Audience signals tell Google's algorithm where to start looking for buyers. The stronger your signals, the faster PMax finds profitable audiences instead of spending weeks testing random Display placements.
I layer three signal types on every PMax campaign:
Customer Match lists
Email lists of past purchasers, segmented by purchase recency and order value. This is the highest-quality signal available โ Google builds lookalike audiences from your actual customers.
Remarketing audiences
Site visitors segmented by behaviour: product page viewers, cart abandoners, checkout starters. Each segment has different purchase intent and different ROAS expectations.
Custom intent segments
Search terms and competitor URLs that define your ideal buyer's intent. "Buy organic cotton kurta online" or visitors to competitor product pages. These cold-audience signals expand reach beyond your existing customer base.
Placement & Content Exclusions
PMax can serve your ads on any Google-owned property and its entire Display partner network. That includes children's apps, gaming apps, parked domains, and low-quality sites. I apply account-level placement exclusions before PMax launches: mobile gaming apps, children's content, sensitive content categories, and any placement that historical data shows has zero conversion probability. This typically eliminates 10โ20% of Display spend that was generating impressions but no revenue.
Search Term & Insights Extraction
Google's PMax search term report is limited โ it shows search categories rather than individual terms, and only surfaces high-volume queries. But it is still usable. I review it monthly to surface irrelevant query patterns and add them as account-level negative keywords. Combined with the Insights tab (which shows rising search trends and audience shifts), this gives me enough signal to steer the campaign even without full keyword-level transparency.
PMax vs Standard Shopping
When to use PMax and when to use Standard Shopping.
This is not an either/or decision. Most well-managed e-commerce accounts run both โ the question is which products go where, and how budget splits between them.
| Factor | Performance Max | Standard Shopping |
|---|---|---|
| Best for | Broad discovery + cross-channel reach | Granular control + query-level visibility |
| Search term visibility | Limited โ categories, not individual terms | Full โ every query visible in reports |
| Ad surfaces | All โ Search, Shopping, Display, YouTube, Gmail, Discover | Shopping tab + Search results only |
| Negative keywords | Account-level only (cannot add at campaign level) | Full control โ campaign + ad group level |
| Data requirement | Needs 30+ conversions/month to learn effectively | Works well with lower conversion volumes |
| Budget threshold | โน1L+/month for meaningful signal | โน50K+/month viable |
| My recommendation | Add after Shopping has conversion history | Start here โ build data, then layer PMax |
What I check in every PMax audit
The 12-point PMax audit I run before changing anything.
Brand exclusions active?
If not, PMax ROAS is inflated by brand queries.
Asset group segmentation
One generic group or category-specific groups?
Audience signal quality
Customer Match lists, remarketing, or just broad interests?
Listing group structure
All products in one group or filtered by category/margin?
Search term category report
Any irrelevant categories consuming >5% of spend?
Placement report review
Gaming apps, kids content, or junk Display sites in the mix?
View-through vs click conversions
What percentage of PMax conversions are view-through?
Asset performance ratings
How many assets are "Low" and need replacing?
Conversion value accuracy
Do PMax purchase values match GA4 and payment gateway?
Budget vs conversion volume
Is PMax getting enough conversions to exit learning?
PMax vs Shopping overlap
Are both campaign types bidding on the same products?
tROAS target vs actual margin
Is the tROAS set from margin data or from guesswork?
What goes wrong
7 Performance Max Mistakes I Fix in Almost Every Audit
Launching PMax as the first campaign
PMax needs conversion data to optimise. On a fresh account with zero history, it has no signal to learn from. Start with Standard Shopping, build 30+ conversions per month, then layer PMax on top.
Using Maximise Conversions instead of tROAS
Maximise Conversions tells Google to get the most purchases regardless of value. It will happily drive 100 orders worth โน200 each instead of 30 orders worth โน3,000. For e-commerce, tROAS is almost always the correct bid strategy.
Not uploading Customer Match lists
Customer Match is the strongest audience signal available. A list of 1,000+ past purchasers gives Google a clear picture of who your buyer looks like. Without it, PMax spends weeks running Display impressions on people who will never buy.
Leaving all products in one listing group
One listing group means your hero 55%-margin product and your 12%-margin clearance item compete for the same budget equally. Segment by custom label (margin tier) so PMax pushes spend toward profitable products, not just popular ones.
Ignoring asset performance ratings
Google rates every asset as "Best," "Good," or "Low." Assets rated "Low" for 30+ days are actively hurting performance โ they reduce your ad combinations and Quality Score. I review and replace low-performing assets monthly.
Setting tROAS too high on launch
An aggressive tROAS on a new PMax campaign restricts spend so much that it cannot exit the learning phase. I start 15โ20% below the target, let PMax accumulate data for 2โ3 weeks, then gradually tighten the target toward the real margin-based goal.
Judging PMax performance before the learning phase ends
PMax needs 6โ8 weeks of stable data to fully optimise. Changing budgets, tROAS targets, or asset groups during weeks 1โ3 resets the learning phase. I set expectations with every client before launch: the first 3 weeks are data collection, not performance evaluation. Real optimisation decisions start in week 4.
Go deeper
Related E-commerce Ads Guides
E-commerce Google Ads Overview
Full-funnel management โ Shopping, PMax, and Search built around your actual margins.
Campaign Strategy
Structure, budget allocation, margin-based ROAS targets, and scaling roadmap.
Search for E-commerce
When Search campaigns add value alongside Shopping โ category and competitor targeting.
Measurement & Tracking
GA4, GTM, enhanced conversions โ verifying your data foundation.
Attribution Models
Why last-click misleads e-commerce decisions and which models work better.
Full Campaign Management
Done-for-you Google Ads โ setup, weekly optimisation, and transparent reporting.
Common questions
Performance Max FAQ
Should I replace all my Shopping campaigns with PMax?
No โ and I actively advise against it. Running Standard Shopping alongside PMax gives you a control group with full search term visibility. If PMax performance drops, you have a fallback that keeps revenue flowing while you diagnose the issue. My typical split is 30โ40% of budget in Standard Shopping and 25โ35% in PMax, with the rest across Brand Search, non-brand Search, and retargeting.
How long does PMax take to optimise?
Google officially says 2 weeks for the initial learning phase, but in practice, PMax needs 6โ8 weeks of stable data to fully optimise. The first 2โ3 weeks are signal collection โ ROAS during this period is unreliable. Weeks 3โ6 show early trends. By week 6โ8, you have enough data to make meaningful optimisation decisions. I tell every client: do not judge PMax performance before week 4, and do not change budgets or tROAS during weeks 1โ3.
Can you see what keywords PMax is bidding on?
Not individual keywords โ Google shows search categories and themes rather than specific terms. However, using third-party scripts and the PMax Insights tab, I can extract enough data to identify broad patterns: which product categories are driving queries, what percentage of traffic is brand vs non-brand (critical for measuring true PMax value), and which search themes are converting. It is not full transparency, but it is enough to make informed negative keyword decisions at account level.
My PMax ROAS is 8ร โ why should I change anything?
Two questions to ask: Are brand exclusions active? And what percentage of conversions are view-through? If brand exclusions are off, PMax is counting cheap brand queries as its own wins. If 30% of conversions are view-through, a significant chunk may have happened anyway without PMax. I have audited multiple accounts where 8ร "PMax ROAS" dropped to 3.5โ4ร once brand queries were excluded and view-through conversions were separated. That 4ร number is the real one โ and it might still be excellent. But you need to know the actual number to set correct tROAS targets.
What minimum budget does PMax need to work for e-commerce?
PMax needs 30+ conversions per month per campaign to exit the learning phase and deliver stable results. For most Indian e-commerce stores with average order values between โน1,000โโน5,000, that means PMax needs at least โน1,00,000/month in dedicated budget (not shared with other campaign types). Below โน1L, Smart Bidding data is too thin and PMax will oscillate between learning phases without ever stabilising. At lower budgets, Standard Shopping is a better use of the money.
Free for e-commerce stores
Get a Free PMax Audit
I will check your brand exclusions, asset group structure, audience signals, placement quality, and conversion accuracy โ and tell you exactly what your PMax ROAS looks like with the noise removed.