The Number That Means Nothing
Open any Amazon advertising dashboard and you will see a single ACOS number across all your campaigns. Sponsored Products, Sponsored Brands, Sponsored Display, all rolled into one figure. It looks authoritative. It is not.
Reporting a blended ACOS across SP, SB, and SD is like averaging the speed of a bicycle, a car, and an airplane. You will get a number. That number will be mathematically correct. And it will tell you absolutely nothing useful about how any of those vehicles is performing.
SP, SB, and SD have different attribution windows, different strategic purposes, different benchmark ranges, and different definitions of what counts as a "sale." When you blend them, you get a metric that cannot be compared to any individual benchmark and masks problems in every channel simultaneously.
This post breaks down each ad type, explains why their metrics are not interchangeable, and identifies the five most common analysis errors sellers make by ignoring the differences.
Sponsored Products: The Conversion Engine
Sponsored Products is where most Amazon ad dollars go, and for good reason. SP ads appear in search results and on product detail pages, targeted by keywords or ASINs. They reach shoppers at the moment of highest purchase intent: the search.
Attribution: 14-Day Click Only
This is the critical detail. A sale is attributed to an SP ad only if the customer clicked the ad and then purchased within 14 days. No view-through credit. No passive exposure. If the shopper saw your ad but did not click, SP records nothing.
This makes SP attribution the most conservative and most trustworthy measurement Amazon offers. When SP says a sale happened, a customer actually clicked your ad and then bought your product. There is a direct, traceable chain from ad impression to click to purchase.
Conversion efficiency. SP is the workhorse. Most mature accounts allocate 60-80% of their total ad budget to Sponsored Products because it directly captures purchase intent.
Benchmarks to Know
- CTR: 0.2-0.5% for typical search placements. Below 0.2% means your main image, title, or price is not competitive on the results page.
- CVR: 8-15% for most categories. Below 8% indicates your product detail page is not closing the sale. Above 15% means your targeting is bringing highly qualified traffic.
- ACOS target: Equal to your gross margin percentage (your breakeven point). A 35% ACOS is excellent at 50% margins and catastrophic at 20% margins.
What SP Cannot Do
It cannot reach customers who are not actively searching. It cannot show video in search results (that is Sponsored Brands Video). And it does not report New-to-Brand metrics in standard campaign reports.
SP is the foundation. Everything else builds on top of it.
Sponsored Brands: Brand Building and New Customer Acquisition
Sponsored Brands ads appear at the top of search results, above all SP placements. They show a brand logo, a custom headline, and multiple products or a link to your Amazon Store. SB requires Brand Registry -- you cannot run these campaigns without it.
The dominant position matters. SB is the first thing a shopper sees when they search. It introduces your brand as a whole, not just a single product.
Attribution: 14-Day Click + 14-Day View-Through
Here is where the comparison to SP breaks down. SB attribution includes a view-through component: if a customer is shown your SB ad, does not click it, but purchases within 14 days, that sale can be attributed to the SB campaign.
The customer may have seen your banner at the top of search, scrolled past it, and bought your product a week later through an organic search. SB claims that sale. SP would not.
New-to-Brand Metrics: The Real SB Measurement
New-to-Brand (NTB) metrics are the most important SB measurement, and they are only available on SB and SD campaigns (not SP in standard reporting). NTB tracks what percentage of orders came from customers who had not purchased from your brand in the past 12 months.
NTB is what justifies SB's naturally higher ACOS. If you are spending more per sale on SB but 60% of those sales are from brand-new customers, you are paying for growth. That is a fundamentally different investment than SP, which mostly captures existing demand.
When NTB% drops below 20% on SB campaigns, your brand ads are mostly recapturing existing customers. You are paying SB premiums -- higher CPCs, prime banner placement -- to reach people who already know your brand and would likely have found you through organic search or SP anyway. At that point, SB is an expensive way to do what SP does more cheaply.
SB Benchmarks Compared to SP
- ACOS runs naturally higher than SP on the same keywords. Do not penalize SB for this.
- CTR is typically lower than SP (banner format versus in-line search result).
- NTB% above 50% is a strong signal that SB is doing its job.
Sponsored Brands Video: A Separate Animal
Sponsored Brands Video (SBV) lives under the SB umbrella in Amazon's campaign structure, but it deserves entirely separate tracking. Treating SBV the same as static SB is one of the five analysis errors covered below.
Format and Attribution
Format: Autoplay video (muted by default) that appears in the search results feed, in the same row as SP ads. It is not a banner. It is an in-feed video that captures attention through motion as shoppers scroll.
Attribution: 14-day click, with minimal view-through. This is a critical difference from static SB. SBV attribution behaves much more like SP than like its parent campaign type. When you see SBV performance numbers, they are closer to "real" click-driven attribution than the static SB numbers sitting next to them in the same report.
Amazon Science research found that brands running SBV alongside static Sponsored ads saw +25% higher CTR and +10% higher year-over-year sales growth. The video format earns attention that static banners do not.
Billing and Best Practice
Billing: CPC only. You pay when someone clicks, not when they view. This is different from display video formats that charge on impression.
Best practice: Product demos, problem-solution narratives, and lifestyle content work best. The autoplay format rewards visual motion that interrupts the scroll. Static product shots in a video ad waste the format's advantage.
The bottom line: SBV reports as Sponsored Brands in your campaign data, but its attribution window, CTR profile, and conversion behavior are closer to SP. If you lump SBV and static SB together, you are averaging two very different performers and learning nothing about either.
Sponsored Display: Retargeting and Audience Reach
Sponsored Display is the most misunderstood ad type on Amazon, and the one most likely to produce misleading metrics if you do not understand its attribution.
SD is not a search ad. It is closer to programmatic display advertising. SD ads appear on product detail pages (yours and competitors'), in some search placements, and off-Amazon entirely through the Amazon Publisher Network. Instead of targeting keywords, SD targets audiences: in-market shoppers, lifestyle segments, retargeting pools, or specific product pages.
Attribution: 14-Day Click + 14-Day View-Through (Most Aggressive)
SD's view-through attribution is the most aggressive in the Amazon Ads ecosystem. A customer who was served an SD ad anywhere -- even on a third-party website -- and later purchases within 14 days can have that purchase attributed to SD, even without ever clicking the ad.
A shopper visits a recipe blog. Your SD ad loads in a sidebar widget. The shopper never notices it, never clicks it. A week later, they search for your product on Amazon and buy it through an organic search result. SD claims that sale.
Why SD ROAS of 8x Probably Is Not What It Looks Like
SD dashboards often show impressive ROAS numbers. An 8x ROAS sounds extraordinary. But that number almost certainly includes a large share of view-through attributed sales -- purchases that would have happened anyway, made by customers who were already in your buying cycle and happened to be served an SD impression somewhere along the way.
When someone shows you an SD ROAS figure, your first question should be: "Does that include view-through?"
SD Benchmarks (Fundamentally Different From SP)
- CTR: 0.05-0.15%. This is not a problem to fix. Display ads are not intent-based; lower CTR is structural to the format.
- CVR at click: Lower than SP because the traffic is less intent-driven.
- ROAS: Looks inflated due to view-through. Interpret with significant caution.
- Retargeting SD (reaching people who viewed your product page but did not buy) typically outperforms prospecting SD by a wide margin.
Best Use Cases for SD
- Retargeting your product page viewers who did not purchase. These shoppers already expressed interest. This is the highest-quality SD audience.
- Competitor product page conquesting. Show your ad on rival listings. Expect lower CVR than retargeting.
- Cross-selling. Show complementary products to recent buyers of related items.
- Category awareness. Reach in-market audiences before they search your specific keywords.
SD should never be evaluated against the same ACOS target as SP. It is a different tool for a different job.
The Attribution Comparison: Why These Numbers Cannot Be Blended
Here is the core of the problem in one table:
| Ad Type | Click Attribution | View-Through Attribution | Data Settled After |
|---|---|---|---|
| Sponsored Products | 14 days | None | 14 days |
| Sponsored Brands (static) | 14 days | 14 days (some placements) | 14 days |
| Sponsored Brands Video | 14 days | Minimal | 14 days |
| Sponsored Display | 14 days | 14 days (most aggressive) | 14 days |
SP counts only sales that followed a click. SB counts some sales where the customer just saw the ad. SD counts sales where the customer was passively served an impression they may never have noticed.
When you average ACOS across all three, you are combining a strict click-based metric with two progressively looser attribution models. The resulting number cannot be benchmarked against SP ranges (too generous), SB ranges (wrong mix), or SD ranges (too strict). It sits in a no-man's-land that is meaningless for decision-making.
Budget Allocation: Where the Money Should Go
A starting framework for a mature account:
| Ad Type | Budget Share | Primary Goal |
|---|---|---|
| Sponsored Products | 60-80% | Conversion efficiency, sales volume |
| Sponsored Brands | 10-25% | Brand awareness, NTB acquisition, search dominance |
| Sponsored Display | 5-15% | Retargeting, competitor conquesting, cross-sell |
For new accounts or new product launches: Concentrate 85-95% of budget in SP until conversion rates and keyword performance are established. SB and SD require a baseline of SP performance data to be meaningful. You cannot optimize brand awareness campaigns if you do not yet know which keywords convert. You cannot retarget product page viewers if you do not yet have enough traffic to build a retargeting pool.
Once SP performance is stable and you have at least 30-60 days of keyword and conversion data, begin layering in SB (starting with your highest-volume keywords) and SD (starting with retargeting your own product page viewers).
Five Analysis Errors That Ruin Your Reporting
1. Reporting Blended ACOS as "The ACOS"
A single blended number across SP, SB, and SD cannot be compared to any benchmark. SP, SB, and SD all have different natural efficiency ranges. A blended ACOS of 28% could mean SP is at 22% (healthy), SB is at 40% (possibly healthy depending on NTB%), and SD is at 55% (normal for display). Or it could mean all three are struggling equally. The blended number tells you nothing about which interpretation is correct.
Fix: Always report ACOS by ad type. If you must report a blended number, label it explicitly as blended and include the per-type breakdown alongside it.
2. Flagging SD "High ACOS" as a Problem
SD is not a conversion channel. Evaluating it on ACOS is like grading a billboard on how many people walked into the store while looking at it. SD's job is awareness, retargeting, and audience reach. A 60% ACOS on SD retargeting is not necessarily bad if the campaign is recovering otherwise-lost shoppers.
Fix: Evaluate SD on click-attributed ROAS (not total ROAS), retargeting conversion rate, and incremental sales lift. Compare SD to its own benchmarks, not to SP benchmarks.
3. Crediting SB With Strong ROAS Without Noting View-Through
An SB campaign showing a 5x ROAS looks strong. But if a meaningful share of that revenue comes from view-through attribution -- customers who saw the banner and bought later without clicking -- the actual ad-driven ROAS is lower. You may be crediting SB with sales that organic search, SP campaigns, or direct navigation actually drove.
Fix: Always note whether SB metrics include view-through attributed sales. Where possible, compare click-only ROAS to total ROAS to understand the gap.
4. Not Separating SBV From Static SB
Sponsored Brands Video and static Sponsored Brands have different CTR expectations, different conversion behavior, different cost structures, and different attribution profiles. SBV attribution is closer to SP (click-dominant), while static SB includes more view-through. Blending them produces averages that misrepresent both formats.
Fix: Track SBV campaigns separately from static SB. In naming conventions, use distinct prefixes (e.g., SBV_ versus SB_). Report their metrics in separate rows.
5. Ignoring NTB% on SB Campaigns
An SB campaign with a 15% New-to-Brand rate is mostly recapturing existing customers. You are paying the premium of top-of-search brand placement to reach people who already buy from you. SP does this more cheaply. If NTB% is consistently below 20%, your SB campaigns are not delivering on their primary strategic value: acquiring new customers.
Fix: Track NTB% on every SB campaign. Set a minimum NTB% threshold (50% is a good target for growth-oriented campaigns). If NTB% falls below your threshold, examine targeting -- you may be over-indexing on branded keywords or too-narrow audiences.
The ad types in Amazon Ads exist for different reasons, measure success differently, and operate on different attribution models. Treating them as interchangeable and collapsing their metrics into a single number is the most common analytical mistake in Amazon advertising.
- Compare SP to SP benchmarks. Click-only attribution makes it the most trustworthy efficiency metric.
- Compare SB to SB benchmarks with NTB context. Higher ACOS is justified when NTB% is strong.
- Compare SD to SD benchmarks with view-through noted. Click-attributed ROAS is the only reliable efficiency number.
- Track SBV separately from static SB. They behave like different ad types despite sharing a campaign category.
- Always report by ad type first. Blended numbers are acceptable only as a supplement with full per-type breakdowns alongside.
The sellers who get this right do not necessarily spend more. They spend with clarity about what each dollar is doing, which channel is doing it, and whether the measurement system is even counting the same thing. That clarity is the difference between optimizing and guessing.