Skip to main content

Thread Transfer

ASC vs Manual Campaigns: A $100k Test Breakdown

Everyone has opinions on ASC vs manual. We have data. $100k across 90 days revealed surprising patterns about when each approach wins.

Jorgo Bardho

Founder, Meta Ads Audit

May 11, 202515 min read
meta adsASCmanual campaignscampaign comparisonad testingROAS
Side-by-side performance comparison of ASC vs manual campaigns

Everyone has opinions about Advantage+ Shopping Campaigns vs manual campaigns. Agency owners swear by ASC. Performance marketers defend manual control. Meta's reps push automation. But opinions aren't data. So we ran a controlled test: $100,000 in ad spend over 90 days, split between ASC and manual campaigns with identical products, creative, and objectives. Here's exactly what happened.

The Test Setup

We designed this test to eliminate as many variables as possible:

Test Parameters

ParameterValue
Total spend$100,000
Duration90 days
Budget split50/50 ($50k each)
AccountE-commerce (fashion), $80 AOV
Historic performance500+ purchases/month, mature pixel
CreativeIdentical assets across both campaign types
ProductsSame catalog, same product sets
GeographyUnited States only

ASC Configuration

  • Existing customer cap: 20%
  • All Advantage+ features enabled (placements, creative, audience)
  • 10 creative concepts uploaded
  • Dynamic catalog ads enabled
  • Optimization: Purchases

Manual Campaign Configuration

  • 5 ad sets: Lookalike 1%, Lookalike 3%, Broad, Interest-based, Retargeting
  • Manual placements: Feed + Stories only
  • Advantage+ Creative disabled
  • Same 10 creative concepts
  • Dynamic catalog ads enabled
  • Optimization: Purchases

Overall Results

After 90 days and $100k in spend, here's how the two approaches performed:

MetricASCManualWinnerDifference
Total Spend$50,000$50,000--
Purchases1,8471,623ASC+13.8%
CPA$27.07$30.81ASC-12.1%
ROAS3.42x3.18xASC+7.5%
Revenue$171,000$159,000ASC+7.5%
CPM$9.82$12.45ASC-21.1%
CTR1.24%1.51%Manual+21.8%
Conversion Rate2.31%2.67%Manual+15.6%

Top-line result: ASC won on efficiency metrics (CPA, ROAS, volume). Manual won on quality metrics (CTR, conversion rate). ASC delivered 224 more purchases at $3.74 lower CPA, generating $12,000 more revenue on identical spend.

Week-by-Week Performance

The aggregate numbers don't tell the whole story. Performance evolved significantly over the 90-day period:

Weeks 1-2: Manual Led

Manual campaigns started stronger. With established audiences and proven creative, manual delivered $26.40 CPA vs ASC's $34.80. ASC was in learning phase, exploring broadly and burning budget on low-intent users.

Weeks 1-2ASCManual
Spend$11,100$11,100
Purchases319420
CPA$34.80$26.40

Weeks 3-6: ASC Caught Up

ASC exited learning phase around day 18. Performance improved dramatically as the algorithm found its footing. By week 6, ASC matched manual efficiency while delivering more volume.

Weeks 3-6ASCManual
Spend$22,200$22,200
Purchases756698
CPA$29.37$31.80

Weeks 7-12: ASC Pulled Ahead

In the final six weeks, ASC's advantage compounded. The algorithm had learned the account thoroughly, finding pockets of efficiency that manual targeting couldn't match. Manual campaigns showed signs of audience fatigue (rising frequency, declining CTR).

Weeks 7-12ASCManual
Spend$16,700$16,700
Purchases772505
CPA$21.63$33.07

New vs Existing Customer Analysis

We tracked whether conversions came from new or existing customers:

ASC Customer Breakdown

  • New customers: 78% of purchases
  • Existing customers: 22% of purchases
  • New customer CPA: $29.85
  • Existing customer CPA: $17.32

Manual Customer Breakdown

  • New customers: 71% of purchases
  • Existing customers: 29% of purchases
  • New customer CPA: $34.40
  • Existing customer CPA: $21.55

The existing customer cap in ASC worked. Despite unlimited access to existing customers, ASC stayed close to the 20% cap while manual campaigns spent 29% on existing customers without any restriction.

Key insight: ASC delivered more new customers at a lower cost than manual campaigns. The $29.85 new customer CPA beat manual's $34.40—a 13% advantage.

Creative Performance Comparison

We uploaded identical creative to both campaign types. Here's how the same assets performed:

Top-Performing Creative

The best-performing creative concept (lifestyle video showing product in use) performed as follows:

MetricASCManual
Spend$8,400$6,200
CPA$22.10$24.80
Share of total spend16.8%12.4%

ASC allocated more budget to the winner faster. The algorithm identified the top performer and scaled spend aggressively, while manual campaigns distributed budget more evenly across creative.

Creative Variation Testing

ASC with Advantage+ Creative enabled generated AI variations of our uploads. Several variations outperformed originals:

  • Text overlay variation: 18% lower CPA than original
  • Aspect ratio adaptation (9:16 for Reels): 12% lower CPA in Stories/Reels
  • Auto-brightness adjustment: 8% higher CTR

Manual campaigns used only our original creative with no AI variations. This partly explains ASC's efficiency advantage—it found optimizations we didn't create.

Placement Distribution

ASC distributed budget across all placements automatically. Manual campaigns used only Feed and Stories:

ASC Placement Breakdown

PlacementSpendCPA
Facebook Feed32%$28.40
Instagram Feed28%$25.60
Instagram Stories18%$27.20
Instagram Reels12%$24.80
Facebook Reels6%$31.40
Audience Network4%$34.60

ASC kept Audience Network spend low (4%) despite having access. Instagram Reels at 12% delivered the best CPA—a placement we excluded from manual campaigns.

What We Learned

Learning 1: ASC Needs Time

Manual campaigns started stronger because they didn't need to learn. If we'd judged ASC after 2 weeks, we'd have concluded it failed. The 90-day timeframe revealed ASC's true potential. Advertisers who give up on ASC after 2-3 weeks miss the long-term advantage.

Learning 2: ASC Scales Better

In weeks 7-12, manual campaigns showed fatigue. Frequency increased, CTR decreased, CPA rose. ASC continued finding fresh audiences. For ongoing campaigns, ASC's exploration advantage compounds over time.

Learning 3: Existing Customer Cap Works

We worried ASC would over-index on existing customers. The 20% cap held. Manual campaigns—without any cap— actually spent more on existing customers (29%). The cap is an effective guardrail.

Learning 4: Creative Automation Adds Value

Advantage+ Creative's variations contributed meaningfully to ASC's advantage. The text overlay and aspect ratio adaptations improved performance in ways we wouldn't have tested manually.

Learning 5: ASC Isn't Universally Better

This test used a fashion e-commerce account with broad appeal, mature pixel data, and strong creative library. These conditions favor ASC. A niche B2B account or new advertiser without conversion history might see different results.

When to Use ASC vs Manual

Based on this test and our broader experience, here's our recommendation:

Use ASC When:

  • You have 50+ weekly conversions (mature pixel data)
  • Your product appeals broadly across demographics
  • You have diverse creative assets (5+ concepts)
  • You're running for 4+ weeks (time to exit learning)
  • You want to reduce manual optimization overhead
  • You're hitting scale ceilings with manual campaigns

Use Manual When:

  • You're a new account without conversion history
  • You're targeting niche B2B audiences
  • You need specific placement control (brand safety, etc.)
  • You have limited creative (1-2 assets)
  • You're running short-term campaigns (under 2 weeks)
  • You need granular testing with controlled variables

The Hybrid Approach

For many advertisers, the optimal setup combines both:

  • ASC (60-70% of budget): Broad prospecting, scale, and ongoing optimization
  • Manual (30-40% of budget): Specific audience tests, retargeting, and experiments

Use audience exclusions to prevent overlap. ASC handles the bulk of prospecting while manual campaigns test hypotheses and reach segments ASC might miss.

Replication Guide

If you want to run your own ASC vs manual test, here's how:

Test Design

  1. Commit to at least 4 weeks (8+ is better for statistical significance)
  2. Allocate equal budget to each approach
  3. Use identical creative across both campaign types
  4. Set a meaningful existing customer cap in ASC (15-25%)
  5. Track both in-platform metrics and downstream metrics (LTV, returns)

What to Measure

  • CPA (overall and by customer type)
  • ROAS (in-platform and actual)
  • New customer acquisition rate
  • Creative performance by variant
  • Placement distribution and efficiency
  • Performance trajectory over time

Common Pitfalls

  • Ending test too early (before ASC exits learning)
  • Using different creative across campaign types
  • Ignoring new vs existing customer breakdown
  • Comparing blended ROAS without accounting for customer type

Key Takeaways

  • ASC outperformed manual campaigns: 12% lower CPA, 14% more conversions, 8% higher ROAS
  • Manual won early (weeks 1-2), ASC caught up and pulled ahead (weeks 3+)
  • ASC delivered better new customer acquisition at lower cost
  • Existing customer cap (20%) effectively limited retargeting spend
  • Advantage+ Creative variations contributed meaningfully to performance
  • Results favor accounts with mature data, broad appeal, and diverse creative
  • The hybrid approach—ASC for scale, manual for testing—works for many advertisers

FAQ

Would results differ for a smaller budget?

Possibly. ASC needs conversion volume to learn effectively. With $500/month instead of $50k/month, learning takes longer and results may be less conclusive. The principles hold, but timeline extends.

Should I switch entirely to ASC?

Not necessarily. Our hybrid recommendation (60-70% ASC, 30-40% manual) gives you ASC's efficiency while maintaining manual campaigns for specific tests and segments. Complete consolidation may leave you blind to opportunities ASC doesn't pursue.

How do I know if ASC isn't working for my account?

Give it 4+ weeks and 100+ conversions. If CPA is still 20%+ above your manual baseline after learning phase, ASC may not fit your account. Check if the problem is existing customer over-spend (adjust cap) or poor placement distribution (can't fix in ASC) before concluding.

Does this apply to lead gen, not just e-commerce?

ASC is designed for e-commerce (Sales objective). For lead gen, you're using standard campaigns with Advantage+ features, not ASC specifically. The Advantage+ Audience and Creative learnings apply, but the full ASC automation is commerce-focused.

What's the minimum budget for ASC?

Budget enough for 50+ weekly conversions at your target CPA. If your target CPA is $30, that's $1,500/week minimum ($214/day). Below this threshold, learning phase extends indefinitely and data is too noisy for meaningful optimization.