Yerba Mate Ecomm – Meta Ads


ROAS: 1.4x
Revenue: $154,000
Ad spend: $110,000
Period: 1 month

Background

E-commerce brand selling yerba mate products positioned as premium.

Product pricing was significantly higher than competitors, without a clear functional or qualitative advantage to justify the price difference.

The project was short-term and intensive:

  • 1-month collaboration
  • very high daily budgets
  • focus on aggressive creative and media testing

Business Context & Challenge

Key challenges were business-side, not technical:

  • product price ~3× higher than competitors
  • low repeat purchase potential
  • weak product-market fit at scale
  • profitability already below target before takeover

At this point, paid ads were expected to solve a business problem, not just scale demand.

Budget & Scale

  • Monthly ad spend: ~$110,000
  • Daily spend: ~$3,000–3,500
  • Period: 1 month

This required fast decision-making and rapid testing, not slow optimization cycles.

Creative Strategy (Main Focus of the Project)

Because of scale and price resistance, the strategy focused on creative volume and angle testing.

  • ~120 static ads prepared at launch
  • multiple video and animated formats
  • heavy rotation to avoid creative fatigue

This case was designed as a creative stress test, not a “find one winner” setup.

Creative Angles Tested

Five main messaging angles were tested in parallel:

  1. Weight loss
  2. Energy boost
  3. Health & wellbeing
  4. Yerba mate vs coffee
  5. Lifestyle / daily habit

Each angle was treated as a separate hypothesis.

Campaign Structure


Main Creative Testing Campaign

  • Single large testing campaign
  • Broad targeting
  • Majority of daily budget allocated here

For each creative angle:

  • separate ad sets
  • two main formats tested:
    • 4:5 format with direct link to website
    • 1:1 format opening the post first, then linking to the website from the caption

This allowed testing not only creatives, but also click behavior and user intent quality.

Additional Campaigns (Attribution & Bidding Tests)

Two additional campaigns were run in parallel using existing proven creatives:

  • Attribution window changed to 7-day click only (no 1-day view)
  • Hypothesis: reduce frequency and push delivery toward new audiences
  • Bidding strategy tests:
    • Target ROAS
    • Highest volume

These setups were tested to counter:

  • rising frequency
  • declining ROAS caused by audience saturation

Results

  • Ad spend: $110,000
  • Revenue: $154,000
  • ROAS: 1.4x
  • Performance remained similar to previous setups

Despite extensive testing:

  • ROAS did not improve meaningfully
  • performance aligned with historical results and competitors

Why It Didn’t Scale

This case clearly showed that:

  • creative volume cannot compensate for weak pricing
  • high frequency hurts expensive, low-repeat products
  • media buying cannot fix poor unit economics

At scale, the bottleneck was product and pricing, not ads.

Takeaway

This case demonstrates large-scale creative testing, structured hypothesis-driven media buying, and the ability to identify when performance limitations come from the business model rather than advertising execution.