Key Takeaways
- Every active ad on Meta is publicly visible in the Ad Library. A free MCP server connects Claude directly to this data so you can search, analyze, and score any brand's ads automatically.
- 5 public signals reveal which ads are printing money even though Meta hides performance data: runtime (30+ days = profitable), variant count, geographic expansion, survival rate across testing rounds, and format diversity.
- The remix system keeps the psychological trigger from winning competitor ads but swaps the product context, claims, and visuals to create original ads for your brand. 5 variations per winner, all distinct.
- A weekly scheduled task automates the entire cycle: scan competitors Monday morning, identify new winners, compare to last week, and deliver fresh angles before coffee.
- Total cost: $65/month (Claude Pro $20 + HeyOz $45). The MCP server, ScrapeCreators API, and Gemini API all have free tiers. Compare to $249/month for Foreplay alone or $3,000-5,000/month for an agency.
Introduction
I stopped brainstorming ads 3 months ago. Now I steal winning angles from the Meta Ad Library — legally, because every ad on the platform is public data.
Most people browse the Ad Library, get overwhelmed by thousands of ads, and go back to guessing. I built a system that finds the winners automatically, analyzes why they work, remixes the angles for my products, and produces finished ads. All for $65/month.
This guide gives you the complete system: the MCP server setup, the winner-detection criteria, the analysis prompts, the remix prompts, the weekly automation, and the production workflow. Focus is on static ads — they are faster to analyze, remix, and produce than video.
How Does This System Work?
Six steps running on a weekly cycle. Step 1: Connect Claude to the Meta Ad Library via a free MCP server that gives Claude direct access to search and analyze any brand's active ads. Step 2: Define what winning looks like using 5 public signals that reveal profitability. Step 3: Let Claude scrape and categorize every active competitor ad, grouped by hook type, ranked by winner criteria. Step 4: Remix each winning angle — Claude keeps the psychological trigger but swaps the context and generates 5 variations per winner. Step 5: Automate it weekly with a scheduled task that runs every Monday morning. Step 6: Produce the ads by pasting remixed angles into HeyOz or your design tool.
What Do You Need Before Starting?
Claude Pro ($20/month) at claude.ai. Claude Desktop app from claude.ai/download. Node.js from nodejs.org (needed for the MCP server). A ScrapeCreators API key (free tier at scrapecreators.com — used by the MCP to access Ad Library data). A Google Gemini API key (free tier at aistudio.google.com — used for AI image analysis of ad creatives). HeyOz ($44.99/month at heyoz.com) for producing finished static ads, or use Canva/Figma. Total: $65/month with free API tiers.
How Do You Install the Meta Ad Library MCP Server?
The MCP server is TryPeggy's facebook-ads-library-mcp, an open-source tool at github.com/trypeggy/facebook-ads-library-mcp. It connects Claude directly to Meta's public Ad Library.
Get your API keys: Go to scrapecreators.com, create a free account, and copy your API key from the dashboard. Then go to aistudio.google.com, sign in with Google, click Get API Key, and create one.
Install the server: Open your terminal and run git clone github.com/trypeggy/facebook-ads-library-mcp.git, then cd into the folder, create a virtual environment with python3 -m venv venv, activate it, and pip install -r requirements.txt. Create a .env file with your SCRAPECREATORS_API_KEY and GOOGLE_GEMINI_API_KEY.
Connect to Claude Desktop: Edit the config file at ~/Library/Application Support/Claude/claude_desktop_config.json (Mac) or %APPDATA%/Claude/claude_desktop_config.json (Windows). Add an mcpServers block with the facebook-ads-library server pointing to your cloned repository path with your API keys as environment variables. Save and restart Claude Desktop.
Test by asking Claude: Search the Meta Ad Library for ads from [any brand] that are currently active. If it returns results, the MCP is working.
Alternative if you do not want to install the MCP: browse facebook.com/ads/library manually, screenshot competitor ads, and paste them into Claude for analysis. This works but does not scale or automate.
How Do You Identify Winning Ads From Public Signals?
Meta hides CTR, ROAS, and conversion data. But 5 public signals reveal which ads are printing money.
Signal 1 — Runtime (Longevity): An ad running 30+ days is almost certainly profitable. Brands do not pay to run losing ads for a month. 30+ days = likely profitable. 60+ days = proven winner. 90+ days = evergreen money printer.
Signal 2 — Variant Count: When a brand creates 5-10 variations of the same angle, they found something that works and are optimizing it. 3+ variants = they believe in it. 5+ = actively scaling. 10+ = top performer.
Signal 3 — Geographic Expansion: An ad in 1 country is being tested. The same ad in 5+ countries has been proven. Geographic spread indicates confidence.
Signal 4 — Kill Rate (Survival): If a brand ran 20 ads last month and only 5 survive, those 5 beat the other 15. Survivors from multiple testing rounds are battle-tested.
Signal 5 — Format Investment: When a brand produces the same angle as static, video, AND carousel, they are investing significant production resources. 3+ formats for one angle = proven winner.
Scoring: each signal scores 1-3 points. Maximum 15. Any ad scoring 9 or above is flagged as a WINNER worth remixing.
What Is the Competitor Analysis Prompt?
Copy-paste this into Claude Desktop with your MCP connected:
Prompt:
I want to analyze the active Meta ads from these competitors. Search the Ad Library for each one and give me a complete breakdown. Competitors: 1. [BRAND NAME 1] 2. [BRAND NAME 2] 3. [BRAND NAME 3]. For each competitor, find all currently active ads. For each ad, extract: the headline/hook (first line of primary text or text overlay), full primary text, CTA, format (static, video, carousel), visual description, estimated runtime, number of variants with similar messaging, geographic reach. Categorize each hook into: Problem-Aware, Benefit-Led, Social Proof, Direct Offer, Curiosity, or Comparison. Score each on the 5 winner signals (1-3 per signal, max 15): Runtime (under 14 days = 1, 14-30 = 2, 30+ = 3), Variant count (1 = 1, 2-4 = 2, 5+ = 3), Geography (1 country = 1, 2-4 = 2, 5+ = 3), Survival (new this week = 1, 2+ weeks = 2, 4+ weeks = 3), Format diversity (1 format = 1, 2 = 2, 3+ = 3). Output as ranked table: Brand, Hook, Category, Runtime, Variants, Geo, Survival, Formats, Total Score. Flag every ad scoring 9+ as WINNER. Summary: total ads per competitor, distribution across hook categories, top 5 winning angles, common patterns across winners, and gaps (hook categories no competitor is using heavily).
Replace the bracketed names with your actual competitors.
What Is the Angle Remix Prompt?
Once you have ranked winners, use this to remix them for your product:
Prompt:
Here are the top winning competitor ad angles (scored 9+ on winner signals): [PASTE TOP 5-10 WINNERS with hook, primary text, category, visual description]. Now remix each for MY product: Product: [NAME], URL: [URL], Price: [PRICE], Key Benefits: [1, 2, 3], Target Audience: [WHO], Brand Voice: [TONE]. For each winning angle, create 5 remixed variations. Rules: KEEP the psychological trigger (the underlying emotion that makes the original work), KEEP the structural format (question stays question, number stays number), SWAP the product context (replace their product with mine), SWAP specific claims (use my real benefits and numbers), MAKE IT DISTINCT (should not look like a copy). For each remix provide: Headline under 40 chars, Primary text under 125 chars first line (full up to 300), CTA, Visual direction for static ad (composition, mood, specific enough for a designer), Which psychological trigger it preserves, Why this remix works for my audience. Flag the top 10 across all remixes.
How Do You Automate This Weekly?
Set up a Claude scheduled task at claude.ai/code/scheduled. Click New scheduled task, set to Weekly Monday 7 AM, and paste this prompt:
Prompt:
Run my weekly competitor ad intelligence scan. Step 1: Search the Meta Ad Library for all active ads from [COMPETITOR 1], [COMPETITOR 2], [COMPETITOR 3]. Step 2: Score each on the 5 winner signals (runtime, variants, geography, survival, format diversity). Flag any scoring 9+ as WINNER. Step 3: Compare to last week. Identify NEW winners not seen last week, DISAPPEARED ads that got killed, SURVIVING winners running 2+ weeks (reinforced confidence). Step 4: For NEW winners, analyze the hook, category, visual approach, and psychological trigger. Step 5: Generate a This Week's Intelligence Brief: total active ads analyzed, new winners identified, confirmed survivors, top 3 new angles to remix, competitive moves to watch. Keep under 500 words. Lead with actionable insights.
Click Create, then Run Now to test. The task runs automatically every Monday after that.
How Do You Produce the Static Ads?
Using HeyOz (fastest): Go to heyoz.com, add your product (paste URL to import images and brand colors), select Static Ad format, and for each remixed angle: select a template, paste the headline and primary text, choose the product image matching the visual direction, generate, and export at 1080x1080 (feed) or 1080x1350 (4:5 vertical).
Meta specs for static ads: Feed 1080x1080 (1:1) or 1080x1350 (4:5, often outperforms square). Stories/Reels 1080x1920 (9:16). Link ads 1200x628 (1.91:1). Produce your top 10 in feed format, expand top 5 to Stories/Reels.
Testing: one campaign, one ad set targeting core audience, 10-20 creatives as individual ads, $5-10 per ad per day for 3-4 days. Kill below 1% CTR with zero conversions, scale top 3-5. Feed winners back into next week's remix for compounding improvement.
What Does This Cost?
Claude Pro: $20/month. HeyOz Basic: $44.99/month. Facebook Ads Library MCP: free (open source at github.com/trypeggy/facebook-ads-library-mcp). ScrapeCreators API: free tier. Google Gemini API: free tier. Total: $65/month.
Compare to Foreplay (ad intelligence tool alone) at $249/month, a creative agency at $3,000-5,000/month, or a freelance strategist at $150-300/hour. This system gives you competitive intelligence, angle generation, and ad production for $65/month.
Frequently Asked Questions
Do I need to code to install the MCP?
The installation requires running a few terminal commands (git clone, pip install, editing a JSON file). They are copy-paste commands in the guide. Alternatively, open Claude Code and ask it to help you install step by step. If you cannot install the MCP, use the manual approach: browse facebook.com/ads/library, screenshot ads, and paste into Claude.
Is this legal?
Yes. The Meta Ad Library is public data — Meta requires transparency and makes every active ad visible to anyone. The goal is understanding WHY ads work (psychological triggers, hook structures) and applying that to your product with your brand. Do not copy creative assets, exact text, or trademarks.
How do I know the winner signals actually work?
No signal is perfect individually. But across 5 signals, false positives are rare. An ad running 60+ days, with 5+ variants, in multiple countries, across multiple formats is almost certainly profitable. Score 9+ across all 5 signals has a very high hit rate.
How many competitors should I monitor?
Start with 3. More than 5 becomes overwhelming. Pick direct competitors — brands at similar scale targeting the same audience with comparable products.
Can I use this for video ads too?
The winner signals and remix process work identically for video. This guide focuses on static ads because they are faster to produce. Upgrade your best static winners to video format once you have proven angles.
Does this work for non-e-commerce brands?
Yes. The framework works for any industry with active Meta advertisers: SaaS, apps, services, lead gen, local businesses. The Ad Library contains all active ads across all categories.
About the author
Ahad Shams
Ahad Shams is the Founder of HeyOz, an all-in-one ads and content platform built for founders and small teams. He has worked across consumer goods and technology, with experience spanning Fortune 100 companies such as Reckitt Benckiser and Apple. Ahad is a third-time founder; his previous ventures include a WebXR game engine and Moemate, a consumer AI startup that scaled to over 6 million users. HeyOz was born from firsthand experience scaling consumer products and the need for a unified, execution-focused marketing platform.

