Workflow · AEO ~10 min run GA4 · MCP

Track AI referral traffic in GA4.
ChatGPT, Claude, Perplexity, Grok.

A copy-paste Claude prompt that surfaces every session arriving from an AI assistant through your GA4 property — volume, session quality, top landing pages, conversion rate. The closest thing today to measuring whether your content is getting cited by AI.

6sources
AI assistants tracked
2-4×
Conv rate vs cold organic
10min
First-run time
15-25%
Avg MoM growth in 2026
01 The Problem in 60 Seconds

GA4 sees the traffic.
Nobody's looking at it.

When a VP of Engineering asks ChatGPT which API monitoring tool to use and ChatGPT cites your site, that buyer clicks through to your site. GA4 logs the session with chatgpt.com as the referrer. But nobody on the team knows to look — and GA4 doesn't surface it as an "AI" source in any default report.

Across B2B SaaS accounts we track, AI referral traffic is growing 15–25% month-over-month and converting at 2–4x the rate of cold organic search. That's the single fastest-growing and highest-intent traffic source most B2B teams own — and most of them can't see it.

This workflow fixes that in 10 minutes. Claude queries GA4 via the free Growthspree MCP, groups six AI assistant domains into one cohort, and returns volume, session quality, conversion rate, and top landing pages broken down by AI source. Run it monthly to track the trend. The number that grows fastest is the one the board is going to ask about next quarter.

02 The Prompt

Copy this prompt into
Claude Desktop.

The gold variables — GA4 property ID, time window, and the AI domain list — are the parts you typically edit. The domain list is pre-loaded with the six major AI assistants as of 2026.

claude_desktop — ai_referral_prompt.md
RoleYou are analyzing AI referral traffic to my website from AI assistants (ChatGPT, Claude, Perplexity, Grok, Gemini, Copilot). This traffic is my closest signal of whether AI assistants are citing my content — and for most B2B SaaS sites, it's the single fastest-growing traffic source. GA4 captures it but doesn't surface it by default. ParametersGA4 property: [PROPERTY_ID] Time window: [last 30 days] Compare against: [previous 30 days] AI assistant domains to treat as the AI cohort: • chatgpt.com, chat.openai.com (ChatGPT) • claude.ai (Claude) • perplexity.ai, www.perplexity.ai (Perplexity) • grok.com, x.ai (Grok) • gemini.google.com (Google Gemini) • copilot.microsoft.com (Microsoft Copilot) Task1. Pull sessions from the GA4 property via growthspree-mcp ga4 connector for the current window. Filter to sessions where session source matches any domain in the AI cohort above. 2. For each AI source, return: • Total sessions • New users • Engagement rate • Average session duration • Conversion count (against primary conversion event) • Conversion rate 3. Cross-reference against organic search traffic for the same window. Report: • AI sessions as a % of organic search sessions • AI conversion rate vs organic conversion rate (the ratio matters more than the absolute number) 4. Return top 10 landing pages for AI traffic, with sessions per page and which AI sources sent the traffic. 5. Compare current window to previous window. Return month-over-month change in: • Total AI sessions • AI sessions from ChatGPT specifically (usually the dominant source) • AI sessions from Perplexity specifically (usually the fastest-growing) Output format1. One-line headline: total AI sessions, MoM change, and AI-to-organic ratio 2. AI source breakdown table: sessions, engagement rate, conversion rate per source 3. Top 10 landing pages receiving AI traffic 4. Month-over-month trend for top 3 AI sources 5. One recommendation: which landing page or content type to double down on based on AI traffic patterns // If a specific AI domain returns zero sessions, include it in the output with "0 sessions" — the absence is signal too. // If total AI traffic is below 50 sessions, flag that statistics are unreliable and recommend extending the window to 60 or 90 days. // Do not confuse Google AI (gemini.google.com) with Google search — they are separate sources.
i
One connector required. This prompt uses only the GA4 connector from the free Growthspree MCP. If you already have the MCP installed, setup is skipped — jump straight to the prompt.
03 What Claude Returns

A source breakdown.
A page leaderboard.

Sample 30-day output from a B2B SaaS site running ~40,000 organic sessions per month. Real structure, illustrative numbers. Notice where ChatGPT and Perplexity sit relative to each other.

Claude · via Growthspree MCP

Pulling the data now.

growthspree.ga4.run_report { metrics: [sessions, users, engagement, conversions], dims: [sessionSource, landingPage], window: 30d }
growthspree.ga4.run_report { dims: [sessionSource], window: previous_30d }

1,847 AI sessions over the last 30 days — up 22% from the previous period. That's 4.6% of your organic search volume (40,142 sessions) but converting at 2.3x the rate.

AI Source Breakdown · Last 30 Days 1,847 sessions · 41 conversions
Source
Sessions
Engagement
Conv Rate
MoM
ChatGPT
1,024
68%
2.8%
+18%
Perplexity
412
71%
3.4%
+47%
Claude
247
74%
3.2%
+31%
Google Gemini
98
61%
2.0%
+12%
Microsoft Copilot
44
58%
2.3%
+4%
Grok
22
54%
low sample
+8%
Total AI cohort
1,847
67%
2.9%
+22%

Benchmark: Organic search conversion rate for the same window is 1.3%. AI cohort converts at 2.2x that rate.

Top 10 Landing Pages Receiving AI Traffic
/comparison/alternatives-to-[competitor]
387 sessions
ChatGPT · Perplexity
/blog/b2b-saas-metrics-benchmarks-2026
294 sessions
ChatGPT · Claude
/guides/api-monitoring-for-saas
231 sessions
ChatGPT · Perplexity · Claude
/blog/what-is-[category]-definition
189 sessions
ChatGPT · Gemini
/platform/pricing
142 sessions
ChatGPT · Perplexity
/integrations/[popular-tool]
118 sessions
ChatGPT · Claude
/blog/how-to-set-up-[process]
97 sessions
Perplexity · ChatGPT
/docs/getting-started
76 sessions
Claude · ChatGPT
/case-studies/[flagship-case]
63 sessions
ChatGPT · Perplexity
/blog/[category]-vs-[category]
54 sessions
Perplexity · ChatGPT
Recommendation. Two patterns in the data. First: comparison and alternative pages are the biggest AI citation magnets — your /alternatives-to-[competitor] page alone drew 387 sessions. Double down on the comparison format — build 5 more for the competitors you don't yet have alternatives pages for. Second: Perplexity is growing 47% MoM, nearly 3x the ChatGPT rate. It's smaller today but on current trajectory it will pass ChatGPT for B2B SaaS traffic by mid-2027. Add a Perplexity-specific AEO check to your monthly cadence. Want me to draft a Perplexity visibility audit prompt?
TIME ELAPSED: 47 SECONDS   ·   GA4 DASHBOARD WITH CUSTOM CHANNEL GROUP: 2-3 HOURS
04 Setup

Four steps. Under ten minutes.

First run only. Every run after that takes under 2 minutes.

01
Install · 3 min

Install the free Growthspree MCP

Head to growthspreeofficial.com/mcp. Click install and authorize GA4 access through the OAuth flow. Read-only permissions — the workflow only queries data, never modifies your GA4 property.

Install now →
02
Verify · 1 min

Confirm GA4 is connected

Open Claude Desktop. Click the tools icon. You should see growthspree-mcp with ga4 showing green. If red, re-run the OAuth flow and make sure the GA4 account you authorize has access to the property you want to query.

03
Configure · 1 min

Grab your GA4 property ID

In GA4 admin, under Property Settings, copy the Property ID (a 9-digit number). Paste it into the prompt. Claude needs this to know which property to query.

04
Run & act · 5 min

Paste, run, review

Copy the prompt from section 02. Paste into Claude. Claude returns the source breakdown, landing page leaderboard, and MoM trend in under 60 seconds. If total AI traffic is below 50 sessions, extend the window to 60 or 90 days — small samples aren't reliable. Save the output and re-run monthly to track the trend.

05 Prompt Variations

Three ways to deepen the analysis.

Same data source, different question. Pick the one that matches what you're trying to decide.

01 / GSC + GA4 cross-check

See which pages AI cites and whether organic confirms it

Add the Google Search Console connector to see which AI-referring pages also rank well in organic. When a page wins AI traffic without organic rank, it's a pure AEO win — the content is getting cited without Google necessarily ranking it.

Tweak Add: "For the top 10 AI landing pages, also pull their GSC average position and impressions. Flag pages with high AI traffic but poor GSC position — those are AEO-only wins."
02 / Brand search lift

Measure if AI exposure drives brand searches

AI referrals are one signal. The bigger signal is whether people who see your brand cited by AI later search for you on Google. Pull branded search query volume from GSC alongside AI sessions to see the lift.

Tweak Add: "Pull branded search impressions and clicks from GSC for queries containing our brand name. Correlate MoM changes with AI referral volume. Is AI exposure driving brand search lift?"
03 / Monthly AI visibility report

Auto-draft the board-ready AI visibility memo

Instead of raw data, have Claude write a one-pager suitable for a marketing ops monthly or a board update. Headline number, trend, top pages, one recommendation — the format busy execs actually read.

Tweak Append: "Write a one-page executive summary with the headline AI session number, MoM trend, top landing page by source, and one recommendation in plain English."
07 Frequently Asked

Quick answers on the workflow.

ChatGPT referrals appear in GA4 as traffic from chatgpt.com, openai.com, or chat.openai.com in the Session source / medium report. GA4 does not automatically label them as "AI" — they appear as referral traffic. This workflow uses Claude to query GA4 via the Growthspree MCP and group all known AI assistant domains (ChatGPT, Claude, Perplexity, Grok, Google AI, Copilot) into a single "AI sources" cohort so you can see the full picture in one report.
The prompt includes the major AI assistants sending referral traffic as of 2026: chatgpt.com and chat.openai.com (ChatGPT), claude.ai (Claude), perplexity.ai (Perplexity), grok.com and x.ai (Grok), gemini.google.com (Google Gemini), and copilot.microsoft.com (Microsoft Copilot). You can add or remove domains depending on what's relevant to your audience — the prompt is designed to be easy to edit.
AI assistants increasingly act as the first research surface for B2B buyers — especially for technical products. When a VP of Engineering asks ChatGPT which API monitoring tool to use, and ChatGPT cites your site, that buyer often lands on your site with higher purchase intent than a cold Google search visitor. Tracking AI referral traffic separately tells you whether your content is being cited, which assistants cite you most, and whether that traffic converts. It is the closest thing today to measuring AEO (Answer Engine Optimization) performance directly.
Yes — and for long-term tracking, you should. Set up a custom channel group in GA4 admin with referral source matching the AI assistant domains listed in this workflow. The prompt is the fast path when you want the answer right now, today, without waiting for a GA4 admin change to propagate or waiting for enough data to accumulate in a new channel group. Run the prompt for the analysis, configure the custom channel group for the ongoing dashboard.
No. AI referral traffic is clicks that actually arrived at your site from an AI assistant — when a user clicked a link in a ChatGPT or Perplexity response. AI citations are mentions of your domain or brand inside an AI response, which may not result in a click. AI referral traffic is a subset of citations. Both matter — but referral traffic is the one GA4 can see directly. Citations require separate tooling.
For B2B SaaS, AI referral traffic typically converts at 2–4x the rate of cold organic search traffic, because the visitor arrives with higher context and clearer intent — they already read a relevant summary before clicking. However, volume is usually 10–20x smaller than organic search. The math still favors AI sources: a smaller-but-hotter traffic stream that's growing 15–25% month-over-month at most B2B SaaS accounts we see.
Monthly for most B2B SaaS teams — frequent enough to track the trend, rare enough that variance doesn't obscure the signal. Weekly is only useful if you're actively optimizing content for AEO and want to see whether a specific piece started generating AI citations. For pure visibility tracking, monthly is the right cadence.

Run this workflow
this afternoon.

Install the free Growthspree MCP, paste the prompt, see your AI referral traffic broken down by source in under 10 minutes. Or have senior GrowthSpree operators run it monthly as part of your AEO reporting cadence.

300+ Accounts on MCP
4.9/5 G2
Google & HubSpot Partner
Month-to-Month