top of page

Why LLMs have high conversion rates, and how to tap into them

Clicks are harder to earn today, but each one is more valuable.

Why LLMs have high conversion rates, and how to tap into them
Nick Haigler headshot

10/17/25

6

 min read

  • Nick Haigler
  • Oct 6
  • 8 min read

The rise in LLM usage has been the biggest shift in search since Google first reached 90% market share in 2009. Of course, today’s internet is bigger and noisier, so a raw Google vs. AI search comparison isn’t apples-to-apples. But the AI search growth curve is hard to ignore. ChatGPT alone receives 5.8 billion visits a month.


Seer Interactive is an AI-driven marketing agency with decades of search experience, and we’ve been analyzing the impact of LLMs on search since ChatGPT’s launch in 2022. And once we started looking at how LLM-driven users behave on-site, it became clear this isn’t just a volume story—it’s a conversion story. 


What we found: Across 50 Seer clients from January 2025 to July 2025, the average conversion rate from AI-referred visitors was 13.8%. Meanwhile, the average organic conversion rate was 9.3% 


Worth noting: The scale is small for now. In that same dataset, organic accounted for 44% of sessions while AI referrals made up just 0.08%. 


What matters: AI-driven sessions grew +333% year-over-year vs. +3.9% for organic. AI-driven conversions grew +9,718% vs. +121% for organic.


Traffic share isn’t the point. All indicators suggest there’s more signal than noise here. 


Wix Studio ad with gradient background, promotes GEO & SEO insights newsletter. Includes a "Subscribe to the newsletter" button.


So, why are LLM visitors converting at a higher rate? 


We think it comes down to three factors:


  1. How people use LLMs 

  2. How LLMs move people down the funnel 

  3. Click friction as an intent filter 



01. How people use LLMs 


“The best data I’ve seen on how individuals use LLMs was from two studies released in the last 30 days,” says Alisa Scharf, VP AI & Innovation at Seer Interactive. “First, Datos and Sparktoro analyzed hundreds of billions of clickstream data events and found that when users start using ChatGPT, they use Google more, not less.”


The second study is OpenAI’s latest consumer report. “It’s a must read (or at least must skim) for anyone responsible for acquiring customers through digital channels,” Scharf says.


In the paper, OpenAI says that the fastest-growing use case for LLMs is ‘information seeking.’ Their research shows that this category grew from 14% to 24% of all usage between July 2024 and July 2025. By that math, nearly 87 billion annual prompts are now people asking questions and looking for answers. 


Within this group sits a particularly valuable subset: “Purchasable Products” conversations. These are queries like “What’s the best laptop under $1,000?,” where users are explicitly asking for recommendations. As this category expands, visibility in LLM responses will only grow in importance. With visibility will eventually come the need for attribution.


Google still handles ~373× more “search-like” queries than ChatGPT, showing that users aren’t replacing search but layering LLMs on top of it. In practice, people use tools like ChatGPT to supercharge the act of finding information online. They're not abandoning Google. 



02. How LLMs move people down the funnel 


In a traditional search flow, people jump between a dozen tabs, piecing together definitions, comparisons, and reviews. With AI, that messy process gets compressed into the conversation itself. 


The basics are handled instantly: definitions, side-by-side comparisons, summaries. Context and tradeoffs are often added too, making the answer more tailored to what the user actually cares about. (This is why bottom-of-funnel content is so important.)


That means by the time someone does click through, they’re not still in the early research phase. They’ve already moved closer to making a decision. 



03. Click friction as an intent filter 


Unlike Google, LLMs don’t show 10 blue links with obvious off-ramps. There are very few prompts inside the interface that push people to click away. 


That makes each click harder to earn—but also more valuable. If someone chooses to leave the conversation and land on a site, they’ve already overcome friction. They’re signaling stronger intent, which is why we see higher engagement and conversion once they arrive. 



How to make the most of rising AI adoption 



01. Set up an AI channel grouping & dig into your own data 


The first step is visibility. Start tracking traffic coming from LLMs, if you’re not already doing so. You may be getting conversions without realizing it.


At Seer, we use a regex like this one to bucket AI referrals in GA4 into their own channel grouping, making it easier to see how these visitors behave on-site: 


.*chatgpt.*|.*poe\.com.*|.*copilot.*|.*bard.*|.*gemini.*|.*perplexity.*|.*openai.*|.*claud e.* 


Once that’s in place, you can see not just traffic share but also how conversion rates stack up against other channels. 


Bar chart shows AI-Driven CVR at 27.77%, highest among channels. Others: Direct 10.59%, Organic Search 13.41%, Organic Social 19.8%, Organic Video 24.67%.


Industry studies are useful, but they won’t tell you how your own audience interacts with AI referrals. Start by comparing conversion rates across your AI grouping and other channels. Across our client set, we’ve seen LLM traffic convert at 13.8% on average vs. 9.3% for organic. 


But on a client-by-client level, it varies widely. One client, for example, saw ChatGPT referrals convert at nearly 16% compared to just 1.7% for organic. 


Bar chart compares conversion rates: ChatGPT 15.9%, Perplexity 10.5%, Claude 5%, Gemini 3%, Google Organic 1.76%. Light purple hues.

Look at your own data:


  • Which pages are AI-referred visitors landing on? 

  • What events or actions are they completing? 

  • How do these patterns differ from organic or paid traffic? 


With audience research hard to come by in AI search, this is your best way to understand what matters to LLM-driven users. 



02. Check your site log files 


Your logs can show which pages LLM crawlers are hitting most often, even if those pages aren’t driving traffic directly. (Related: How to know which AI crawlers are visiting your site)


The ChatGPT-User bot, for example, is tied to real-time user requests. If it consistently hits a certain feature page, that’s a clue the model is pulling content from it into live answers. 


This is where content strategy comes in. If feature pages are showing up often, double down on building more of them. If industry pages are being hit, expand those sections with audience-specific details. By shaping what LLMs can pull from, you’re shaping how your brand is represented in AI search. 


SEO dashboard shows options like "AI Bot Traffic over Time" and "AI Bot Visits by Page" highlighted in a red box.
See which bots are crawling your site in Wix Analytics

03. Use prompts as real-time audience research 


LLMs themselves can serve as a lightweight research tool. Think about the prompts your audience might use, then test them directly.


Laptop buying guide under $1000, listing CPU, RAM, storage, display, battery life, and ports. Features and specs outlined in a table format.


For example, when we asked ChatGPT for laptop recommendations, it highlighted features like portability, battery life, display quality, and refresh rate. Students, remote workers, and gamers all have different priorities, and the model surfaced them clearly. Brands can use this insight to adapt their content strategy for LLMs.


If LLMs are emphasizing these attributes, make sure your product pages do too. 


Creating prompts to monitor your brand doesn’t need to be complicated. Start by tracking how often your brand shows up in the prompts you care about. From there, you can adjust for funnel stage, persona, and other factors.


Here’s a prompt you can use to generate prompts related to your brand: 


Please create a list of 100 questions that result in generating a list of companies. These questions should include modifiers such as: Best, Which, Where, or Top.


For example, an insurance agency might use the query, 'What are the best options for renter's insurance?' 


Our primary products and services are: [FILL IN THIS PART] 


Do not repeat any questions. 


Take your time. 


Do you understand? 


It’s worth noting that, given the predictive nature of LLMs and the level of personalization in each output, we focus on casting as wide a net as possible when monitoring prompt themes. This helps us identify the common attributes LLMs tend to highlight when prompted.



04. Track which sources LLMs cite 


Not all content gets surfaced equally. Some formats, sections, and content types are more likely to be cited. 


By analyzing LLM outputs at scale, you can spot patterns. Maybe they favor structured lists. Maybe they pull statistics. Maybe they reference competitor FAQs more than your own. 


This analysis can guide strategy in two ways: 


  • Page optimization: Add clear headings, structured lists, and hard data to boost the odds of being cited.


  • Gap identification: If competitors are providing content formats that you’re not—detailed stats, FAQs, or benchmarks—that’s an opportunity. 


And remember: LLMs don’t need top-of-funnel fluff, like definitions. They can generate that themselves. Focus on creating unique content that LLMs haven’t already seen to increase your chance of being cited. If the information is widely available elsewhere, the LLM has less reason to credit you.


What they surface are the differentiated aspects of your brand. For example, when we updated Seer’s site footer to highlight our enterprise clients and retention rate, ChatGPT began referencing those details almost immediately. Small tweaks can shape what LLMs repeat back to users. 


Two sections with black backgrounds show text: "Remote-first, Philadelphia-founded" and "130+ clients, 97% retention rate," each with an email entry box. Pink arrows connect them to captions about AI knowledge, with orange and pink gradients in corners.


05. Start testing with clear hypotheses


Here’s the reality: No one has a perfect blueprint for generative engine optimization (GEO). Results vary by industry, and the only way to know what works is to test. The key is to start simple. Begin with a hypothesis, then identify the data that would prove or disprove it. 


One of our first tests focused on content recency. Our hypothesis: AI bots prefer fresher content than organic search does. We noticed LLM citations often came from pages updated within the past six months. When we mapped AI-driven traffic against publish dates, the pattern held. From there, refreshing older pages became a structured test and the results validated the hypothesis. 


Some hypotheses worth testing: 


  • Freshness: How quickly does updated content get cited? 

  • Page elements: Do visuals, stats, or detailed FAQs drive more citations? 

  • Trust signals: Do reviews or testimonials boost visibility? 

  • Schema markup: Which structured data types align best with how LLMs source content? 


The point isn’t to guess. It’s to test with purpose. Each insight builds toward a repeatable playbook tailored to your brand. 



06. Adjust your KPIs


Adjust your KPIs to align with an AI-search landscape. 


“The number one thing a CMO can do is help their team get new KPIs,” says Wil Reynolds, CEO and founder of Seer Interactive. “So many of the leading indicators of search are vanishing, but the KPIs aren't keeping up, which then leaves people trying to optimize for metrics that are no longer as well correlated to business success.”


He continues: “This is where leaders need to lead and recognize we are moving into a world with less and less trackable data, so our KPIs might need to be less channel-specific (how many leads from AI) and more customer-centric (how many leads did we get from organic marketing activities).”


Reynolds recommends tracking the leads trendline, then overlaying channels to see which ones align best with the trend of your lead flow or sales. “We're finding that AI search + organic social are now more correlated to our revenue trends than organic search, for instance, which used to be a good predictor.” (Related: 6 ways to get executive buy-in for your AI search plan)



Take advantage of the opportunity


LLM-driven traffic may still be a small slice of overall sessions, but the growth and conversion rates are already outpacing traditional channels. The bigger opportunity isn’t just clicks; it’s shaping how your brand shows up in the answers people see before they ever land on your site.


“We need to prepare to see less web traffic as more users get an understanding of a business directly on search platforms," says Scharf. "But the majority of brands shouldn't expect fewer customers."

So, take a look at your own AI traffic. You may already be getting real value from these visits without even realizing it. By tracking referrals, watching crawler behavior, surfacing your differentiators, and running small tests, you can start building a playbook that scales with the channel.


This is still early, but that’s exactly why it matters. The brands testing now will learn faster, shape best practices, and understand influence factors while competitors are still on the sidelines.

 
 

Related articles

5 ways to get executive buy-in for your AI search plan

6 ways to get executive buy-in for your AI search plan

BY KIERA CARTER

How to know which AI crawlers are visiting your site (and why it matters)

How to know which AI crawlers are visiting your site (and why it matters)

BY EINAT HOOBIAN-SEYBOLD

AI is changing eCommerce

AI is changing eCommerce. 5 ways brands can prepare.

BY PAULA XIMENA MEJIA

Get SEO & LLM insights sent straight to your inbox

Stop searching for quick AI-search marketing hacks. Our monthly email has high-impact insights and tips proven to drive results. Your spam folder would never.

*By registering, you agree to the Wix Terms and acknowledge you've read Wix's Privacy Policy.

Thanks for submitting!

bottom of page