ChatGPT’s chats are being indexed by Google

Vadim Kravcenko
Vadim Kravcenko
Aug 02, 2025 · 5 min read

TL;DR: ChatGPT's shared conversations showed up in Google search results, then vanished within 24 hours. Here's what happened, what it means for SEO, and what we saw in our own monitoring data when it unfolded.

Less than 24 hours ago, savvy SEOs were sharing a clever discovery: ChatGPT's public /share conversations were fully indexable, and some were already showing up in Google's top 20 for long-tail queries. The find felt like digital alchemy — instant, authoritative content you didn't have to write. Screenshots hit Twitter, blog posts popped up, and a few opportunists even started scraping the chats for quick-fire affiliate pages.

Then the hammer dropped.

By the next morning every /share result had disappeared from Google's index. Type site:chatgpt.com/share today and you'll see zero results. OpenAI quietly pushed three changes in rapid succession — <meta name="robots" content="noindex">, a site-wide canonical to the homepage, and (most likely) a bulk request via Google's URL Removal Tool. "ChatGPT share URLs" became a live case study in lightning-fast Google deindexing.

What We Saw in Our Own Data

We were monitoring this in real time at SEOJuice. When the first reports surfaced on Twitter, I ran a quick check across our client sites to see if any /share URLs were showing up as competing pages. Here's what we found:

  • 3 of our client domains had ChatGPT /share pages appearing in the same SERPs for long-tail queries. In one case, a shared ChatGPT conversation about "best CRM for real estate agents" was ranking #14 for a query where our client's blog post sat at #11. That's close enough to be concerning.
  • The content quality was uneven. Some shared conversations were genuinely thorough (a user had asked ChatGPT detailed follow-up questions and the resulting thread read like a well-structured article). Others were garbled half-conversations that shouldn't have ranked for anything.
  • After the deindexing: The three competing /share pages disappeared, but our clients' positions didn't improve immediately. The SERP reshuffled over the following 48 hours, with other pages filling the slots. That's a reminder that removing a competitor from the SERP doesn't automatically promote you — Google re-evaluates all candidates.

The episode was brief enough that no one suffered lasting damage. But it raised a question I keep coming back to: what happens when the next AI company doesn't react as quickly as OpenAI did?

A snap poll of 225 founders captured the mood swing:

Poll Option Votes Takeaway
Yes — worth the risk 28.9 % Nearly a third would roll the dice on black-hat shortcuts, even after seeing sites nuked overnight.
No — I need SEO traffic 40.4 % Pragmatists who know organic is their lifeline.
Wait… nuke my SEO? 24.9 % Shocked newcomers learning what "deindexed" really means.
What are backlinks? 5.8 % The blissfully unaware — until it's their turn.

Stakes couldn't be clearer:

  • Citations lost: Any AI assistant or news outlet that quoted your /share chat loses the link equity once Google erases the page.

  • AI-visibility gap: LLMs trained on fresh web snapshots count Google's index as a trust signal. No index, no citation.

  • Organic traffic cliff: If Google can flick you off the SERP in a single crawl cycle, your content pipeline is only as strong as your compliance discipline.

Yesterday's growth "hack" became today's cautionary tale — proof that when you rely on loopholes instead of durable SEO fundamentals, the distance from ranking to vanishing is just one Google refresh away.

How /share Pages Got Indexed in the First Place

This is the part I find most interesting from a technical SEO perspective, because it reveals how Google discovers and indexes content even without traditional link signals:

  1. Robots.txt Left the Door Wide-Open
    When ChatGPT launched the public "Share" feature, its robots.txt file explicitly allowed crawling of /share/ under User-agent: *. For Googlebot that's a green light to fetch, render, and consider each shared conversation as a normal HTML page. This was likely an oversight rather than a deliberate choice — OpenAI was focused on the sharing feature, not on its SEO implications. (I've made similar mistakes. Our staging environment was indexable for three weeks before someone noticed. It happens.)

  2. Google's Hidden-URL Discovery Arsenal
    Even if no site linked to those pages, Google can still surface them through passive data pipes the SEO community calls "Google side-channels."

    • Chrome URL hints — when millions of users paste a /share link into the omnibox or click it inside ChatGPT, Chrome telemetry feeds anonymized URL samples to Google's crawl scheduler.

    • Android Link Resolver — any tap on a /share URL inside an Android app fires an intent logged by Play-services diagnostics.

    • Gmail & Workspace Scans — shared chats emailed between colleagues get scanned for phishing; URLs deemed benign join the crawl queue.

    • Public DNS & QUIC heuristics — high-volume DNS look-ups for the same sub-directory signal "this path matters."

    The net result: No internal links does not equal no discovery. Google doesn't need a hyperlink graph when user behaviour itself points to new URLs. This has implications beyond ChatGPT — if you have any user-generated content on your site that's publicly accessible, Google is probably finding it through channels you haven't considered.

  3. AI-Generated Content Looks Fresh & Unique
    Each /share page held novel text that isn't duplicated elsewhere, so Google's freshness classifier assigned immediate value. The combination of Allowed crawling and unique content fast-tracked the pages into the live index — some within hours of first being shared.


Google's Rapid Clean-Up: The Four-Pronged Fix

What makes this episode instructive for anyone managing a large site is the speed and precision of the response. Here's the technical playbook OpenAI used:

# Mitigation Step What It Does Why It Works Fast
1 Add <meta name="robots" content="noindex"> Tells Googlebot to keep crawling but drop the page from the index. Tag is respected on the very next crawl — often < 12 h.
2 Set <link rel="canonical" href="https://chatgpt.com"> Consolidates any residual ranking signals to the homepage. Prevents canonicalised duplicates from re-appearing later.
3 Bulk-submit to Google's URL Removal Tool Hides URLs from results immediately for ~6 months while permanent deindex proceeds. Bypasses crawl latency; acts within minutes.
4 (expected) Update robots.txt to Disallow /share/ Stops crawl requests entirely, reducing bandwidth and log clutter. Final polish; ensures new share links never re-enter the queue.

This four-step playbook — noindex + canonical + URL removal + robots.txt — is worth bookmarking. If you ever need to deindex a large section of your site quickly (after a staging environment leak, an accidental publish, or a user-generated content explosion), this is the fastest approach available. We've used a similar playbook for three client emergencies in the past year, and it consistently clears indexed URLs within 24-48 hours.

Why Google Could React Within 24 Hours

  • Big-brand priority: High-authority domains get crawled more frequently, so directive changes propagate faster. When chatgpt.com tells Google something, Google listens quickly.

  • Manual nudge: OpenAI almost certainly triggered "Fetch as Google" in Search Console to force-refresh critical pages after the new tags went live.

  • Automated Penalty Avoidance: Google's spam systems penalise thin or user-generated content that scales unchecked; OpenAI had strong incentive to neutralise the risk before a site-wide demotion kicked in.

Bing's One-Million-URL Hangover

OpenAI's clean-up playbook stopped at Google Search Console. As a result, Bing still shows ~1 million /share pages in its results — a digital ghost town of ChatGPT conversations that are now invisible on Google.

This is where the story gets interesting from a multi-engine SEO perspective. We checked the same three client queries that had competing /share pages in Google and found that in Bing, those pages were still ranking a full week later. The disparity highlights three structural differences between the engines:

  1. Crawl-to-Index Latency — Googlebot revisits high-authority domains in hours; Bingbot often needs days. When OpenAI injected noindex and canonicals, Google recrawled quickly and obeyed. Bing simply hadn't cycled through its backlog yet.

  2. Absent BWT Intervention — All signs indicate OpenAI skipped Bing Webmaster Tools, meaning Bingbot was still following the original "Allowed" directive until its natural cadence caught the changes.

  3. Historical Lag Pattern — This isn't new. In 2021 Bing continued serving WordPress favicon URLs weeks after they were purged from Google, and last year it indexed a leaked font-CSS directory that Google ignored. Bing's smaller bot fleet and conservative update window make it prone to indexing hangovers whenever a high-profile site flips directives suddenly.

Practical takeaway: If you rely on Bing traffic — or on ChatGPT citations that lean on Bing's index — run dual dashboards. Submit removal or recrawl requests in both Search Console and Bing Webmaster Tools. We now include this as a standard step in our emergency deindexing playbook after this incident taught us the hard way that "fixed in Google" does not mean "fixed everywhere."

Why Non-English /share Results Dominate in Bing

An odd by-product of Bing's lag: the surviving /share pages are overwhelmingly non-English, non-Latin alphabet results — Japanese, Russian, Arabic, Thai. We noticed this because one of our clients has a Japanese-language subdomain and was seeing more competing /share pages in Bing JP than Bing US. Three factors explain the bias:

  1. Regional Index Slices Update Slower — Bing partitions its index by locale. High-traffic US-EN slices refresh fastest; peripheral language shards may wait a week or more before pruning noindex pages.

  2. Duplicate-Cluster Prioritisation — Bing's de-duplication algorithm keeps one URL per canonical cluster. When the English versions vanished from Google and lost interlink equity, Bing shifted weight to unique non-English variants that still carried user-engagement signals.

  3. Serving vs. Indexing Disparity — Bing may mark a URL as "deindexed" internally but continue to serve it in low-competition locales until the next full deployment cycle.

Optimization Insight: For multilingual sites, staggered directive rollouts (e.g., first EN, then JP) can create unintended duplicate-content windows. The safer approach is to deploy noindex and canonical updates globally, then verify removal in every locale-specific data center using VPN-based SERP checks. We've added this to our post-deployment checklist.

Related reading: