Why Your Pages Stop Ranking After 6 Months (And the Fix)

Vadim Kravcenko
Vadim Kravcenko
Mar 24, 2026 · 16 min read

Updated March 2026

TL;DR: Most pages peak at 3-6 months then decline. Here's how to catch it, what to do about it, and when to let a page die.

The Post That Lost 80% of Its Traffic

Google Search Console about page showing performance monitoring features
Google Search Console — your first line of defense against content decay. Source: Google

In March 2025, our internal linking guide was getting around 400 clicks per month from Google. It had been climbing steadily since we published it. Good keyword targeting, decent backlinks, comprehensive coverage. The kind of post you write and forget about because it's working.

By July 2025, it was getting 80 clicks.

I didn't notice for two months. That's the thing about content decay — it doesn't send you a notification. There's no alert that says "hey, that post you spent 30 hours on is now irrelevant." You just stop getting traffic, and if you're not watching, you don't find out until it's too late to recover cleanly.

When I finally looked at the Search Console data, the pattern was textbook. Impressions held steady through April, then started dropping in May. Clicks followed two weeks later. By June, the average position had slipped from page one to page two, and once you're on page two, you might as well not exist.

What happened wasn't complicated. Two competitors published better, more current guides in April. One of them included interactive examples. Ours suddenly looked dated — not wrong, just not the best result anymore. Google noticed before we did.

We refreshed the post in August. Updated the examples, added new data, restructured two sections that had gotten bloated. It took about 12 hours of work. By October, we'd recovered to around 300 clicks per month — not the original 400, but enough. (I should note: we also lost the featured snippet we'd had, and never got it back. Partial recovery is the realistic outcome.)

That experience is why I built content decay detection into SEOJuice. Not because it's a sexy feature — it isn't — but because I kept finding posts that had been quietly dying for months while I was focused on writing new ones.

This guide is everything I've learned about content decay since then. Some of it from our own data, some from published research, some from painful trial and error. I'll tell you what works, what the recovery rates actually look like, and which of our pages I couldn't save despite trying.

The 6-Month Cliff (And Why It Happens)

Here's the pattern I see repeatedly: you publish a post, it indexes, it climbs for 2-4 months as Google tests it in various positions, it peaks somewhere around month 3-6, and then it starts sliding. Sometimes slowly, sometimes off a cliff.

I'm basing the 6-month timeline on our own data and what I've seen across the sites we monitor. Your mileage will vary — some niches move faster, some slower. But the general arc is consistent enough that Ross Hudgens of Siege Media calls content "a depreciating asset," and their research found that 65.8% of content marketers expect traffic to decline over time without active maintenance.

Three things drive this decline, and they usually work together rather than in isolation.

The first is competition. Your post ranks #3 today because it's among the best results Google can find. But other people are also writing about the same topic, and some of them will do it better — more recent data, better examples, cleaner structure. You don't lose rankings because your content got worse. You lose them because relative quality shifted.

The second is staleness. Information ages. Statistics from 2023 feel old in 2025. Screenshots of tools show outdated interfaces. Recommendations based on last year's algorithm update become incomplete when the next update lands. Andy Crestodina of Orbit Media has been tracking this for years, and his research consistently shows that content updaters are 2.8x more likely to report strong results from their content marketing than people who only publish new pieces (from Orbit Media's annual blogging survey, which has tracked this metric since 2014).

The third — and this is the one people miss — is intent shift. The query stays the same, but what people mean when they search it changes. "Best project management tool" in 2023 meant Asana and Monday.com comparisons. In 2025, people searching that phrase increasingly want AI-native tools. Google's results shift to match, and your article about the old guard drops.

HubSpot found that 92% of their monthly blog leads came from old posts, which tells you two things: old content is incredibly valuable when maintained, and incredibly fragile when neglected. They also documented a 106% traffic increase from systematically refreshing their existing content — not by writing anything new, just by updating what they already had. (These are from HubSpot's widely-cited 2015 analysis — old data, but the pattern still holds.)

(Side note: I think AI Overviews are accelerating all three of these causes simultaneously. More on that later.)

How to Catch Decay Before It Kills Your Page

SEOJuice automated SEO monitoring page
SEOJuice automated monitoring catches decay signals before they become emergencies. Source: SEOJuice

This is the section that matters most. Everything else in this article is context — this is the part you can act on today.

The core principle is simple: you need to compare a page's current performance against its own historical baseline. Not against some abstract benchmark, not against your other pages. Against itself, 3-6 months ago.

The thresholds we use at SEOJuice: a 30% drop in clicks over a rolling 3-month comparison window, with a minimum baseline of 20 clicks per month. The minimum baseline matters — a page going from 5 clicks to 2 clicks isn't meaningful decay, it's noise. You need enough data to distinguish signal from variance.

(The 30% threshold is what we use. Some people use 20%. There's no magic number. But 30% is aggressive enough to catch real problems without flooding you with false positives from seasonal fluctuation.)

Here are three ways to detect decay, ordered by effort level. I've used all three — the summary:

Method Cost Setup Time Catches Decay In False Positive Rate
GSC date comparison Free 0 minutes Months (depends on when you check) Low (manual filtering)
Analytics alerts Free 1-2 hours Weeks High (needs tuning)
Automated monitoring Paid 15 minutes Days to weeks Low (intelligent baselines)

Method 1: Google Search Console Date Comparison (Free, Manual)

Open Search Console. Go to Performance. Click "Date" and select "Compare." Set it to compare the last 3 months against the previous 3 months. Filter by page. Sort by biggest click decline.

This takes about 15 minutes and gives you a clear picture. The limitation is that it's manual — you have to remember to do it, and most people don't. I did this monthly for about four months before I stopped because I got busy, which is exactly the problem.

What to look for: pages where clicks dropped more than 30% AND impressions also dropped (not just CTR changes). If impressions are stable but clicks dropped, that's a CTR issue, not a decay issue — different problem, different solution.

(Pro tip: export the data as a CSV and sort by absolute click change, not percentage change. A page dropping from 10 clicks to 3 is a 70% decline, but it's probably not worth your time. A page dropping from 500 to 350 is only 30%, but that's 150 real clicks you're losing every month.)

Method 2: Analytics Trend Monitoring (Automated, Free)

If you're using GA4 or any analytics platform, you can set up custom alerts for traffic drops. Create an alert that fires when organic sessions to a specific page group drop below a threshold compared to the previous period.

This is better than manual checking because it's automatic, but the setup is annoying and the alerts tend to be noisy. You'll get false positives from holidays, from seasonal content, from one-time traffic spikes that made the baseline artificially high. Expect to spend time tuning the thresholds for your specific site.

The advantage over Search Console is that you can track post-click behavior too. A page that's losing rankings AND has a rising bounce rate is a stronger decay signal than one that's just losing rankings. If someone lands on your page and immediately bounces back to the SERP, that's Google's quality signal working in real time against you.

(I should be honest: I found GA4's custom alerts frustrating to configure. The interface has changed three times since I set them up. If you're more patient with Google's UX than I am, this works.)

(I should note: these thresholds have worked for us, but every site is different. A news site might see 30% drops routinely from seasonal patterns. An evergreen SaaS blog shouldn't. Calibrate to your own baseline.)

Method 3: Automated Monitoring Tools

This is what we built into SEOJuice, and what several other tools offer in various forms. The idea is to automate the Search Console comparison, apply intelligent baselines, and surface only the pages that actually need attention.

Our approach connects to your Google Search Console data, calculates rolling baselines per page, and flags pages that cross the 30% decline threshold with at least 20 clicks of baseline traffic. It also factors in position changes and impression trends to distinguish between "this page is decaying" and "this page had a weird month."

The honest advantage of automated monitoring: it catches decay in weeks rather than months. Our internal linking guide would have been flagged in May instead of July if we'd had the detection running. Those two months matter — the longer decay goes unaddressed, the harder recovery becomes. A page that's lost 20% of its traffic is much easier to recover than one that's lost 80%.

The honest limitation: no tool can tell you WHY a page is decaying. It can tell you that it IS decaying, and it can show you the metrics, but the diagnosis still requires a human looking at the competitive landscape, checking for intent shifts, and reading the actual content with fresh eyes.

Avoiding False Positives

One thing I learned the hard way: not every traffic drop is decay. Here's what to rule out before you start a content refresh:

  • Seasonal patterns. If you wrote a "tax season checklist" in January and traffic drops in May, that's not decay. Compare year-over-year, not month-over-month, for seasonal content.
  • Technical issues. Check if the page got accidentally noindexed, if the sitemap broke, or if a CMS update changed the URL. I've seen traffic drops that looked like decay but were actually a trailing slash that got added to the URL after a WordPress update.
  • Algorithm updates. If 30 pages all dropped traffic in the same week, it's probably a core update, not content decay. The response is different — individual page refreshes won't fix a site-wide quality reassessment.
  • Viral traffic fading. If a page got shared on Hacker News or Reddit and spiked temporarily, the "decline" back to normal levels isn't decay. Check referral traffic to confirm.

What to Do When You Spot Decay

Once you've identified a decaying page, don't just start editing. First, figure out what changed:

  • Search the target keyword and look at what's ranking now. Has the SERP composition changed? Are the top results fundamentally different from what was ranking 6 months ago?
  • Check if your page's content is factually outdated. Old statistics, deprecated tools, changed best practices.
  • Look at the "People Also Ask" boxes. Have the related questions shifted? That signals intent change.
  • Check whether you lost backlinks to the page. Use your backlink tool of choice.

The diagnosis determines the response. Which brings us to the decision framework.

The Decision: Refresh, Rewrite, Redirect, or Let It Die

Not every decaying page deserves to be saved. This was a hard lesson for me — I spent 15 hours rewriting a post about SEO audit checklists that never recovered because the entire query had shifted to people wanting automated tools, not manual checklists. That time would have been better spent on new content.

Four options, and one of them — the one that most content marketing advice never mentions — is doing nothing.

Action When to Use Effort Expected Recovery
Refresh Content is structurally sound but has outdated data, old screenshots, or stale examples 4-8 hours 60-80% of peak traffic
Rewrite The topic is still valuable but your angle, structure, or depth is no longer competitive 15-25 hours 80-120% of peak (sometimes exceeds original)
Redirect You have a better, newer page on the same topic, or you're consolidating thin content 1-2 hours Transfers 60-80% of link equity to target page
Let it die The query intent has fundamentally shifted away from what you offer, or the topic no longer serves your business 0 hours N/A — reclaim the time for better investments

The diagnostic questions, in order:

  1. Is the topic still relevant to your business? If you pivoted away from the subject, let it die or redirect to something current.
  2. Has search intent changed? If the SERP now shows a completely different type of content (tools instead of guides, videos instead of articles), a refresh won't help. You either rewrite to match the new intent or let it go.
  3. Is the core structure still sound? If yes, refresh. If the entire argument or framework needs rebuilding, rewrite.
  4. Do you have another page that covers this better? If yes, redirect and consolidate.

One critical warning from Roxana Stingu, an independent SEO consultant who previously led indexing strategy at Alamy: don't just update the publication date without making real content changes. Google can detect this, and it erodes trust signals. If you're going to update a post, actually update it — new data, revised recommendations, improved examples. The date change should reflect genuine work, not a cosmetic trick.

The 90-Day Recovery Playbook

Recovery isn't instant. People underestimate this consistently — they update a post on Monday and expect results by Friday. Here's the realistic timeline based on the 12 blog posts we've refreshed over the past year and the recovery patterns I've seen across sites we monitor. (Twelve isn't a huge sample. I'm giving you the patterns I see, not statistically rigorous data.)

Timeframe What to Do What to Expect
Week 1 Diagnose the cause. Check SERPs, review competing content, verify technical issues (indexing, crawl errors). Make the refresh/rewrite/redirect/die decision. No traffic change. This is research time.
Weeks 2-4 Execute the changes. Update content, add new sections, refresh data and examples. Request re-indexing in Search Console. Google recrawls within days. Ranking shifts start appearing in week 3-4, usually erratically — you'll bounce between positions.
Weeks 5-8 Monitor. Don't touch the page. Let Google settle. Track impressions and average position (clicks lag behind position changes). Impressions should stabilize or start climbing. Position should settle within 5-10 spots of where it will land.
Weeks 8-12 Assess. Has the page recovered to at least 60% of peak traffic? If yes, monitor quarterly. If no, consider whether a more aggressive action (full rewrite, redirect) is warranted. Final results become clear. Recovery typically plateaus by week 10-12.

The recovery data from published studies is encouraging but needs context. HubSpot's 106% traffic increase from content refreshes was across their entire blog portfolio with a dedicated team — they have resources most of us don't. Backlinko documented a case study showing a 260.7% traffic increase from updating a single post — but that's an outlier, and I wish people would stop citing it as if it's typical. Don't expect that. Median recovery is more like 40-60%. A Single Grain case study published by Eric Siu reported a 96% traffic increase in 6 months from a systematic refresh program, which is more in line with what I'd call realistic-optimistic.

The number I trust most is this: across the refreshes we've done and the ones I've tracked from other sources, 70-80% of refreshed pages recover meaningful traffic within 90 days. "Meaningful" meaning at least 50% of their peak.

(I should note: 70-80% recovery rate means 20-30% of your refreshes won't work. Budget for that. Some pages just don't come back, no matter what you do. I'll talk about those next.)

The Pages I Couldn't Save

I want to be honest about the failures because every content decay article I've read makes it sound like refreshing always works. It doesn't.

We had a post about SEO audit checklists that was getting around 250 clicks per month at its peak. Classic decay pattern — slow decline starting around month 5. I did a full rewrite. New checklist, updated for current best practices, added downloadable template. Spent about 15 hours on it. After 90 days, it was at 60 clicks. Never recovered further. My best guess is that the query intent shifted from "give me a checklist" to "give me a tool that does the audit for me," and no amount of content improvement could compete with automated solutions ranking for the same keyword.

Another one: a comparison post about two specific tools. Both tools changed their pricing and feature sets significantly after we published. We updated the comparison, but by then, three newer comparison posts had been published by sites with more authority. We eventually redirected it to a broader tools overview page.

A third — our Shopify SEO checklist — partially recovered but plateaued at about 60% of its peak. Sometimes partial recovery is all you get.

I don't fully understand why some pages recover and others don't, even when the effort and approach seem similar. My best guess is that intent shift is the killer — when the underlying thing people want when they type that query has genuinely changed, no amount of content quality can overcome the mismatch. But I'm not sure. It might also be that some competitive landscapes are just harder to re-enter once you've lost position.

The practical takeaway: treat content refreshes as investments with uncertain returns, not guaranteed fixes. The 70-80% success rate is a portfolio statistic — it means most will work, but any individual refresh might not.

How AI Is Making Decay Faster

This section comes with a caveat: the data on AI's impact on content decay is still emerging. I'm less confident about these numbers than anything else in this article.

That said, here's what I'm seeing. AI Overviews are reducing click-through rates for informational queries by pulling answers directly into the search results. If your page ranked #2 for a question and Google now answers that question with an AI-generated summary at the top of the page, your clicks drop even though your ranking didn't change. This shows up as a CTR decline in Search Console, and it looks a lot like traditional decay even though the mechanism is different.

Metehan Yesilyurt's research on ChatGPT's behavior, published in his 2025 LinkedIn analysis of ChatGPT's ranking signals, found that ChatGPT assigns a URL freshness score that influences whether it recommends a source. Older content gets penalized in AI recommendations even when the information is still accurate. This creates a new decay vector that didn't exist two years ago — your content can decay in AI citation rankings independently of Google rankings.

What this means practically: the refresh cycle is getting shorter. Content that used to stay competitive for 12-18 months may now start losing ground at 6-9 months, because AI systems are biased toward recency in ways that traditional search wasn't. (I'm not sure yet whether this applies uniformly or only in certain niches — our data is mostly B2B SaaS and marketing content. E-commerce product pages might behave differently. But the direction is clear.)

If you're producing content that answers questions — the type most vulnerable to AI Overviews — monitoring for decay is more important now than it was even a year ago. The window between "peak performance" and "needs attention" is shrinking.

Frequently Asked Questions

How quickly does content decay typically happen?

Most content peaks between 3-6 months after publication, then begins a gradual decline. The speed varies by niche — fast-moving topics like AI or social media marketing can decay in weeks, while evergreen topics like basic business concepts may hold for 12-18 months. In our experience, the average page starts showing measurable decline around month 6-8.

Should I update the publication date when I refresh a post?

Only if you've made substantial content changes. Roxana Stingu has warned specifically about updating dates without real content modifications — Google can detect this pattern, and it undermines trust. If you've updated statistics, added new sections, and revised recommendations, absolutely update the date. If you fixed two typos, don't.

What's the minimum traffic a page needs before I should worry about decay?

We use a baseline of 20 clicks per month. Below that, the data is too noisy to distinguish real decay from normal variance. If a page is getting 5 clicks per month and drops to 2, that might just be a slow week. Focus your decay monitoring on pages that actually drive meaningful traffic.

Can refreshing content hurt my rankings?

It's rare, but possible. Major structural changes — especially changing the URL, removing sections that were attracting backlinks, or fundamentally shifting the topic — can cause temporary or permanent ranking drops. The safest approach is to keep the URL identical, preserve the core structure, and add or update rather than remove. If you're doing a full rewrite, keep the URL and set up redirect tracking so you can monitor the impact.

How do I prioritize which decaying pages to fix first?

Start with the pages that have the most to recover — high peak traffic combined with significant decline. A page that went from 500 clicks to 150 is a better candidate than one that went from 50 to 15, even though the percentage decline is similar. Also factor in business value: a page that drives conversions is worth more refresh effort than one that drives only informational traffic. We cover prioritization frameworks in our content refresh strategy guide.


Related Reading

Content decay isn't optional to deal with — it's happening to your pages right now, whether you're monitoring for it or not. The question is whether you catch it at 20% decline when recovery is straightforward, or at 80% decline when it might be too late. SEOJuice monitors your content automatically so you don't have to remember to check Search Console every month. But even if you use a spreadsheet and a calendar reminder, the important thing is that you're watching.

SEOJuice
Stay visible everywhere
Get discovered across Google and AI platforms with research-based optimizations.
Works with any CMS
Automated Internal Links
On-Page SEO Optimizations
Get Started Free

no credit card required