SEO for Next.js React and Nuxt

Vadim Kravcenko
Vadim Kravcenko
Mar 25, 2026 · 18 min read

TL;DR: JavaScript frameworks break SEO by default. Client-side rendering means search engines see an empty shell — and AI crawlers don't execute JavaScript at all. Next.js (with Server Components and the Metadata API) is the best option for SEO right now. Nuxt is close behind for Vue shops. Plain React SPAs should never be used for pages you want indexed. This guide has the exact code, the exact mistakes, and the exact fixes. No hand-waving.

The Fundamental Problem: JavaScript and Search Engines Don't Get Along

Googlebot JavaScript processing pipeline showing crawling, rendering, and indexing phases
How Googlebot processes JavaScript: pages go through separate crawling, rendering, and indexing phases, which can delay content discovery. Source: Google Search Central

The rule: If a URL needs to rank in search or be cited by AI, it must deliver complete HTML in the initial response. That's it. "But Google can render JavaScript" — yes, eventually, unreliably, after a queue that could last days, on infrastructure that Google operates at enormous cost specifically because so many developers ship empty HTML shells and expect the search engine to do the rendering work for them. And AI crawlers won't even try.

ISR is... actually, let me back up. I used to think ISR was a clear winner over SSR for most cases. After watching revalidation bugs cause stale content to persist for weeks on a few client sites, I'm less certain. On one client's e-commerce site, ISR pages were serving stale prices for 6 hours because the revalidation silently failed — no error in the logs, no alert, just wrong prices shown to customers and crawlers alike. ISR is excellent when it works. When it doesn't, debugging stale cache issues is genuinely painful. SSR is more predictable, even if it's slower.

Next.js SEO: The Deep Guide

Next.js gets the longest section because it deserves it. It's the most popular React framework, and since the App Router became the default in Next.js 13+, it's become the best JavaScript framework for SEO. Not close — meaningfully better than the alternatives.

If you're building a new project in 2026 and SEO matters, use Next.js with the App Router. That's the recommendation. The rest of this section explains why.

App Router vs Pages Router — SEO Implications

Next.js has two routing systems. The App Router (introduced in Next.js 13, stable in 14+) and the legacy Pages Router. Both can produce good SEO outcomes, but the App Router is better in almost every way that matters for search engines:

  • Server Components by default — components render on the server unless you explicitly add 'use client'. Less JavaScript shipped means faster pages and complete HTML for crawlers.
  • Streaming SSR — the server starts sending HTML immediately, before all data is fetched. TTFB improves. Googlebot gets content faster.
  • Built-in Metadata API — type-safe metadata generation per route. No more third-party packages for meta tags.
  • Nested layouts — shared UI persists across navigations without re-rendering, improving Core Web Vitals.

The Pages Router still works fine. If you have a large existing codebase on it, don't panic-migrate.

But for new projects? App Router, no question.

The Metadata API (This Is the Big One)

Before the App Router, Next.js SEO meant installing next-seo and manually wiring up meta tags. The Metadata API — which I actually think is one of the best things about the App Router, despite my complaints about the migration pain — handles this cleanly as a first-class framework feature.

Static metadata for a page:

// app/about/page.tsx
import { Metadata } from 'next'

export const metadata: Metadata = {
  title: 'About Us | Your Company',
  description: 'We build tools that make SEO automatic.',
  openGraph: {
    title: 'About Us',
    description: 'We build tools that make SEO automatic.',
    type: 'website',
  },
  alternates: {
    canonical: 'https://example.com/about',
  },
  robots: {
    index: true,
    follow: true,
  },
}

Dynamic metadata for pages with parameters (blog posts, products, etc.):

// app/blog/[slug]/page.tsx
import { Metadata } from 'next'

type Props = { params: { slug: string } }

export async function generateMetadata({ params }: Props): Promise<Metadata> {
  const post = await getPost(params.slug)

  return {
    title: `${post.title} | Your Blog`,
    description: post.excerpt,
    openGraph: {
      title: post.title,
      description: post.excerpt,
      type: 'article',
      publishedTime: post.publishedAt,
      authors: [post.author.name],
      images: [{ url: post.ogImage, width: 1200, height: 630 }],
    },
    alternates: {
      canonical: `https://example.com/blog/${params.slug}`,
    },
  }
}

export default async function BlogPost({ params }: Props) {
  const post = await getPost(params.slug)

  return (
    <article>
      <h1>{post.title}</h1>
      <p>{post.content}</p>
    </article>
  )
}

This is a Server Component. No 'use client' directive. It runs entirely on the server. The HTML that reaches Googlebot includes the full article content and all meta tags. Zero JavaScript rendering required.

Common mistake I see constantly: developers put their data fetching in a 'use client' component, which means the content loads via JavaScript after the initial HTML is sent. Googlebot gets an empty page. Seriously. Move your data fetching to Server Components or generateMetadata. And if you're not sure whether a component is server or client, check for the 'use client' directive at the top of the file — if it's there, nothing in that component tree will be in the initial HTML response.

Sitemap Generation

Next.js App Router has native sitemap support. Create app/sitemap.ts and it generates at /sitemap.xml automatically:

// app/sitemap.ts
import { MetadataRoute } from 'next'

export default async function sitemap(): MetadataRoute.Sitemap {
  const posts = await getAllPosts()
  const products = await getAllProducts()

  const postEntries = posts.map((post) => ({
    url: `https://example.com/blog/${post.slug}`,
    lastModified: new Date(post.updatedAt),
    changeFrequency: 'weekly' as const,
    priority: 0.7,
  }))

  const productEntries = products.map((product) => ({
    url: `https://example.com/products/${product.slug}`,
    lastModified: new Date(product.updatedAt),
    changeFrequency: 'daily' as const,
    priority: 0.8,
  }))

  return [
    {
      url: 'https://example.com',
      lastModified: new Date(),
      changeFrequency: 'daily',
      priority: 1,
    },
    ...postEntries,
    ...productEntries,
  ]
}

For sites with more than 50,000 URLs, use generateSitemaps() to create multiple sitemap files. Google's limit is 50,000 URLs per sitemap file.

robots.ts

Same pattern. Create app/robots.ts:

// app/robots.ts
import { MetadataRoute } from 'next'

export default function robots(): MetadataRoute.Robots {
  return {
    rules: [
      {
        userAgent: '*',
        allow: '/',
        disallow: ['/api/', '/dashboard/', '/admin/'],
      },
    ],
    sitemap: 'https://example.com/sitemap.xml',
  }
}

Dynamic OG Images

Next.js can generate Open Graph images on-the-fly using next/og (built on Vercel's Satori library). This is useful — instead of manually creating OG images for every blog post, you define a template and it renders at request time:

// app/blog/[slug]/opengraph-image.tsx
import { ImageResponse } from 'next/og'

export const size = { width: 1200, height: 630 }
export const contentType = 'image/png'

export default async function Image({
  params,
}: {
  params: { slug: string }
}) {
  const post = await getPost(params.slug)

  return new ImageResponse(
    (
      <div style={{
        display: 'flex',
        flexDirection: 'column',
        justifyContent: 'center',
        padding: '60px',
        background: 'white',
        width: '100%',
        height: '100%',
      }}>
        <h1 style={{ fontSize: 48, fontWeight: 700 }}>
          {post.title}
        </h1>
        <p style={{ fontSize: 24, color: '#666' }}>
          {post.excerpt}
        </p>
      </div>
    ),
    { ...size }
  )
}

Next.js automatically sets the og:image meta tag to point at this generated image. No manual wiring needed.

Image Optimization with next/image

The next/image component handles lazy loading, automatic WebP/AVIF conversion, responsive sizing, and prevents layout shift (CLS). All of these directly affect Core Web Vitals, which Google uses as a ranking signal.

import Image from 'next/image'

// This automatically:
// - Generates WebP/AVIF variants
// - Lazy loads below-the-fold images
// - Sets width/height to prevent CLS
// - Serves responsive sizes
<Image
  src="/hero.jpg"
  alt="Product screenshot showing the dashboard"
  width={1200}
  height={630}
  priority  // Above-the-fold: disable lazy loading
/>

Two things developers consistently get wrong: they forget priority on the LCP image (your largest above-the-fold image), and they use raw <img> tags instead of next/image. Both hurt Core Web Vitals.

Structured Data in Next.js

Add JSON-LD structured data directly in your Server Components. No third-party package needed:

// app/blog/[slug]/page.tsx
export default async function BlogPost({ params }) {
  const post = await getPost(params.slug)

  const jsonLd = {
    '@context': 'https://schema.org',
    '@type': 'Article',
    headline: post.title,
    description: post.excerpt,
    datePublished: post.publishedAt,
    dateModified: post.updatedAt,
    author: {
      '@type': 'Person',
      name: post.author.name,
    },
    image: post.ogImage,
  }

  return (
    <>
      <script
        type="application/ld+json"
        // In Next.js Server Components, this is safe —
        // the JSON is generated server-side from your own database
        {...{ children: JSON.stringify(jsonLd) }}
      />
      <article>
        <h1>{post.title}</h1>
        <p>{post.content}</p>
      </article>
    </>
  )
}

Common Next.js SEO Mistakes

I see these on almost every Next.js site I audit. Every single one of them costs rankings.

  • Fetching data in 'use client' components — Your content loads after JavaScript runs. Googlebot sees an empty loading spinner. Move data fetching to Server Components.
  • Missing canonical URLs — Next.js doesn't set canonicals by default. If you have /blog/my-post and /blog/my-post?ref=twitter both indexed, you're splitting authority. Set alternates.canonical in your metadata.
  • No metadata on dynamic routes — Static pages get the metadata export. Dynamic pages need generateMetadata. I've seen sites with perfect metadata on the homepage and <title>undefined</title> on every product page.
  • Using loading.tsx for critical content — The loading file shows a skeleton while content streams in. Googlebot may index the skeleton instead of the actual content. Use loading.tsx for non-critical UI, not for the main page content.
  • Client-side redirects — Using router.push() for redirects. Search engines don't execute JavaScript redirects reliably. Use next.config.js redirects or middleware for server-side 301/302s.

React SPA SEO: Mostly "Don't"

I'm annoyed I have to write this section. It's 2026. (If you're still using Create React App for a public site in 2026, we need to talk.) But the emails keep coming, so here we are.

Don't use a plain React SPA for pages you want indexed. That's the whole point. A plain React SPA — Create React App, Vite with React, whatever — sends an empty HTML shell to every crawler that visits. Google might render it eventually. AI crawlers never will. Use it for dashboards, admin panels, anything behind a login. Don't use it for anything you want to appear in search results.

Stopgap: React Helmet + Prerendering

If you're stuck with a React SPA and can't migrate to Next.js (I understand — it happens), here's the duct tape approach:

// Using react-helmet-async for meta tags
import { Helmet } from 'react-helmet-async'

function ProductPage({ product }) {
  return (
    <>
      <Helmet>
        <title>{product.name} | Your Store</title>
        <meta name="description" content={product.description} />
        <link rel="canonical"
          href={`https://example.com/products/${product.slug}`} />
      </Helmet>
      <h1>{product.name}</h1>
    </>
  )
}

React Helmet manages your <head> tags. But — and this is critical — it still runs client-side. Googlebot still needs to render JavaScript to see these tags. You need a prerendering service (Prerender.io, Rendertron, or your own Puppeteer setup) to serve pre-rendered HTML to crawlers.

This works. I've seen it work. But it's fragile, adds latency for crawler requests, costs money, and you're fighting the framework instead of working with it.

If your SPA has more than 50 pages that need indexing — and those pages have dynamic content that changes weekly, which means the prerendering cache needs constant invalidation, which means someone on your team is now maintaining crawler infrastructure instead of building product features — the cost of maintaining a prerendering setup exceeds the cost of migrating to Next.js. I've run those numbers for clients.

As Addy Osmani and Jason Miller documented on web.dev, prerendering typically adds noticeable server-side latency per page for crawler requests, and edge cases with dynamic content or authenticated states frequently produce stale or incorrect snapshots. It's a valid bridge strategy, not a permanent solution (web.dev: Rendering on the Web).

Nuxt SEO: The Vue Equivalent

If your team is in the Vue ecosystem, Nuxt is to Vue what Next.js is to React. Same idea: take a client-side framework, add server rendering, metadata management, and file-based routing. (I should admit: I've spent far less time with Nuxt than Next.js, so take my opinions here with extra salt.) The SEO story is strong — not quite as polished as Next.js in a few areas, but close enough that it shouldn't be the deciding factor between the two ecosystems.

SSR Is the Default

Nuxt uses universal rendering out of the box. Every page is server-rendered on first load, then hydrated for client-side navigation. You don't have to opt in. You have to opt out. That's the right default for SEO.

useHead() and useSeoMeta()

Nuxt's composable system for metadata is clean. Two options depending on how much control you need:

<!-- pages/blog/[slug].vue -->
<script setup>
const route = useRoute()
const { data: post } = await useFetch(`/api/posts/${route.params.slug}`)

// Option 1: useSeoMeta — type-safe, covers the common cases
useSeoMeta({
  title: () => post.value?.title,
  description: () => post.value?.excerpt,
  ogTitle: () => post.value?.title,
  ogDescription: () => post.value?.excerpt,
  ogType: 'article',
  ogImage: () => post.value?.ogImage,
  twitterCard: 'summary_large_image',
})

// Option 2: useHead — full control over <head> tags
useHead({
  link: [
    {
      rel: 'canonical',
      href: `https://example.com/blog/${route.params.slug}`,
    }
  ],
  script: [
    {
      type: 'application/ld+json',
      innerHTML: JSON.stringify({
        '@context': 'https://schema.org',
        '@type': 'Article',
        headline: post.value?.title,
        datePublished: post.value?.publishedAt,
      }),
    },
  ],
})
</script>

<template>
  <article>
    <h1>{{ post?.title }}</h1>
    <p>{{ post?.content }}</p>
  </article>
</template>

I like useSeoMeta() a lot. It's more opinionated than Next.js's Metadata API but covers 90% of what you need with less boilerplate. The type safety means your IDE will catch typos in meta tag names — something that's caused real bugs on projects I've worked on.

Hybrid Rendering with Route Rules

Nuxt's routeRules let you mix rendering strategies per route. This is one area where Nuxt is arguably ahead of Next.js in developer experience:

// nuxt.config.ts
export default defineNuxtConfig({
  routeRules: {
    '/':              { prerender: true },     // SSG — homepage
    '/blog/**':       { isr: 3600 },           // ISR — revalidate hourly
    '/products/**':   { ssr: true },           // SSR — always fresh
    '/dashboard/**':  { ssr: false },          // CSR — authenticated pages
    '/docs/**':       { prerender: true },     // SSG — documentation
  }
})

One config object, five different rendering strategies. With Next.js, you'd set export const dynamic = 'force-static' or export const revalidate = 3600 on each individual page file. (Actually, that's not entirely fair — Next.js middleware can handle some of this centrally too. But Nuxt's approach is more explicit about it.) (This reminds me of debugging Gatsby's GraphQL layer at 2am — some frameworks optimize for developer experience at the expense of developer sanity.) Nuxt's approach scales better when you have clear route-level patterns.

Sitemap Module

Nuxt doesn't have built-in sitemap generation like Next.js. You need the @nuxtjs/sitemap module (which is officially maintained and well-supported):

// nuxt.config.ts
export default defineNuxtConfig({
  modules: ['@nuxtjs/sitemap'],

  sitemap: {
    sources: ['/api/__sitemap__/urls'],
    exclude: ['/dashboard/**', '/admin/**'],
  },

  site: {
    url: 'https://example.com',
  },
})

The module auto-discovers your static pages from the file-based routing. For dynamic pages (blog posts, products), you provide an API endpoint that returns the URL list. Sitemap index files are generated automatically when you exceed 50,000 URLs.

Nuxt Content for Blog/Docs

If you're building a blog or documentation site with Nuxt, the @nuxt/content module is worth knowing. It reads Markdown/MDX files from a content/ directory, generates pages with full SSG support, and includes built-in search. For SEO, the key benefit is that every content page is pre-rendered as static HTML — the fastest possible delivery to search engines.

War Story: The Migration That Changed My Mind

Google Lighthouse SEO audit score showing passed and failed SEO checks
Lighthouse SEO audit scores your page on meta tags, crawlability, and mobile-friendliness. Source: Semrush Blog

Fix: Server-render your critical content. If you must have loading states, make sure they only appear for non-critical UI elements. Verify: disable JavaScript in Chrome DevTools (Settings > Debugger > Disable JavaScript), reload the page. Whatever you see is what most crawlers see. If you see a spinner, so does Google.

4. Missing Structured Data

JSON-LD structured data helps search engines understand what your page is about. Product pages without Product schema. Blog posts without Article schema. FAQ pages without FAQPage schema. Every missing schema type is a missed opportunity for rich results.

Fix: Add JSON-LD to every page type. Verify: run Google's Rich Results Test on each page type. Both Next.js and Nuxt make this straightforward — see the code examples above. Or let SEOJuice handle it automatically.

5. Client-Side Redirects

Using JavaScript (window.location.href, router.push(), navigateTo()) for redirects instead of server-side 301/302s. Search engines don't reliably follow JavaScript redirects. Link equity doesn't transfer. Pages that should be consolidated stay split.

Fix: Use server-side redirects. In Next.js, configure them in next.config.js or use middleware. In Nuxt, use routeRules redirects. Verify: curl -I https://your-site.com/old-url — you should see a 301 or 302 status code with a Location header. If you see 200 OK, the redirect is client-side only.

6. Forgetting About AI Crawlers

This is the new one. Even if your SSR setup is perfect for Googlebot, check whether your robots.txt blocks AI crawlers. Some CDN providers (especially Cloudflare with its "AI Bot" toggle) block GPTBot and similar crawlers by default. If you want visibility in AI search engines, make sure they can access your content. Verify: curl https://your-site.com/robots.txt and search for GPTBot, ClaudeBot, PerplexityBot. If they're disallowed and you didn't put them there, your CDN did.

And one final check that covers all of the above: right-click any page, View Page Source. If your content is in the raw HTML, server rendering is working. If you see an empty <div id="root"></div>, it's not. Run Lighthouse's SEO audit too — aim for 100, it's achievable with server rendering. As Jason Miller and Addy Osmani wrote on web.dev: rendering strategies involve real tradeoffs, and you cannot afford to assume your JavaScript will be executed correctly by every crawler (web.dev: Rendering on the Web).

The Honest Take

Google Search Console URL Inspection tool showing indexation status
URL Inspection in Google Search Console shows indexation status and how Googlebot rendered the page. Source: Semrush Blog

JavaScript frameworks and SEO have a complicated history. For years, the advice was "just use server-side rendering" and that was basically it. Now the picture is more nuanced but also more tractable.

Next.js with the App Router is the best option for most teams in 2026. Server Components mean your pages are server-rendered by default, the Metadata API is type-safe and comprehensive, and the ecosystem is mature. If you're in the Vue ecosystem, Nuxt gives you the same core benefits with a slightly different API surface.

Plain React SPAs should not be used for public-facing pages. I'm tired of seeing React SPAs on marketing sites. It's 2026. We should know better. And this isn't going to change — if anything, the rise of AI crawlers that don't execute JavaScript makes this harder, not easier.

One thing I'm genuinely uncertain about: the long-term trajectory of Remix versus Next.js for SEO. Remix's "SSR only, no caching layer" philosophy is appealing in its simplicity. As edge computing gets cheaper, the argument for ISR (Next.js's middle ground) weakens. I wouldn't bet against Remix in 2027. But today, Next.js has more built-in SEO features, a larger ecosystem, and better documentation. Use what solves your problem now.

If all of this sounds like a lot of configuration to get right — it is. That's why SEOJuice exists: you pick the rendering strategy, we automate the rest.


FAQ

Does Google actually render JavaScript now?

Yes, but with caveats. Google uses the Web Rendering Service (an evergreen Chromium) to execute JavaScript. The rendering happens in a second pass — after the initial crawl — and can take anywhere from seconds to days. For established sites with high crawl budgets, it's usually fast. For new or low-authority sites, the delay can be significant. And crucially: Google's rendering is not guaranteed. Pages that depend on complex JavaScript interactions, require authentication, or have rendering errors may never be fully rendered. The safer approach is always to serve complete HTML in the initial response.

Is Nuxt better or worse than Next.js for SEO?

Choose based on whether your team writes Vue or React, not based on SEO differences.

How do I fix SEO for an existing React SPA without rewriting everything?

The fastest path: add Prerender.io or a similar prerendering service as middleware. This intercepts crawler requests and serves pre-rendered HTML. It takes about an hour to set up and immediately makes your content visible to search engines. Then plan a gradual migration to Next.js — the App Router can coexist with existing pages, so you can migrate route by route instead of doing a big-bang rewrite. Start with your highest-traffic pages.

SEOJuice
Stay visible everywhere
Get discovered across Google and AI platforms with research-based optimizations.
Works with any CMS
Automated Internal Links
On-Page SEO Optimizations
Get Started Free

no credit card required