TL;DR: Lovable ships CSR React apps -- fast and beautiful but invisible to Google without pre-rendering. This checklist covers sitemap, robots.txt, meta tags, and the rendering fix that makes it all work.
Lovable ships CSR React apps. They are fast and beautiful and invisible to Google. Here is the fix list.
I have audited four Lovable sites in the past six months, and every single one had the same three problems: no sitemap, a robots.txt blocking JavaScript assets, and identical meta titles on every route. The sites looked great. The code was clean. The SEO was nonexistent.
This is not a Lovable problem per se -- it is a CSR problem. Lovable builds apps using React + Vite and ships them as client-side rendered single-page applications. Your browser loads a small HTML shell first, then JavaScript renders "pages" as app states. Google can index CSR sites, but it happens in two stages: crawl the initial HTML, then come back later to render JavaScript and capture the full content. That delay introduces predictable SEO pitfalls you do not see with "HTML-first" platforms.
Treat SEO like code: something you deliberately implement, verify, and maintain. Lovable explicitly recommends this "SEO as code" approach in their documentation.
Use this as your "done = published" gate. I am going to be blunt about what matters most: if you do nothing else, fix the first section. Crawlability and indexing problems show up on literally every Lovable site I have seen. The on-page stuff matters too, but a perfectly optimized title tag is worthless if Google cannot render your page.
sitemap.xml and keep it updated when routes changerobots.txt that does not block JS, CSS, or /assets/ -- include your sitemap referencerobots.txt/llm.html or similar) in your sitemapThis is where I start every Lovable audit, and it is where you should start too. I know the temptation is to jump straight to content optimization or link building, but trust me -- I have watched people spend weeks refining meta descriptions on a site that Google could not even render. Fix the foundation first. Everything else is polish.
Lovable calls a custom domain "one of the most important steps" for SEO because it consolidates authority under a stable canonical URL. Pick your format once -- https://example.com vs https://www.example.com, trailing slash or not -- and keep it consistent across canonicals, internal links, sitemaps, and redirects. I cannot stress this enough: pick one format and stick with it. Two of the four sites I audited had inconsistent URLs -- the sitemap used www, the canonicals did not, and the internal links were a mix of both. Google was indexing three versions of every page.
Lovable supports a primary domain mode where other domains redirect automatically. For domains added after October 29, 2025, the first custom domain becomes the primary by default.
Sitemaps are critical for CSR sites because crawlers cannot easily discover SPA routes through link traversal.
Lovable prompt:
Create XML sitemap at /sitemap.xml listing all public routes.
Include lastmod dates and priorities: homepage 1.0, main pages 0.8, blog posts 0.6.
Verify by opening https://example.com/sitemap.xml and confirming all key routes are present. Regenerate when URLs change -- Lovable does not do this automatically.
(Side note: one of the four sites I audited had a sitemap that listed only the homepage. They had 23 routes. Google had indexed exactly one page. Twenty minutes of sitemap work fixed a problem they had been debugging for months.)
Lovable explicitly warns: never block CSS, JavaScript, or /assets/ because Google needs those resources to render CSR pages.
Create robots.txt at /public/robots.txt that allows all crawlers
and references Sitemap: https://example.com/sitemap.xml
Add self-referencing canonical tags to prevent duplicate content issues from SPA routing variations.
Add canonical tags to all pages pointing to their own URLs.
Use https://example.com format with no trailing slash.
Quick verification in console: console.log('Canonical:', document.querySelector('link[rel="canonical"]')?.href);
This is the problem that surprises everyone. You build five beautiful pages in Lovable, each with different content, different headers, different purposes. You open them in your browser and everything looks distinct. Then you check Google's index and discover every single page has the same title: "My App - Built with Lovable." That is because Lovable explicitly calls out a CSR limitation: metadata does not update automatically across routes. This means every page shows the same title and description unless you fix it. I checked one Lovable site where all 12 pages had the identical title tag in Google's index. The owner had no idea until I showed them.
The solution is react-helmet-async:
Install react-helmet-async and implement per-route SEO metadata:
unique <title>, meta description, canonical, OG tags, and Twitter Card tags for every important route.
Verify by navigating between routes and confirming the page title, meta description, canonical, and OG tags change in the HTML.
Brand -- One-line value propFeature Name -- Benefit | BrandUse Case for [ICP] -- Outcome | Brand[Primary Keyword]: Specific promise (Year)Template: What it is + who it is for + proof + CTA. Keep between 140-160 characters.
One H1 per route. Use H2/H3 for sections, not styling. Use lists and tables for structured information. Put your key value proposition and primary keyword in visible HTML near the top -- not hidden behind interactions.
Lovable states that internal links help users and search engines navigate, discover content, understand topic relationships, and distribute authority.
Your minimum viable architecture:
<a href=""> links, not click handlers, so crawlers can follow themStart with Organization and WebSite schema on the homepage, Article schema on blog posts, Product schema on product pages, and FAQPage schema on FAQ pages.
Add JSON-LD structured data:
- Organization schema on the homepage (name, description, URL, logo, social links)
- Article schema on blog posts
- FAQPage schema on /faq
Validate output and keep it consistent with visible page content.
Most social platforms do not execute JavaScript. Without OG and Twitter metadata in the initial HTML, you get generic or broken link previews. Add unique OG title, description, and image per important route. Do not reuse one generic image for every page.
Lovable includes a built-in Speed tool powered by Google Lighthouse. Target scores: Performance 90+, Accessibility 90+, Best Practices 90+, SEO 100.
Improve performance:
- compress large images, use WebP/AVIF
- add width/height attributes to images
- lazy load non-critical images
- defer non-essential scripts
- preload key assets
Lovable includes an "AI bot access" section showing how to allow or deny bots like GPTBot, PerplexityBot, Claude-Web, and Google-Extended.
Not yet proven in the SEO community, but Lovable recommends a dedicated summary page that AI systems can easily crawl and cite. Include: what your product does (1-2 sentences), who it is for, key features (bullets), pricing summary, security highlights, links to docs and pricing, and a short FAQ with quoteable answers.
Write short, direct FAQ answers that start with the main answer. Avoid vague marketing language. These become the snippets LLMs lift into their responses.
Use Google Search Console as your core monitoring tool. Verify via DNS TXT (recommended by Lovable).
Maintenance schedule:
Missing or outdated sitemap. Especially important for CSR. Must regenerate and resubmit when routes change.
robots.txt blocking rendering resources. Never block CSS/JS or /assets/. Re-test in URL Inspection after changes.
No per-route titles or descriptions. Install react-helmet-async and set unique metadata per route.
No canonical strategy. Add self-referential canonicals and choose one preferred domain format.
Weak internal linking. Ensure nav + footer + contextual links exist. Important pages need multiple links.
(Another aside: the easiest way to audit a Lovable site is to open three different routes and check if the browser tab title changes. If it shows the same title everywhere, you have the metadata problem. Takes 10 seconds.)
Yes. Google can index CSR sites via a two-stage process. The key is implementing crawlability and per-route metadata correctly.
CSR requires a second rendering wave for full content extraction. SSR and SSG platforms serve pre-rendered HTML that Google reads immediately.
If you care about long-term SEO, yes. Lovable strongly recommends it for consolidating authority under one canonical URL.
sitemap.xml (kept updated)robots.txt that does not block assetsSEOJuice automates internal linking, meta tag management, and structured data -- tasks that become bottlenecks as your content library grows. It also tracks visibility across AI platforms and Google Search Console.
Related reading:
no credit card required