Why Client-Side Rendering Destroys Your Search Rankings
CSR sends empty HTML to crawlers. Learn why client-side rendering hurts SEO and what to do about it.

The short answer: Client-side rendering (CSR) is the worst rendering strategy for SEO. It sends empty HTML to crawlers, delays Google indexing, breaks social media link previews, and is completely invisible to AI search engines like ChatGPT and Perplexity. If your website relies on CSR, you are losing search traffic.
Every time a crawler visits a client-side rendered page, it sees a nearly empty HTML document. The actual content only appears after JavaScript executes in a browser, and most crawlers either cannot or will not wait for that to happen.
How CSR Works (And Why Crawlers Hate It)
In a client-side rendered application, the server returns something like this:
<html>
<body>
<div id="root"></div>
<script src="/bundle.js"></script>
</body>
</html>The browser downloads bundle.js, executes it, makes API calls to fetch data, and then populates the page. A human visitor sees a loading spinner for a moment, then the content appears. But a crawler that does not execute JavaScript sees only that empty <div>.
Google's Web Rendering Service (WRS) can execute JavaScript, but it does so in two phases. First, Googlebot crawls and indexes the raw HTML. Second, the page enters a render queue where WRS runs an evergreen Chromium instance to execute the JavaScript. Google states this queue delay can last "a few seconds" but independent research from Onely found that Google needs 9x more time to crawl JavaScript pages compared to plain HTML pages.
The WRS also caches aggressively and may ignore your cache headers. If you update your JavaScript bundle but Google serves a cached version, your latest content changes may not appear in search results for days or weeks.
Not sure if your site has this problem? We can run a free check and show you exactly what Google sees.
AI Crawlers Cannot See CSR Content
This is the most important development for web rendering in 2025 and 2026. AI search engines are growing rapidly, but none of them execute JavaScript.
Vercel analyzed their network traffic and found that AI crawlers make over 900 million requests per month across their platform. Here is what they discovered:
GPTBot (OpenAI) makes 569 million requests per month on Vercel's network. It fetches JavaScript files 11.5% of the time but does not execute them. The files are likely collected for training data, not for understanding page content.
ClaudeBot (Anthropic) makes 370 million requests per month. It fetches JavaScript files 23.84% of the time but does not execute them.
PerplexityBot does not execute JavaScript either. Cloudflare has documented cases of Perplexity using undeclared crawlers that impersonate Chrome to bypass access restrictions, but even these do not appear to execute JavaScript for content rendering.
If your website uses CSR, your content does not exist for any AI search engine. As AI-powered search (ChatGPT search, Perplexity, Google AI Overviews) continues to grow, this gap will only widen.
Want to see if your site is visible to AI crawlers? Get a free audit and we will check your rendering output.
Social Media Previews Break Too
When you share a link on Facebook, Twitter/X, LinkedIn, Discord, or Slack, those platforms send a crawler to fetch your page and extract Open Graph meta tags (og:title, og:image, og:description) for the link preview card.
None of these crawlers execute JavaScript. If your CSR app injects Open Graph tags via client-side JavaScript, every social share will show a blank or broken preview. This affects click-through rates from social media and reduces the value of every link shared.
CSR Destroys Core Web Vitals
Google uses Core Web Vitals as a ranking signal. CSR pages consistently score worse than server-rendered pages on the metrics that matter most.
| Metric | CSR Impact | SSR/SSG Impact |
|---|---|---|
| LCP (Largest Contentful Paint) | Poor. Content appears only after JS downloads, parses, fetches data, and renders. Often 4-5 seconds. | Good. Content is in the initial HTML. Typically under 2.5 seconds. |
| CLS (Cumulative Layout Shift) | Higher risk. Content loads in stages as components render, causing layout shifts. | Lower risk. Initial layout is set by server-rendered HTML. |
| INP (Interaction to Next Paint) | Delayed. Page is not interactive until the full JS bundle loads and executes. | Better. SSR pages can be interactive sooner, especially with streaming. |
For businesses trying to optimize their website speed, moving away from CSR is often the single highest-impact change.
How to Fix CSR SEO Problems
The solution is to move your rendering to the server or to build time. You do not need to rewrite your entire application.
React apps: Migrate to Next.js or Remix. Both frameworks accept your existing React components and add SSR or SSG. Next.js also supports ISR for pages that need periodic updates.
Vue apps: Migrate to Nuxt. It supports SSR, SSG, and hybrid rendering out of the box with deployment presets for every major hosting platform.
Starting fresh: Use Astro for content sites (zero JavaScript by default) or SvelteKit for interactive applications (compiled output, tiny bundles). Both deploy to Cloudflare Pages, Vercel, and Netlify.
AI builder sites (Lovable, Bolt): These tools output Vite plus React SPAs. You need to either add a server-rendering layer or rebuild with a framework that includes SSR. Our modern website service handles this migration.
Google deprecated its dynamic rendering workaround in 2024 and now explicitly recommends SSR, SSG, or hydration instead. The recommended path forward is clear.
For a full comparison of all four rendering strategies, read our pillar guide on SSR vs SSG vs ISR vs CSR. If you are evaluating frameworks, our framework comparison ranks 12 options by SEO capability. And if you are considering edge rendering to further reduce latency, we cover the real-world data on that too.
Summary
- CSR sends empty HTML to crawlers, requiring JavaScript to see any content
- Google renders JavaScript in a delayed second phase, taking significantly longer than plain HTML
- AI crawlers (GPTBot, ClaudeBot, PerplexityBot) do not execute JavaScript at all
- Social media crawlers do not render JavaScript, breaking all link previews
- CSR hurts LCP, CLS, and INP scores, all of which are Google ranking signals
- The fix is migrating to SSR or SSG using your existing framework ecosystem (Next.js for React, Nuxt for Vue)
- Google deprecated dynamic rendering in 2024 and recommends SSR or SSG
References
- Google Search Central: JavaScript SEO Basics - How Googlebot handles JavaScript rendering
- Vercel: The Rise of the AI Crawler - AI crawler analysis showing zero JavaScript execution
- Onely: Google Needs 9x More Time to Crawl JS Than HTML - Independent research on JavaScript crawling delays
- Google Search Central: Dynamic Rendering - Deprecated workaround documentation
- Cloudflare: From Googlebot to GPTBot - 2025 crawler landscape analysis
Frequently Asked Questions
What is client-side rendering?
Client-side rendering (CSR) is when the server sends a minimal HTML file with a JavaScript bundle. The browser downloads the JavaScript, executes it, fetches data from APIs, and then builds the page content on the user's device. Until JavaScript finishes running, the page is empty.
Can Google index client-side rendered pages?
Google can render JavaScript using its Web Rendering Service (WRS), which runs evergreen Chromium. However, rendering happens in a second phase after the initial crawl, which creates an indexing delay. Research from Onely found Google needs 9x more time to crawl JavaScript pages compared to plain HTML.
Do AI search engines like ChatGPT render JavaScript?
No. GPTBot (OpenAI), ClaudeBot (Anthropic), and PerplexityBot do not execute JavaScript. Vercel's analysis of over 900 million monthly AI crawler requests found zero evidence of JavaScript rendering. CSR content is completely invisible to AI search engines.
How does CSR affect Core Web Vitals?
CSR typically results in poor LCP (Largest Contentful Paint) because the browser must download, parse, and execute JavaScript before any content appears. It also increases CLS (Cumulative Layout Shift) risk because content loads in stages as JavaScript renders different components.
Do social media platforms render JavaScript for link previews?
No. Facebook, Twitter/X, LinkedIn, Discord, Slack, and WhatsApp crawlers do not execute JavaScript. If your Open Graph meta tags are injected via client-side JavaScript, your link previews will be blank or broken on all social platforms.
How do I fix CSR SEO problems?
The most effective fix is migrating to a framework with server-side rendering. If your site is built with React, move to Next.js or Remix. If it uses Vue, move to Nuxt. These frameworks let you keep your existing components while adding SSR or SSG. Pre-rendering at build time gives even better results.
Is my Lovable or Bolt website client-side rendered?
Yes. AI website builders like Lovable and Bolt generate Vite plus React projects by default. These are client-side single page applications with no server rendering. The sites look great visually but send empty HTML to search engine crawlers.
Does Google penalize client-side rendered sites?
Google does not penalize CSR directly. However, CSR pages face indexing delays, potential rendering failures, and worse Core Web Vitals scores. The practical result is lower rankings compared to server-rendered alternatives, even if there is no formal penalty.
Related Articles

Edge Rendering and SEO: Does Serving From the Edge Help?
Edge SSR cuts TTFB by serving HTML from locations near users. Learn when it helps SEO and when it does not.
7 min read

MDX vs Markdown for SEO: How Content Format Affects Rankings
MDX and Markdown both compile to HTML. Learn the real SEO differences and when each format works best.
6 min read

Programmatic SEO: How to Build Thousands of Pages That Rank
Learn what programmatic SEO is, how it works with templates and data, and which rendering strategy makes it scale.
7 min read