Common Pitfalls with Dynamic Rendering for JavaScript SEO

Earlier this year, Google announced official support for dynamic rendering as a solution for JavaScript-driven websites: serving search bots a server-side rendered (SSR) page, while providing a client-side experience for users. (In truth, many sites have been doing this for years via in-house solutions and services like Prerender.io, but it was great to see Google address that its JavaScript crawling capabilities aren’t perfect and that dynamic rendering doesn’t count as cloaking.)

We’ve worked with several sites with various setups over the last several years. Below is a list of common issues we see and items to watch for as sites make the switch to dynamic rendering.

1. Avoid Accidental Cloaking

Minimize differences between the search bot version and the user version. Risk is introduced if you implement things on the SSR that are “for Google only,” such as adding content to the page “for SEO” that users will not see, like additional keyword-targeted copy, more links, or the removal of aggressive ads/CTAS. Don’t fall into the trap of adding features or text to the “SEO version” of a page by selectively serving it to your prerender solution. While dynamic rendering isn’t cloaking, using it to present content differences that influence ranking still is.

2. Account for Device User-agents

Ensure that device-specific crawlers get the same experience (content, link graph, and redirects) that a user on that device will receive (i.e., smartphone crawler receives any mobile-specific design, content, and redirects).

If desktop & mobile return different SSR snapshots, use a “Vary: User-Agent” in the response headers, as this is a form of Dynamic Serving in addition to Dynamic Rendering.  (This is generally not a concern for Responsive pages, but some sites are “mostly Responsive” with certain elements, such as navigation, changing between the two).

If your site changes for mobile vs. desktop user-agents, you may need to support two separate SSR versions to account for that difference.

3. Use Caching for Faster Server Response

Depending on the solution used, caching can be used to speed up the response time search bots experience and reduce the load on your server. One of the reasons client-side rendering is used is to speed up a page’s load time or to defer particularly slow elements on the page (such as a Suggested/Recommended products features or product search results). Dynamically constructing these on the fly can increase response time. We want to avoid a prolonged server response as a result of prerendering.

4. Account for Non-active Parameters:

Depending on the solution you go with, you can often disregard parameters that do not change a page’s content when generating a prerender/cache. For example, if a URL has tracking parameters appended to it, you can return back the prerendered version of the canonical URL instead of saving a cache for many different URLs with the same content. This also reduces load by not triggering a new prerender for an artificially “new” URL.

While some of these can be managed using Robots.txt, preventing Google from fetching them in the first place, any allowed parameter will hit whatever system you’re using to handle your SSR. Ignoring these parameters speeds up response times, reduces load, and slims down the size of your cache.

5. Account for Prerendering Fails

Depending on the solution, look for ways that the prerender might fail (downtime, slow response, code issue that interferes with the render). You’ll want to avoid a cache of a blank or incomplete prerender.

  • Signal when rendering is complete: We’ve found it can be helpful to set an initial status code of 503 that updates to 200 once the content has been fully loaded and rendering is complete.  We’ve done this using a meta tag with a default value of 503, which is switched to a 200 by JS after the page has completed. If the prerendering service gets a 503, it will recrawl until it receives a 200.
  • Watch for tests that break rendering: We’ve seen Adobe Test issues cause blank or incomplete pages to get cached by the prerender service. These types of problems can be challenging to catch. The status code change above helped solve this issue on sites we’ve worked with. (We discovered the issue by noticing that Google was caching the content of a single page for many different pages – a sign that the pages all looked duplicate to Google, because they were “blank.”) Keep a lookout for failed rendering on a percentage of URLs, such as a test applied to 10% of URLs. A percentage based test could result in only a small portion of URLs failing each day. These errors can cycle through different URLs as the SSR cache updates, which makes reproducing errors a challenge (you see an error in GSC or crawl, but the URL looks fine by the time you inspect it). These types of errors can be harming your traffic, and you might not even realize it.

6. Monitor Logs

Ensure hits to the prerender snapshot/cache are monitored for catastrophic issues. If something fails there, it may not be visible to users but can have dramatic impacts on Google’s indexation and traffic. It could take you a week or two to notice, but weeks to months to fix and recover.

You can look at the general hit rate, but you always want to monitor misses/hits and error codes. These should be flagged to on-call IT or NOC teams, as these Googlebot-only errors surface more slowly than traditional signals like downtime, short-term traffic drops, or customer complaints.

7. Set up Crawls for Monitoring SSR Issues

We like to have two regular site crawls (weekly or monthly, depending on resources) to look for issues: one as a Googlebot user-agent and one as a traditional JavaScript crawl (how users see the site). The crawl as Googlebot is helpful for monitoring dynamic rendering, as issues that appear may not be immediately apparent to a user browsing the site. We look for common footprints that indicate a failed rendered, such as title, h1, or other bits of expected text (price, reviews, etc.). You can also review duplicate content, content similarity, and content change percentages in tools like Botify or DeepCrawl.

There are also monitoring services, like Little Warden, which can look out for site changes you might not notice right away when auditing the site.

Some servers and CDNs, such as Cloudflare, block non-authentic search bot traffic, making these type of user-agent crawls impossible. If this happens, talk with your team about whitelisting your IP (your personal IP for desktop crawlers like Screaming Frog or the service’s IP(s) for cloud-based crawlers like DeepCrawl and Botify).

8. Plan for Changes to Auditing the Site

With a change to dynamic serving, anyone auditing the site for SEO purposes will need to set their user-agent to Googlebot. This can be done easily using Chrome extensions. Forgetting this simple step when auditing means you’re not hitting the Dynamic Render version of a page. Get in the habit of always switching your user-agent when navigating the site.

Also account for any additional complexity in your rules, such as Bingbot vs. Googlebot or Desktop vs. Mobile. You’ll want to check those versions too.

9. Watch for Design & CSS Failures in SSR

With Dynamic Rendering, search bots will only get the SSR snapshot version of the page. This is the version of the page design they’ll use for any sort of visual analysis, such as mobile-friendly, layout, proximity, visual prominence, and images. Ensure you’re happy with the design of the page as it appears to the search bot user-agent and within Fetch & Render. The access to CSS resources and their interactions with JavaScript can cause these to not render appropriately in the SSR. Audit the page to see if it looks right.

10. Watch for Content Set to Invisible

It’s possible that content on a page is set to invisible that you’d expect to be rendered visible. This can occur when a bit of text is set to invisible via inline CSS by default and fails to switch to visible before the SSR is captured. This results in text on the SSR remaining invisible. While content within the code is still discoverable, it’s possible the hidden text is treated differently. It’s best to ensure all content you expect to be visible is visible within your Dynamically Rendered SSR.

11. Watch for User Events

Content dependent on a user action, such as an onClick event, is typically not captured by a prerender solution. That event-dependent content may still be inaccessible even with the Dynamic Rendering solution. A Dynamic Rendering approach means your indexation is dependent on the abilities of that solution. If your solution can’t render it, search engines can’t index it. Always audit the SSR using a search user-agent to ensure all text you expect to be in the source code was captured and cached. It’s a best practice to have important content visible by default and not dependent on a user action.

3 thoughts on “Common Pitfalls with Dynamic Rendering for JavaScript SEO

    • Sorry for not mentioning in the post. We haven’t used OnCrawl yet for any of our clients, so I limited it to tools we’ve used. I still need to check out OnCrawl. 🙂

      We’ve also been using Sitebulb more recently, which is a solid desktop crawler that has some features not in Screaming Frog.

Leave a Comment