Adding Googlebot to your integration
This guide is to help you understand how Google’s Dynamic Rendering works and why it’s important for SEO.
Overview
Google recommends using Prerender.io for Dynamic Rendering to serve prerendered content to search engines and JavaScript content to users. (Dynamic Rendering documentation)
In May 2018, Google introduced Dynamic Rendering as a strategy for handling JavaScript-heavy websites. Under this model, Googlebot receives prerendered pages, while regular users continue to access dynamic JavaScript content. Google has specifically endorsed this practice as a way to ensure that search engines can properly index content that would otherwise be difficult to crawl due to JavaScript rendering issues.
Prerender.io implements Dynamic Rendering by rendering content for search engine bots, like Googlebot, and delivering it directly to them while serving the live, interactive JavaScript page to users. This ensures that search engines can properly index your content and gives you a better chance of ranking higher in search results.
Here's a slide from Google's presentation:
Why It Matters
Prerender.io’s service aligns with Google's recommendation for dynamic rendering, and by using it, you can improve your site's SEO performance. Serving search engines with prerendered content ensures that:
-
Search engines can crawl and index your content: Even if your site uses JavaScript for rendering, bots will still get access to fully-rendered pages.
-
Better ranking potential: Pages that are properly indexed are more likely to rank well in search results.
-
Avoid cloaking issues: Google permits sending prerendered pages to search engines by checking their user agents, avoiding the risk of cloaking (deliberately showing different content to users and search engines).
Prerender.io's middleware is designed to make the process seamless, ensuring that Googlebot and other major search bots receive the correct, fully-rendered version of your content.
Solution: Adding Googlebot to Your Middleware
If you're using Prerender.io, the integration is typically automatic, as Googlebot is included in the list of crawlers we check. If you need to manually add Googlebot (or other crawlers) to your middleware or if you’re using your own server setup, follow the instructions below:
1. Using nginx
In your nginx configuration, add the following line to check for Googlebot and other major crawlers:
if ($http_user_agent ~* "googlebot|bingbot|yandex|baiduspider|twitterbot|facebookexternalhit|rogerbot|linkedinbot|embedly|quora link preview|showyoubot|outbrain|pinterest|slackbot|vkShare|W3C_Validator") {
set $prerender 1;
}
2. Using Apache
In your Apache config, add the following condition to check for Googlebot, Bingbot, and Yandex:
RewriteCond %{HTTP_USER_AGENT} googlebot|bingbot|yandex|baiduspider|facebookexternalhit|twitterbot|rogerbot|linkedinbot|embedly|quora\ link\ preview|showyoubot|outbrain|pinterest|slackbot|vkShare|W3C_Validator [NC,OR]
3. Using Prerender-node
If you're using prerender-node, modify the user agent array like this:
var prerender = require('prerender-node').set('prerenderToken', 'YOUR_TOKEN');
prerender.crawlerUserAgents.push('googlebot');
prerender.crawlerUserAgents.push('bingbot');
prerender.crawlerUserAgents.push('yandex');
app.use(prerender);
4. Using Prerender-rails
If you're on Prerender for Rails, update the crawler_user_agents
array:
ruby config.middleware.use Rack::Prerender, prerender_token: 'YOUR_TOKEN', crawler_user_agents: [
'googlebot',
'bingbot',
'yandex',
'baiduspider',
'facebookexternalhit',
'twitterbot',
'rogerbot',
'linkedinbot',
'embedly',
'bufferbot',
'quora link preview',
'showyoubot',
'outbrain',
'pinterest/0.',
'developers.google.com/+/web/snippet',
'www.google.com/webmasters/tools/richsnippets',
'slackbot',
'vkShare',
'W3C_Validator',
'redditbot',
'Applebot',
'WhatsApp',
'flipboard',
'tumblr',
'bitlybot',
'SkypeUriPreview',
'nuzzel',
'Discordbot',
'Google Page Speed',
'Qwantify'
]
5. Using ASP.NET MVC
For ASP.NET MVC, modify your web.config
to include Googlebot, Bingbot, and Yandex:
<prerender token="YOUR_TOKEN" crawlerUserAgents="googlebot,bingbot,yandex"></prerender>
6. Using Prerender-java
In your web.xml, add Googlebot, Bingbot, and Yandex:
<filter>
<filter-name>prerender</filter-name>
<filter-class>com.github.greengerong.PreRenderSEOFilter</filter-class>
<init-param>
<param-name>prerenderToken</param-name>
<param-value>YOUR_TOKEN</param-value>
</init-param>
<init-param>
<param-name>crawlerUserAgents</param-name>
<param-value>googlebot,bingbot,yandex</param-value>
</init-param>
</filter>
Verify Your Installation
After integration, it’s important to verify that everything is working as expected. We’ve provided an easy-to-follow guide to help you test your setup and confirm that Prerender is functioning properly.
Tips & Notes
-
Why this is important: By adding Googlebot (and other crawlers) to your middleware, you're ensuring that search engines get access to the full, prerendered version of your page, leading to better SEO and more accurate content indexing.
-
Seamless Updates: If you're using Prerender.io’s hosted service, you don't need to make these manual changes. Simply ensure that your middleware is up to date to automatically check for Googlebot and other major search engine bots.
-
Cloaking Considerations: While Google allows sending prerendered pages to Googlebot based on the user agent, this should only be used for SEO purposes. Serving different content to users and search engines is considered cloaking and can lead to penalties. With Prerender.io, you serve the same content, just rendered differently for bots versus users.
-
Be Proactive: Regularly update your middleware to stay in line with Google's Dynamic Rendering recommendations, especially when they update their guidelines for crawling and indexing JavaScript-heavy pages.
By following these guidelines, you can make sure your pages are correctly indexed by Google and other search engines, helping your site rank higher and remain SEO-friendly.
Related Articles / FAQs: