Skip to content
  • There are no suggestions because the search field is empty.

How to Add Additional Bots (integrations maintained by Prerender)

Easily serve additional bots by adding their User-Agent strings to your integration.

 

Overview

Prerender.io automatically detects and serves prerendered content to a standard set of known search engine bots. However, if you need to serve less common or custom bots (like ChatGPT, ClaudeBot, or internal QA crawlers), you can manually add their User-Agent strings to your integration.

Doing this ensures all relevant bots receive the correct HTML snapshot, improving SEO, bot visibility, and crawlability—especially for JavaScript-heavy websites.

 

How It Works

Prerender integrations rely on User-Agent matching to detect bots. Each middleware (Express, Rails, NGINX, etc.) checks incoming requests against a predefined list of User-Agent strings. If a match is found, the request is routed to Prerender, which serves a static HTML version of the page.

Why Add Custom Bots?

  • Improve Indexing for New Bots: Platforms like ChatGPT, Perplexity, Claude, or Amazonbot increasingly crawl and summarize content.

  • Support Internal or QA Tools: You may run custom tests or previews that rely on receiving prerendered content.

  • Stay Ahead of SEO Trends: Some SEO tools simulate crawlers that aren't yet on the default list.

Tip: Each framework or server has a unique method of defining User-Agent rules. Ensure you test after making changes to confirm Prerender is serving your custom bot correctly.

 

Verify Your Installation

To confirm if a bot is being served prerendered content:

  • Use browser developer tools to inspect network requests

  • Or run a cURL command with a custom User-Agent header

  • For step-by-step instructions, see: Verify Your Prerender Integration

 


 

ExpressJS (Javascript) and NuxtJS

After you have downloaded the package via npm or yarn, open the index.js file in :

node_modules/prerender-node/index.js

Then add your desired bot user-agent name to the following list (the following list is not case-sensitive):

prerender.crawlerUserAgents = [
'Google-InspectionTool',
'googlebot',
'Yahoo! Slurp',
'bingbot',
'yandex',
'baiduspider',
'facebookexternalhit',
'twitterbot',
'rogerbot',
'linkedinbot',
'embedly',
'quora link preview',
'showyoubot',
'outbrain',
'pinterest/0.',
'developers.google.com/+/web/snippet',
'slackbot',
'vkShare',
'W3C_Validator',
'redditbot',
'Applebot',
'WhatsApp',
'flipboard',
'tumblr',
'bitlybot',
'SkypeUriPreview',
'nuzzel',
'Discordbot',
'Google Page Speed',
'Qwantify',
'pinterestbot',
'Bitrix link preview',
'XING-contenttabreceiver',
'Chrome-Lighthouse',
'TelegramBot',
'SeznamBot',
'OAI-SearchBot',
'ChatGPT',
'GPTBot',
'ClaudeBot',
'Amazonbot',
'Perplexity',
'integration-test'
];

 

Rails(Ruby)

First, locate where prerender_rails is, with the following command :

gem which 'prerender_rails'

After opening the file, add your bot to the @crawler_user_agents array, then save it :

      @crawler_user_agents = [
'Google-InspectionTool',
'googlebot',
'yahoo',
'bingbot',
'baiduspider',
'facebookexternalhit',
'twitterbot',
'rogerbot',
'linkedinbot',
'embedly',
'bufferbot',
'quora link preview',
'showyoubot',
'outbrain',
'pinterest/0.',
'developers.google.com/+/web/snippet',
'www.google.com/webmasters/tools/richsnippets',
'slackbot',
'vkShare',
'W3C_Validator',
'redditbot',
'Applebot',
'WhatsApp',
'flipboard',
'tumblr',
'bitlybot',
'SkypeUriPreview',
'nuzzel',
'Discordbot',
'Google Page Speed',
'Qwantify',
'Chrome-Lighthouse',
'TelegramBot'
'SeznamBot',
'OAI-SearchBot',
'ChatGPT',
'GPTBot',
'ClaudeBot',
'Amazonbot',
'Perplexity',
'integration-test'
]

 

Nginx

Look for $http_user_agent $prerender_ua and add the desired User Agent to the list.

    map $http_user_agent $prerender_ua {
default 0;
"~*Prerender" 0;

"~*Google-InspectionTool" 1;
"~*googlebot" 1;
"~*yahoo!\ slurp" 1;
"~*bingbot" 1;
"~*yandex" 1;
"~*baiduspider" 1;
"~*facebookexternalhit" 1;
"~*twitterbot" 1;
"~*rogerbot" 1;
"~*linkedinbot" 1;
"~*embedly" 1;
"~*quora\ link\ preview" 1;
"~*showyoubot" 1;
"~*outbrain" 1;
"~*pinterest\/0\." 1;
"~*developers.google.com\/\+\/web\/snippet" 1;
"~*slackbot" 1;
"~*vkshare" 1;
"~*w3c_validator" 1;
"~*redditbot" 1;
"~*applebot" 1;
"~*whatsapp" 1;
"~*flipboard" 1;
"~*tumblr" 1;
"~*bitlybot" 1;
"~*skypeuripreview" 1;
"~*nuzzel" 1;
"~*discordbot" 1;
"~*google\ page\ speed" 1;
"~*qwantify" 1;
"~*pinterestbot" 1;
"~*bitrix\ link\ preview" 1;
"~*xing-contenttabreceiver" 1;
"~*chrome-lighthouse" 1;
"~*telegrambot" 1;
       "~*SeznamBot"                           1;
"~*OAI-SearchBot"                      1;
"~*ChatGPT"                            1;
"~*GPTBot"                             1;
"~*ClaudeBot"                          1;
"~*Amazonbot"                          1;
"~*Perplexity"                         1;
"~*integration-test"                   1;
}

 

Apache

In the .htaccess file, look for the line RewriteCond %{HTTP_USER_AGENT} , then add the User Agent to the list, separated by |

RewriteCond %{HTTP_USER_AGENT} Google-InspectionTool|googlebot|bingbot|yandex|baiduspider|facebookexternalhit|twitterbot|rogerbot|linkedinbot|embedly|quora\ link\ preview|showyoubot|outbrain|pinterest\/0\.|pinterestbot|slackbot|vkShare|W3C_Validator|whatsapp|redditbot|applebot|flipboard|tumblr|bitlybot|skypeuripreview|nuzzel|discordbot|google\ page\ speed|qwantify|bitrix\ link\ preview|xing-contenttabreceiver|chrome-lighthouse|telegrambot|SeznamBot|OAI-SearchBot|ChatGPT|GPTBot|ClaudeBot|Amazonbot|Perplexity|integration-test [NC,OR]

 

IIS

Look for the {HTTP_USER_AGENT} condition, then add your desired new User Agent to the pattern, divide it with |

Google-InspectionTool|googlebot|bingbot|yandex|baiduspider|facebookexternalhit|twitterbot|rogerbot|linkedinbot|embedly|quora\ link\ preview|showyoubot|outbrain|pinterest\/0\.|pinterestbot|slackbot|vkShare|W3C_Validator|whatsapp|redditbot|applebot|flipboard|tumblr|bitlybot|skypeuripreview|nuzzel|discordbot|google\ page\ speed|qwantify|bitrix\ link\ preview|xing-contenttabreceiver|chrome-lighthouse|telegrambot|SeznamBot|OAI-SearchBot|ChatGPT|GPTBot|ClaudeBot|Amazonbot|Perplexity|integration-test

 

Cloudflare

Locate and edit the worker you made for Prerender and locate const BOT_AGENTS in the code, then add your desired User Agent to the end of the list

const BOT_AGENTS = [
'google-inspectiontool'
'googlebot',
'yahoo! slurp',
'bingbot',
'yandex',
'baiduspider',
'facebookexternalhit',
'twitterbot',
'rogerbot',
'linkedinbot',
'embedly',
'quora link preview',
'showyoubot',
'outbrain',
'pinterest/0.',
'developers.google.com/+/web/snippet',
'slackbot',
'vkshare',
'w3c_validator',
'redditbot',
'applebot',
'whatsapp',
'flipboard',
'tumblr',
'bitlybot',
'skypeuripreview',
'nuzzel',
'discordbot',
'google page speed',
'qwantify',
'pinterestbot',
'bitrix link preview',
'xing-contenttabreceiver',
'chrome-lighthouse',
'telegrambot'
'SeznamBot',
'OAI-SearchBot',
'ChatGPT',
'GPTBot',
'ClaudeBot',
'Amazonbot',
'Perplexity',
'integration-test'
];

 

Docker

As our docker middleware uses Nginx, you will need to open the nginx.conf and locate map $http_user_agent $prerender_ua and add your desired User Agent to the list

map $http_user_agent $prerender_ua {
default 0;
"~*Prerender" 0;

"~*google-inspectiontool" 1;
"~*googlebot" 1;
"~*yahoo!\ slurp" 1;
"~*bingbot" 1;
"~*yandex" 1;
"~*baiduspider" 1;
"~*facebookexternalhit" 1;
"~*twitterbot" 1;
"~*rogerbot" 1;
"~*linkedinbot" 1;
"~*embedly" 1;
"~*quora\ link\ preview" 1;
"~*showyoubot" 1;
"~*outbrain" 1;
"~*pinterest\/0\." 1;
"~*developers.google.com\/\+\/web\/snippet" 1;
"~*slackbot" 1;
"~*vkshare" 1;
"~*w3c_validator" 1;
"~*redditbot" 1;
"~*applebot" 1;
"~*whatsapp" 1;
"~*flipboard" 1;
"~*tumblr" 1;
"~*bitlybot" 1;
"~*skypeuripreview" 1;
"~*nuzzel" 1;
"~*discordbot" 1;
"~*google\ page\ speed" 1;
"~*qwantify" 1;
"~*pinterestbot" 1;
"~*bitrix\ link\ preview" 1;
"~*xing-contenttabreceiver" 1;
"~*chrome-lighthouse" 1;
"~*telegrambot" 1;
   "~*SeznamBot"                           1;
"~*OAI-SearchBot"                      1;
"~*ChatGPT"                            1;
"~*GPTBot"                             1;
"~*ClaudeBot"                          1;
"~*Amazonbot"                          1;
"~*Perplexity"                         1;
"~*integration-test"                   1;
}

 


Best Practices

  • Regularly audit crawler logs to spot new bots worth serving.

  • Use wildcards or regex only where needed; overly broad patterns can degrade performance.

  • Avoid serving prerendered content to human users by accident (don’t match Mozilla, Chrome, etc.).

Tips

  • Test after changes: Mistyped User-Agent patterns won’t be matched.

  • Watch for escape characters: Some User-Agents (e.g., quora link preview) require backslashes in regex.

  • Cloudflare Workers: Changes require a redeploy to take effect.

  • Case-insensitive matching is recommended (~* in regex).

  • Restart your server or redeploy code-based changes.

 

Related Articles / FAQs