Best practices for crawler-ready pages
Four configuration techniques that ensure AI crawlers and search engines receive accurate, complete HTML from Prerender.
TL;DR
Prerender serves cached HTML to AI crawlers and search engines, but accurate rendering depends on four configuration choices you control: status codes, page readiness signals, cache management, and cookie consent handling. Each takes a few minutes to implement and prevents crawlers from receiving incomplete or misleading page states.
1. Return the correct status code to crawlers
By default, Prerender caches pages and returns a 200 OK status code. If a page has been deleted or permanently moved, AI crawlers and search engines will receive the wrong signal unless you override it.
Add these meta tags to the <head> section of your HTML to return the correct status code.
Returning a 404:
<meta name="prerender-status-code" content="404">
Returning a 301 redirect:
<meta name="prerender-status-code" content="301">
<meta name="prerender-header" content="Location: http://www.example.com">
ℹ️ Only use these meta tags when you need to override the default 200 OK response—for example, when a page is deleted or permanently moved. Applying them to live, indexable pages will signal the wrong status to crawlers.
2. Signal when your page is fully loaded
Prerender estimates when a page is ready by counting in-flight network requests. For pages with dynamically loaded content—such as data fetched via AJAX—this estimate may be too early, and Prerender may cache an incomplete page.
The window.prerenderReady flag gives you precise control. Set it to false at page load to tell Prerender to keep waiting:
<script> window.prerenderReady = false; </script>
Then set it to true once all content has loaded. Prerender caches the page at that point:
window.prerenderReady = true;
ℹ️ Set window.prerenderReady = true inside the callback or promise that resolves your last data fetch—not on a timer. Timer-based approaches risk caching a page that is still mid-render.
3. Manage caching with the API
The Prerender API gives you direct control over when pages are cached and refreshed—without waiting for organic crawler visits or cache expiration.
Three approaches cover most use cases:
- Cache pages on creation. Trigger caching immediately when a new page goes live so AI crawlers and search engines can find it right away.
- Recache pages when content changes. Use the
/recacheendpoint to update a cached page as soon as its content changes, rather than waiting for automatic expiration. - Reduce extra renders. Set a high cache expiration value and use the API to selectively recache only pages that have changed. This reduces the frequency of automatic re-rendering and keeps your render count down.
4. Suppress cookie consent banners for crawler user agents
Cookie consent banners are necessary for compliance, but they can block AI crawlers and search engines from reading your page content. If Prerender caches a page with a full-screen consent banner, crawlers may receive a page with little or no readable content.
Disable the consent banner for Prerender's user agents so crawlers receive a clean, complete version of each page. Your website visitors are unaffected—the banner continues to display normally for them.
See How can I whitelist Prerender's IP addresses and user agents? for the full list to add to your consent management platform.
⚠️ Do not disable the consent banner globally. Whitelist only Prerender's user agents—this keeps you compliant for website visitors while giving crawlers a clean page to index
💬 Still need help?
If you have questions about status codes, page readiness, API caching, or cookie consent configuration, our support team can help.
→ Contact us at support@prerender.io