Skip to content
  • There are no suggestions because the search field is empty.

Best Practices

Ensure proper handling of status codes, page readiness, and caching to optimize crawling and SEO.

 

Overview

When integrating Prerender.io with your website, it’s essential to manage how status codes are returned to crawlers, ensure that pages are fully loaded before being cached, and optimize caching strategies to improve site performance and SEO. Here’s how you can control these factors with special meta tags, API calls, and proper handling of cookie consent banners.

 

1. Correctly Returning Status Codes to Crawlers

By default, Prerender.io caches HTML pages and returns a 200 OK status code. However, sometimes you might want to return a different status code to crawlers (e.g., if a page has been deleted or redirected). To do so, use two special meta tags in the <head> section of your HTML.

  • Returning a 404 Status Code:

If your REST endpoint returns a 404 (page not found) but you still want crawlers to receive a 404 response, add this meta tag:

<meta name="prerender-status-code" content="404">
  • Returning a 301 Redirect:

If your REST endpoint returns a 301 (permanent redirect) and you want crawlers to follow the redirect, add the following meta tag for the status code and header:

<meta name="prerender-status-code" content="301">
<meta name="prerender-header" content="Location: http://www.example.com">

Key Takeaway:
These meta tags help you manage status codes for crawlers, ensuring that Prerender.io delivers the correct response, including redirects or 404s, for search engines to process.

 

2. Ensuring Pages are Fully Loaded Before Prerendering

Prerender.io tries to determine when a page is fully loaded by counting the number of in-flight requests. However, if you want to ensure that a page is completely ready before it’s cached, you can control this process using a special JavaScript flag.

  • Set window.prerenderReady to false initially to tell Prerender.io that the page is still loading:
<script> window.prerenderReady = false; </script>
  • Set window.prerenderReady to true once all the necessary content (including AJAX requests) has loaded. This signals Prerender.io to save the page:
window.prerenderReady = true;

Key Takeaway:
Using the window.prerenderReady flag allows Prerender.io to wait until the page is fully loaded before it is cached, ensuring accurate content rendering for search engines.


3. Using the Prerender API to Manage Caching

You can optimize caching by using Prerender.io’s API for both initial caching and recaching when the page content changes. This approach is particularly useful if you want to keep your content fresh without constantly hitting the cache expiration limit.

  • Use the API to cache pages when they are created:
    You can trigger the caching process immediately when new pages are created, ensuring that they are available to crawlers right away.

  • Use the API to recache pages when they change:
    If your page content changes, you can recache it using the API. This ensures that the latest version of your content is available to crawlers, without having to wait for the cache expiration time.

  • Save on costs:
    By setting a high cache expiration value and using the API to selectively recache pages that have changed, you can reduce the frequency of automatic re-rendering, lowering your costs.

Click here to view our /recache endpoint.

Key Takeaway:
Leveraging the Prerender API for caching and recaching pages provides more control over when and how content is rendered, improving efficiency and reducing costs.

 

4. Handling Cookie Consent Banners for SEO

Cookie consent banners are important for compliance but can interfere with user navigation and search engine crawlers. If the cookie consent banner is disruptive to crawlers, it can negatively affect your SEO, as search engines may interpret the page experience as poor.

Best Practice:
To ensure a smooth crawling experience, disable the cookie consent banner for Prerender’s user agents. This will prevent the banner from being included in the cached page version that is served to search engines.

  • Whitelisting Prerender’s User Agents:
    Make sure the banner is not shown for user agents associated with Prerender (such as Googlebot). This ensures that crawlers receive a clean, usable version of the page.

You can find the list of Prerender’s user agents here.

Key Takeaway:
Disabling cookie consent banners for crawlers ensures that your pages are properly indexed and don't suffer from poor user experience signals that could hurt your SEO rankings.

Tips & Notes

  • Meta Tag Configuration: Ensure that you only use the meta tags for status codes when you need to override the default behavior (e.g., when the page is deleted or redirected). This helps ensure that search engines receive the correct response.

  • Page Readiness: Be mindful that dynamically loaded content (like via AJAX) can delay when a page is considered ready. Using the window.prerenderReady flag gives you precise control over when the page is considered fully loaded.

  • API Usage for Recaching: Use the /recache API to update pages in real-time. This is helpful for sites that frequently update content or have time-sensitive information.

  • SEO Impact: Optimizing how pages are rendered for bots and users (including handling cookie banners) can improve SEO rankings by enhancing the user experience and ensuring proper crawling.

 

By following these best practices, you can ensure that your website is crawled and indexed properly by search engines, improving your SEO performance while reducing costs and maintaining an optimal user experience.

 

Related Articles / FAQs: