How does work?
  • 06 Oct 2023
  • 2 Minutes to read
  • Contributors
  • Dark
  • PDF

How does work?

  • Dark
  • PDF

Article Summary is designed to speed up serving content to the search crawlers and thus improve your SEO score. has to be integrated with your web server / backend / CDN to have a mechanism that determines which requests will be forwarded to Prerender.

You can find more about integrations in our Available integrations page.

Direct User Access

This is how your system operates by default.
Requests are coming from your clients and you serve them. Prerender doesn't take part in the flow.

Direct User Access.png


First-Time Crawler Request

If a crawler requests a page, your backend identifies it as a crawler and sends a request to Prerender. The first time this happens, the page won't be in the cache, called a cache miss. In this case needs to fetch the page from you, render it, store it in the cache, and return it to you. This process may take several seconds.

First-Time Crawler Request.png


Repeated Crawler Request

From the next occasion on, the page will be in our cache and we will serve it from there.

Repeated Crawler Request.png


Content Update

Your content changes. To follow the changes, needs to periodically refresh it's cache. This is called recaching and you can set the maximum age of your pages in the cache (called cache freshness). In this case, the trigger of the render is a timer without external event.

Content Update.png

Integrating Prerender is always a matter of a server-side request-rewrite. Your server needs to differentiate between a human visitor and a bot request, and based on that it should either fulfill the request normally or rewrite the request to Prerender respectively.

With that said, generally, there are two ways to integrate Prerender:

  1. Integrated into the backend service using a so-called 'middleware' provided by Prerender. This is analog to the node.js middlewares - if familiar - but we have many languages covered, such as C#, Python, PHP, Ruby... etc. Middlewares are small code snippets that get executed upon every request, but it only affects the response when a bot (e.g., Googlebot) user agent is detected. In this case, it fetches the prerendered content from the cloud service and responds with that. Example middleware integration: Example middleware integration.
  2. Integrated into the CDN in front of your servers - if any. In this case, it's not a middleware but a set of well-crafted rewrite rules that do the routing between your backend and Prerender's cloud service as needed. Example CDN integration.

In both cases, the rewrite target is a very simple URL concatenation (<YOUR_URL>); that you can test easily, even without doing any integration, just by issuing the following curl command:

curl -H "X-Prerender-Token: <YOUR\_PRERENDER\_TOKEN>"

Both methods have pros and cons: integrating into the backend gives you more flexibility, as it runs actual code, so you have more granular control over what should happen and when. But CDNs or caches in front of your servers may interfere with responses and may give you unwanted results. Integrating into the CDN makes this risk go away, but not all CDNs are capable of routing the way you need them to.

The middleware that you install on your server will check each request to see if it's a request from a crawler. If it is a request from a crawler, the middleware will send a request to for the static HTML of that page. If not, the request will continue on to your regular server routes. The crawler never knows that you are using since the response always goes through your server.

Was this article helpful?

What's Next