Getting started

How does it work?

Integrating Prerender is always a matter of a server-side request-rewrite. Your server needs to differentiate between a human visitor and a bot request, and based on that it should either fulfill the request normally or rewrite the request to Prerender respectively.
With that said, generally, there are two ways to integrate Prerender:
  1. Integrated into the backend service using a so-called 'middleware' provided by Prerender. This is analog to the node.js middlewares - if familiar - but we have many languages covered, such as C#, Python, PHP, Ruby... etc. Middlewares are small code snippets that get executed upon every request, but it only affects the response when a bot (e.g., Googlebot) user agent is detected. In this case, it fetches the prerendered content from the cloud service and responds with that. Example middleware integration:
  2. Integrated into the CDN in front of your servers - if any. In this case, it's not a middleware but a set of well-crafted rewrite rules that do the routing between your backend and Prerender's cloud service as needed. Example CDN integration:
In both cases, the rewrite target is a very simple URL concatenation (<YOUR_URL>); that you can test easily, even without doing any integration, just by issuing the following curl command:
curl -H "X-Prerender-Token: <YOUR_PRERENDER_TOKEN>"
Both methods have pros and cons: integrating into the backend gives you more flexibility, as it runs actual code, so you have more granular control over what should happen and when. But CDNs or caches in front of your servers may interfere with responses and may give you unwanted results. Integrating into the CDN makes this risk go away, but not all CDNs are capable of routing the way you need them to.

The middleware that you install on your server will check each request to see if it's a request from a crawler. If it is a request from a crawler, the middleware will send a request to for the static HTML of that page. If not, the request will continue on to your regular server routes. The crawler never knows that you are using since the response always goes through your server.

We currently have integrations available for these stacks: officially maintained middleware

Community maintained middleware

Testing your middleware

To see a prerendered page exactly how the crawlers will see it, set your User Agent in your browser to Googlebot and visit your URL, or run this on a command line and change to your URL:

curl -A Googlebot

If you have your middleware set up correctly, you should see your page returned! You'll know it's correct if you view the source of your page and see all of the HTML in there instead of just the JavaScript. If you don't see your page rendered correctly... see our FAQ or get in touch with us over this link!

Testing a local development server

We published our core component as an open-source project to test sites in a local development environment.

git clone
cd prerender
npm install
node server.js

The default port is 3000. Change the server port with export PORT=1337 if needed.

Now you should have a Prerender server running, and you should be able to prerender pages with the following command:

curl http://localhost:3000/

Try switching out the URL with your own, even locally hosted URLs!

Please note: opening these URLs in a browser may give you misleading results, as your browser will fail to load resources (e.g., CSS) with a relative URL.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.