Prerender and You: a Case Study in AJAX Crawlability

The story

This year I had the opportunity to work on an interesting technical SEO project for a client: they had a great web-app that prioritized performance and usability, built entirely with AJAX. The app was dynamically generated and rendered fully on the client-side, resulting in a lightweight, seamless user experience that their engineers were justifiably proud of.

It all sounds ideal, right? Almost. The app and its subsequent pages weren’t being indexed in Google. To clarify: Google was rendering the Javascript pages, as evinced in Webmaster Tools, but failing to reliably crawl and understand them. This might never have been an issue at all, but our client operated in a space where organic search was crucial and so they came to us for help.

The problem

This is a well-documented problem that Google has been aware of as early as 2009, i.e that its bot was unable to crawl, render, or index dynamically generated Javascript pages. Bad news for our client’s web app. Google has diligently worked to overcome this flaw over the past few years, with marked results. By the time I’d been tasked with speccing a solution for our client, their web app could be rendered reliably, but that was about it. Although Google had delivered a solution for reading our client’s dynamically generated pages, it seemed the bot was stymied when it came to taking their web app to the next level of SEO - crawling, indexing, and above all, ranking.

All of this was further compounded by how their web app worked, as well as how it was built. Our client operated in the job-search space and aspired to compete with the likes of Indeed, Monster, and AngelList. Their app incorporated a faceted search function that would result in all sorts of alphanumerical URL’s, which were effectively different pages, as the on-page elements would change as users navigated the facets, but these changes weren’t being picked up by Google as new pages. It’s not hard to imagine how this might be problematic; ranking for a term like “Los Angeles Accounting Jobs” without a Los Angeles Accounting Jobs page could be compared to bringing a sock to a gunfight.

Our solution

We rolled up our sleeves, came up with three possible solutions, and immediately abandoning the following two:

Make the web app isomorphic with static HTML/CSS content on each page


- Solves their SEO problem by reducing/minimizing the role of AJAX in serving content


- Resource-intensive build for their already-pressed engineers
- Backend becomes doubly difficult to manage

Hard-code all the job listing pages


- Best practice SEO


- Too much dev. work, not enough time.
- Undermines the benefits of AJAX (site speed, user experience)
- Difficult to scale going forward

We ended up going with Designed to adhere to Google’s AJAX crawling specification, Prerender works by creating cached HTML versions of pages (in our case, the web app’s faceted search pseudo-pages) and serving those cached pages to Google’s bot when it comes around via an ?_escaped_fragment_= URL path. Compatible with all the major JavaScript frameworks and libraries, Prerender exists on your server as middleware and works with tech stacks including Ruby on Rails, Nginx, Apache, and more.

Implementing Prerender is straightforward (our client was able to roll it out within a week of agreeing to use Prerender), it’s totally open source, and comes supported by robust documentation. It was a Goldilocks solution given their resources and needs, that began producing results almost immediately.

The results

Three words come to mind: indexation hockey stick (shown in Webmaster Tools)

Prerender Implemented

We doubled indexation within three months of Prerender being implemented. Google was now able to crawl and index facet pages from the web app, meaning our client was now set up to begin ranking for terms like “Los Angeles Accounting Jobs”.

It’s important to note here that Prerender alone wasn’t enough to guarantee ranking in a competitive SEO space like job search; my highschool math/logic teacher might say that Prerender was necessary, but not sufficient. This insufficiency was due to how Prerender chose the set of faceted pages in the web app to index (it didn’t). Prerender exposed the pages to Google, but did nothing beyond that.


Job Type






Company A




Company B


Graphic Design


Company C

In the matrix above, the number of possible faceted pages is already (I’m told) 81 combinations. Although Google could now crawl and index the Prerendered pages, it couldn’t sensibly order them in a hierarchy. In other words, there was no corresponding information architecture to support our newly Prerendered pages. Each page in the web app was also missing crucial SEO elements such as clean URLs (as opposed to something like this) and descriptive title tags, which help search engines understand content. Put simply: Prerender was able to expose the faceted job pages to Google and get them indexed, but the pages themselves weren’t initially built to be landing pages (no hierarchical directory to browse, no readable URLs, no title tags, etc).

To get a better idea of what I’m describing, we can turn to a job search juggernaut, This page is the Indeed page that ranks for “NYC accounting jobs”. Readers will note that this page is, in effect, a faceted page - it’s generated with Indeed’s internal search already populated (what: Staff Accountant, where: New York, NY). The difference is that Indeed’s search function is complemented by static directory pages segmented by state or category, and city and that this page (and every page like it on Indeed) has a readable URL and optimized title tags. These elements, the hierarchy and on-page SEO for each job page, makes Indeed’s contender page strong for SEO; our client’s faceted web-app pages weren’t built with SEO in mind, so they were missing critical elements that Prerender was never meant to address.

Applying Prerender alone helped to A) significantly increase their indexation and B) marginally increase organic traffic. It wasn’t a perfectly self-contained solution, but it was a big step in the right direction for our client.

Airbnb has a documented case study where they tackled the same problem with their web app and implemented a similar solution, for any reader who wants to learn more.

Looking forward

Google announced one month ago that they were deprecating their AJAX crawl scheme. Context: in 2009, when Google was unable to render dynamically generated pages, they recommended workarounds like ?_escaped_fragment_=, which Prerender neatly encapsulates. Now, six years later, Google’s putting on its big boy pants and declaring that no, in fact, they can crawl and render your JavaScript content (they updated their technical webmaster guidelines, too). Personally I’ve yet to see compelling evidence that Google can crawl and render dynamic content at scale. Individual pages, yes, entire websites (of significant size), no.

For now, Google will still crawl and accept ?_escaped_fragment_= indexation solutions meaning their deprecation announcement has little in the way of practical implications for the average SEO/webmaster. Directionally, Google has set its sights on crawling dynamically generated content. The fact that they’re announcing deprecation of their AJAX crawl scheme now when their current crawling capabilities are still suboptimal, suggests that they’re confident they’ll be able to crawl dynamic content well, if not soon.

In the meantime, Prerender and solutions like it are still reliable ways to get your AJAX & JavaScript crawled and that’s unlikely (knock on wood) to change any time soon. So, for anyone with a dynamically generated site that wants to enable crawlability, I would recommend trying Prerender. It’s not necessarily a perfect solution, I’d describe it as “one size fits most”, but it worked for my client without breaking the bank.

Get blog posts via email