React application seen as a blank page via “Fetch as Google”
Ensuring that SPA application is correctly indexed by the search engines
As part of SEO testing, I have used Fetch as Google service to ensure that the Google bot is able to render my single page application (SPA). Instead, I saw an empty page:

It is near impossible to tell what went wrong since “Fetch as Google” service does not share output of the console.log
captured during the execution of the website.
TL;DR; The problem was that one of the dependencies in the project tree is using syntax that the Fetch as Google browser engine does not support. The (lazy) solution is to transpile the resulting bundle.
Does it matter?
There is a rumour (no sources) that “Fetch as Google” is not equivalent to whatever the actual Google bots use to render the pages.
I thought: maybe Fetch as Google is just a toy; something thats not being maintained actively.
After all, there is Chrome DevTools Protocol, which means that Google is able to use whatever the latest Chrome version to render the pages. Therefore, I let Google index pages while “Fetch as Google” was showing me an empty page.

Turns out that what “Fetch as Google” shows is important. All results appear without any description. (Note, the document <title />
is rendered server-side.)
Possible reasons
I have started searching Google for possible reasons.
- Some suggested that it is an issue with react-router 4.0.0-alpha.5.
- Others suggested that the issue is with how long the page is taking to render (waiting for async resources).
- Others suggested that the issue is a lacking polyfill.
However, I was sure that neither of those are the issue in my case:
- I have tried rendering the app without react-router with the same results.
- My page is loading within 100ms.
- I am using https://polyfill.io/ to load the required polyfills.
PhantomJS
Without quoting any sources, I will say that there is a rumour that “Fetch as Google” is using PhantomJS v2.1 to render pages.
I have tested rendering of my page using PhantomJS v2.1.1 and got the same result: a blank page. However, even more worrying was that PhantomJS produced no errors either. No syntax error, no warning, no missing resource, etc.
Turns out that there is a bug in PhantomJS v2.1 that makes syntax errors fail silently.
I have stripped down my entire application down to:
import ReactDOM from 'react-dom';
import React from 'react';ReactDOM.render(<div>Hello, World!</div>, document.getElementById('app'));
and it worked.
I am using babel-preset-env to transpile code down to ES that is supported by browsers with a market share greater than 1%.
[
"env",
{
"loose": true,
"modules": false,
"targets": {
"browsers": [
">1%"
]
},
"useBuiltIns": true
}
]
Now all eyes are on the dependencies 👀. My guess is that one of the project dependencies is not properly transpiled.
Transpile the resulting webpack bundle
I have used babel-webpack-plugin to transpile the resulting code chunks.
new BabelPlugin({
test: /\.js$/,
presets: [
[
'env',
{
exclude: [
'transform-regenerator'
],
loose: true,
modules: false,
targets: {
browsers: [
'>1%'
]
},
useBuiltIns: true
}
]
],
sourceMaps: false,
compact: false
})
and it worked. Now both PhantomJS and “Fetch as Google” were able to render the page. This means that one of the dependencies includes syntax that these browser engines do not support.
Fun fact
PhantomJS and “Fetch as Google” produced non-equivalent looking screenshots.


Some interesting observations while on the subject:
- “Fetch as Google” uses servers in different locations. As far as I can tell, all of which are in the US. This impacts geolocation-aware services (e.g. using IP to determine user’s location).
- As a result, date-sensitive parts of the website (e.g. listing events in the UK for “todays” date) show no results. This is because the client is located in the US, and at the time the screenshot was made the “today” is UK’s yesterday.