What we’ve learnt so far about using JavaScript for SEO

14 December 2017 Andy Allen Leave a comment Company Updates

Since we launched the Onsite Optimiser back in early September, we’ve come to learn a lot about how effective JavaScript can be at optimising important on-page SEO elements. And we’ve also learnt a lot about Google’s own ability to index JS-rendered content.

There’s a lot of great articles coming out of the SEO community about Javascript and SEO, and awareness is growing considerably. This is fantastic as we can all learn from each other.

In this blog I’m not going to rabbit on about how effective JavaScript is for making SEO changes, as we’ve spoke about that before when launching the tool (have a read here if you’ve not seen it before).

What I want to share instead are the insights and learnings we’ve discovered in the last few months which we think everyone will find useful.

Some of it you guys may already know, some of it you may not, but we hope you learn at least something a long the way!

 

A Couple Of Basics First…

Apologies if I’m teaching anyone to suck eggs here, but the first thing to remember is that you need to always check the DOM when reviewing and inspecting JS changes to a webpage. Google will crawl and then index the rendered DOM, not the ‘traditional’ pre-rendered HTML source.

This requires a couple of changes to the usual way an SEO might inspect a webpage for onsite SEO;

  1. You need to view and inspect the DOM when reviewing the code that Google will index, and
  2. Viewing the source code is now no longer relevant

This is especially important, for any SEO changes you make via JS (or any changes to the webpage for that matter) will only be visible in the rendered DOM. This is true whether your website is built using a JS framework, you use our Onsite Optimiser tool, or you are adding JS in manually.

 

Can You Check To See if Google Has Indexed These Changes?

Errrr yes and no. You can, but it’s not as simple as it is with traditional HTML changes, and there are some important distinctions.

Fetch and Render (F/R)

The first thing to note is that Search Console’s Fetch and Render tool (hereby known in this article as F/R) can be a bit confusing when it comes to JS. When you F/R it pulls back two views; ‘Fetching’ and ‘Rendering’.

‘Fetching’ only pulls back the unrendered HTML, so any changes made via JS will not be shown:

The ‘Rendering’ tab however will display any changes made to the page via JS (note here – Google ‘indexes’ the DOM at a certain point during the load of the webpage, so only JS changes before this point will be crawled. More on that later 🙂

Important to note here – especially for those of you who use our Onsite Optimiser tool – is that rendered metadata can not be checked here. Unfortunately the ‘rendering’ tab above allows no way to view the rendered HTML, so any code that is not visible cannot be checked.

Fortunately we do know that even though it cannot be checked here, Google does in fact index any metadata that is changed via JS. Check out this page to do some quick testing yourself – https://thewebshed.co/test-page/

JS SEO SERP result

Google’s Cache

We just want to highlight here that Google’s cache of a page will show the unrendered HTML, just as the ‘fetching’ tab above does. In this respect, viewing Google’s cache of a page is kind of useless if you’re updating the page’s content via JS… (check the page title in the screenshot below, it’s the unrendered HTML title)

 

How Efficient is Google at Indexing JS Changes?

This question (and many more TBH!) have been answered very well by other great SEO’s (including Justin Briggs and Bartosz Góralewicz) and it’s a very hot topic of conversation. Mainly because there is no conclusive documentation from Google on this; we need to remember that JS indexing is relatively new, even for Google.

The first thing to understand when we approach the above question though is at what point does Google crawl (essentially take a snapshot of) the rendered DOM? This may seem like a strange question, but it’s very important. And it can be easily misunderstood.

It’s widely accepted and understood that Google takes a ‘snapshot’ of the page when the browser fires the ‘load’ event. It’s easy to see this actually; just inspect element, click on the network tab and reload the page.

The red bar signals the load event; resources loaded after this will not be crawled by GBot.

JS load event

We’ve seen this with our customers too, but we’ve found the red bars to be a bit misleading, as the time it appears in the two instances above differ. There’s probably a good reason for this, pleas feel free to comment below if you know why 🙂

Instead what we’ve found is that what you should focus on the load time that is detailed at the bottom of the screenshot. We have found that as long as resources start loading before this point then they will be indexed by Google. This is what we ensure for our Onsite Optimiser.

One important note here – tools to assist with caching of webpages to improve speed and performance can have an impact on any changes you make as they can cache JS resources.

We’ve liaised with Barry Adams who pointed us to a great article on the difference between crawling and indexing in terms of Javascript. We definitely advise checking it out here as it brilliantly explains the process Google goes through.

How Long Does it Take for Google to Index JS Changes?

Just this week there have been two great posts on this very subject; one by Eoghan Henn and one by Bartosz Góralewicz. I’m going to share what we’ve found with our clients who use the Onsite Optimiser, but a word of caution here; there is no hard and fast rule about how quickly (or slowly) Google indexes JS as it will continue to improve. We even asked the main man himself:

Fairly obvious answer I guess… Probably better to ask an open question next time Kieran 🙂

So far we have found that any updates made to a page’s content via JS can take between 1 – 14+ days to be indexed. And this varies according to to different situations:

  1. If it is the first time Google is seeing JS-made changes, eg the first time you update a page title on a page, then it usually takes between 1-2 days to index and display the new content in the SERPs (sometimes sooner)

  2. If it is not the first time Google is seeing JS-made changes, eg the second time you update a page title on a page. This can take anywhere up to 14 days to update the content in the SERPs

NB: in order to determine what counts as ‘day 1’, this is the day we performed a Fetch + Render + Request Index (hereby known in this article as F/R/I) for that page in Search Console.

This is because Google caches JS resources for a non-descript length of time:

We assume it’s at least 14 days; this is what our tests have shown us. However, we tried to speed things along through hammering F/R/I for the pages where we made an update to previously indexed JS-made content. So if you don’t do that it could be longer.

Wait, 2 Weeks is a Long Time Right? 

It sure is. We’re constantly testing to see how we can ‘make’ Google drop their JS cache, and it’s something we are still working on.

However there is one temporary solution we found to help with this, it’s not perfect but it currently works. It goes like this:

  1. You’ve made a change to the pages content via JS, but you want to update it again via JS. And you don’t want to wait 14 days for this change to reflect in the SERPs
  2. You remove the JS content altogether, so when Google next crawls your site it see’s the unrendered HTML content. Speed this up by using F/R/I
  3. You check the SERPs to see your unrendered HTML content is displaying
  4. You then start over again by adding the new content into the page via JS, and speed up indexing via F/R/I

Like I said, not pretty, but effective.

To help with number two above, we’ve added a ‘Delete Optimiser Changes’ button into the tool so you can quickly remove the content you made to the page:

This is why it’s important to make sure there are not any caching tools (whether server side or client side) that are preventing your JS-made changes from being visible. It’s also why it’s important that these changes are visible pre-load event.

 

Conclusion

Making changes to a webpage via JS is a great solution for implementing on-page SEO on tricky/out-dated/’lost login details’ CMS platforms and websites. But it’s important that you still understand the current limitations with Google’s ability to index and re-index JS.

We’re always continuing to improve the Onsite Optimiser, and speeding up Google’s re-index rate is one of the top priorities for us. We’ll be sure to keep you updated along the way.

And please feel free to comment below if you have anything you want to add!

Great Resources on JS SEO

We wouldn’t have been able to create the Onsite Optimiser without the help of the SEO community and it’s generosity with sharing knowledge. If you’d like to learn more about using JS for SEO, then check out any of the below articles:

Oliver Mason, ‘Googlebot Renders As-If’ – http://ohgm.co.uk/googlebot-renders-as-if/

Eoghan Henn, ‘More JavaScript SEO experiments with Google Tag Manager’ – https://www.searchviu.com/en/javascript-seo-experiments-google-tag-manager/

Justin Briggs, ‘Core Principles of SEO for JavaScript’ – https://www.briggsby.com/dealing-with-javascript-for-seo/

Bartosz Góralewicz, ‘Can Google Properly Crawl and Index JavaScript Frameworks? A JavaScript SEO Experiment.’ and the recent follow up ‘Everything You Know About JavaScript Indexing is Wrong’ – https://www.elephate.com/everything-you-know-about-javascript-indexing-is-wrong

Sam Nemzer, ‘How to Implement SEO Changes Using Google Tag Manager’ – https://moz.com/blog/seo-changes-using-google-tag-manager

Barry Adams, ‘JavaScript and SEO: The Difference Between Crawling and Indexing’ – http://www.stateofdigital.com/javascript-seo-crawling-indexing/

 

P.S. We are currently running another test , but more on that later.

Tags: