I made a Google Sheet that does change detection for you based on two Screaming Frog crawls. I'll tell you why that's important.
Two problems frequently come up for SEOs, regardless of if we're in-house or external.
- Knowing when someone else has made key changes to the site
- Keeping a record of specific changes we made to the site, and when.
Both can sound trivial, but unnoticed changes to a site can undo months of hard work and, particularly with large e-commerce sites, it's often necessary to update internal links, on-page text, and external plugins in search of the best possible performance. That doesn’t just go for SEO, it applies just as much to CRO and Dev teams.
Keeping a record of even just our changes can be really time-consuming but without it, we often have to rely on just remembering what we did when, particularly when we see a pattern of changing traffic or rankings and want to know what might have caused it.
These things are people problems. When we can't rely on other teams to work with us on their planned changes, that needs to be fixed at a team level. When we don't have a system for listing the changes we make it's understandable, particularly for smaller keyword or linking tweaks, but if we compare ourselves to a Dev team for example - a record of changes is exactly the kind of thing we'd expect them to just include in their process. At the end of the day, when we don’t keep track of what we doing that’s because we either don’t have the time or don’t have the commitment to a process.
We shouldn’t really be trying to fix people problems with tools. That said, people problems are hard. Sometimes you just need a way of staying on top of things while you fight all the good fights. That's exactly what this is for.
This is a way to highlight the changes other teams have made to key pages, so you can quickly take action if needed, and to keep track of what you’ve done in case you need to undo it.
What sites is this good for?
This sheet is for anyone who needs an idea of what is changing on a fairly large number of pages but can’t afford to pay for big, expensive change detection systems. It’ll work its way through around 1,000 key pages.
That said, 1,000 key pages stretches further than you would think. For many small sites, that’ll more than cover all the pages you care about and even larger eCommerce sites get the vast majority of their ranking potential through a smaller number category pages. You would be surprised how big a site can get before more than 1,000 category pages are needed.
That 1,000 URL limit is a guideline, this sheet can probably stretch a bit further than that, it’s just going to start taking quite a while for it to process all of the formulas.
So what changes does it detect?
This Google Sheet looks at your “new crawl” and “old crawl” data and gives you tabs for each of the following;
- Newly found pages - any URL in the new crawl that isn’t in the old crawl
- Newly lost pages - any URL in the old crawl that isn’t in the new crawl
- Indexation changes - i.e. Any URL which is now canonicalised or was noindexed
- Status code changes - i.e. Any URL which was redirected but is now code 200
- URL-level Title Tag or Meta Description changes
- URL-level H1 or H2 changes
- Any keywords that are newly added or missing sitewide.
What’s that about keyword change detection?
On many sites, we’re targeting keywords in multiple places at a time. Often we would like to have a clear idea of exactly what we’re targeting where but that’s not always possible.
The thing is, as we said, your pages keep changing - you keep changing them. When we update titles, meta descriptions and H1s we’re not checking every page on the site to confirm our keyword coverage. It’s quite easy to miss that we are removing some, middlingly important, keyword from the site completely.
Thanks to a custom function, the Google sheet splits apart all of your title tags, meta descriptions, and H#s into their component words and finds any that, as of the last crawl, have either been newly added, or removed from the site completely.
It then looks the freshly removed words up against Search Console data to find all the searches you were getting clicks from before, to give you an idea of what you might be missing out on now.
The fact that it’s checking across all your pages means you don’t end up with a bunch of stopwords in the list (stopwords being; it, and, but, then etc.) and you don’t have to worry about branded terms being pulled through either - it’s very unlikely that you’ll completely remove your brand name from all of your title tags and meta descriptions by accident, and if you do that’s probably something you’d want to know about.
How do I use it?
Start by accessing a copy of this Google Sheet so you can edit it. There are step-by-step instructions in the first tab but broadly all you need to do is;
- Run a Screaming Frog crawl of all the pages you want to detect changes on
- Run another SF crawl of the pages you want to detect changes on
- Export the internal_all report for both crawls and paste them into the “old crawl” and “new crawl” tabs respectively
- Wait a bit (like 30 minutes)
- Check the results tabs for changes
- (Optional) Import Search Console data to give “value lost” information for keywords you removed.
What do you think?
I hope you find this useful! I was really surprised by what Google Sheets was able to achieve, is there anything you think I’ve missed? Anything you would change?