Change Tracking: Monitor Competitors’ Websites for SEO

It is relatively standard practice nowadays to do keyword rank checking with tools such as SEOmoz, Authority Labs or Conductor. It just makes sense to us as SEOs to keep an eye on them, whether you are of the school that you should be reporting them to your clients/boss or not. However, we know that with rankings there are so many variables at play that it is more of an art than a science to react to them when you see big changes.

Rank tracking helps inform us of how our tactics are working, whether competitors are up to something, or if Google has been playing with the dials again. However, I’ve been thinking recently about what other things we should be routinely tracking, and which of these might be helpful in prompting more specific actions.

One thing that I know some SEOs do, on and off, but something I haven’t really done much of until now is tracking my competitors’ sites (their markup, structure and content). Sure I look at their rankings, and if their has been interesting changes then I might look at OpenSiteExplorer, Majestic of Ahrefs to establish whether they’ve been doing anything new on the link-building front, but if it is internal changes to their site then I probably won’t spot the exact changes unless it was something in-your-face (like a complete redesign).

However, Google have been rolling out an increasing array of technical changes over the last few years that have enabled us to make all sorts of under-the-hood changes which can affect our SEO performance. Sometimes these changes can have a dramatic impact in the SERPs, and in those instances it can be really handy to know exactly what your competitor dig. Furthermore, even when the changes are smaller and less dramatic, and even negative, it can be helpful for you to know. By taking frequent snapshots of our competitors sites (and our own too if there are other departments that might be making changes!), we can then look at our rankings data, competitive intelligence tools such as Searchmetrics, and any other data points we have to identify any link between a change in rankings/visibility and any on site changes that might have been made just before.

Sure, these changes probably aren’t going to be a frequent occurrence, but tracking a few sample pages from your competitors websites on a weekly or monthly basis need only take a few minutes, and on the occasions you refer back it can turn out to be a high ROI activity.

Putting it to Use

Here is an (anonymised - sorry its covered by NDA) view of one of my clients’ competitors SERP visibility as measured by Searchmetrics:

Searchmetrics graph

In early December we can see they took a decent ~15% step up in visibility (not the Y axis is not 0 based), which on a large site is quite significant, and also likely equates to a very real increase in revenue. Now, normally, I’d start trying to work this out by looking at the the link activity that had gone on recently. I’d also take a look at the site but, as I mentioned above, it is hard to identify all the changes without having a snapshot to refer back to.

In this instance, there wasn’t much out of the usual with the linking activity, so at this point I’d look around their site a bit and maybe come away with one or two theories, but nothing concrete and nothing actionable....

However, in this instance one of our consultants, Mike Pantoliano, had been keeping an eye on things and so we had a before and after view of the code on the competitor site. They had rolled out an update to their front end, which changed lots of the visuals but kept the IA of the site pretty much intact. However, hidden amongst all the changes, was an under-the-hood tweak: they had added a recently added Schema.org vocabulary to their pages. Suddenly our view of the situation looks quite different:

Searchmetrics graph

By regularly taking a snapshot of your competitors sites you can overlay your data with annotations based on what they changed, and suddenly the picture becomes a lot clearer. In the example above, we can be far more confident of what was the likely cause of the change, and we can examine it in more detail to see if we can replicate it (or do an improved version of the same thing).

I won’t labour the point with examples based on rankings data; hopefully you see why this might be useful.

What to Track

For some small sites it might be feasible to download the whole site, but more often than not that simply won’t be feasible. Our goal here isn’t really focused on tracking the content across the whole site, but is more focused on code changes and IA updates.

My suggestion is to pick a handful of the ‘money’ pages that represent the main types of page on the site:

  • Homepage
  • Category pages (1-2)
  • Product pages (1-2)
  • Search page
  • Sitemap file(s)
  • Robots.txt file
The exact details of what you track will depend, of course, but the aim is to get a decent sample of the makeup of the main pages on the site.

How to Track Changes

Ok, so now we know we want to download a snapshot of a page we can examine again at a later date - how do we go about that?

The most straight-forward solution is widely available, and is to simply use the “Save as...” option available in most browsers nowadays. You’ll often be given a choice (except for sitemap files and robots.txt files) between saving just the HTML of the page, or the full page with accompanying files (images, CSS files, JS files etc.):

Save page dialog

I recommend saving the complete page, and archiving these by date and URL. Having just the HTML is useful, but having the dependent files allows you to load the page up again at a later date and I find having that ability makes spotting some changes easier. You can do this in Chrome, Firefox and even IE!

If you’re looking to be a bit more geeky you can write your own scripts to use curl or similar command line utilities to download the URLs you’ve selected and then schedule these to run. It is outside the scope of this post a little so I’ll leave that as an exercise to the reader!

If you’re interested in downloading the whole site then I recommend SiteSucker for Mac, which has worked really well for me and has various options for what types of files to download. On the Windows side, I’ve not used anything but my research has turned up Fresh WebSuction which looks to be approximately the same deal. If people have alternatives, I’d love to hear in the comments.

What Changes to Look For

This could be a whole blog post in and of itself. It is also something that changes over time, so I’ll leave it to your imagination. However, broadly speaking, any times of structure markup or microformats, meta changes, changes to internal linking, changes to crawlability and the like are a good place to start. However, hopefully others have ideas for the comments.

Wrap Up

I think observing your competitors in a structured and routine fashion is something that absolutely makes sense, and doesn’t need to be a big task. On the occasions that you find your competitors have changed something, or you observe movement in their ranking or search visibility, you are going to be very grateful you have the snapshots to refer back to. I’d like to see more tools out there that help to automate this (maybe there are? I only recently have thought about trying to do this in a more structured fashion).

If you have ideas on how to automate this better, or other tools we could be using I’d love to hear from you in the comments. Now go and download your competitors’ pages! :)

Tom Anthony

Tom Anthony

Joining Distilled as an SEO, Tom comes from a background in freelance web development. With a degree in Computer Science, a PhD in Artificial Intelligence (almost – he is still writing his thesis!) and having taught himself to program on a BBC...   read more

Get blog posts via email

10 Comments

  1. Good feedback on tracking competitors. I think this gets overlooked more often than not since everyone is concerned with what their site or clients site is doing. I'll have to try Fresh WebSuction but thanks for the tip.

    reply >
  2. Sounds interesting, i'm going to go through the windows app for applying to my job.

    I think, This is how a good SEO must work, viewing the enviroment in a widely way, not only numbers and SERPS rankings.

    Thanks

    reply >
  3. Takeshi Young

    I don't think it will detect under-the-hood changes like semantic markup, but for onsite changes changedetection.com is a good site. If you forgot to take a snapshot, archive.org may work if your competitor's site is cached often enough (Google or Yandex cache for more recent changes).

    reply >
  4. It has always been a challenge to understand that which activity brings most result for competitors. You article has given me some new ideas.
    Nice article - but many spelling mistakes though :) need better editing.

    reply >
  5. Tom have you try this Chorme "Page Monitor" Extension

    https://chrome.google.com/webstore/detail/page-monitor/pemhgklkefakciniebenbfclihhmmfcd

    This is of best extension and we can monitor our competitors easily with this, and can see what changes he makes in his website.

    reply >
  6. Alex

    This is something I have been doing for a few months now. I have a python script to open and download a png of all my competitors websites at the same time each day. Makes for interesting viewing when trying to find whats caused the sudden jump in rankings.

    reply >
  7. Iain

    I might be missing something here (not uncommon, I'm not particularly smart), but could you not try using archive.org to get the code for previous iterations of competitor sites? Obviously you are depending on them having the right version, so keeping your own data provides a greater guarantee, but only looking it up when you need it might be more efficient?

    reply >
  8. Freaky timing. We've been building a tool that does something very similar to this on an automated basis (because I am really bad at doing anything regularly without getting bored). Automatically screenshots, saves source and highlights changes on target URLs. Currently checks individual page + robots etc.

    Love the idea of marking the changes against search visibility. Hadn't thought to do that - nicking that idea - thanks Tom :¬)

    Could probably cobble together a public beta if anyone is interested in taking a look (contact via link).

    reply >
  9. There are some great Chrome plugins you can use to see all the "under the hood" markup in anyones pages. (You can even do a diff and compare the resulting graphs). More details and a guide are in this article:

    http://searchengineland.com/how-to-use-rich-snippets-semantic-markup-to-send-rich-signals-139886

    reply >
  10. I have used http://www.httrack.com/ with success. It's not much to look at, but gets the job done. Great post btw. Thanks for the expansive outlook.

    reply >

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>