The Latest 5 Tools I’ve Added to my SEO Toolbox

I’ve written a few posts now on tools: There was the one on Doing a Site Audit Using Google Webmaster Tools, the 8 Alternative Ways To Use Screaming Frog for SEO and most recently one on Simple Tools to Get More Done. But recently I’ve been using  some tools that don’t really fit into the topics I’ve covered previously so I thought I’d write this post.

Chart Intelligence Plugin

Credit for finding this tool goes to  Bridget Randolph who found this while checking if a client had been hit by any of the Panda updates. Normally, I just refer to the SEOmoz Google Algorithm Change History page but there’s been so many updates that I find it annoying to jump back and forth between windows to check the dates. This plugin means I no longer need to do that. It’s a plugin for Chrome and can be found here. It overlays Google updates onto Google Analytics data, below you can see example data with the Panda updates highlighted.

chart-intelligence

The arrow shows the icon that shows up in GA when this plugin is installed. Clicking on it brings up various options to overlay, the most useful being the Google updates but it can also be used to show public holidays or even your own private data if you are willing to do some customisation.

chart-intelligence-2

Link Research Tools

If, after using something like the chart intelligence plugin, you find out that your site has been hit by a penalty - for example, the Penguin update - deciding which links to remove can be a painful and slow process. To help with this, lately I’ve been using the Link Research Tools suite. They have too many tools to explain in a single blog post but I’ve been using one in particular; the Link Detox tool.

Cleaning up a bad link profile is a very labour intensive task, Link Detox helps scale this by allowing you to upload a list of backlinks that point to your site and telling you which are potentially harmful.

They use a variety of methods to do this, the most basic being things like checking if a domain is indexed and if it has PageRank. Links are then classified as being either Toxic (these should be removed) suspicious (these could potentially be dangerous) or healthy (no action is needed). The output of the tool is partially shown below.

Link-Detox-Summary

No link classifying tool will be perfect so always have a quick look at the results to make sure there are no obvious mistakes. As a general rule, I normally find the toxic links are accurate enough but I’ll still always give them a once over to check for glaring errors – better to be safe than sorry.

Rmoov

Keeping with the link penalty theme, the next step in the process would be to remove as many of the links that were cateogrised as toxic and potentially the suspicious ones. Again, this is a very manual and time consuming process so I’ll always welcome any tools that can help automate or speed that up. Thanks to Cyrus Shepard for writing a blog post on tools for easy link cleanup which brought Rmoov to my attention.

Rmoov

Rmoov is a tool that helps manage the entire link clean up and reinclusion process. From emailing the websites, to keeping records of the responses that you get [or don’t get] to finally submitting the reinclusion request;  it does almost all of this automatically. When you sign up, you are guided through a set up involving adding the list of links that you want to get removed. It then tries to find contact details for the sites and tells you which ones it can’t find. The final step is adding your email details to allow the emails to be sent and received from within the tool. Once all that is set up, the campaign can be started. One of the features I like about this tool is the template emails. Let’s be honest the initial batch of emails are unlikely to get many responses so you don’t want to waste too much time crafting individual emails to each. This allows you to get up and running as quickly as possible. If no response is received within the set time you set (for example, I set it to 3 days) the tool will automatically send a follow up email for you. The result is probably something like this:

  • A large percentage of broken email addresses or sites that won’t reply
  • A small percentage of sites that agree to remove the links
  • A small percentage of sites that will remove the links for a fee
All of this information with notes on who and when you emailed the sites is saved in a gdoc which you can then include as a reference in your reinclusion request.

Note -  Don’t ever pay someone to remove a link, just include that they wanted to charge you in the reinclusion request and add them to the disavow tool; that’s what it’s for.

Optimizely

Optimizley is my weapon of choice when it comes to conversion rate optimisation. It has a lot of the same features as many other CRO tools out there with one big difference; I don’t need to ask a developer to make the changes.

Optmizely makes running CRO experiments easy by using a WYSIWYG interface. Once you’ve added a single line of code sitewide, you can create and apply changes very easily. I’ve been using it with a lot of success recently and I highly recommend it. For more details and to see it in action, see the video below:

Deep Crawl

When it comes to crawling websites, I’m a big fan of Screaming Frog but the one downside is that I’m limited by the performance of my computer. Crawling more than 100k pages is challenging. This is where I turn to DeepCrawl; it’s worth noting that DeepCrawl is very much an enterprise tool so comparing it to Screaming Frog isn’t really a fair comparison.

DeepCrawl is an enterprise level cloud based crawler that allows you to crawl into the millions of pages without using your computer’s processing power or even your office IP. The interface is very easy to use and what I love about it is the ability to share problems with external developers or other members of staff. For example if I have a list of 404 pages, for a developer or client to be able to fix them, they need to know where they are being internally linked from. With DeepCrawl I’m able to simply copy and paste the link to that report and other people are able to not only see the 404 pages but they can click through and find out where those pages is being linked from as well as other information. I love it.

The image below shows the dashboard style report that you get after a crawl.

deep-crawl

Scrapebox

scrapebox1

I briefly mentioned Scrapebox in my last blog post. As I said, ScrapeBox gets a bad reputation as being a spamming tool, yes there are features that allow you to do some pretty spammy tactics but I only use it for a few of the features and find it a massive time saver and an excellent tool for diagnosing indexation and penalty problems for my clients. I’ve outlined the features that I use it for below:

1 – Checking if Toxic Links are Still Alive

Before during and after a link removal project you’ll want to check if links are in fact removed or if they are still live. Doing this manually isn’t a good use of your time but it needs to be done. ScrapeBox has a plugin called “Do Follow Test”. It allows you to upload a text file with the URLs of the links you want to check. You then add your domain and press start. It’ll then go and check each of the URLs for links to your client and tell you if they’re still there, but also whether or not the links are nofollowed. You can see an example of this below.

scrapebox

2 - Indexation and PageRank Checker

As I mentioned earlier in the Link Research Tools section, two of the most basic checks you can do to check if a link is toxic or not is to check if the domain is indexed and if the home page has any PageRank. It’s rarely as black and white as that but it’s a good place to start. Scrape box has a feature to check both very quickly.

scrapebox2

Just paste a list of URLs that you want to check in the URLS box and press either of the buttons shown on the right of the image above. The full list can then be exported.

3 - Sitemap Scraper

The final thing I use ScrapeBox for is for the sitemap scraping plugin. Using this in conjunction with the index checker allows me to find navigational and indexation problems with my clients’ sites. To use this, just paste the location of the sitemap into the URLs box then download the sitemap scraper plugin from the add-on menu. When you open it, there will be an option in the bottom right corner to “import/export” click it and select import from ScrapeBox. This loads the sitemap URL that you just added and you can then hit start. The result will be something like shown below:

scrapebox3

These can then be exported to a text document or in my case I want to check if those pages are all indexed. To do this I press “show extracted links” click inside the box, select all and copy the full list of URLs.

I can then go back to ScrapeBox and paste the full list into the URL box again and hit “check indexed”. I’ll then get a list of all the pages in the sitemap that are indexed or not as shown below.

index-checker

These are just some of the  tools I’ve been using recently. If you’ve got suggestions for other tools that I’ve not heard about, please feel free to add them to the comments. Also, I’ve written a similar post on SEOmoz about some single purpose tools and tools to make you work more efficiently, you can read that here. Thanks for reading and don’t forget to comment.

About the author
Craig Bradford

Craig Bradford

Craig moved from Glasgow, Scotland to join Distilled in 2011 as an SEO analyst. Since joining, he has consulted with a range of clients, from start-ups to some of the biggest brands in the world. Specialist areas include technical SEO, analytics and...   read more