Microsoft IIS SEO Toolkit: SEO tips and tricks

install Microsoft IIS SEO toolkit

Bow before me for I am root

I recently went through the not inconsiderable pain of installing the Microsoft IIS SEO Toolkit. It’s probably easier if you already run IIS servers. Since we are a LAMP shop, I’ve used Microsoft servers before but never installed them. I don’t quite know why this tool is a plugin to the server - it is essentially a crawler. So you don’t have to run it over sites on your own machine (or even sites hosted on MS servers). Whatever the ins and outs of that, it’s worth following this guy who is the original developer.

The web installer is a really nice touch, but it doesn’t play nicely with user permissions on Windows 7. I managed to make it work by logging in as root before using the web installer (which meant setting up a bunch of browser settings in my superuser account).

Run the application as administrator

Even after doing that, I still need to run the application as administrator each time.

If you are used to the general terminology of the web and crawlers like Xenu then you’ll find it pretty easy to get started, but this is a great walkthrough of the basics.

My intention with this post is not to walk through the features (I’ll leave that as an exercise for the interested reader) but rather to point out some cool specific SEO tasks:

Digging into errors

Status Code Summary

One of the first places I check out is the status code summary.

The nice feature here is the ability to drill down by double clicking on statuses. I have found this to be one of the quickest ways of digging into structural issues across big sites. Although there would have been other ways of finding it, it quickly highlighted an issue with a recent client’s site where they had accidentally changed a large proportion of their internal links to go via 301 redirects. On large sites, it can be hard to find patterns in this kind of thing without a scalable tool.


I have recently become a big fan of webpagetest for creating management-friendly videos of sites loading side by side (see this corporate comparison for example). This is only indicative, however. It doesn’t help make widescale improvements to large sites. Your (client’s) developers are going to want to know patterns - the kinds of pages that are generally slow etc.

In the ‘performance’ tab, you can select to see slow pages by directory and sort by ‘count’ descending:

Slow pages report

Click depth

Rather than bore you with more screenshots and detailed explanations of pivot tables, I thought I’d demonstrate with a quick screencast how to mash up the IIS crawl data to tell you how many pages you have at each ‘level’ of crawl (this post from Rand explains why you would care about this kind of thing). I don’t know of any other good ways of getting this data, so I hope this is of interest as a use for the IIS toolkit. Note that the video was recorded in the Distilled office rather than a recording studio so there is background hum, hopefully it serves its purpose:

Will Critchlow

Will Critchlow

Will founded Distilled with Duncan in 2005. Since then, he has consulted with some of the world’s largest organisations and most famous websites, spoken at most major industry events and regularly appeared in local and national press. Will is part...   read more

Get blog posts via email


  1. Nice overview of the toolkit, been using the tool for around a year now, Its one of the best all in one tools around in my opinion.

    Once you know how to create custom queries the information you can pull from it is endless :)

    reply >
  2. I really like this tool as well. One annoying issue that may crop up for some is that IIS and Apache don't seem to play well together. Not surprising. The IIS server starts up automatically by default and needs to be shut down by opening the IIS manager before Apache can run on the same machine.

    reply >
  3. aj

    I just installed it to my desktop and and run it from there all the time. No real need to put it on a server. Its a great program just be sure to click the advanced settings when running it because it is set to only do a few K pages and I think 100kb per page by default.

    reply >
  4. i had used the old version and wasnt too impressed... from what i see here i should probably reevaluate it.

    by the way i read seomoz all the time and have been here a coupel times... you always have great content and this article has made you worthy of my RSS feeds congrats :)

    btw im @cartercole

    reply >
  5. This looks awsome guys. iv used the older one a few times, but its good to see they got all the kinks out of it. also im loving that T-shirt, iv got to get me one of those.

    reply >
  6. I've just installed this. It will be an interesting tool to have in place over time. It will also be equally interesting as to how it will evolve over time; i.e. what sort of developments to maybe expect from Microsoft for the application.

    reply >
  7. Digital Marketing Pro

    Awesome tool and I use it extensively. However, usually face problems working with reports when crawling large sites for clients (more than 1 million page).

    CSV export for such large sites do not include all the URLs that the tool finds. As a result data is incomplete & not useful. Does someone knows how to get around this?

    reply >

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>