I recently went through the not inconsiderable pain of installing the Microsoft IIS SEO Toolkit. It’s probably easier if you already run IIS servers. Since we are a LAMP shop, I’ve used Microsoft servers before but never installed them. I don’t quite know why this tool is a plugin to the server - it is essentially a crawler. So you don’t have to run it over sites on your own machine (or even sites hosted on MS servers). Whatever the ins and outs of that, it’s worth following this guy who is the original developer.
The web installer is a really nice touch, but it doesn’t play nicely with user permissions on Windows 7. I managed to make it work by logging in as root before using the web installer (which meant setting up a bunch of browser settings in my superuser account).
Even after doing that, I still need to run the application as administrator each time.
My intention with this post is not to walk through the features (I’ll leave that as an exercise for the interested reader) but rather to point out some cool specific SEO tasks:
## Digging into errors
One of the first places I check out is the status code summary.
The nice feature here is the ability to drill down by double clicking on statuses. I have found this to be one of the quickest ways of digging into structural issues across big sites. Although there would have been other ways of finding it, it quickly highlighted an issue with a recent client’s site where they had accidentally changed a large proportion of their internal links to go via 301 redirects. On large sites, it can be hard to find patterns in this kind of thing without a scalable tool.
I have recently become a big fan of webpagetest for creating management-friendly videos of sites loading side by side (see this corporate comparison for example). This is only indicative, however. It doesn’t help make widescale improvements to large sites. Your (client’s) developers are going to want to know patterns - the kinds of pages that are generally slow etc.
In the ‘performance’ tab, you can select to see slow pages by directory and sort by ‘count’ descending:
## Click depth
Rather than bore you with more screenshots and detailed explanations of pivot tables, I thought I’d demonstrate with a quick screencast how to mash up the IIS crawl data to tell you how many pages you have at each ‘level’ of crawl (this post from Rand explains why you would care about this kind of thing). I don’t know of any other good ways of getting this data, so I hope this is of interest as a use for the IIS toolkit. Note that the video was recorded in the Distilled office rather than a recording studio so there is background hum, hopefully it serves its purpose: