How to do a 5 Step Site Audit

Sometimes, it can be overwhelming and difficult to look at a new site and know where to start.

Do I got straight into the code? Do I navigate around? Should I go straight into their backlink profile. I JUST DON'T KNOW!

Although these are all important, they may not give you a high-level understanding of the site and a clear picture of how the project should proceed. To tackle the "deer-in-the-headlights" approach I've put together 5 simple steps that will help you identify some common and crucial problems that are made. These steps are especially useful when you being speaking with a potential client and they want to know how you can help.

So away we go...

1) Duplicate Content

Why does it matter? Duplicate Content can be any site's downfall. When there is more than one piece of the same content, it can make it difficult for search engines to distinguish between the original source and mere copies. This will then cause the search engines to present less relevant information, and more importantly to the site, lower rankings and traffic metrics.

How to check?

    - Copy a snippet of text
      - Paste the text in Google (or your search engine of choice :))

      - Analyze results!
    If you want to be sure there isn't duplicate content on your domain (which you should definitely do!) combine the above method with a site search. By doing this, your query will only return instances of duplicate text from said site.

2) Check Robots.txt

Why does it matter? Utilizing robots.txt can be super helpful when you're trying to block the search engines from crawling certain pages or folders. However, this function can be just as detrimental if you do not implement it correctly. To be certain that a site isn't blocking themselves (I've seen this..a few times!) or any of their own important content or pages; check the robots.txt file of the website.

How to check? Super easy! Taking Distilled as an example:

3) View the Cache Version of the page

Why does it matter? You've heard of cloaking right?! Don't do it. By showing different content or URLs to Google you're asking for a world of pain to be brought down upon your rankings and traffic. My advice, don't wake a sleeping bear (that's a saying right?!)

This check is also useful because you can see the last time Google crawled a specific page. This is important because you want your site to be crawled as frequently as possible, while a long lapse in time can indicate a possible penalty or crawling issues.

How to check? Again, super easy!

4) Canonicalization

Why does it matter? Just like duplicate content, having multiple versions of the same content across multiple locations can decrease your rankings, spread out your link equity, or more simply devalue your site.

How to check? Easy as pie. Check all the common versions of the site are redirecting to the canonical version:

If the page loads the same content without redirecting then you've got some canonical issues, my friend. Don't get too panicked though, this is a basic mistake that can be easily rectified.

5) Top Page Report

Why does it matter? You can often find missed opportunities by looking at the top pages report in Open Site Explorer(OSE). By doing this, you can see which pages have the most links pointing to them, as well as the response codes. For a site's top pages, you want to ensure that they are in fact the site's "top pages" (AKA most important) that return the right response code. If the page returns anything but a 200 or 301 that page might not be passing valuable link juice.

How to check? Input the site into OSE and analyze the top pages, linking root domains, and HTTP status codes.

Get to it!

So get to it! Hope this helps and remember, it's as simple as pie. Specifically, pecan pie. mmmm :)

Get blog posts via email