Richard Baxter - Information & Site Architecture - a4uexpo Live Blogging

Right, apologies for not covering the pre-lunch session with Will (beyond on Twitter a bit), I was feeling the tax a bit on covering these sessions but by popular demand I’ll cover another one off here and Will should be covering one in the next timeslot.

Rich Baxter of SEOGadget kicks us off with his experience in the travel industry on massive sites to help us understand what the best way to get link equity spread across a site and how to get information architecture (IA) and internal linking right on sites of all sizes.

Rich started out by giving us a couple of examples and problems with “not so good site architecture”. These sites are typically classified as being light at the top (a few pages linked from the top level) and several clicks away from the money (category and product pages).

This all hits on the ability for the search engines and the users to get through your site to the most important/valuable pages as quickly as possible and with as few clicks. The aim is to flow authority down the site as fluidly and easily as possible. Any links a page may attract pass this equity down and it is our job to pass that aurthority down from the easiest way possible. If your lowest level of content (voucher code or product pages) are in this type of link structure they will starve for links, visits and conversions! This is an are you need to compensate for this if you’ve not done it right.

Start with Your Homepage

The right way to start is to focus on your homepage/the landing page. Bad example and high potential: BA - 302 the page if you can’t read javascript and are a crawler (Google) this is a terrible idea.

Flattening Site Architecture

General rules: no more than 3 clicks and certainly not more than 4 to get to the base layer of content. Three clicks is certainly to be preferred.

Tricks & Tips for achieving this: using vouchercodes.co.uk as an example of doing it well and planning for multiple paths of entry for deeper pages.

Keyphrase Research Before Information Architecture

Uses flights as a broad term and “flights to new york” as a destination site. The best place to start is by thinking about keyphrases and such first and then use this data to decide what information you need. This is Rich’s preferred approach (to do keyphrase research first) because it will keep you from being influenced by architecture when going after keyphrases.

(Sorry- As a quick aside it is also worth mentioning at this point that Rich will be unveiling his advanced keyphrase research at PRO seminar)

Identify Keyphrase Groups:

Rich used the example from the travel industry where you would be looking at: Generic, Destination and Route groups. Before we figure out how to link them together we can figure out what information we’re going to need and identifying these groups (covering off “flights”, “flights to new york” and “flights from Portland, ME to New York”) and the need for this content before you’ve even begun to figure out how to link these pages together and where they sit. The name of the game is knowing the content you need first, then figuring out how it all fits together.

Similarly, for a Retail site he suggested: generic (gifts, gift ideas), category (home and garden), subcategory (bathroom gifts), Products (candy floss maker). Mapping keyphrases is great.

Rich recommends we use visio as a way to draw out each page and arrange them. You don’t even need to figure out how to link them together, but rather think about how they would fit together. At this point we’re seriously beginning to get an idea of what these categories/sections look like.

Next: how to link it together.

Navigation and Crosslinks

Planning these elements out can be huge and is an absolutely essential piece to the success of any good architecture: it can’t just be about a few pages, a few categories, and a clear set of term silos.

CSS styled DHTML (cross browser drop-down cascading validating menu) is the right way to make sure the navigation is search engine friendly. Technology is key to make sure that this universal navigation fits across the site and perhaps most importantly- that these links can be used by search engines and users alike (irrespective of whether they can use java, ajax, etc). Basically if links can’t be seen without Javascript you are doing it wrong. Especially to the category pages because this is the way to link to these SUPER important pages across the site.

Importance of Internal Links

Each page in your IA (information architecture) is going to attract a certain number of links. It is important that some of these links go deeper into the site rather than just to the home page.

Potential Problem: the more we link down based on a clear hierarchy (e.g. just a global navigation) we are creating silos from which any link equity will not be spread. This is an inferior way of spreading the link equity and getting the most out of your pages. Navigational enhancements (beyond global navigation) can help spread this equity more fairly.

Crosslinking Ideas

What strategies can we use? Think about how users are navigating around your website for starters!

Once we’ve got the hierarchy done we need to figure out ways to help throw links around both for search engines, but perhaps more importantly for the users. Bearing the user in mind and what they were looking for is the smartest way to address these issues.

Come up with words that would reflect popularity, would reflect behaviours on your page (recent products, top products, users who viewed this product also viewed this product, etc).

Pick “popular review categories” (or popular tags, or whatever is most relevant to your site). This could be another smart way to get other seemingly unrelated pages indexed fairly well.

One thing that works well:

When you see “show all links” you can add in more links to those pages (dynamic HTML pop-ups). This can help create a better user experience, and create more links on a page without harming the usability of the site.

Onpage SEO

This is an extremely important part of the architecture and in getting the most pages indexed (one of the key aims of any information architecture). Rich mentioned the need to be very careful with duplicate content and the amount of this in the affiliate space.

User Generated Content can be huge in helping you create unique content for a page, you just need to be a bit clever about using it higher up than just on the product page.

Stop Using Boilerplate Text

This never should have been a powerful strategy, but it was once. It isn’t now. You should be doing more with your site rather than “Looking for [category] in [location]...” find yourself an outsource (mentioned ODesk) to get someone to take these and rewrite them. Boilerplates will work to an extent but they’re not going to have anywhere near the impact of carefully crafted individual fields.

Case study: pulled in UGS (out of javascript), reworked some content on top category pages. The result was a serious increase in the number of keyphrases sending traffic and overall traffic from organic search.

Category Page Tips

Rich reckons these pages are perhaps the most important- and this was the most thorough disection of a category page I’ve ever seen (which is extremely helpful for a certain project on which I am currently working).

How can you make a category page better and improve crosslinking?

Rich mentionsed Visio again as being great for creating mock-up sites. Here are the top areas he addressed as ways to make a better category page.

Get your breadcrumbs right. You don’t just need ONE word. You can create some unique information/text here in your content manager.

Unique content is the name of the game here and there are a number of days to do this. Along these same lines- unique description/text on this page are essential.


Pagination (use rel=canonical and no-follow both wisely and carefully). Messing this up can have a huge issue. Rich suggested that you really should not use rel=canonical if this is unique enough even in terms of product listings on the next page. Rich would generally advise against rel=canonical from page 2 to page 1 of category pages. He suggested a far better way to deal with this would be to create unique content on each of these pages as people often mess this up.


UGC/HReview/Microformats: if you’ve got some information further down (product page) why not bring some of it up to the category page? This (as long as it is reworded) can count as long as it has been reworked/summarised sufficiently.


Cross link categories - spread links more nicely and make it as relevant to the user as possible. If you want to make it relevant to the user you should be careful and think about what they might be looking from this site. Rich also mentioned related categories. This doesn’t all need to be identical - you can relate topics at this level to prevent some of the silo-effect we discussed earlier even if you wouldn’t expect them in the same hierarchy (e.g. christening gifts page, add toddler and infant gifts, etc. OR a car model and year category page should also link to parts for that particular make and model).


Review content/encourage users to leave a review. REWARD people for a competitions. The occasional reward (voucher), a top user/reviewer list, etc. This activity is usually very common down on the lowest level- why not syndicate this up onto a category page. HReview AGgregate adds some uniqueness to these category pages.


Use internal site search and find out the most popular that result in a page view, or may have the highest likelihood to convert, etc.

These category pages tend to have the most potential of any page so ignore them at your own peril!

Supercharged UGC

Rich is a strong believer in microformats. If you have people willing to get involved and leave reviews, you should get them to review. Rich snippets are the outcome and having these star ratings can help with conversion, CTR and display within the SERPs.

Tips:

Pull through content and aggregrate it to category and hme pages (don’t duplicate but use these elsewhere)

Expired pages? Keep the UGC from any of these pages and use it elsewhere. This doesn’t have to disappear just because the product has. Along these lines Rich also suggested: limit the number of reviews you would show on any page so you can use this elsewhere

Technical Issues

There will often be technical issues around creating a good information architecture. Issues:

Duplicate content

Rel=canonical is great if used right, but having more unique content is even better. Especially if you know how to use noindex properly. Identify the duplicate content on your site (e.g. site:example.com inurl:product1) and make sure you identify these. You can decide what to do with this later.

Tools for duplicate content (new tools!):
SEOmoz pro toolset can do this and catch other errors on your site.

Webmaster tools can warn you about duplicate meta descriptions and titles

IIS (gets another shout!)- not the easiest thing to install particularly if you are a new user. Rich has written a guide about this. It
is well worth learning and adding it to your site as it will give you a heads up on anything it thinks is wrong. It will help you catch crippling things early-on and let you know where you need to start.

For click depth:  Rich suggests Xenu can also be really good for this!
Do a full crawl with Xenu- scroll to right side and you can see the level at which that URL is and the number of outlinks vs. inlinks on that page. If you sort by low-level of index and that are low in the architecture (i.e. several clicks deep) you can identify some great opportunities here.

 

Site Speed

Rich only touched on this briefly, but it is a hot topic of late. Whilst Rich seemed somewhat skeptical about the ability of site speed to play a part as a major ranking factor, he thinks it has had an impact and has also done a lot for usability. The vast majority of people do not expect to wait more than 2 Seconds per page.


Optimal site speed isn’t just about doing better in the rankings, but rather for improving your business! Look at abandonment after two-three seconds and things like this. Can make it much easier to make the business case for this.

Faster (Re)Crawl with Conditional Get

What can we do to make sure the site gets crawled quickly and efficiently?

304 modified response (according to Bing this can help your crawl). This will help them from crawling/wasting time if it has not been modified since last crawled. Bing has shown that, on larger sites this can increase the number of pages that can be indexed if you use the right response code.

Fun Fact:

According to Rich they pay $5,000/month [?] (one of the reasons they can be quite slow), Rich is shocked that this hasn’t toppled over all together given just how massive the site is and how much traffic it gets.

Pro-tip from Q&A:

Check out Google ajax definitions and figure out how to serve up a non-ajax friendly version of the page.

Get blog posts via email

0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>