This week I have come across a couple of interesting ways that the web is just a little bit broken.
## Strange Cloaking from the Telegraph
The first is some strange behaviour from the Telegraph blogs area. Try viewing this page in the following three ways:
1. normally, with cookies enabled, via your usual browser 2. via a browser with cookies disabled 3. with cookies disabled identifying yourself as any of the main search engines' crawler in your useragent
What you will see is:
I have a few questions about this, but the main ones are:
1. WHY? If your site doesn't break without cookies (as evidenced by the fact it works for search engine spiders), why stop people browsing without cookies? It can't be to do with advertising tracking because the main Telegraph site works fine without cookies. 2. If you must do this, could you not at least return an error code on the page you deliver users telling them your site is broken? A "200 OK" breaks the internet. Really.
I'm not sure what I think about this from the search engines' perspective. It's clearly against the letter of the guidelines (treating googlebot et al differently based on their user-agents) yet in many ways it doesn't actually go against the spirit of the guidelines as the search engines are served the same content that the majority of users see. We had a vigorous debate in the office about how we would treat this behaviour if we were a search engine. I'm not quite sure. I'd like to stop it happening (I feel it breaks the internet) but it doesn't really feel bad enough to get penalised. What do you think? What would you do?
## Unfortunate Geo-Delivery from the BBC
Try a Google search like this site:news.bbc.co.uk/weather and you will find the homepage has a title tag of "BBC Weather | United States of America".
Strange, huh? For such a solidly UK-centric organisation to have its weather homepage US-focussed? That's what I thought.
Browse there (at least from the UK) and you find a title tag of "BBC Weather | United Kingdom". Not a perfect title tag, I wouldn't have said (I would have wanted "UK" and "forecast" in there I think) but a lot better.
The BBC (the beeb! of all people) have fallen prey to what I called the world series spidering problem in an SEOmoz post last year - that since the search engines typically send their spiders out from the US, if you blindly geo-deliver content, you will end up with your US-focussed content being indexed.
Ironically, if the BBC really wants to geo-deliver US-focussed content to US visitors (while presumably being more focussed on the UK market, given their funding!), arguably they should be engaged in the kind of 'conditional delivery' the Telegraph is doing. Should they show the search engines their UK-focussed 'primary market' content while showing regular US browsers US-targeted content?
Incidentally, this results in all kinds of ranking confusions - I'll leave trying out all the combinations of searches for weather forecasts in the UK and US from the UK and US to the interested reader.