As you all know by now there has been a massive update to Google’s algorithms in the past week. Initially called the Farmer update by Danny Sullivan it’s now been revealed the internal name at Google was the Panda update. This post is in response to the latest Wired article which features an interview with Amit Singhal and Matt Cutts from Google.
Some quick facts we know:
- The initial update happened Wed 23rd Feb in the US only
- There was another smaller update on Tuesday 1st March which gave some sites their traffic back
For background reading, you should check out these key blog posts:
- Search Engine Land
- Search Engine Land part 2 including updated winners and losers
- SEOmoz correctly identifies this update is largely not about links
Then (I’m quite proud of this) it looks like I might have accurately predicted what some of the key factors were:
Some of these things were confirmed in this Wired post published just now:
Some key takeaways as I see it:
Looks like I was right – aggresive ads are a factor
Cutts: There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.
This was one of the major factors I had already guessed went into the algorithm update. You can see this for yourself. I made a Google Custom Search Engine for all the sites listed in the sistrix data that were big losers. Try it here:
More or less any search you do will return pages plastered in ads. Coincidence? I think not. Looking at this CSE is one of the main reasons I thought aggressive ads might be a factor.
But ads aren’t the only factor, trust is a big deal too
Singhal: we asked the raters questions like: “Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?”
So it’s not just ads that play a factor – it’s the overall look and feel, perception and trustworthiness of the page which goes into whether it was affected or not. Of course, these things can’t be measured directly by algorithms but from the interview it seems like they have come up with a whole blend of factors which combined gives you an accurate measure of how much the site is trusted.
This update is multi-faceted
As I mentioned in my comment, and as supported by the Wired article this change is a BIG DEAL. It’s not just a subtle fiddling with key factors in the algorithm – it sounds like they either introduced a whole bunch of new factors or figured out a way of combining them in some new and complex way:
Amit Singhal: Well, we named it internally after an engineer, and his name is Panda. So internally we called a big Panda. He was one of the key guys. He basically came up with the breakthrough a few months back that made it possible.
If it requires a breakthrough to accomplish it’s gotta be pretty complex right? Either they’re computing a whole bunch of data they weren’t previously or they’ve figured out how to process data signals quickly enough to use them in search results. Either way – this isn’t an update where it’s just one thing that matters, it’s an update that touches on all kinds of different factors which together form a rich blend.
There’s more where this came from – brace yourself
Singhal: I won’t call out any site by name. However, our classifier that we built this time does a very good job of finding low-quality sites. It was more cautious with mixed-quality sites, because caution is important
I feel confident in saying this is just the beginning. It sounds like Google are going to be updating this over the coming days/weeks/months to target the mixed-quality sites. This update appears to have been largely site-wide but maybe the next update will concentrate on individual pages more closely? Either way – watch out because if you’re a mixed-quality site and it’s a crowded niche you’re in then you’re going to get pretty soon imho.
When will this update hit the UK?
Frankly, I have no idea. But I do have a theory (I love theories!). This update appears to have been based largely off feedback from users – Google is looking at what trust signals people use to classify sites. But those trust signals are going to vary from country to country right? This is just a hunch but I feel like it might be a little while before this rolls out in the UK. Give them some time to run the same kinds of analysis and find the right trust and quality metrics for the regional search engines.
That said – I could be wrong and it might roll out tomorrow.
Did they fit the algorithm to the human signals?
Singhal: You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red.
What does this comment mean? Other than being fascinating and mentioning hyperspace planes does this comment hint that they have actually built an algorithm that is designed to fit a bunch of human signals? This would be the first time I know of that they have admitted to doing such a thing. Maybe that’s clutching at straws though.
So what’s the actionable insight? What do you DO about it?
Firstly, it’s very early days. Updates to the algorithm are on their way and more details will be released I’m sure about what factors are really playing a part in this update. That said, if your site was affected (or if you’re in the UK and worried this update will hit you when it rolls out into the UK):
- Ensure your content us unique and useful – there’s nothing terribly new here (good content has always been best practice) but I do think Google tightened ship slightly on duplicate content and thin content pages
- Ensure there is a good amount of value added to users above the fold on your page – value here is a difficult concept to nail down algorithmically but for your site it should be fairly easy to figure out. Perhaps consider using a service like feedback army to test your site with random people. Ask them some of the questions that Google ask – would you put credit card details into this site? do you trust this site? If the answers don’t give a positive response – keep going with adding value (or making the value add clearer!).
- Get rid of aggressive ads and images that look like ad blocks. Whether it’s adsense, kontera, banner ads or just large images that look like ads – make sure they don’t take precedence over your content
- Re-design your site to make it look like a web 2.0 site. Did I really just say that? Yes, I did. Because Google’s algorithms are now based on what people want to see, not where the actual value is – so you need to fit in. If your site is the ugly one the you’re far more likely to get hit than if your site is good looking.
If you know of a high quality site that has been negatively affected by this change, please bring it to our attention in this thread. Note that as this is an algorithmic change we are unable to make manual exceptions, but in cases of high quality content we can pass the examples along to the engineers who will look at them as they work on future iterations and improvements to the algorithm. So even if you don’t see us responding, know that we’re doing a lot of listening.
Note these are all based off a combination of data analysis, reading information from google and others in the community and gut feel. What else is there to go off?
Update: There’s some new information from Google released at SMX West covered by Search Engine Land:
Looks like a bunch of my hunches have proved correct but also the addition of social media/brand metrics is a new one (but which also makes a lot of sense).
Update 2:Although this is PPC-related I find the post here absolutely fascinating about what % of the page above the fold (and on what screen resolution) is allowed to be advertising and what % should be unique content:
This is a great tool to see how different screen sizes see your site:
Update 3: More information released here:
The only really new information is this:
Matt confirmed that the algorithm change is still U S. only at this point, but is being tested internationally and would be rolling out to additional countries “within weeks”. He said that the type of low quality content targeted by the changes are more prevalent in the United States than in other countries, so the impact won’t be as strong outside the U.S.
We’ll be watching closely to try and spot when this rolls out.
Tom Critchlow Tom Critchlow is VP Operations for the NYC office, living in Brooklyn and working in Manhattan. Fiercely curious about most things and passionate about everything.