Google’s Panda/Farmer Update - What To Do About It

As you all know by now there has been a massive update to Google’s algorithms in the past week. Initially called the Farmer update by Danny Sullivan it’s now been revealed the internal name at Google was the Panda update. This post is in response to the latest Wired article which features an interview with Amit Singhal and Matt Cutts from Google.

Some quick facts we know:

  • The initial update happened Wed 23rd Feb in the US only
  • There was another smaller update on Tuesday 1st March which gave some sites their traffic back

For background reading, you should check out these key blog posts:

  1. Search Engine Land
  2. Search Engine Land part 2 including updated winners and losers
  3. SEOmoz correctly identifies this update is largely not about links

Then (I’m quite proud of this) it looks like I might have accurately predicted what some of the key factors were:

Some of these things were confirmed in this Wired post published just now:

http://www.wired.com/epicenter/2011/03/the-panda-that-hates-farms/all/1

Some key takeaways as I see it:

Looks like I was right - aggresive ads are a factor

Cutts: There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.

This was one of the major factors I had already guessed went into the algorithm update. You can see this for yourself. I made a Google Custom Search Engine for all the sites listed in the sistrix data that were big losers. Try it here:

More or less any search you do will return pages plastered in ads. Coincidence? I think not. Looking at this CSE is one of the main reasons I thought aggressive ads might be a factor.

But ads aren’t the only factor, trust is a big deal too

Singhal: we asked the raters questions like: “Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?”

So it’s not just ads that play a factor - it’s the overall look and feel, perception and trustworthiness of the page which goes into whether it was affected or not. Of course, these things can’t be measured directly by algorithms but from the interview it seems like they have come up with a whole blend of factors which combined gives you an accurate measure of how much the site is trusted.

This update is multi-faceted

As I mentioned in my comment, and as supported by the Wired article this change is a BIG DEAL. It’s not just a subtle fiddling with key factors in the algorithm - it sounds like they either introduced a whole bunch of new factors or figured out a way of combining them in some new and complex way:

Amit Singhal: Well, we named it internally after an engineer, and his name is Panda. So internally we called a big Panda. He was one of the key guys. He basically came up with the breakthrough a few months back that made it possible.

If it requires a breakthrough to accomplish it’s gotta be pretty complex right? Either they’re computing a whole bunch of data they weren’t previously or they’ve figured out how to process data signals quickly enough to use them in search results. Either way - this isn’t an update where it’s just one thing that matters, it’s an update that touches on all kinds of different factors which together form a rich blend.

There’s more where this came from - brace yourself

Singhal: I won’t call out any site by name. However, our classifier that we built this time does a very good job of finding low-quality sites. It was more cautious with mixed-quality sites, because caution is important

I feel confident in saying this is just the beginning. It sounds like Google are going to be updating this over the coming days/weeks/months to target the mixed-quality sites. This update appears to have been largely site-wide but maybe the next update will concentrate on individual pages more closely? Either way - watch out because if you’re a mixed-quality site and it’s a crowded niche you’re in then you’re going to get pretty soon imho.

When will this update hit the UK?

Frankly, I have no idea. But I do have a theory (I love theories!). This update appears to have been based largely off feedback from users - Google is looking at what trust signals people use to classify sites. But those trust signals are going to vary from country to country right? This is just a hunch but I feel like it *might* be a little while before this rolls out in the UK. Give them some time to run the same kinds of analysis and find the right trust and quality metrics for the regional search engines.

That said - I could be wrong and it might roll out tomorrow.

Did they fit the algorithm to the human signals?

Singhal: You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red.

What does this comment mean? Other than being fascinating and mentioning hyperspace planes does this comment hint that they have actually built an algorithm that is designed to fit a bunch of human signals? This would be the first time I know of that they have admitted to doing such a thing. Maybe that’s clutching at straws though.

So what’s the actionable insight? What do you DO about it?

Firstly, it’s very early days. Updates to the algorithm are on their way and more details will be released I’m sure about what factors are really playing a part in this update. That said, if your site was affected (or if you’re in the UK and worried this update will hit you when it rolls out into the UK):

  • Ensure your content us unique and useful - there’s nothing terribly new here (good content has always been best practice) but I do think Google tightened ship slightly on duplicate content and thin content pages
  • Ensure there is a good amount of value added to users above the fold on your page - value here is a difficult concept to nail down algorithmically but for your site it should be fairly easy to figure out. Perhaps consider using a service like feedback army to test your site with random people. Ask them some of the questions that Google ask - would you put credit card details into this site? do you trust this site? If the answers don’t give a positive response - keep going with adding value (or making the value add clearer!).
  • Get rid of aggressive ads and images that look like ad blocks. Whether it’s adsense, kontera, banner ads or just large images that look like ads - make sure they don’t take precedence over your content
  • Re-design your site to make it look like a web 2.0 site. Did I really just say that? Yes, I did. Because Google’s algorithms are now based on what people want to see, not where the actual value is - so you need to fit in. If your site is the ugly one the you’re far more likely to get hit than if your site is good looking.
  • Make your site look more trusted. This has always been a good thing - it helps conversion rates - but you should ensure you have a good about page, privacy policy, contact page etc to ensure people feel comfortable using your site.

Update: There’s also a a posting by a Google employee over here. If you’ve been hit unfairly I suggest you jump in and leave a comment. (Thanks to Balibones for alerting me to the thread)

If you know of a high quality site that has been negatively affected by this change, please bring it to our attention in this thread. Note that as this is an algorithmic change we are unable to make manual exceptions, but in cases of high quality content we can pass the examples along to the engineers who will look at them as they work on future iterations and improvements to the algorithm. So even if you don’t see us responding, know that we’re doing a lot of listening.

Note these are all based off a combination of data analysis, reading information from google and others in the community and gut feel. What else is there to go off?

Update: There’s some new information from Google released at SMX West covered by Search Engine Land:

http://searchengineland.com/the-farmerpanda-update-new-information-from-google-and-the-latest-from-smx-west-67574

Looks like a bunch of my hunches have proved correct but also the addition of social media/brand metrics is a new one (but which also makes a lot of sense).

Update 2:Although this is PPC-related I find the post here absolutely fascinating about what % of the page above the fold (and on what screen resolution) is allowed to be advertising and what % should be unique content:

http://andrewhansen.name/affiliate-marketing/google-checkmates-me-but-reveals-internal-secrets/

This is a great tool to see how different screen sizes see your site:

http://browsersize.googlelabs.com/

Update 3: More information released here:

http://searchengineland.com/lessons-learned-at-smx-west-googles-farmerpanda-update-white-hat-cloaking-and-link-building-67838

The only really new information is this:

Matt confirmed that the algorithm change is still U S. only at this point, but is being tested internationally and would be rolling out to additional countries “within weeks”. He said that the type of low quality content targeted by the changes are more prevalent in the United States than in other countries, so the impact won’t be as strong outside the U.S.

We’ll be watching closely to try and spot when this rolls out.

Get blog posts via email

21 Comments

  1. Bill Slawski

    Hi Will

    Google's Biswanath Panda, along with a number of other Googlers wrote an interesting paper about Tree Ensemble classification using Map Reduce in 2009:

    PLANET: Massively Parallel Learning of Tree Ensembles with MapReduce (pdf).

    Can't be certain at all that the method described in that paper was behind the quality classifications in the Farmer/Panda update, but it's interesting to think about.

    A snippet from the paper:

    "We are currently applying PLANET to problems within the sponsored search domain. Our experience is that the system scales well and performs reliably in this context, and we expect results would be similar in a variety of other domains involving large scale learning problems."

    reply >
  2. John

    Brilliant Tom, really brilliant. I found it incredibly interesting that Mstt Cutts (funny how not many SEOs are willing to just call him "Matt") and Singhai admitted that they used the Chrome blocking extension to compare against their algorithm. A brilliant move on their part, really, because they probably knew it would be SEOs and advanced SE users installing/using it.

    I like your takeaways too. I would maybe add in site navigability? I.e don't silo your users too much, but make them feel comfortablr on the sitr. Maybe this is inckuded in your "Web 2.0" comment, but I thought it might be a useful suggestion too.

    Cheers.

    reply >
  3. John

    I apologize for the typos too... iPad fail :-(

    reply >
  4. Great post buddy very helpful i would love for you do do na article for our magazine its new obviously we will give you credit and linkbacks you have my details you can skype me too smokey1661

    reply >
  5. Bill Slawski

    Sorry, Tom

    Don't know where that "Will" came in.

    Agree with your actionable insights.

    reply >
  6. Excellent article - but I'm wondering about the affect on the UK ecomerce sites, as my site certainly seemed to have been affected within that time scale, and pushed down the serps.
    The sites replacing mine mainly seemed to have the same formation - plenty of onpage products, with no descriptions, whereas I've concentrated on creating unique descriptions, which did tend to be a tad 'flowery'.
    It seems as though, initially, Google threw the baby out with the bath water.
    I'm glad to report that some pages are slowly sneaking back onto the first page, without any major changes on my part, although they are in the pipeline - it's no good ranting and railing, you just gotta go with the flow.

    reply >
  7. "Get rid of aggressive ads and images that look like ad blocks. Whether it’s adsense, kontera, banner ads or just large images that look like ads – make sure they don’t take precedence over your content"

    What about google served ads?

    reply >
  8. @Bill - heh - I'd love to take credit for the "ads" spot, but that was all Tom.

    I love the custom search engine trick - smart move Tom.

    reply >
  9. Great post Tom. I really liked the CSE work you did.

    Do you know if there is a UK version of the Sistrix site available. I've been wanting to product demo their tools for agees now - but I don't speak ze Greman very well.

    reply >
  10. Brilliant post that leaves a lot to think about, it seems like with most updates, if you try and be professional as possible online anyway, then you wont be affected.

    Old websites i expect will be hit, its not often but you can still come across 5/6 year old websites that haven't been updated for years but still rank for various keywords, these are usually very old domains though.

    reply >
  11. Great analysis and love the use of the Google custom engine.

    Just on reducing ads, can you get clever with the way you apply ads to your site, in terms of loading after content, iframes etc. Would any of this make a difference.

    Could you explain this a little more"

    "Re-design your site to make it look like a web 2.0 site. Did I really just say that? Yes, I did. Because Google’s algorithms are now based on what people want to see, not where the actual value is – so you need to fit in. If your site is the ugly one the you’re far more likely to get hit than if your site is good looking."

    Design is such a personal thing, I don't see how an algorithm can determine what's appealing. Are we talking usability here in terms of engagement metrics, bounce etc being used ? Some Ugly sites can convert really well, think of all those product launches in IM :)

    It still seems to be early days, I have seen posts where auto blog networks saw an increase in traffic after this update. I am sure there will be a lot more on it.

    On Gary's point - it would be good if Google reduced the weight on aged sites. Plus, I wonder if they are ever going to deal with sites that sell counterfeit goods. I still see burn and crash methods for these sort of sites working fine.

    Thanks again for the great info.

    reply >
  12. Great post guys, this answers a lot of debating going on in the office! My question though, which I don’t think has been touched upon is do you think outbound links from sources that have seen large traffic drops, ie - Ezinearticles will be devalued? Personally I can’t see why they wouldn't as they have already been identified as low quality content. What are your thoughts on this?

    reply >
  13. Brilliant article. I think that you nailed the Panda update more or less spot on. And actually it is an amazing thought that Google is able to incorporate "human vison" in their algorithm. And a bit scary too cause who is going to be the final judge over what is good and bad then?

    Again: I think your article is THE most spot on concerning the lates update from Google.

    reply >
  14. Nice job on the prediction Tom.

    My wife publishes articles on Suite101 every now and then just for fun. There is actually some decent content on that site, but whenever I'd look at one of her new articles I'd be amazed at how Suite101 would blanket the page with ads in prominent positions.

    It was a huge turnoff to me as a user and it really made the articles look low quality, even though (in her case) they weren't. So I wouldn't be surprised at all to learn that they'd failed any kind of human quality testing.

    reply >
  15. Good post guys, although Panda has not yet hit the UK I believe the effects of it has. Sites that are heavily reliant on US based blog networks to get their inbound links will see those links devalued if Panda deems the site supplying the link to be of low quality. I have seen some evidence of this already.

    reply >
  16. so Panda/farmer hit today, and its quite interesting to see its impact on my sites, 3 down 1 up in SERPS. All sites I would broadly say of similar quality in terms of content, the bizarre thing is that the site that has risen in rank, is the one with the least amount of content, the fewest links but the oldest domain... not quite sure what that says if anything, but until I can analyse the data it'll be interesting to see if the impact has been site wide of specific pages...

    It will be very interesting in the coming weeks to see how it will affect the UK?

    reply >
  17. emtwo

    Brilliant post Tom, the takeaways are great food for thought and will perhaps prompt some usability / user journey thoughts as a by product of the desire to be ranked.

    @Kieran, I hear what you're saying about the product launches, interesting point.

    reply >
  18. I can almost see why they would call it a Farmer update, but a Panda? I'm lost on that one. I think sites who mostly used article marketing like ezine articles or similar services for link building will get hit hard, we've seen similar results happen here in the States.

    reply >
  19. Ryan, it's named after one of the Google engineers who came up with the breakthrough.

    reply >
  20. ChrisJ

    I'm getting the impression from various sources, that backlinks from these content sites which have taken syndicated content or scraped it (which one has no control over) are also downranking sites

    reply >
  21. This is one of the best articles I've read about Google Panda Update. Do you have any statistics on whether it has had a major impact on some of your websites?

    reply >

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>