Beating the Panda - Diagnosing and Rescuing a Client’s Traffic

A few months Distilled was approached by a large website in the UK because they had been hit by Panda and had some link-based penalties in place as well. At first I was not sure that they had been hit, but then dug a little deeper and saw it. We then put in place a strategy for them to implement that we hoped would bring their Google traffic out of the depths.

Today I want to show you how we diagnosed their traffic issues and then what steps we took to bring them out of Panda. I figure this post is timely since Panda 2.5 was just released last week and some sites took a huge hit again.

Diagnosing a Panda Attack

When the Panda update rolled through in February for the first time, people had no idea what to do. When the Panda update rolled through for the first time in the UK in early to mid April, people still had little idea what to do.

First Look: Google-Only Traffic

First I segmented the Organic Traffic down to only traffic coming from Google. This is pretty easy to do.

Traffic Sources -> Sources -> Search -> Organic

This is a necessary practice for SEOs to do when you just want to look at organic traffic from all search engines. Here is where to find it:

Google-Only Traffic

Next, get your traffic that is coming from Google. The default view on Search -> Organic is Keywords, so you need to change this to Source and then select the Google option. Here is where to find it:

Broaden your Date Range

Now you will want to broaden your date range so that you can see if a drop has occurred in Organic Traffic around the times that Panda hit. If you are looking for a good resource on the dates of Panda, Dr Pete has the full list in the Google Algorithm Change History. For this client, I had to go back to the beginning of February and run the date range until the middle of July (when I got access to Analytics).

Pro tip: Graph the traffic by week, not by day or month.

If I just segment to around the date when Panda first hit the UK (April 11th-ish), I saw this:

Not very helpful, right?

But when I drew out the date ranges from February 1 to July 16, I saw this:

Second Look: # of referring keywords and landing pages receiving traffic

The next way to have a good idea if your site was hit by Panda is to look at the number of referring keywords and the number of organic landing pages both before and after. The easiest way I have found to do this is in Excel. Export the two weeks before and the two weeks after the Panda update and compare the number of landing pages before and after, as well as the keywords before and after. If the number has gone down for both, it’s worth more investigation.

Another quick way to compare the number of longtail referring keywords is to use the 4 or more keywords Advanced Segment I talked about in a post on SEOmoz. Simply use this Advanced Segment and take note of the number of keywords during a timespan of 2 or 3 weeks after, and then switch the timespan to 2 or 3 weeks before the update. This should give you a pretty good idea if something is amiss.

Here’s the Advanced Segment:


What I Recommended and What Worked

This specific client had a lot of individual pages for their site’s offerings that contained very little content. They also had a lot of customer profiles that had very little content on them. When you added all of these pages together, we are literally talking about tens of thousands of pages with thin or expired content, and sometimes even duplicate content.

My thought was that these pages were probably hurting them the most. So, I recommended that they use the meta noindex tag to remove the thin pages from the index. We also recommended that they nofollow the links to these pages. Ultimately, it would have been best for them to completely remove these thin pages, but from a vendor perspective they need to keep the pages live on the site. The client even went a step beyond our recommendations and no longer even links to the pages, so they are not in the index.

For the expired content, we recommended that they 301 redirect it back to the next level up. As far as I know, this has happened. The client had done some testing on using a rel=canonical tag to try to put the link juice back to the level above, but we did not feel that this was a good use of the rel=canonical tag.

We have also recommended some site architecture changes that are in the process of being implemented, as well as changes to the sitemaps that they declare to the search engines. Interestingly, even without these changes, the traffic has been on the up and up. Our hope is that with the architecture and meta content changes that we have suggested, that their rankings for head terms will increase as well.

Here are some statistics and graphs:

The week before Panda in UK
Organic Traffic: 3,759 visits
Traffic from keywords with 3+ words: 2,827
Traffic from keywords with 4+ words: 1,635
Landing pages from organic: 630

The week after Panda hit
Organic traffic: 2,708 (over 30% decrease)
Traffic from keywords with 3+ words: 2,165 (a 25%+ decline)
Traffic from keywords with 4+ words: 1,342 (about 20% decline)
Landing pages from organic: 560 (an 11% decline)

Here are the two weeks compared:

Where are we now?

As soon as the client no-indexed all of those pages and quit linking to them as well, the site’s traffic started to rise. Interestingly, the gains have been incremental. Even with new updates to the Panda algorithm, the recovery has been a steady upwards trend in traffic, not a quick recovery. A bit recovery started to occur from the middle of July, but the big gains began occurring in August and September as these low-quality pages were being removed.

To get back to this level, we were brutal with noindexing as much thin content as possible. While I recognize that this is a case in which we were able to do so, because the content was not for example someone’s blog post, I still think that as SEOs we need to educate webmasters who come to us for help that they may have to noindex and remove a lot of thin pages, or flesh out those pages a lot more. And this will take work. 

The recovery plan will vary by website obviously, but I do think that we need to take Google’s advice and get rid of thin content as much as possible. Even moreso today than ever, we really should ask “Does this content deserve to rank?” Is it providing the most value? I understand that businesses are at stake here, but it also seems that Google is cracking down on those building their businesses off of gaming Google.

Here’s the traffic as of last week:

Organic traffic: 4,010
Traffic from keywords with 3+ words: 3,171
Traffic from keywords with 4+ words: 2,006
Landing pages from organic: 1,247


My Panda Theory

After conversations with a number of SEOs who have also seen full Panda recoveries, I have realized that all of the recoveries I have heard of have been gradual. No one, to my knowledge, has recovered fully in the next update. Since Panda is not a “penalty” in the traditional sense of word (ie. you were caught buying links and received a penalty until you removed them and groveled back into Google’s good graces), I bet that your domain’s trust in Google’s eyes (ie your TRUE Pagerank, not Toolbar Pagerank) goes down when you are caught in the low-quality Panda algorithm filter. Therefore, you have to build it back up over time. As Google regains trust in your site, they will crawl it more and more and you’ll get your traffic back.

That is, until you get caught again.

Get blog posts via email


  1. Sure that's a Panda issue? 10k+ pages but only 3-4k visits per week seems a bit off. How many pages were there in total before you ditched the 10k? Depends on the industry and how well optimised they were before, but a 100 page site could easily bring in 3-4k visits a week.

    Also, the traffic drop seems so slight compared to the 50-80% drops other Panda affected sites are reporting, couldn't it just be a previously untidy site that started to suffer and recovered some due to general SEO housekeeping being implemented?

    I've seen some sites lose rankings due to being propped up by links from heavy hitter sites that were slapped by Panda - the gradual loss of traffic and subsequent tidy up and slow recovery is more in line with that than an actual Panda victim / recovery.

    reply >
    • John Doherty

      Hey Scott -
      I didn't believe it at first either, but after digging deeper I came to believe it especially because of the longtail traffic taking a hit. 20% is not as big as some of the other sites that got hit, but then again we don't hear much about sites that got "dinged". Instead we hear about the ones that got "decimated".

      Also, the traffic kept on declining and the longtail traffic kept getting worse, which is classic Panda. Also, you're right that with 10k+ indexed pages but only 3-4k uniques per week, there was definitely a lot of thin content that was not ranking for anything. Those unnecessary pages were dragging them down.

      Also...the quality of the links was not great. As I stated, they had some penalties in place too, but that traffic drop was not nearly in line with what happened in this timeframe.

      Thanks for the comment!

  2. Very true - it's all the horror stories that make it to the public eye! They make better reading! :)

    I had a couple of sites hit but they're too low key for me to really bother spending time on fixing them to be honest - I really should set aside some time and tinker with them.

    Nice article John!

    reply >
  3. John - great, clear article. But, I sure would love to hear if you have any information on what sorts of sites Panda 2.5 is cracking down on. Was this most recent update supposed to fight against specific practices, e.g. duplicate content, poor site maps, spelling & grammer errors, bad anchor text,etc.? As you know, a number of my sites got hit pretty hard this weekend, but everything that you mention in this article I am doing correctly. I really want to find out more information on what specific changes I can do to combat these recent drops in the SERPs. Thanks.

    reply >
    • John Doherty

      Hi Daniel -
      To be totally honest, this is not something that I have looked into. I would like to do some analysis on the sites that were hit to see if there are any patterns.

      Also, Matt Cutts came out with an interesting (and timely) Webmaster video yesterday talking about spelling and grammar that would be worth a listen:

  4. Interesting post. I have a site with a similar looking analytics chart. However, when I did some checking, it looks like some of my backlinks which I had built using expired domains lost all of their PR. In my case it wasn't Panda at all.

    I could be wrong, but I thought that most sites that had a Panda hit had a sudden drop off the charts rather than a slow decline like your chart shows.

    reply >
  5. Our website lost about 40% traffic between April 10th and April 30th. Obviously I thought it was Panda initially. Although we made some small content changes, we hadn't actually got to doing anything before improved in the next 2 weeks to pre April 10th levels. Having seen this I just put it down to those crazy couple of weeks when the Royal Wedding was happening and everyone buggered off on holiday!

    reply >
  6. Levin

    As always very insightful John! Interesting how less can be more nowadays (in terms of quantity).
    One thing I would take in account when it comes to these kind of analysis is other external factors. I bet causation was pretty clear in this case, but especially PR/publicity and seasonal effects can have quite an impact on organic traffic in some industries. I work for a B2B service that just experienced a stagnation/slight dip in organic traffic over the summer - the same happened to competititor's traffic according to alexa (I am aware of the validity of this ;)
    Which makes sense in retrospect, since a big share of the workforce is on vacation during the summer.


    reply >
  7. Good article and analysis of the client's website. Thanks for sharing the data and tools used to discover and diagnose the issues at hand. I'll also believe your theory about Panda is spot on and although the pagerank may not change in your toolbar, the trust that Google once had in the site is lower and needs to take some time to gain back. I've also seen similar situations where removing the low quality or thin pages have helped a site to gain back some of the loss traffic, but it doesn't happen over night.

    reply >
  8. rteodor

    Thanx for sharing data what you discovered.Great article to find out something new.
    I must admit, I read it couple times before i finally get it all. I would also recommend to everyone who read this, to read Google algorhytm change from SEO moz., which you also mention in this article. Thanx for sharing, again. Now, back to study Google algorthytm change and find out similarities and differences...

    reply >
  9. That might be a Panda slapped site. Daily reports on GA will add to evidence as Panda traffic drops happen overnight. This is because Panda, unlike other parts of the Google ago are run as one-offs.

    G has told us that sites then either have or have not been deemed 'low quality' (ie, slapped). If slapped, a site-wide hit is applied. This is 'penalty by algo'.

    The slap can't be lifted until the next algo. So any recovery before the next algo is not the site getting out of Panda - you just optimised a Panda-slapped site like you can nay other.

    Of course, G may have changed how Panda works. It has to change sometime for sure. And there may be more than one type of 'Panda' penalty. That; the kind of sneaky thing G would do.

    reply >
  10. I haven't seen evidence that thin content is the the problem per-se but that users landing on it are sending a significant negative signal to Google. I see people all the time indicating that they were hit by Panda but all of their content is high quality and unique. Almost universally they have UX and branding issues.

    Take DaniWeb for example, an edge case since it may or may not be hit by Panda. But when I reviewed top driving traffic terms, I found that many of the pages returning in Google were extremely dated (back to 2003). Upon landing on such an old, likely irrelevant page, there wasn't clear internal navigation to where I could find more useful information. Seems a large number of people simply exit and do the same search again which I suspect is a very significantly negative signal.

    reply >
    • John Doherty

      Rick -
      Very interesting insights here, and thanks for sharing. The Bounce thing is a theory that has been thrown around a lot, and I tend to agree with you. It's also interesting that on most of the sites really hit hard, there is not a good internal architecture to the site, so people bounce quickly. I'm wondering if onsite organization is part of it as well. If you have a lot of good info, yet no one can find it, it's not going to be a good user experience, and since Panda is an algo built around actual users testing, this makes sense.

      Thanks for dropping by!

  11. Donal Doherty (no known relation)

    Hey John,

    great post. Just wondering what your thoughts are on noindexing content versus disallowing via robots.txt with regard to Panda?

    reply >
    • John Doherty

      Hey Donal -

      Even if we're not related, we should meet! I love meeting other Doherty's. Are you in Ireland? I'm not...but I get there every few years.

      Regarding the SEO at hand. I would personally prefer to use noindex,follow because you can still pass link juice. I rarely block stuff with robots.txt, as then the crawler just completely ignores it and you lose out on any potential linkjuice.

  12. Donal Doherty

    Hey John,

    I live in the Manchester, but grew up in Derry where Doherty's are like rats in a sewer, it's full of 'em ;-) Next time you are in my neck of the woods I'll buy you a pint and we can try and figure out if we have any connections!

    Thanks for the wisdom.

    reply >
  13. Rvll

    Hey John,

    This article has been doing the rounds in my office and will be seen as a post panda bible should any of our clients get hit over the coming months.

    Nice one to Distilled, the content on here over the last few months has been excellent

    reply >
  14. James Porter

    great article, thanks for sharing...

    I'm surprised to see you implemented nofollow links. I thought current thinking (according to SEOMOZ, Matt Cutts etc) was not to use nofollow links, as you risked being seen as a pagerank sculpter?

    Also, I'd be interested to know why you decided to meta robots 'noindex, follow' and 'nofollow' the links to those pages. It seems a bit unnecessary.

    I love the smell of SEO on a Sunday Morning

    reply >
    • John Doherty

      Hey James -

      Thanks for the comment. We recommended the nofollows because desperate times call for drastic measures. Also, nofollows are not against the rules, nor are they bad when used in the right circumstances. In this case, we had to do something drastic and it involved nofollows, and it worked. Is it an optimal solution for most people? No, but in this case, if they had not stopped linking to the pages, it would have worked I think.

      Also, good point about the "noindex, follow" and "nofollow" on the links. It is a bit unnecessary, but there was always the chance that the noindexed page would be linked from an external site, so we decided to do the "follow" just in case.

      Thanks for the thought provoking comment!

  15. Derek

    Are you or other people from going to react on the 2.6 panda? me and other people i know have been hit hard by this update, while not affected by any other panda update...

    reply >
    • John Doherty

      Hey Derek -

      I have not yet seen a site that has been hit by Panda 2.6, though I honestly have not really been looking. None of my clients were hit, but if we come across one and see a recovery, or have insights into what was different about 2.6 as opposed to others, I'm sure we'll write about it!


  16. I spend much of my time doing this exact thing for many client's sites. Particularly one I have been dealing with issues on since late July. Thanks so much for the different perspectives here. After reading this it's off to work I go again but with new ideas and angles to consider. All of the feedback here helps loads too. Liked it so much I Liked it, +'d it, and Tweeted it 2x's. Thanks John and your other commenters!

    reply >
  17. Chris

    John, I've got a site that seems to be in a similar position (lots of thin company pages, hit hard by Panda 2.2). With your client's site, were you concerned about the negative impact of pulling thousands of pages of content from the index? If those pages, weak as they were, made up the majority of urls on the site, were you worried about the impact on pagerank?

    I'm considering this move, but am weighing the risks.

    reply >
    • John Doherty

      Hi Chris -
      Thanks for the question. If the pages make up the majority of the URLs on the site, I would be careful. The site I was working on had a lot more pages left over. If I were you, I would worry about losing a lot of internal linkjuice as well as pages that can rank. Are you able to bulk up the pages at all, maybe make some layout changes to highlight content over ads? The site I was working on had minimal ads that were not causing a problem.

      Sounds to me like this might not be your solution.


    • Greg

      We have en eCommerce site that we are purging most of the product pages from index because they are all so similar. These product pages make up 80% of the site, but they are all either thin and/or very similar in content. They are 2nd level pages on the site with very few links from inside the site. Zero links from 3rd party sources. Since they are mostly considered to be thin pages anyways, wouldn't they just really be diluting the overall PR of the site? Possibly dragging down the site because of bad metrics compared to category (top level) pages. Similar to what the thin company pages you are referring to here? I was under the impression that Google Panda was about removing or improving badly performing pages to focus on the better performing stuff. Our top level pages represent 95% of our overall traffic to the site, so I was thinking losing the thin pages wouldn't be a huge loss. Also, when using the , doesn't pagerank still flow through these pages to the pages they link to and the bot continue to follow links from the pages you are noindexing as well? It's all so confusing!

  18. Hi John,

    I enjoyed reading your article and you did make some very valid points. But to your statement:

    "After conversations with a number of SEOs who have also seen full Panda recoveries, I have realized that all of the recoveries I have heard of have been gradual. No one, to my knowledge, has recovered fully in the next update. "

    I felt that I have to add my experience:

    A web site owner have been hit from Panda on the 2nd of April this year and I advised him what has to be done, he followed blindly all my instructions, and on the 28th of April his traffic and rankings came back, and were better than ever before.

    That said, the 25 days Panda horror paid him off. How about that?

    reply >
    • John Doherty

      Hey John -
      That's pretty amazing and I haven't seen that happen with other sites. I guess this goes to show that maybe it can happen.

  19. I like this article; insightful and very inspiring. Thank you so much. Now I know what to do with my blog. It actually is affected by the Farmer update since October 14, 2011. Hope my old traffic gets back soon.

    reply >
  20. radu

    I've seen several panda hits including UK based ones that got back on track suddenly and not during the next panda update but at one point when the 'panda related issues' were covered / fixed. (including some big uk brands that got hit by panda 2.0)

    based on your case study I think it's safe to say that there are several ways to get out - progressive recover, sudden recover but there is also the option to live with panda and survive - if you can not fix the content as it is all bad :) by improving the rankings and even get to previous panda level traffic with other techniques .. without going to the dark side ofcourse :)

    reply >
  21. Hey John! One question from me.. a panda dropped website can be recover in only next panda refresh??

    reply >

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>