Beating the Panda - Diagnosing and Rescuing a Client’s Traffic

A few months Distilled was approached by a large website in the UK because they had been hit by Panda and had some link-based penalties in place as well. At first I was not sure that they had been hit, but then dug a little deeper and saw it. We then put in place a strategy for them to implement that we hoped would bring their Google traffic out of the depths.

Today I want to show you how we diagnosed their traffic issues and then what steps we took to bring them out of Panda. I figure this post is timely since Panda 2.5 was just released last week and some sites took a huge hit again.

Diagnosing a Panda Attack

When the Panda update rolled through in February for the first time, people had no idea what to do. When the Panda update rolled through for the first time in the UK in early to mid April, people still had little idea what to do.

First Look: Google-Only Traffic

First I segmented the Organic Traffic down to only traffic coming from Google. This is pretty easy to do.

Traffic Sources -> Sources -> Search -> Organic

This is a necessary practice for SEOs to do when you just want to look at organic traffic from all search engines. Here is where to find it:

Google-Only Traffic

Next, get your traffic that is coming from Google. The default view on Search -> Organic is Keywords, so you need to change this to Source and then select the Google option. Here is where to find it:

Broaden your Date Range

Now you will want to broaden your date range so that you can see if a drop has occurred in Organic Traffic around the times that Panda hit. If you are looking for a good resource on the dates of Panda, Dr Pete has the full list in the Google Algorithm Change History. For this client, I had to go back to the beginning of February and run the date range until the middle of July (when I got access to Analytics).

Pro tip: Graph the traffic by week, not by day or month.

If I just segment to around the date when Panda first hit the UK (April 11th-ish), I saw this:

Not very helpful, right?

But when I drew out the date ranges from February 1 to July 16, I saw this:

Second Look: # of referring keywords and landing pages receiving traffic

The next way to have a good idea if your site was hit by Panda is to look at the number of referring keywords and the number of organic landing pages both before and after. The easiest way I have found to do this is in Excel. Export the two weeks before and the two weeks after the Panda update and compare the number of landing pages before and after, as well as the keywords before and after. If the number has gone down for both, it’s worth more investigation.

Another quick way to compare the number of longtail referring keywords is to use the 4 or more keywords Advanced Segment I talked about in a post on SEOmoz. Simply use this Advanced Segment and take note of the number of keywords during a timespan of 2 or 3 weeks after, and then switch the timespan to 2 or 3 weeks before the update. This should give you a pretty good idea if something is amiss.

Here’s the Advanced Segment:


What I Recommended and What Worked

This specific client had a lot of individual pages for their site’s offerings that contained very little content. They also had a lot of customer profiles that had very little content on them. When you added all of these pages together, we are literally talking about tens of thousands of pages with thin or expired content, and sometimes even duplicate content.

My thought was that these pages were probably hurting them the most. So, I recommended that they use the meta noindex tag to remove the thin pages from the index. We also recommended that they nofollow the links to these pages. Ultimately, it would have been best for them to completely remove these thin pages, but from a vendor perspective they need to keep the pages live on the site. The client even went a step beyond our recommendations and no longer even links to the pages, so they are not in the index.

For the expired content, we recommended that they 301 redirect it back to the next level up. As far as I know, this has happened. The client had done some testing on using a rel=canonical tag to try to put the link juice back to the level above, but we did not feel that this was a good use of the rel=canonical tag.

We have also recommended some site architecture changes that are in the process of being implemented, as well as changes to the sitemaps that they declare to the search engines. Interestingly, even without these changes, the traffic has been on the up and up. Our hope is that with the architecture and meta content changes that we have suggested, that their rankings for head terms will increase as well.

Here are some statistics and graphs:

The week before Panda in UK
Organic Traffic: 3,759 visits
Traffic from keywords with 3+ words: 2,827
Traffic from keywords with 4+ words: 1,635
Landing pages from organic: 630

The week after Panda hit
Organic traffic: 2,708 (over 30% decrease)
Traffic from keywords with 3+ words: 2,165 (a 25%+ decline)
Traffic from keywords with 4+ words: 1,342 (about 20% decline)
Landing pages from organic: 560 (an 11% decline)

Here are the two weeks compared:

Where are we now?

As soon as the client no-indexed all of those pages and quit linking to them as well, the site’s traffic started to rise. Interestingly, the gains have been incremental. Even with new updates to the Panda algorithm, the recovery has been a steady upwards trend in traffic, not a quick recovery. A bit recovery started to occur from the middle of July, but the big gains began occurring in August and September as these low-quality pages were being removed.

To get back to this level, we were brutal with noindexing as much thin content as possible. While I recognize that this is a case in which we were able to do so, because the content was not for example someone’s blog post, I still think that as SEOs we need to educate webmasters who come to us for help that they may have to noindex and remove a lot of thin pages, or flesh out those pages a lot more. And this will take work. 

The recovery plan will vary by website obviously, but I do think that we need to take Google’s advice and get rid of thin content as much as possible. Even moreso today than ever, we really should ask “Does this content deserve to rank?” Is it providing the most value? I understand that businesses are at stake here, but it also seems that Google is cracking down on those building their businesses off of gaming Google.

Here’s the traffic as of last week:

Organic traffic: 4,010
Traffic from keywords with 3+ words: 3,171
Traffic from keywords with 4+ words: 2,006
Landing pages from organic: 1,247


My Panda Theory

After conversations with a number of SEOs who have also seen full Panda recoveries, I have realized that all of the recoveries I have heard of have been gradual. No one, to my knowledge, has recovered fully in the next update. Since Panda is not a “penalty” in the traditional sense of word (ie. you were caught buying links and received a penalty until you removed them and groveled back into Google’s good graces), I bet that your domain’s trust in Google’s eyes (ie your TRUE Pagerank, not Toolbar Pagerank) goes down when you are caught in the low-quality Panda algorithm filter. Therefore, you have to build it back up over time. As Google regains trust in your site, they will crawl it more and more and you’ll get your traffic back.

That is, until you get caught again.