Replicate Google’s Panda Questionnaire: Processing

As we all know, Google’s Panda update aimed to improve the quality of results returned in search. As Will recently explained on SEO Moz’s Whiteboard Friday, questionnaires collecting users’ opinion of a page or a whole site can help determine an outside interpretation of page (or site) quality. These survey results can be a useful tool for persuading clients or site owners that changes need to be made based on a number of quality factors.

Running the Survey:

The questions were chosen in order to glean a measure of how users felt about the quality of a page. We collected the data using Smartsheet and used Mechanical Turk to recruit web-users to answer questions. For this survey, the respondents didn’t need to have any particular demographic characteristics other than being familiar with looking at websites. Administered in this way, the questionnaire could be made available to any number of respondents depending on time and budget. The respondents were asked to answer ‘yes’, ‘no’ or ‘don’t know’ to the following:

  • Would you trust information from this website?
  • Is this website written by experts?
  • Would you give this site your credit card details?
  • Do the pages on this site have obvious errors?
  • Does the website provide original content or info?
  • Would you recognise this site as an authority?
  • Does this website contain insightful analysis?
  • Would you consider bookmarking pages on this site?
  • Are there excessive adverts on this website?
  • Could pages from this site appear in print?
Getting Answers:

The responses were downloaded as a CSV file and were processed using Excel. As these were fixed-response questions, we did a frequency count for each question’s responses. The quickest way to do this in Excel is to use a pivot table (Insert > Pivot Table) for each question. Select the data and drag the question into both the Row Labels and Values areas in the Pivot Table Field List.

 

 

This summarised data can then be pasted out and used to calculate percentages for each question.

Presenting the Results to Encourage Action:

The processed results should be presented in a way that clearly identifies where there are problems. The best way to do this is using a table with some formatting. The coloured cells within each question shows how the majority of respondents answered and whether or not this was a good or a bad thing for opinion on quality. Check through the responses and look at what they mean for measures of quality: is a majority response of ‘yes’ a good thing given the question? Green shows were the majority of responses is positive for quality assessment, yellow cells show where there is little or no difference between responses and red shows where the majority of responses show an area of concern. For example:

 

 

Colour coding responses is an easy way to quickly see where there are problems. In the example above, 70% of respondents answered ‘No’ to ‘Would you give this site your credit card details?’; potentially a big problem for ecommerce sites with these goals.  A stacked bar or bar chart is inappropriate for these results as a particular answer (’yes’ or ‘no’) doesn’t consistently show an area performing well or an area of concern. The table can be accompanied by notes to explain what the responses mean and potential follow up actions.

Collecting users’ opinions is a fast, easy, and inexpensive means of getting some authentic feedback from outside your site. This is a potentially powerful tool when trying to bring about change to pages which may be problematic and may help to improve quality overall.