On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights.
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference. All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year, make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
- One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
- Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
- There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
- Have contact information available.
- If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
- Make sure your site is secure (HTTPS)
- Have correct grammar. How your page reads is important!
- Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
- Defining your objective. Write it as a question. Keep it specific, focused and simple.
- Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
- Actually observing our users to figure out what and how they’re searching and moving through the funnel.
- You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
- Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.
- Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors.
- Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
- Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
- Google My Business is your new homepage, so make sure you give it some attention!
- Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
- Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
- Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
- Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
- Consider scraping product recommendations:
- What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
- Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
- Also consider scraping competitor sites for prices, review and stock
- Are you cheaper than competitors?
- Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
- To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
- Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
- Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
- Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
- Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
- Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
- Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
- Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
- There are definite obstacles to integrating marketing teams like paid, social, or organic.
- It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
- Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
- There are also clear wins when you’re able to collaborate effectively.
- When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
- By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
- Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
- Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
- Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
- Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
- Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
- Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
- As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
- Status codes are not good or bad - there are right codes and wrong codes for different situations
- In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
- Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
- Both mobile and desktop are big - don’t neglect one at the expense of the other!
- The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
- If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
- With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
- As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
- The main levers for SEO tend to boil down to
- 1. Improving organic click-through-rate (CTR)
- 2. Improving organic rankings of current keywords
- 3. Ranking for new keywords
- Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
- Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
- Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
- Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
- For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession. You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
- Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
- Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
- Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
- You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
- Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
- The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
- Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
- If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
- If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
- Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
- A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
- Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
- Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
- Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
- Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
- The three main things to evaluate when you’re analysing log files
- Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
- Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
- Site health: look for inconsistent server responses
- Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
- Pete has developed a simple metric called RankVol to help determine the importance of a keyword
- RankVol = 1 / (rank x square root of volume)
- Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
- Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
- 9% of keywords John Lewis ranks for are mis-spellings
- Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
- Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
- If sitelinks appear for your website, Google thinks you’re a brand
- A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
- This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
- If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
- ‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
- To summarise, perception is everything with keyword research - make sure you filter out the noise!
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
- Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
- When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
- If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
- In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
- Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
- As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
- Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
- In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
- For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
- If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
- If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
- Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
- Build out different audience lists according to the types of content or topics that users have been reading
- Build out separate PPC campaigns for this so you can easily monitor results
- Stacey saw CPA fall by -34% when she did this for a healthcare site
- To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
- You can also use Google Forms to survey previous customers to find out what drove their purchase
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
- Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
- Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
- The Google documentation, robots.txt checker in GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
- We often have a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
- There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
- There are common misconceptions too about the meaning of ‘long tail keywords’
- 50% of Twitter respondents incorrectly think it means that there are many words in a query
- 40% understand the correct meaning, which is that they are keywords with low search volume
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.