Interactive: robots.txt

Time remaining/total left out of 1h 10m Lessons completed/total / 11 4.3 / 5

Basic exclusion

Time required: 10m
Lesson URL:
Teacher: Will Critchlow

There are a variety of ways to control the behavior of search engine crawlers. You can learn more about the alternatives in our technical SEO module. Robots.txt is a plain-text file found in the root of a domain (e.g. www.example.com/robots.txt). It is a widely-acknowledged standard and allows webmasters to control all kinds of automated consumption of their site - not only by search engines.

In addition to reading about the protocol, robots.txt is one of the more accessible areas of SEO since you can access any site's robots.txt. Once you have completed this module, you will find value in making sure you understand the robots.txt files of some large sites (for example Google and Amazon).

What you will learn in this module:

  • How to block all robots from certain areas of your site
  • How to restrict your robots.txt instructions to apply only to certain robots
  • How to override exclusion directives to allow access to certain areas of your site despite exclusion rules
  • Use wildcards to apply your rules to whole swathes of your site
  • Other robots.txt syntax such as sitemap file directives

The most common use-case for robots.txt is to block robots from accessing specific pages. The simplest version applies the rule to all robots with a line saying User-agent: *. Subsequent lines contain specific exclusions that work cumulatively, so the code below blocks robots from accessing /secret.html.

Add another rule to block access to /secret2.html in addition to /secret.html.



Clue Show answer
Close

Add another “disallow:” directive that blocks robots from /secret2.html

Not yet completed

Exclude directories

Time required: 5m
Teacher: Will Critchlow

Allow specific paths

Time required: 5m
Teacher: Will Critchlow

Restrict to specific user agents

Time required: 5m
Teacher: Will Critchlow

Add multiple blocks

Time required: 5m
Teacher: Will Critchlow

Use more specific user-agents

Time required: 10m
Teacher: Will Critchlow

Basic wildcards

Time required: 5m
Teacher: Will Critchlow

Block certain parameters

Time required: 5m
Teacher: Will Critchlow

Match whole filenames

Time required: 10m
Teacher: Will Critchlow

Add an XML sitemap

Time required: 5m
Teacher: Will Critchlow

Add a video sitemap

Time required: 5m
Teacher: Will Critchlow