Web servers often contain directories that do not need to be indexed. You create a text file with
search engine indexing restrictions and place it on the root directory of the Web Server.
User-agent: *
Disallow: /images/
Disallow: /banners/
Disallow: /Forms/
Disallow: /Dictionary/
Disallow: /_borders/
Disallow: /_fpclass/
Disallow: /_overlay/
Disallow: /_private/
Disallow: /_themes/
What is the name of this file?
A.
robots.txt
B.
search.txt
C.
blocklist.txt
D.
spf.txt
C is the Ans.
Robots.txt
The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.