Fix merge #101 by replacing a non-working Yahoo! link with Wikipedia

The link was meant to describe robots.txt crawl-delay info
This commit is contained in:
Mikael Nordfeldth 2016-02-26 13:41:14 +01:00
parent 5227483855
commit a3c5ef59d6
1 changed files with 1 additions and 1 deletions

View File

@ -674,7 +674,7 @@ Web crawlers. See http://www.robotstxt.org/ for more information
on the format of this file.
crawldelay: if non-empty, this value is provided as the Crawl-Delay:
for the robots.txt file. see https://help.yahoo.com/kb/search
for the robots.txt file. see <https://en.wikipedia.org/wiki/Robots_exclusion_standard#Crawl-delay_directive>
for more information. Default is zero, no explicit delay.
disallow: Array of (virtual) directories to disallow. Default is 'main',
'search', 'message', 'settings', 'admin'. Ignored when site