Skip to content

Commit ed87faa

Browse files
dsofeirrhukster
authored andcommitted
Update robots.txt (getgrav#2632)
I have found that Bing/Yahoo/DuckDuckGo, Yandex and Google report crawl errors when using the default robots.txt. Specifically their bots will not crawl the the path '/' or any sub-paths. I agree that the current robots.txt should work and properly implements the specification. However it still does not work. In my experience explicitly permitting the path '/' by adding the directive Allow: / resolves the issue. More details can be found in a blog post about the issue here: https://www.dfoley.ie/blog/starting-with-the-indieweb
1 parent 20b9ca5 commit ed87faa

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

robots.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,3 +10,4 @@ Disallow: /user/
1010
Allow: /user/pages/
1111
Allow: /user/themes/
1212
Allow: /user/images/
13+
Allow: /

0 commit comments

Comments
 (0)