Enable sitemap.xml generation & reintroduce robots.txt #669
Reference in New Issue
Block a user
Delete Branch "sitemap-robotstxt"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
I disagree with the decision to remove robots.txt in #668. I think it may at least remove 404 errors from logs and sitemap.xml can only help with SEO or won't be harmful.
However I am not sure if privacytools.io has more than one page and I was unable to get Jekyll working locally on this PC.
What's the point of having robots.txt when it simply says "all pages allowed"?
I see the point with a sitemap, though.
Though as I expected, the sitemap is pretty useless since we have only two pages:
Can you verify that the
jekyll-sitemapplugin is supported by Github Pages?Good point. Though we don't really need a sitemap.
Regarding robots,
Not having it can spam error.log of the HTTP server, but as you use GitHub pages through CloudFlare, I don't know if anyone would ever see them.
Would there be any harm in case there would be more pages in the future? I don't know about your plans though.
Yes, see https://pages.github.com/versions/
I agree with this. Having it in place doesn't hurt and it can only benefit SEO.
I guess this can be useful when we have links like
/it/index.html,/de/index.htmletc.