-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
Take the basic robots.txt file from oe24 and upload to our other rails applications.
What we have so far has general allow/disallow of specific pages for all user agents, and a 10s crawl delay.
Some additional considerations are:
Are there any specific user agents that we want to block or allow?
Do all our rails apps have sitemaps?/ are we caring about including sitemaps for SEO?
Are there any other specific pages we want to block from being crawled and what are their paths?
matrix comparing multiple academic libraries' robots.txt files
Metadata
Metadata
Assignees
Labels
No labels