Skip to content

add robots.txt file #1445

@CB987

Description

@CB987

Take the basic robots.txt file from oe24 and upload to our other rails applications.
What we have so far has general allow/disallow of specific pages for all user agents, and a 10s crawl delay.
Some additional considerations are:
Are there any specific user agents that we want to block or allow?
Do all our rails apps have sitemaps?/ are we caring about including sitemaps for SEO?
Are there any other specific pages we want to block from being crawled and what are their paths?
matrix comparing multiple academic libraries' robots.txt files

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions