Crawlable

Site is available for indexing by well-behaved agents.

About href="#about"

The robots.txt file tells AI crawlers and search engines which parts of a website they can or cannot access. This helps website owners control what data is available to AI models while protecting private or sensitive information. It also prevents unnecessary crawling, reducing server load and improving site performance. By following robots.txt rules, AI crawlers respect website preferences and ensure ethical data collection.

Related

Topics

On this page

Supporters

Thank you to these organizations for supporting Project ScanGov:

ScanGov

Government. Digital. Experience.

Get ScanGov