Directory of website pages.


All government websites must have a sitemap.


A sitemap is a file that lists all of the pages on a website in addition to information about each so that search engines can more intelligently crawl the site.

Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support sitemaps to pick up all Uniform Resources Locators (URLs) in the sitemap and learn about them using the associated metadata. Using sitemap protocol doesn’t guarantee web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.

The sitemap is an Extensible Markup Language (XML) file on the website’s root directory that include metadata for each URL, such as:

  • when it was last updated
  • how often it usually changes
  • how important it is, relative to other URLs in the site


Example government website sitemaps:


Example sitemap code:

<?xml version="1.0" encoding="UTF-8"?>

<urlset xmlns="">