Search engines crawl the sitemap.xml on every site they index. The sitemap tells them which URLs exist, when each was last modified, and how often they change. Without a sitemap, crawlers must follow links recursively from the homepage and miss orphan pages.
Sitemaps must list URLs that return 200 with valid HTML. URLs in the sitemap that 404, redirect, or return non-HTML content actively hurt — search engines see the inconsistency as a quality signal.
lrok's sitemap is data-driven: every entry in the use-case, ngrok-errors, integrations, learn, blog, compare, glossary, and tools data files auto-appears. Adding a record adds a sitemap row.