# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html User-agent: * Crawl-delay: 10 # Homepage Allow: /$ # Public website pages Allow: /en/* Allow: /fr/* Allow: /de/* Allow: /es/* Allow: /pl/* Allow: /pt/* Allow: /it/* # Entry point pages Allow: /register.html Allow: /login.html # API Documentation Allow: /ShippyPro-API-Documentation/ # Allow old pages to let crawlers see redirects Allow: /*.html$ # Allow sitemaps Allow: /sitemap.xml Allow: /sitemap-*.xml$ # Disallow everything else Disallow: * # Sitemap Sitemap: https://www.shippypro.com/sitemap.xml