SEO-related files should reside in the root folder of your site and are used by search engine spiders during the crawling of your site.
1.'sitemap.xml' file. This file represents the page architecture of your site and speeds up the search engine indexing of all pages throughout your site. We'd recommend you to re-generate and re-submit a new sitemap to search engines whenever you apply major changes to your site.
2.'robots.txt' file. This file tells crawlers which directories can or cannot be crawled. Uploading such a file can also serve as an invitation for search spiders to index every page in your site.
How to insert a robot.txt file in your site:
2.1. Open a text editor, such as Notepad and write these lines:
If all pages should be crawled:
If you don't want some of your pages to be crawled, exclude them by adding their URLs: