As part of this kwork I will configure two very important files for each site on any CMS - robots.txt and sitemap.xml
What are these files for?
robots.txt is a directive specifying the rules for indexing the site for different search robots (which sections and pages to index and which not);
sitemap.xml - a site map that contains and structured links to all available html-pages of the site to make it easier for search robots to understand its structure and crawl the site.
For robots.txt:
1. I will specify the rule User-agent for all the important search robots (Yandex, Google, general rule, etc. )
2. Remove parameter Host, which does not work.
3. Write path to sitemap
4. Cover JS-scripts and CSS-styles for Yandex
5. Close all 404 errors and broken pages
6. Check the validity of everything
For sitemap.xml:
1. Configure the correct sitemap with only the right pages
2. All structured and set the correct priorities for all pages
3. Check for validity
I work with all CMS: WordPress, OpenCart / OcStore, 1C-Bitrix, Joomla, Magento, PrestaShop, Shop Scipt, Simpla, Megagroup, Tiu.ru, WIX, WebAsyst, DLE, Drupal, ModX, UMI. CMS, NetCat, Host CMS, InSales, Tilda, etc.
For work I need a link to the site, access to FTP and site admin panel.
If there are any non-standard wishes to the order - write to me before you pay, I'll analyze everything and give an answer.
Pay attention to additional options.
If you need a separate sitemap for all the pictures - write in advance and specify. Suitable not for all sites, as some CMS store images on third-party servers.