Kwork Overview
Robots.txt and sitemap.xml - creation and proper configuration
What are these files for?
robots.txt - is a directive specifying the rules for indexing the website for different search robots (which sections and pages to index & which not);
sitemap.xml - a site map that contains & structured links to all available HTML-pages of the website to make it easier for search robots to understand its structure and crawl the website.
For robots.txt:
** I willI specify the rule User-agent for all the important search robots (Yandex, Google, general rule, etc. )
** Remove parameter Host, which does not work.
** Write path to sitemap
** Cover JS-scripts and CSS-styles for Yandex
** Close all 404 errors and broken pages
** Check the validity of everything
For sitemap.xml:
** Configure the correct sitemap the right pages
** All structured and set the correct priorities for all pages
** Check for validity
I work with all CMS: WordPress, OpenCart / OcStore, 1C-Bitrix, Joomla, Magento, PrestaShop, Simpla, Megagroup, Tiu.ru, WIX, WebAsyst, DLE, Drupal, ModX, UMI. CMS, NetCat, Host CMS, InSales, Tilda, etc.
For work I need a link to the site, access to FTP and site admin panel.
If there are any non-standard wishes to the order - write to me before you pay, I'll analyze everything and give an answer.
Pay attention to additional options.
If you need a separate sitemap for all the pictures - write in advance and specify. Suitable not for all sites, as some CMS store images on third-party servers