A robot .txt file tells search engine crawlers which pages or file the crawler can,t request from your site. This is used mainly to avoid overloading your site with the request, it is not a mechanism for keeping a web page out of google.
Join MindStick Community
You need to log in or register to vote on answers or questions.
We use cookies to ensure you have the best browsing experience on our website. By using our site, you
acknowledge that you have read and understood our
Cookie Policy &
Privacy Policy.
A robot .txt file tells search engine crawlers which pages or file the crawler can,t request from your site. This is used mainly to avoid overloading your site with the request, it is not a mechanism for keeping a web page out of google.