Robots.txt is a robots exclusion standard which is used by websites to interact with web robots and web crawlers. Search engines use robots.txt to categorize website. I would like know more about the robots.txt functions and purpose. Can someone explain me about the robots.txt functions and purposes?