WebGlossary.info
robots.txt
- A standard and special file used on websites to communicate with web robots and crawlers. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize websites. Not all robots cooperate with the standard; email harvesters, spambots, malware, and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out. ← Wikipedia ↑ robotstxt.org
- Previous term: Robots exclusion standard
- Next term: Robustness Principle
- Random term: Response