One of the names given to special bots that visit websites, collecting information for search engines. Every engine has its own spiders — standalone programs that follow links and add data to the search engine database. Owners can manage site indexing by placing instructions for such robots in the robot.txt file.