# The servers: hercules.kgs.ku.edu, deuteron.kgs.ku.edu, drysdale.kgs.ku.edu, neutrino.kgs.ku.edu # follow the robots.txt file that wikipedia.org uses. We got burned by HTTrack on November 24, 2003. # # # robots.txt for http://www.wikipedia.org/ and friends # # Please note: There are a lot of pages on this site, and there are # some misbehaved spiders out there that go _way_ too fast. If you're # irresponsible, your access to the site may be blocked. # User-agent: * Disallow: / # Wikipedia work bots: User-Agent: IsraBot Allow: / # Crawlers that are kind enough to obey, but which we'd rather not have # unless they're feeding search engines. User-agent: UbiCrawler Disallow: / User-agent: DOC Disallow: / User-agent: Zao Disallow: / # Some bots are known to be trouble, particularly those designed to copy # entire sites. Please obey robots.txt. User-Agent: askjeeves Disallow: / User-Agent: AskJeeves Disallow: / User-Agent: sitecheck.internetseer.com Disallow: / User-Agent: Zealbot Disallow: / User-Agent: MSIECrawler Disallow: / User-Agent: SiteSnagger Disallow: / User-Agent: WebStripper Disallow: / User-Agent: WebCopier Disallow: / User-Agent: Fetch Disallow: / User-Agent: Offline Explorer Disallow: / User-Agent: Teleport Disallow: / User-Agent: TeleportPro Disallow: / User-Agent: WebZIP Disallow: / User-Agent: linko Disallow: / User-Agent: HTTrack Disallow: / User-Agent: Microsoft.URL.Control Disallow: / User-Agent: Xenu Disallow: / User-Agent: larbin Disallow: / User-Agent: libwww Disallow: / User-Agent: ZyBORG Disallow: / User-Agent: Download Ninja Disallow: / # # Sorry, wget in its recursive mode is a frequent problem. # Please read the man page and use it properly; there is a # --wait option you can use to set the delay between hits, # for instance. # User-Agent: wget Disallow: / # # The 'grub' distributed client has been *very* poorly behaved. # User-agent: grub-client Disallow: / # # Doesn't follow robots.txt anyway, but... # User-Agent: k2spider Disallow: / # # Hits many times per second, not acceptable # http://www.nameprotect.com/botinfo.html User-Agent: NPBot Disallow: / User-Agent: HTTrack Disallow: / # # Friendly, low-speed bots are welcome viewing article pages, but not # dynamically-generated pages please. # # Inktomi's "Slurp" can read a minimum delay between hits; if your # bot supports such a thing using the 'Crawl-delay' or another # instruction, please let us know. # #User-agent: * #Disallow: /w/ #Crawl-delay: 15