Internet bots, also known as web robots, WWW robots or simply bots, are software applications that run automated tasks over the Internet. Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. One popular use of bots is in web spidering, in which an automated script fetches, analyzes and files information from web servers at many times the speed of a human.
There are malicious bots (and botnets) of the following types:
- Spambots that harvest email addresses from internet forums, contact forms or guestbook pages
- Downloader programs that suck bandwidth by downloading entire web sites
- Web site scrapers that grab the content of web sites and re-use it without permission on automatically generated doorway pages
- Viruses and worms
- DDoS attacks
- Botnets / zombie computers; etc.
- File-name modifiers on peer-to-peer file-sharing networks. These change the names of files (often containing malware) to match user search queries.