Machine agents such as robots or crawlers (called bots) visit millions of Web sites daily. Sometimes the use is benign, as when search agents use them to update their information. But these programs can also be harmful. They can be used to hunt for Web site vulnerabilities or harvest e-mail addresses for future spamming. They can also reduce a Web site’s bandwidth.
A new site keeps an updated list of bad bots and crawlers. Web site host managers and business owners can download data for free from Botslist.com that can then be configured to block bad robots. It’s this month’s A Site to See.