Developers are employing bots, spiders and scrapers with increasing frequency to gather and utilize information gleaned from websites. Bots and scrapers can be divided into four categories, depending on desirability and aggressiveness. Knowing and understanding the different categories and how to mitigate the risks they pose is an important component of a web security strategy. Learn more about how to evaluate which bots to allow access to your company’s website in this summary presentation, and then download the full report at www.stateoftheinternet.com/security-reports