August 31, 2016
Bots are all the rage today. From Facebook to Google, it seems that every company is rolling out their own. Though technically a robot, an internet bot’s capabilities are quite different than robots featured in pop culture. In reality, bots are just another evolution of applications that we have already been using for years.
You might be surprised to know that up to 60 percent of website traffic could actually be coming from bots. This is important to note because these bots have the ability to create huge problems for website owners, from stealing content to taking a website offline. After working to protect websites from cybercriminals for nearly a decade, I have seen the good, the bad and the ugly when it comes to bots.
The Good Bots
Not all bots are bad. Bots are known by a few different names, such as “internet bots” or even “WWW robots.” Good bots focus on helping organizations run more efficiently. Ultimately, organizations use good bots to gather data by running automated tasks through an online software application. For example, good bots can help identify whether a website is healthy or vulnerable.
Another example of good bots are Google’s web crawling bots known as “Googlebots” and “spiders.” Their main focus is SEO, as they look to discover additional websites to add to Google’s index. The name “spiders” indicates how they function — they use algorithms to determine which sites and how many pages to crawl over and retrieve. From there, Googlebots reward sites with good SEO practices and penalize sites with bad techniques.
When Good Bots Go Bad
Just as good bots are used to help organizations run more efficiently, hackers can employ bad bots of equal strength to complete cybercriminal tasks. There are many different types of bad bots, which represent more than 35% of all bot traffic. These include website scrapers, spammers/spam bots and botnets.
Scrapers lift content from reputable sites and publish it to other sites without the permission of the original site. This can hurt your SEO rankings, as search engines could view the content as duplicate. These bandit bots stalk your website’s RSS feed for updates, getting alerts to when new content is added in order to take it to republish elsewhere. Unfortunately, a website can be penalized regardless of whether it was you or a malicious bot that published the duplicate content.