Featured Article: Guess What Most Web Traffic Is Made Up Of?

In this article, we look at how a surprisingly large proportion of Internet traffic is made up of bots, how many of these can be ‘bad bots’, and what businesses can do to keep enjoying the benefits of good bots while guarding against the threats of bad bots.

Two-Thirds of Internet Traffic is Bots

The recent Barracuda Networks ‘Top Threats and Trends’ report found that Bots make up nearly two-thirds (64 percent) of internet traffic although other surveys have put this number closer to 50 percent.  ‘Bots’ generally refers to the software apps that run automated tasks (scripts) over the Internet, performing tasks that are simple, repetitive, and that wouldn’t be viable for humans to perform.  For example, popular bots include search engine crawlers, social network bots, aggregator crawlers, shop bots, and monitoring bots.  These could be regarded as ‘good bots’ because they serve a practical (rather than a deliberately malicious) purpose and are helpful to businesses and other Internet users.  Good bots obey the website owner’s rules (e.g. as specified in the robots.txt file to dictate what is indexed). They also publish the methods of validating them so it’s clear they’re what they say they are, and they don’t overload the websites and apps they visit.

Bad/malicious bots include, for example, Distributed Denial-of-Service (DDoS)/ botnets which use other malware-infected devices (zombies) to bombard a server with bots to the point where it becomes overwhelmed and is rendered out-of-action. Other ways in which bots are used for nefarious purposes include web and price scraping, inventory hoarding, account takeover attacks, Intelligence harvesting (for fraud), auction sniping (for last-minute bids), spam relay, click fraud, fake vulnerability scanners, and more.  Most ‘bad bot’ traffic comes from the US (67 percent) and mostly from two large public clouds (AWS and Microsoft Azure).

Percentage of Good/Bad Bots

The Barracuda Networks report, for example, suggests that 25 percent of Internet traffic is made up of good bots, but 39 percent of Internet traffic is made up of bad bots.

Worst Hit Industries

Those industries worst hit by bad bots (Imperva figures, 2020) are Telecom & ISPs (45.7 percent, Computing & IT (41.1 percent), Sports (33.7 percent), News (33 percent), and Business Services (29.7 percent).

The Challenges

One of the key challenges that all website owners have is ensuring protection is in place that can distinguish between good and bad bots (bad bots are often disguised as good) and filter out the bad ones. Also, bad bots are now increasingly prevalent because they are easily built and can be purchased for very little money.

Cost, Threats, and Damage

Bad bots can be a real threat to businesses as they can exploit vulnerabilities in (often outdated) software in your system, be used to deliver malware in a number of ways (trojans, software, email attachments), or in concentrated attacks such as DDoS.  The damage caused can be very costly to businesses in terms of damage to networks/systems, disruption of the business/business continuity, reputational damage and worse. The growth of the IoT and its vulnerabilities such as default passwords have further fuelled the popularity of bad bots.

Beating The Bad Bots

With nearly 40 percent of your web traffic being made up of bad bots, it’s important to know how to protect your business from them.  Examples of ways to keep bad bots at bay include:

– Investing in WAF/WAF-as-a-Service offerings / Web Application and API Protection (WAAP) technology. WAF means web application firewall.

– Check and make sure that chosen company security solution offers anti-bot protection.

– Use ‘machine learning’ security solutions.

– Make sure credential stuffing protection is in place.

Upstream and Downstream Traffic

Computer and Internet traffic is often categorised in different ways and the terms upstream or downstream are often used. Broadly speaking (as a basic definition), upstream traffic is that data sent from a computer or network (e.g. sending e-mails, uploading files), while downstream traffic is data received by a computer or network (e.g. traffic that’s downloaded onto your PC). For example, this could be receiving e-mail messages, downloading files, visiting Web pages, Zoom calls (data, video, and audio) and more.

One Third Human Traffic, or More?

According to the Barracuda Networks report, bots/automated traffic makes up two-thirds of Internet traffic.  This suggests that human traffic makes up the remaining third.  Other surveys provide different figures.  For example, the 8th Annual Bad Bot Report from Imperva suggests that human traffic actually made up 60 percent of all website traffic in 2020.

Monitoring and Measuring

If we accept that one-third to around one-half of Internet traffic is automated/bots, this has implications for how accurate your web analytics program and paid ad stats are.  Stats/analytic programs, therefore, tend to have known bot filtering options. For example, Google Analytics has an automatic filter for known bots and spiders (a check box in the settings). You can also set up filters for certain host-names if you notice spikes from certain sources (spikes can be a sign of bots).

What Does This Mean For Your Business?

Good bots undoubtedly save overheads and time and help to make the Internet work as smoothly as it does. However, realising that anywhere between one-third and one half of web traffic is automated (bots) and that the majority of these bots are malicious, and furthermore that this appears to be an upward trend, should make businesses want to take a closer look at just how their cyber-security defences are set up to tackle the threat of bad bots. The risk and potential costs of ignoring the fact that automated threats are likely to be constant, more sophisticated, and are being fuelled by the seemingly unstoppable growth of a less than secure IoT, and the ease by which attackers can obtain and execute bot-based attack methods should motivate businesses to make security a top priority. AI and machine learning provide some hope in identifying potential bot threats but for most businesses, as outlined above in this article, there are basic precautions that can and should be taken to protect the business right now.

Comments are closed.