Bots Are Getting More Sophisticated, It's Time Your Cyber Defenses Do Too
February 17, 2016

Kent Alstad
Radware

Share this

This year, 2016 is set to host the "battle of the bots" as bot-generated attacks targeting web application infrastructure are increasing in both volume and scope, according to a recent survey conducted by Radware, which profiled the various cybersecurity threats that are expected to increase in the coming year.

One important fact to note is that not all bots are bad. There are plenty of bots and computer-generated traffic programs that are essential for the daily support and maintenance of web applications. Some prominent examples include search engine bots, such as Baidu Spider and Bingbot. While these bots exist to support the infrastructure, IT managers do need to be aware of the bad bots out there, as they are also numerous, and can pose a serious threat to web application performance.

These bad bots generate various web attacks, some of the most common being SQL injections and Cross-Site Request Forgery, web scraping, and, of course, the ever-looming threat of DDoS attacks.

Every web administrator knows the fear – application performance slowing to a crawl, and then crashing entirely, all because of a massive, unforeseen influx of web traffic from a bot-network. Web applications can’t handle that amount of traffic, and performance suffers.

Since humans can be just as great of a threat to web applications as bots, it’s vital for organizations to be able to distinguish between human and bot activity, in order to properly mitigate threats. One common form of detection is the use of CAPTCHA challenges, a reverse Turing test used to gauge the ability of a computer program to mimic human behavior. However, while this practice is an acceptable means to detect simple, script-based bots, the rise of "advanced bots" has posed a challenge to the IT industry.

These newer, more sophisticated bots are based on headless browser technology and pose significant complications to the detection process. Advanced bots are capable of mimicking human user behavior to a much higher degree than their script-based counterparts and use techniques such as running Javascript and following links graphically to trick detection protocols into thinking they are performing are human activities. These bots are also capable of passing CAPTCHA challenges and setting up dynamic IP addresses, which allows them to maintain low rates of activity per individual IP on a bot network, thus evading IP-based detection parameters.

Defending Against the Bots

So how can organizations defend themselves against such sophisticated bots?

The first step is to assure the use of IP-agnostic bot detection, as successful detection requires correlation across sessions. Without this correlation, it can be highly challenging to detect advanced bots jumping from IP to IP. Relying solely on IP-based detection is not sufficient and can conceal larger threats. To create this IP-agnostic system, fingerprinting is required.

The use of device fingerprinting offers IT managers the ability to identify browsers or automated web client tools through data collection. These tools are able to collect information in various forms, such as operating system specifications, TCP/IP configuration, underlying hardware attributes, and browser attributes. Commonly, this data is collected through Javascript processing, although some types, like TCP/IP, can be collected passively without obvious querying.

A great deal of client-side browser attributes can be collected to form a device fingerprint. While some attributes may seem common, the consolidation and combination of this information is what yields power and sufficiently distinct device fingerprints.

As attacks by advanced bots become increasingly common, the maintenance of an IP-agnostic detection environment is becoming more critical, as is the ability to track bots jumping across IPs via a single, consistent fingerprint.

Finally, it’s important to gauge the threat to applications across multiple attack vectors. An application DDoS attack may be targeting specific resources, however a data-focused scraping attack is typically aimed at specific web pages with the goal of information extraction. Be sure to apply device fingerprinting where it makes the most sense, whether that be a single point of interest within an application or the global implementation across domain resources.

Kent Alstad is VP of Acceleration at Radware.

Kent Alstad is VP of Acceleration at Radware
Share this

The Latest

April 25, 2024

The use of hybrid multicloud models is forecasted to double over the next one to three years as IT decision makers are facing new pressures to modernize IT infrastructures because of drivers like AI, security, and sustainability, according to the Enterprise Cloud Index (ECI) report from Nutanix ...

April 24, 2024

Over the last 20 years Digital Employee Experience has become a necessity for companies committed to digital transformation and improving IT experiences. In fact, by 2025, more than 50% of IT organizations will use digital employee experience to prioritize and measure digital initiative success ...

April 23, 2024

While most companies are now deploying cloud-based technologies, the 2024 Secure Cloud Networking Field Report from Aviatrix found that there is a silent struggle to maximize value from those investments. Many of the challenges organizations have faced over the past several years have evolved, but continue today ...

April 22, 2024

In our latest research, Cisco's The App Attention Index 2023: Beware the Application Generation, 62% of consumers report their expectations for digital experiences are far higher than they were two years ago, and 64% state they are less forgiving of poor digital services than they were just 12 months ago ...

April 19, 2024

In MEAN TIME TO INSIGHT Episode 5, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the network source of truth ...

April 18, 2024

A vast majority (89%) of organizations have rapidly expanded their technology in the past few years and three quarters (76%) say it's brought with it increased "chaos" that they have to manage, according to Situation Report 2024: Managing Technology Chaos from Software AG ...

April 17, 2024

In 2024 the number one challenge facing IT teams is a lack of skilled workers, and many are turning to automation as an answer, according to IT Trends: 2024 Industry Report ...

April 16, 2024

Organizations are continuing to embrace multicloud environments and cloud-native architectures to enable rapid transformation and deliver secure innovation. However, despite the speed, scale, and agility enabled by these modern cloud ecosystems, organizations are struggling to manage the explosion of data they create, according to The state of observability 2024: Overcoming complexity through AI-driven analytics and automation strategies, a report from Dynatrace ...

April 15, 2024

Organizations recognize the value of observability, but only 10% of them are actually practicing full observability of their applications and infrastructure. This is among the key findings from the recently completed Logz.io 2024 Observability Pulse Survey and Report ...

April 11, 2024

Businesses must adopt a comprehensive Internet Performance Monitoring (IPM) strategy, says Enterprise Management Associates (EMA), a leading IT analyst research firm. This strategy is crucial to bridge the significant observability gap within today's complex IT infrastructures. The recommendation is particularly timely, given that 99% of enterprises are expanding their use of the Internet as a primary connectivity conduit while facing challenges due to the inefficiency of multiple, disjointed monitoring tools, according to Modern Enterprises Must Boost Observability with Internet Performance Monitoring, a new report from EMA and Catchpoint ...