This year, 2016 is set to host the "battle of the bots" as bot-generated attacks targeting web application infrastructure are increasing in both volume and scope, according to a recent survey conducted by Radware, which profiled the various cybersecurity threats that are expected to increase in the coming year.
One important fact to note is that not all bots are bad. There are plenty of bots and computer-generated traffic programs that are essential for the daily support and maintenance of web applications. Some prominent examples include search engine bots, such as Baidu Spider and Bingbot. While these bots exist to support the infrastructure, IT managers do need to be aware of the bad bots out there, as they are also numerous, and can pose a serious threat to web application performance.
These bad bots generate various web attacks, some of the most common being SQL injections and Cross-Site Request Forgery, web scraping, and, of course, the ever-looming threat of DDoS attacks.
Every web administrator knows the fear – application performance slowing to a crawl, and then crashing entirely, all because of a massive, unforeseen influx of web traffic from a bot-network. Web applications can’t handle that amount of traffic, and performance suffers.
Since humans can be just as great of a threat to web applications as bots, it’s vital for organizations to be able to distinguish between human and bot activity, in order to properly mitigate threats. One common form of detection is the use of CAPTCHA challenges, a reverse Turing test used to gauge the ability of a computer program to mimic human behavior. However, while this practice is an acceptable means to detect simple, script-based bots, the rise of "advanced bots" has posed a challenge to the IT industry.
These newer, more sophisticated bots are based on headless browser technology and pose significant complications to the detection process. Advanced bots are capable of mimicking human user behavior to a much higher degree than their script-based counterparts and use techniques such as running Javascript and following links graphically to trick detection protocols into thinking they are performing are human activities. These bots are also capable of passing CAPTCHA challenges and setting up dynamic IP addresses, which allows them to maintain low rates of activity per individual IP on a bot network, thus evading IP-based detection parameters.
Defending Against the Bots
So how can organizations defend themselves against such sophisticated bots?
The first step is to assure the use of IP-agnostic bot detection, as successful detection requires correlation across sessions. Without this correlation, it can be highly challenging to detect advanced bots jumping from IP to IP. Relying solely on IP-based detection is not sufficient and can conceal larger threats. To create this IP-agnostic system, fingerprinting is required.
The use of device fingerprinting offers IT managers the ability to identify browsers or automated web client tools through data collection. These tools are able to collect information in various forms, such as operating system specifications, TCP/IP configuration, underlying hardware attributes, and browser attributes. Commonly, this data is collected through Javascript processing, although some types, like TCP/IP, can be collected passively without obvious querying.
A great deal of client-side browser attributes can be collected to form a device fingerprint. While some attributes may seem common, the consolidation and combination of this information is what yields power and sufficiently distinct device fingerprints.
As attacks by advanced bots become increasingly common, the maintenance of an IP-agnostic detection environment is becoming more critical, as is the ability to track bots jumping across IPs via a single, consistent fingerprint.
Finally, it’s important to gauge the threat to applications across multiple attack vectors. An application DDoS attack may be targeting specific resources, however a data-focused scraping attack is typically aimed at specific web pages with the goal of information extraction. Be sure to apply device fingerprinting where it makes the most sense, whether that be a single point of interest within an application or the global implementation across domain resources.
Kent Alstad is VP of Acceleration at Radware.
The Latest
Industry experts offer predictions on how NetOps, Network Performance Management, Network Observability and related technologies will evolve and impact business in 2025 ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 6 covers cloud, the edge and IT outages ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 5 covers user experience, Digital Experience Management (DEM) and the hybrid workforce ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 4 covers logs and Observability data ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 3 covers OpenTelemetry, DevOps and more ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 2 covers AI's impact on Observability, including AI Observability, AI-Powered Observability and AIOps ...
The Holiday Season means it is time for APMdigest's annual list of predictions, covering IT performance topics. Industry experts — from analysts and consultants to the top vendors — offer thoughtful, insightful, and often controversial predictions on how Observability, APM, AIOps and related technologies will evolve and impact business in 2025 ...
Technology leaders will invest in AI-driven customer experience (CX) strategies in the year ahead as they build more dynamic, relevant and meaningful connections with their target audiences ... As AI shifts the CX paradigm from reactive to proactive, tech leaders and their teams will embrace these five AI-driven strategies that will improve customer support and cybersecurity while providing smoother, more reliable service offerings ...
We're at a critical inflection point in the data landscape. In our recent survey of executive leaders in the data space — The State of Data Observability in 2024 — we found that while 92% of organizations now consider data reliability core to their strategy, most still struggle with fundamental visibility challenges ...