While security experts may disagree on exactly how to secure a network, one thing they all agree on is that you cannot defend against what you cannot see. In other words, network visibility IS network security.
Visibility needs to be the starting the point. After that, you can implement whatever appliances, processes, and configurations you need to finish off the security architecture. By adopting this strategy, IT will acquire an even better insight and understanding of the network and application performance to maximize security defenses and breach remediation.
One easy way to gain this insight is to implement a visibility architecture that utilizes application intelligence. This type of architecture delivers the critical intelligence needed to boost network security protection and create more efficiencies.
For instance, early detection of breaches using application data reduces the loss of personally identifiable information (PII) and reduces breach costs. Specifically, application level information can be used to expose indicators of compromise, provide geolocation of attack vectors, and combat secure sockets layer (SSL) encrypted threats.
You might be asking, what is a visibility architecture?
A visibility architecture is nothing more than an end-to-end infrastructure which enables physical and virtual network, application, and security visibility. This includes taps, bypass switches, packet brokers, security and monitoring tools, and application-level solutions.
Let's look at a couple use cases to see the real benefits.
Use Case #1 – Application filtering for security and monitoring tools
A core benefit of application intelligence is the ability to use application data filtering to improve security and monitoring tool efficiencies. Delivering the right information is critical because as we all know, garbage in results in garbage out.
For instance, by screening application data before it is sent to an intrusion detection system (IDS), information that typically does not require screening (e.g. voice and video) can be routed downstream and bypass IDS inspection. Eliminating inspection of this low-risk data can make your IDS solution up to 35% more efficient.
Use Case #2 – Exposing Indicators of Compromise (IOC)
The main purpose of investigating indicators of compromise for security attacks is so that you can discover and remediate breaches faster. Security breaches almost always leave behind some indication of the intrusion, whether it is malware, suspicious activity, some sign of other exploit, or the IP addresses of the malware controller.
Despite this, according to the 2016 Verizon Data Breach Investigation Report, most victimized companies don't discover security breaches themselves. Approximately 75% have to be informed by law enforcement and 3rd parties (customers, suppliers, business partners, etc.) that they have been breached. In other words, the company had no idea the breach had happened.
To make matters worse, the average time for the breach detection was 168 days, according to the 2016 Trustwave Global Security Report.
To thwart these security attacks, you need the ability to detect application signatures and monitor your network so that you know what is, and what is not, happening on your network. This allows you to see rogue applications running on your network along with visible footprints that hackers leave as they travel through your systems and networks. The key is to look at a macroscopic, or application view, of the network for IOC.
For instance, suppose there is a foreign actor in Eastern Europe (or other area of the world) that has gained access to your network. Using application data and geo-location information, you would easily be able to see that someone in Eastern Europe is transferring files off of the network from an FTP server in Dallas, Texas back to an address in Eastern Europe. Is this an issue? It depends upon whether you have authorized users in that location or not. If not, it's probably a problem.
Due to application intelligence, you now know that the activity is happening. The rest is up to you to decide if this is an indicator of compromise for your network or not.
The Latest
On average, only 48% of digital initiatives enterprise-wide meet or exceed their business outcome targets according to Gartner's annual global survey of CIOs and technology executives ...
Artificial intelligence (AI) is rapidly reshaping industries around the world. From optimizing business processes to unlocking new levels of innovation, AI is a critical driver of success for modern enterprises. As a result, business leaders — from DevOps engineers to CTOs — are under pressure to incorporate AI into their workflows to stay competitive. But the question isn't whether AI should be adopted — it's how ...
The mobile app industry continues to grow in size, complexity, and competition. Also not slowing down? Consumer expectations are rising exponentially along with the use of mobile apps. To meet these expectations, mobile teams need to take a comprehensive, holistic approach to their app experience ...
Users have become digital hoarders, saving everything they handle, including outdated reports, duplicate files and irrelevant documents that make it difficult to find critical information, slowing down systems and productivity. In digital terms, they have simply shoved the mess off their desks and into the virtual storage bins ...
Today we could be witnessing the dawn of a new age in software development, transformed by Artificial Intelligence (AI). But is AI a gateway or a precipice? Is AI in software development transformative, just the latest helpful tool, or a bunch of hype? To help with this assessment, DEVOPSdigest invited experts across the industry to comment on how AI can support the SDLC. In this epic multi-part series to be posted over the next several weeks, DEVOPSdigest will explore the advantages and disadvantages; the current state of maturity and adoption; and how AI will impact the processes, the developers, and the future of software development ...
Half of all employees are using Shadow AI (i.e. non-company issued AI tools), according to a new report by Software AG ...
On their digital transformation journey, companies are migrating more workloads to the cloud, which can incur higher costs during the process due to the higher volume of cloud resources needed ... Here are four critical components of a cloud governance framework that can help keep cloud costs under control ...
Operational resilience is an organization's ability to predict, respond to, and prevent unplanned work to drive reliable customer experiences and protect revenue. This doesn't just apply to downtime; it also covers service degradation due to latency or other factors. But make no mistake — when things go sideways, the bottom line and the customer are impacted ...
Organizations continue to struggle to generate business value with AI. Despite increased investments in AI, only 34% of AI professionals feel fully equipped with the tools necessary to meet their organization's AI goals, according to The Unmet AI Needs Surveywas conducted by DataRobot ...
High-business-impact outages are costly, and a fast MTTx (mean-time-to-detect (MTTD) and mean-time-to-resolve (MTTR)) is crucial, with 62% of businesses reporting a loss of at least $1 million per hour of downtime ...