There is a fundamental shift currently happening in operational technology today — it's the shift from core computing to edge computing. This shift is being driven by a completely massive growth in data that has already started to take place. According to Cisco Systems, network traffic will reach 4.8 zettabytes (i.e. 4.8 billion terabytes) by 2022.
Businesses cannot continue as usual and still keep up with network performance, security threats, and business decisions. So, in response, network architects are starting to move as much of the core compute resources as they can to the edge of the network. This helps IT reduce costs, improve network performance and maintain a secure network.
However, is the shifting of resources to the edge the right approach?
It could have a negative impact to the network in terms of new security holes, performance issues due to remote equipment, and reduced network visibility.
At the same time, if the network changes are done right, the pendulum could swing to the other side and great there could be great improvements to network security, performance, visibility that take place.
The answer comes down to the deployment of the new architecture. The pivotal tactic is to deploy a visibility architecture that can support the application services and monitoring functions needed. You need network visibility more than ever to: access the data you need, filter it properly, inspect for security threats, and manage SLAs to keep the latency low from the core to the edge.
Two key components are necessary to a successful visibility in this situation — a network packet broker (NPB) and SD-WAN. The NPB provides data aggregation and filtering, application filtering, and performance monitoring all the way to edge devices. SD-WAN services can (and probably should) then be layered on top of the IP-based links to guarantee link performance, as Internet-based services can introduce unacceptable levels of latency and packet loss into the network.
Edge computing deployments have already started to begin. According to a report from Gartner Research, by year-end of 2021, more than 50% of large enterprises will deploy at least one edge computing use case to support IoT or immersive experiences, versus the less than 5% that are currently performing this in 2019.
When it comes down to it, while the promise of edge computing is real, the actual deployment scenario (and whether or not you build network visibility into your network) is what is going to make or break the performance of your new architecture.
The Latest
On average, only 48% of digital initiatives enterprise-wide meet or exceed their business outcome targets according to Gartner's annual global survey of CIOs and technology executives ...
Artificial intelligence (AI) is rapidly reshaping industries around the world. From optimizing business processes to unlocking new levels of innovation, AI is a critical driver of success for modern enterprises. As a result, business leaders — from DevOps engineers to CTOs — are under pressure to incorporate AI into their workflows to stay competitive. But the question isn't whether AI should be adopted — it's how ...
The mobile app industry continues to grow in size, complexity, and competition. Also not slowing down? Consumer expectations are rising exponentially along with the use of mobile apps. To meet these expectations, mobile teams need to take a comprehensive, holistic approach to their app experience ...
Users have become digital hoarders, saving everything they handle, including outdated reports, duplicate files and irrelevant documents that make it difficult to find critical information, slowing down systems and productivity. In digital terms, they have simply shoved the mess off their desks and into the virtual storage bins ...
Today we could be witnessing the dawn of a new age in software development, transformed by Artificial Intelligence (AI). But is AI a gateway or a precipice? Is AI in software development transformative, just the latest helpful tool, or a bunch of hype? To help with this assessment, DEVOPSdigest invited experts across the industry to comment on how AI can support the SDLC. In this epic multi-part series to be posted over the next several weeks, DEVOPSdigest will explore the advantages and disadvantages; the current state of maturity and adoption; and how AI will impact the processes, the developers, and the future of software development ...
Half of all employees are using Shadow AI (i.e. non-company issued AI tools), according to a new report by Software AG ...
On their digital transformation journey, companies are migrating more workloads to the cloud, which can incur higher costs during the process due to the higher volume of cloud resources needed ... Here are four critical components of a cloud governance framework that can help keep cloud costs under control ...
Operational resilience is an organization's ability to predict, respond to, and prevent unplanned work to drive reliable customer experiences and protect revenue. This doesn't just apply to downtime; it also covers service degradation due to latency or other factors. But make no mistake — when things go sideways, the bottom line and the customer are impacted ...
Organizations continue to struggle to generate business value with AI. Despite increased investments in AI, only 34% of AI professionals feel fully equipped with the tools necessary to meet their organization's AI goals, according to The Unmet AI Needs Surveywas conducted by DataRobot ...
High-business-impact outages are costly, and a fast MTTx (mean-time-to-detect (MTTD) and mean-time-to-resolve (MTTR)) is crucial, with 62% of businesses reporting a loss of at least $1 million per hour of downtime ...