Dispelling 3 Common Network Automation Myths
May 22, 2023

Rich Martin
Itential

Share this

As with any journey we embark on, before we get started, we often think about what we need to begin the journey, what we may need along the way and how long it will take us. When it comes to the network automation journey, it really is no different.

Before network engineers even begin the automation process, they tend to start with preconceived notions that oftentimes, if acted upon, can hinder the process. To prevent that from happening, it's important to identify and dispel a few common misconceptions currently out there and how networking teams can overcome them. So, let's address the three most common network automation myths.

Myth #1: A SINGLE Source of Truth & Standardized Data Are Prerequisites for Meaningful Automation

Most network engineers simply don't trust the systems that store network data because of the many failed attempts they've experienced trying to maintain accurate information. Why do these systems lack accurate data? Simply put, the spreadsheets and databases tracking the data are "offline," which means they are "in" the configuration change process but "outside" the process of requiring updates after all changes.

Secondly, the updating processes are human-centric and oftentimes managed by inexperienced engineers during maintenance windows — which typically fall between the hours of 12am-5am — or they're the result of emergency fixes performed on the fly without timely documentation. This lack of timely data updates erodes confidence that these systems are accurate.

This is where the role of DDI platforms comes in. DDI is a unified solution that combines three core networking elements — domain name system (DNS), dynamic host configuration protocol (DHCP), and IP address management (IPAM). These platforms serve as reservation and tracking systems for IP addresses and DNS records which must be unique and accurate for the network to behave properly. Despite this, what can still happen is the DDI data and the actual network configurations can still get out of sync, providing incorrect DDI data.

Some tools were built to put automation on top of a specific source of SoT, tightly coupling automation with Source of Truth (SoT) data within that database. However, there are other sources of truth within the network that the automation code doesn't operate on or integrate with, leading to incomplete or incorrect data and the automation is limited to automating tasks and not an entire process. I believe the SoT is the configuration of the network itself — not an offline copy of the system data that may or may not reflect updated information.

Source of Truth is important to the automation journey but having a single source of truth can quickly lead to inaccuracy. So how do you decide when to apply SoT and when not to apply it?

First, it's always a good idea to apply a source of truth for parts of the network that aren't programmable, for example, port assignments.

Second, some programmable network infrastructure is the SoT, for example, anything in the cloud and SD-WAN. Amazon Web Services (AWS) is the source of truth for AWS. A SD-WAN controller is the source of truth for SD-WAN. These systems are programmable and always accurate which means you don't need an offline copy. Copies are the source of discrepancies which drive error in automation. Multiple sources of truth and "fresh" data will enable better automation.

Myth #2: Network Scripts as a Strategy

When network engineers identify activities they want to automate, they usually turn to network "scripting," since many don't consider themselves developers. Two platforms have become the go-to platforms for network scripting — Python and Ansible.

Python, which has been around since 2010, has become the default programming language for network operations and has many network-friendly libraries.

Ansible has also become a crowd favorite for two reasons: first, it has simplified/limited the functionality towards automation and leverages YAML as a description language for automation. Secondly, it has broad support for command line interfaces (CLIs) for most network vendors.

However, both options have limitations. Ansible is often only viable for task-based automations. It's not a full-fledged programming language like Python because it still requires a knowledge of YAML and how it is applied in Ansible Playbook.

It also isn't truly usable at scale. Ansible tries to be simpler than writing code, but this comes at the expense of some serious limitations with respect to integration and scale. For example, if you're stringing multiple playbooks together and exchanging data between them, custom code is required, which brings you back to learning Python and using a programming language.

Whether you use Ansible or Python to fulfill a script strategy, the fundamental challenge is that there is very little collaboration and awareness of everyone's different scripts. So, what ends up happening is a lack of awareness of who has what scripts and how to use them, and very little version control to ensure people are using the correct version.

Myth #3: Mapping and Modeling of the Network Are Needed Before Automating: If I Can't See It, I Can't Automate It?

Oftentimes, network engineers believe modeling and/or mapping the entire network is a prerequisite before beginning the automation journey. However, this isn't a feasible plan, especially when we're talking about larger networks with many devices.

Why isn't mapping the network feasible?

What many don't realize is that the process of completely mapping an entire network can take several months. When mapping the network, changes are constant, resulting in a process that never really ends before automation can begin. Additionally, requiring modeling of different network devices as a prerequisite to automation comes with some severe downsides.

First, your network automation software vendor must support a particular network vendor, model, and operating system version in their application before any automation can be done. So right from the start, network teams are faced with only being allowed to buy software based on what it's able to support, or buying something that hasn't been modeled and simply going without automation until the vendor supports it.

Also, network vendors who use modeling as the basis for automation must create models for every CLI command and feature supported in the OS. This requires time and resources which forces the vendors who model like this to support a very limited number of vendors/models/operating systems.

While mapping and modeling are important to the automation journey, they should not be viewed as prerequisites, simply because doing so can waste too much time. Rather, both mapping and modeling should be seen to support automation.

At the end of the day, we see more enterprises embracing network automation because of the efficiencies it delivers. But if you're going to automate your infrastructure, your automation solution will need to gather authoritative information using multiple sources of truth.

With today's programmable networks, relying on a single source of truth is based on a flawed assumption that we can always have a synchronized database. With network automation, organizations can adopt a distributed source of truth solution by enabling the multiple systems of record, and their collective data, to act as the source of truth.

Rich Martin is Director of Technical Marketing at Itential
Share this

The Latest

May 09, 2024

App sprawl has been a concern for technologists for some time, but it has never presented such a challenge as now. As organizations move to implement generative AI into their applications, it's only going to become more complex ... Observability is a necessary component for understanding the vast amounts of complex data within AI-infused applications, and it must be the centerpiece of an app- and data-centric strategy to truly manage app sprawl ...

May 08, 2024

Fundamentally, investments in digital transformation — often an amorphous budget category for enterprises — have not yielded their anticipated productivity and value ... In the wake of the tsunami of money thrown at digital transformation, most businesses don't actually know what technology they've acquired, or the extent of it, and how it's being used, which is directly tied to how people do their jobs. Now, AI transformation represents the biggest change management challenge organizations will face in the next one to two years ...

May 07, 2024

As businesses focus more and more on uncovering new ways to unlock the value of their data, generative AI (GenAI) is presenting some new opportunities to do so, particularly when it comes to data management and how organizations collect, process, analyze, and derive insights from their assets. In the near future, I expect to see six key ways in which GenAI will reshape our current data management landscape ...

May 06, 2024

The rise of AI is ushering in a new disrupt-or-die era. "Data-ready enterprises that connect and unify broad structured and unstructured data sets into an intelligent data infrastructure are best positioned to win in the age of AI ...

May 02, 2024

A majority (61%) of organizations are forced to evolve or rethink their data and analytics (D&A) operating model because of the impact of disruptive artificial intelligence (AI) technologies, according to a new Gartner survey ...

May 01, 2024

The power of AI, and the increasing importance of GenAI are changing the way people work, teams collaborate, and processes operate ... Gartner identified the top data and analytics (D&A) trends for 2024 that are driving the emergence of a wide range of challenges, including organizational and human issues ...

April 30, 2024

IT and the business are disconnected. Ask the business what IT does and you might hear "they implement infrastructure, write software, and migrate things to cloud," and for some that might be the extent of their knowledge of IT. Similarly, IT might know that the business "markets and sells and develops product," but they may not know what those functions entail beyond the unit they serve the most ...

April 29, 2024

Cloud spending continues to soar. Globally, cloud users spent a mind-boggling $563.6 billion last year on public cloud services, and there's no sign of a slowdown ... CloudZero's State of Cloud Cost Report 2024 found that organizations are still struggling to gain control over their cloud costs and that a lack of visibility is having a significant impact. Among the key findings of the report ...

April 25, 2024

The use of hybrid multicloud models is forecasted to double over the next one to three years as IT decision makers are facing new pressures to modernize IT infrastructures because of drivers like AI, security, and sustainability, according to the Enterprise Cloud Index (ECI) report from Nutanix ...

April 24, 2024

Over the last 20 years Digital Employee Experience has become a necessity for companies committed to digital transformation and improving IT experiences. In fact, by 2025, more than 50% of IT organizations will use digital employee experience to prioritize and measure digital initiative success ...