New Relic's 2024 Observability Forecast report features insights and analysis on key growth areas, challenges, and trends influencing the observability industry. The findings highlight the vital role of observability in improving operational efficiency and business performance, and show a strong correlation between full-stack observability and reduced downtime, fewer disruptions, and lower annual outage costs.
Here are six key takeaways from the report.
1. Full-stack observability results in lower outage costs
High-business-impact outages are costly, and a fast MTTx (mean-time-to-detect (MTTD) and mean-time-to-resolve (MTTR)) is crucial, with 62% of businesses reporting a loss of at least $1 million per hour of downtime. However, enterprises with full-stack observability experienced 79% less downtime than those without (70 hours compared to 338 hours), leading to $42 million in cost savings.
In fact, the data revealed an inverse correlation between the number of capabilities deployed, median annual downtime, and median hourly outage costs. For example, respondents with 10 or more observability capabilities deployed experienced 74% lower median annual downtime and 32% lower median hourly outage costs.
2. Business observability is a top priority
Business observability, the ability to correlate telemetry data with business outcomes and report them in real-time, was ranked by respondents as the third most important vendor criteria. True business observability involves integrating business-related data with telemetry data (metrics, events, logs, and traces). Most respondents (87%) reported integrating at least one business-related data type with their telemetry data, with operations data being the most likely to be integrated (43%). Further, 59% said they planned to integrate 5+ business-related data types with their telemetry data in the next 1 - 3 years.
Notably, 40% of those surveyed have deployed business observability and reported significant benefits. According to the data, business observability resulted in 40% less annual downtime, 24% lower hourly outage costs, and 25% less time spent managing disruptions than those without it.
3. AI is driving the need for observability
As the adoption of artificial intelligence (AI) technologies continues to rise, it has emerged as a leading trend driving the demand for observability. In fact, 41% of respondents highlighted an increased focus on AI as a contributor to the need for observability, which has shown tangible benefits. Companies that have invested in AI, particularly in the context of IT operations (AIOps), have experienced notable payoffs, reporting a 28% higher annual value received from observability.
4. Tech professionals are seeking to consolidate tools
An overwhelming majority of those surveyed (88%) indicated they use multiple observability tools, while only 6% rely on a single tool. Yet, those with just one tool reported notable advantages.
Enterprises using a single observability tool spent 50% less engineering time managing disruptions — about 7 hours compared to 13 hours within a 40-hour work week. They also experienced 18% less median annual downtime (249 hours per year compared to 305 hours) and incurred 45% lower median hourly outage costs ($1.1 million per hour compared to $2 million).
Overall, using a single observability tool led to cost savings for organizations, with respondents reporting a 65% lower median annual observability spend ($700,000 compared to $2 million). Moreover, tool consolidation is on the rise, with nearly half of respondents (41%) planning to consolidate tools in the next year.
5. Investment in observability is paying off
According to the report, the median annual spend on observability per year was $1.95 million. In comparison, the estimated median annual value derived from observability was significantly higher at $8.15 million, exhibiting the potential benefits of observability in driving business value.
The median return on investment (ROI) for those who invested in observability was 4x — meaning that for every $1 spent, the respondents believe they receive $4 of value — underscoring the effectiveness of observability in improving operational efficiency and delivering returns for organizations.
6. Observability adoption will continue to rise
With the growing complexity of the tech landscape, companies are increasingly looking towards observability to understand their systems better, identify issues in real-time, optimize performance, and align metrics with business outcomes. To get the most out of their observability spend, most survey respondents (91%) expect to deploy at least one new observability capability in 2025, and 86% expect to deploy six or more new observability capabilities by 2025, emphasizing the overall business value of observability.
Methodology: Released in October, the report draws on data from over 1,700 technology professionals across 16 countries.
The Latest
New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...
Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...
SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...
Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...
AI sure grew fast in popularity, but are AI apps any good? ... If companies are going to keep integrating AI applications into their tech stack at the rate they are, then they need to be aware of AI's limitations. More importantly, they need to evolve their testing regiment ...
If you were lucky, you found out about the massive CrowdStrike/Microsoft outage last July by reading about it over coffee. Those less fortunate were awoken hours earlier by frantic calls from work ... Whether you were directly affected or not, there's an important lesson: all organizations should be conducting in-depth reviews of testing and change management ...
In MEAN TIME TO INSIGHT Episode 11, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Secure Access Service Edge (SASE) ...
On average, only 48% of digital initiatives enterprise-wide meet or exceed their business outcome targets according to Gartner's annual global survey of CIOs and technology executives ...
Artificial intelligence (AI) is rapidly reshaping industries around the world. From optimizing business processes to unlocking new levels of innovation, AI is a critical driver of success for modern enterprises. As a result, business leaders — from DevOps engineers to CTOs — are under pressure to incorporate AI into their workflows to stay competitive. But the question isn't whether AI should be adopted — it's how ...
The mobile app industry continues to grow in size, complexity, and competition. Also not slowing down? Consumer expectations are rising exponentially along with the use of mobile apps. To meet these expectations, mobile teams need to take a comprehensive, holistic approach to their app experience ...