Progressing Expectations in Advanced IT Analytics: How the Industry is Still Getting it Wrong - Part 1
March 21, 2016

Dennis Drogseth
EMA

Share this

New EMA research is just in on advanced IT analytics (AIA) and the results are telling. I'll be giving a webinar on April 13 with much more detail and insight than I can present here, but in this 2-part blog I wanted to share a few highlights — and a few opinions about the data — in advance.

We spoke to 250 respondents,100 in Europe and 150 in North America. Company size was 500 and above, and all respondents had some active levels of participation in AIA, including many roles across IT, a strong executive presence and a meaningful percentage of business stakeholders.

We could compare the results with earlier research in 2014 and there are some areas of marked advancement, and other data points that have remained surprisingly consistent. This year we focused ONLY on actual deployments, and we targeted two specific use cases:

■ Performance and availability analytics

■ Change and capacity/optimization analytics

However, we did ask proactively about security-driven analytics, which have become more and more intertwined with performance and change.

Rather than forcing a template of technologies or data sources on our respondents, our exploratory research let the "real world" of active AIA deployments define itself.

Here's Some of What We Learned

Maybe the biggest single surprise was that 100% of our respondents were using AIA for performance. Of these 60% were also using AIA for change management or capacity/optimization. What this indicates, of course, is that performance and availability are mainstream use-cases, a place to begin. Change management and capacity/optimization are next-step initiatives with generally more AIA technologies and more data sources, but, interestingly, slightly lower success rates.

Just a few other highlights are:

■ IT respondents wanted AIA coverage for more than 7 domains, 4 triage options, support and 4.5 business impact metrics.

■ In 2016 the average number of roles (domain, cross-domain and business) supported by AIA is 11 compared to 9 roles in 2014.

■ IT respondents seek to invest in nearly 4 distinct analytic technologies as a part of their AIA initiatives, and draw from 5 different types of data sources. The top analytic choices were process analytics and anomaly detection. The top two data sources were security information and event management (SIEM) and the Internet of Things (IoT). Both of these priorities were different from 2014 and suggest a yet broader use case focus with increasing interest on business alignment.

■ Respondents want to integrate about 15 monitoring or other third-party tool sources into an AIA investment.

■ The average respondent indicated about four unique benefits achieved via AIA. The top three were more efficient use of cloud resources, more efficient use of storage, and faster time to repair problems.

Progressing Expectations in Advanced IT Analytics: How the Industry is Still Getting it Wrong - Part 2

Dennis Drogseth is VP at Enterprise Management Associates (EMA)
Share this

The Latest

March 18, 2024

Gartner has highlighted the top trends that will impact technology providers in 2024: Generative AI (GenAI) is dominating the technical and product agenda of nearly every tech provider ...

March 15, 2024

In MEAN TIME TO INSIGHT Episode 4 - Part 1, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses artificial intelligence and network management ...

March 14, 2024

The integration and maintenance of AI-enabled Software as a Service (SaaS) applications have emerged as pivotal points in enterprise AI implementation strategies, offering both significant challenges and promising benefits. Despite the enthusiasm surrounding AI's potential impact, the reality of its implementation presents hurdles. Currently, over 90% of enterprises are grappling with limitations in integrating AI into their tech stack ...

March 13, 2024

In the intricate landscape of IT infrastructure, one critical component often relegated to the back burner is Active Directory (AD) forest recovery — an oversight with costly consequences ...

March 12, 2024

eBPF is a technology that allows users to run custom programs inside the Linux kernel, which changes the behavior of the kernel and makes execution up to 10x faster(link is external) and more efficient for key parts of what makes our computing lives work. That includes observability, networking and security ...

March 11, 2024

Data mesh, an increasingly important decentralized approach to data architecture and organizational design, focuses on treating data as a product, emphasizing domain-oriented data ownership, self-service tools and federated governance. The 2024 State of the Data Lakehouse report from Dremio presents evidence of the growing adoption of data mesh architectures in enterprises ... The report highlights that the drive towards data mesh is increasingly becoming a business strategy to enhance agility and speed in problem-solving and innovation ...

March 07, 2024
In this digital era, consumers prefer a seamless user experience, and here, the significance of performance testing cannot be overstated. Application performance testing is essential in ensuring that your software products, websites, or other related systems operate seamlessly under varying conditions. However, the cost of poor performance extends beyond technical glitches and slow load times; it can directly affect customer satisfaction and brand reputation. Understand the tangible and intangible consequences of poor application performance and how it can affect your business ...
March 06, 2024

Too much traffic can crash a website ... That stampede of traffic is even more horrifying when it's part of a malicious denial of service attack ... These attacks are becoming more common, more sophisticated and increasingly tied to ransomware-style demands. So it's no wonder that the threat of DDoS remains one of the many things that keep IT and marketing leaders up at night ...

March 05, 2024

Today, applications serve as the backbone of businesses, and therefore, ensuring optimal performance has never been more critical. This is where application performance monitoring (APM) emerges as an indispensable tool, empowering organizations to safeguard their applications proactively, match user expectations, and drive growth. But APM is not without its challenges. Choosing to implement APM is a path that's not easily realized, even if it offers great benefits. This blog deals with the potential hurdles that may manifest when you actualize your APM strategy in your IT application environment ...

March 04, 2024

This year's Super Bowl drew in viewership of nearly 124 million viewers and made history as the most-watched live broadcast event since the 1969 moon landing. To support this spike in viewership, streaming companies like YouTube TV, Hulu and Paramount+ began preparing their IT infrastructure months in advance to ensure an exceptional viewer experience without outages or major interruptions. New Relic conducted a survey to understand the importance of a seamless viewing experience and the impact of outages during major streaming events such as the Super Bowl ...