In APMdigest's exclusive interview, Nicola Sanna, President and CEO of Netuitive, discusses the importance of real-time analytics in solving the challenge of APM-generated Big Data.
NS: The APM market is confronted with a Big Data problem. Over the last two years companies have progressed in the deployment of more and more agents. They are measuring not only how a machine is doing, but also how an application is doing. They are also starting to measure their business activity in real-time.
This is generating terabytes of data that is becoming more and more intractable to store, visualize and analyze with existing tools to gain operational intelligence. So by that fundamental definition, there is a challenge with APM-generated Big Data.
Last year was the first year that Bank of America transactions in online banking surpassed the number of transactions at the retail agencies. It was the first year where the volume of cell phone traffic surpassed the traffic on fixed lines. Businesses are being driven by the Internet and via the Cloud. Consequently, there has been a shift in the industry in terms of the amount of data being collected in the last two years.
Businesses used to monitor activity on a periodic basis – a quarterly, monthly, weekly, daily basis. But now that a lot of those business activities are sustained by Web applications, they can measure the activity in real-time. That has been a significant change. They can start forecasting when they may have issues supporting the business activity and sustaining it.
Five years ago they were unable to do that because they did not have the applications that were online, and they did not have the instrumentation that was capturing the business activity – they did not have the analytics like Netuitive to help them make sense of it.
That is the most significant change in our market in the last 10 years. And we see this trend continuing and leading to a complete explosion of business data being harvested, and this will require analytics and scalable infrastructure behind analytics to address that issue.
NS: Virtually every conversation about Big Data problems includes a discussion of volume, variety, velocity and complexity of the data. It’s no different in APM.
One of the Big Data challenges is multiple data sources. The data comes from a very diverse set of monitoring tools and domains. Companies used to analyze data coming from a silo. They used to look at databases for business intelligence on consumer sentiment or numbers relating to a specific product line. Now suddenly they are looking at more complex problems like trying to correlate business activity data to application and infrastructure performance in real-time.
So first, data capture is a problem. Customers used to ask us to analyze IT infrastructure data. Now customers are looking to integrate a dozen different data sources including application and infrastructure data, business activity metrics, customer experience metrics, and more.
Second is the cost of storing the data. The cost of relational databases is making alternative approaches much more attractive. We have new data models, NoSQL storage options like Hadoop and Apache Cassandra, which are more cost-effective alternative ways to store a lot of data.
The third challenge is the analytics. The traditional tools do not have the ability to analyze across multiple data sources and understand complex behavioral patterns in real time.
The fourth challenge is sharing the data. You need to provide rapid real-time access to the users of the data – business owners, technical people, application support, a variety of constituencies want to mine the data.
And the fifth challenge is visualizing data in a way that users can consume without being completely overwhelmed. Displaying 20,000 elements on the screen does not make any sense. Providing 50,000 alerts to customers doesn't make any sense. What is it we want to visualize, and how do we do that?
So the capturing, storing, sharing, analyzing and visualizing the data are all components of the Big Data problem in APM.
NS: One of our customers correlates 1 billion data points daily. Ten years ago companies monitored 500 to 1000 servers every 15 minutes. Now they monitor 100,000 to 200,000 elements every minute. We are no longer monitoring only servers, we are also monitoring applications, storage, virtual data center components, end user experience and business transactions.
Another company we are engaged with is a major wireless carrier. They operate 5000 shops across the country and they need to make sure that every time a person comes into the shop to buy a wireless device, it can be activated within minutes and provide a seamless customer experience without latency or delays. Just analyzing that level of end-user experience for 5000 shops that have multiple point-of-sale terminals, monitoring anything from billing to activation codes to latency or business activity, then correlating those metrics with the application and infrastructure performance – this leads to an explosion of data. That is why it is not surprising that we are getting into these numbers of data points every day. Our customers tell us that without advanced ways to integrate the data, correlate it, and analyze it, there is no way for them to understand where a problem lies.
NS: Absolutely. The explosion of data will continue to grow. I think some customers will experience a tenfold increase in data in the next three years or so.
NS: With this explosion of data in the APM market, much of the data is now coming from tools we never heard of before, open source tools, and custom solutions, so we needed to upgrade our solution to be able to account for that. It is no longer just infrastructure data coming from conventional monitoring tools such as HP, IBM, CA, BMC, Microsoft, VMware, etc.
Now a multitude of vendors have come up with very original ways of measuring business activity and transaction data and we have to find a way to capture that. So with 6.0 we had to extend our platform with a powerful integration and modeling SDK called Studio to account for the new level of data being generated in real-time, and to capture it, analyze it and correlate it with the rest of the application and infrastructure data. That is the most notable improvement we have made.
Another improvement in version 6.0, we have improved support for the Java application ecosystem. We realize that many of those mission-critical applications were written in Java – 70% to 80% of the enterprise market. Customers told us they do not want to write more rules and scripts to understand how their Java ecosystem is working. They want to get business and user experience out of the box. They asked us for even more advanced analytics to provide a very good understanding of anomalies in those environments, and pinpoint sources of issues in Java ecosystems with high degree of accuracy, automatically. So we have responded to that demand.
NS: Two factors. First, we offer an open platform that takes any time series data into the product. This allows users to not be locked into any existing integration or data model. That is one of Netuitive's greatest strengths. We are the only open analytics solution out there that can do that for time series data.
Second, our predictive analytics engine is the only one that can self-learn and correlate behaviors of complex systems and applications across these various data sources. So for example, Netuitive is the only solution that can correlate business activity or transaction data with application performance and infrastructure performance. Being able to self learn the behavior of complex systems that rely on multiple data sources to sustain service levels or drive business activity, we have not seen that in any other solution. Self learning, not just based on past patterns of behavior but based on context, and then being able to forecast outcomes in the near future, that is completely unique.
Large enterprises are looking for solutions like Netuitive that deliver real-time analytics against all data, including APM-generated Big Data, to visualize, isolate, and proactively alert application owners of impending performance issues before they impact quality of service.
NS: Next, we want to democratize the use of predictive analytics technology to a wider set of customers beyond the very large enterprises. In the future, most companies will be looking to access that service in the cloud. Netuitive is now working on SaaS based solutions that will significantly shorten adoption while lowering the cost of ownership. Stay tuned for more product announcements as we start delivering on our SaaS strategy.
Nicola Sanna is the President and CEO of Netuitive. Under his leadership Netuitive released its first commercial products and implemented a go-to-market strategy that has established Netuitive as the leader in self-learning performance management software. Prior to Netuitive, Sanna served as President and CEO of e-Security (acquired by Novell) and Chief Operating Officer of Allen Systems Group (ASG). Sanna sits on the North American board of the “Economy of Communion” Initiative, which promotes ethical and social entrepreneurship. He regularly lectures on the subject to undergraduate and graduate program students in business and economics at universities throughout the US.
To learn more about APM and Big Data, check out this Gartner webcast that addresses how predictive analytics is used by large organizations to manage application performance and detect anomalies before they cascade into service degradation and outages.