In Part Two of APMdigest's exclusive interview, Will Cappelli, Gartner Research VP in Enterprise Management, discusses the past and future of APM, and its key components such as analytics and end-user experience monitoring.
Click here to read Part One of the interview.
APM: How has APM technology changed in the last couple of years?
A couple of significant changes. One has been the recognition that APM is what we call a multidimensional problem. So you need a collection of different technologies that are looking at the application from different perspectives in order to create a fully rounded picture of what is going on. In the past, many of these different technologies were seen as being competitors of one another. But now I think enterprise buyers recognize that they complement one another.
Second, I would say what has changed most significantly over the last few years is the focus on end-user experience monitoring, especially the drive to capture the real user experience as opposed to capturing some proxy of that user experience. Utilizing synthetic transactions is still seen as valuable in a supplementary way. But the main event around end-user experience monitoring is being able to capture what is actually going on when a user is accessing the system.
The other critical component in all of this is the increasing importance accorded to analytics. As application architectures become more distributed, as they become more dynamic, the ability to see what is happening in the application becomes limited. So you want tools that are able to learn, almost on-the-fly, the relationships among the different variables that describe the states of different components within the application, and then learn the behavior of this dynamic system.
In the future, we are expecting to see many of the other elements that constitute APM to play an increasingly subsidiary role, and the center of APM will become the rich capture of end-user experience data, supplemented by a very powerful analytics capability that will draw data from whatever sources are available at the time to learn the causal patterns that will describe the application's behavior.
APM: Is the market focusing more on analytics because it is a key component of APM?
APM is one of the factors that has driven increased attention to analytics. Other critical factors are the complexity and dynamism of the virtual environment, trying to deal with some of the complexities of monitoring cloud-based applications, and the ongoing increase in the complexity of IT overall. All these factors are working together.
But APM is one of the most important use cases because you have multiple data sources which need to be looked at simultaneously. Each of the data collection technologies generates vast quantities of data. And then you also have the issue that although the data sets are large, they are not redundant. You cannot just sample a small part of these datasets and figure you've gotten the message of the whole. You really need to be looking at large segments of those datasets in order to learn the lesson they are trying to teach you. That means some kind of automated capability that will allow you to discover the patterns inherent in those data sets.
APM: You mentioned the large datasets. How do you solve that old problem of performance monitoring systems delivering too much information?
You are hitting on a fundamental point. The data volumes are exploding. They are much more difficult to manage by themselves. In the old days, where you would have a person sitting in the NOC looking at a screen, and trying to make decisions on whether those events were green, yellow or red. Those days are rapidly going away. You need an aggregating, simplifying pattern discovery capability that will overlay the data and help you make sense of it all.
There are some key straightforward statistics that you want to present in their simplicity because they are meaningful in and of themselves. There is a discipline, the art of creating a meaningful higher level health index that can be positioned to the executive, or even to the IT operation professional, with a minimum of explanation. Here is where the role of analytical technologies play a significant role in that they will allow you to extract meaningful and intuitive graphs that describe the relationship among different variables that impact the system.
APM: The Magic Quadrant talks about the five functionalities of APM, including end-user experience monitoring and analytics, as well as runtime application architecture discovery, modeling and display; user-defined transaction profiling; and component deep-dive monitoring. Are these functionalities based on the vendor offerings?
These five functionalities represent more or less the conceptual model that enterprise buyers have in their heads. I think that, in fact, the vendors came to support that model kicking and screaming. Then many tried to focus the APM problem on one of the dimensions or another. If you go back and look at the various head-to-head competitions and marketing arguments that took place even as recently as two years ago, you see the vendors pushing one of the five functional areas as being the key to APM. I think it is only because of the persistent demand on the part of enterprise buyers for all five capabilities that drove vendors to populate their portfolios in a way that would adequately reflect those five functionalities.
APM: What is missing in APM today?
There are a couple of key areas. It all comes down to the fact that this generation of APM technology has emerged to deal with traditional web-based applications. This points to where the gaps are right now.
First of all, there is the fact that the Internet is becoming more complex. It is much more difficult to see what is happening within the edge of the Internet unless you are actually there monitoring the edge of the Internet. And at the same time, the impact of what is happening at the edge of the Internet on the user's perception of the application's performance is growing. So the need for getting into the edge of the Internet and seeing what is going on there has increased. The way we can summarize that is to say these tools need to get a lot better at monitoring a Web 2.0 applications and applications being accessed over mobile devices.
Second, you also have the issue of all those legacy applications that are not web-based. We are finding many enterprises looking at the more traditional legacy applications and becoming frustrated because current APM tools don't handle those environments that well. That includes not only traditional big client environments like SAP or PeopleSoft, but also some of the Citrix-based environments as well. They are not handled by current technologies with the same degree of thoroughness that web-based applications are.
The last area where there is a very patchy treatment by the vendor community is with regard to vertical industry applications. These technologies do quite well with in-house developed applications built in Java and .Net, but their understanding of that whole realm of off-the-shelf industry specific applications – banking applications, healthcare applications – the APM technologies still fall short
APM: Are companies utilizing APM side by side with their own in-house monitoring for the industry niche apps?
What ends up happening is that you have a great inequality in the degree to which applications are monitored and managed. The in-house developed applications are well monitored while the packaged apps are usually monitored using some fairly low-level functionality that are offered by the vendors of the application packages themselves.
APM: You predicted that analytics and end-user experience monitoring will become even more important. Do you have any other predictions about major changes to come for APM?
A couple of other key changes will happen. I think we will see APM becoming increasingly embedded into our overall lifecycle approach to application management. This is an old topic but it has really been revitalized over the last year and a half or so. Sometimes you hear the term “devops”, which comes from the cloud community, but it is basically nothing more than a reawakening of interest in the overall application lifecycle. So we see APM and application development coming together into a single lifecycle approach. And many of the technologies that are used in production are being ported over to the development side in order to create a consistent view of the application across its lifecycle.
Second, we anticipate APM will become increasingly embedded into an automation cycle and it will be used for dynamic provisioning, dynamic infrastructure configuration, and to create feed back loops. So if there is a performance problem picked up by the APM system, that will feed back into your data center automation system that may reprovision some resources so that the application starts to perform well again.
Third, we think there will be increased integration between the monitoring and management of business applications and the monitoring and management of what are now considered to be more network services, such as VoIP and IP video. We anticipate that organizations will use more and more of the same stack to monitor both types of services.
Finally, I think we will see more integration between APM and overall business process monitoring, where applications and business processes will become more entangled over time and hence will need to be managed in conjunction.
APM: The Magic Quadrant mentioned some statistics that show a 10-15% rise in APM adoption. Do you see that continuing to rise?
Barring economic apocalypse, I think it is safe to predict that the adoption rate will continue over the next 4 or 5 years. I think it is difficult to see beyond that because of the general changing nature of IT itself. The relationship between users and the IT environment will change so significantly over the next 10 years that it may be hard to identify something as an application in 10 years.
The Latest
Broad proliferation of cloud infrastructure combined with continued support for remote workers is driving increased complexity and visibility challenges for network operations teams, according to new research conducted by Dimensional Research and sponsored by Broadcom ...
New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...
Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...
SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...
Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...