End User Monitoring - Reports of EUM's Death Have Been Greatly Exaggerated
February 22, 2016

Larry Haig
Intechnica

Share this

Once upon a time (as they say) client side performance was a relatively straightforward matter. The principles were known (or at least available – thank you, Steve Souders et al), and the parameters surrounding delivery, whilst generally limited in modern terms (IE5 /Netscape, dialup connectivity anyone?) were at least reasonably predictable.

This didn't mean that enough people addressed client side performance (then or now for that matter), despite the alleged 80% of delivery time spent on the user machine, and the undoubted association between application performance and business outcomes.

From a monitoring and analysis point of view, synthetic external testing (or end user monitoring) did the job. Much has been written (not least by myself) on the need to apply best practice, and to select your tooling appropriately. The advent of “real user monitoring” (RUM) came some 10 years ago – a move at first decried, then rapidly embraced, by most of the “standalone” external test Vendors. The undoubted advantages of real user monitoring in terms of breadth of coverage and granular visibility to multiple user end points – geography, O/S, device, browser – tended for a time to mask the different, though complementary strengths of consistent, repeated performance monitoring at page or individual (eg 3rd party) object level.

Fast forward to today, though, and the situation demands a variety of approaches to cope with the extreme diverseness of delivery conditions. The rise and rise of mobile (just as one example, major UK retailer JohnLewis.com quoted over 60% of digital orders derived from mobile devices during 2015/16 peak trading) brings many challenges to Front-End Optimization (FEO) practice. These include: diversity of device types and version; browsers; and limiting connectivity conditions.

This situation is compounded by development of the applications themselves. As far as the web is concerned, monitoring challenges are introduced by, amongst other things: Single Page Applications (either full or partial); “server push content”; and mobile “WebApps” driven by service worker interactions. Mobile Applications, whether native or hybrid, present their own analysis challenges, which I will address subsequently also.

This already rich mix is further complicated by business demands for more on-site content – multimedia and other rich content, exotic fonts, and more. Increasingly large amounts of client side logic, whether as part of SPAs or otherwise, demand focused attention to avoid unacceptable performance in edge case conditions.

As if this wasn't enough, the (final!) emergence of HTTP/2 introduces both advantages and anti-patterns relative to former best practice.

The primitive simplicity of page onload navigation timing endpoints has moved from beyond irrelevance to becoming positively misleading, regardless of the type of tool used.

So, these changes require an increased subtlety of approach, combined with a range of tools to ensure that FEO recommendations are both relevant and effective.

I will provide some thoughts in subsequent blogs as to effective FEO approaches to derive maximum business benefit in each of these cases.

The bottom line is, however, that FEO is more important than ever in ensuring optimal business outcomes from digital channels.

Larry Haig is Senior Consultant at Intechnica.

Share this

The Latest

July 26, 2017

The retail industry is highly competitive, and as retailers move online and into apps, tech factors play a deciding role in brand differentiation. According to a recent QualiTest survey, a lack of proper software testing — meaning glitches and bugs during the shopping experience — is one of the most critical factors in affecting consumer behavior and long-term business ...

July 25, 2017

Consumers aren't patient, and they are only one back-button click from Google search results and competitors' websites. A one-second delay can bump the bounce rate by almost 50 percent on mobile, and a two-second delay more than doubles it ...

July 24, 2017

Optimizing online web performance is critical to keep and convert customers and achieve success for the holidays and the entire retail year. Recent research from Akamai indicates that website slowdowns as small as 100 milliseconds can significantly impact revenues ...

July 21, 2017

Public sector organizations undergoing digital transformation are losing confidence in IT Operations' ability to manage the influx of new technologies and evolving expectations, according to the 2017 Splunk Public Sector IT Operations Survey ...

July 20, 2017

It's no surprise that web application quality is incredibly important for businesses; 99 percent of those surveyed by Sencha are in agreement. But despite technological advances in testing, including automation, problems with web application quality remain an issue for most businesses ...

July 19, 2017

Market hype and growing interest in artificial intelligence (AI) are pushing established software vendors to introduce AI into their product strategy, creating considerable confusion in the process, according to Gartner. Analysts predict that by 2020, AI technologies will be virtually pervasive in almost every new software product and service ...

July 18, 2017

Organizations are encountering user, revenue or customer-impacting digital performance problems once every five days, according a new study by Dynatrace. Furthermore, the study reveals that individuals are losing a quarter of their working lives battling to address these problems ...

July 17, 2017
Mobile devices account for more than 60 percent of all digital minutes in all 9 markets profiled in comScore's report: Mobile’s Hierarchy of Needs ...
July 14, 2017

Cloud adoption is still the most vexing factor in increased network complexity, ahead of the internet of things (IoT), software-defined networking (SDN), and network functions virtualization (NFV), according to a new survey conducted by Kentik ...

July 13, 2017

Gigabit speeds and new technologies are driving new capabilities and even more opportunities to innovate and differentiate. Faster compute, new applications and more storage are all working together to enable greater efficiency and greater power. Yet with opportunity comes complexity ...