New Relic, a SaaS web application performance management provider, has joined forces with the Internet Archive to support its new HTTP Archive project, the largest repository of Web performance data monitoring performance data for 1 million of the Internet’s top websites. Web performance leaders including Google, Mozilla, Etsy and others have also joined New Relic in supporting HTTP Archive’s efforts.
“We share a common interest with the HTTP Archive project: to create a faster, better Web experience for everyone who uses the Internet,” says New Relic founder and CEO Lew Cirne. “Understanding application performance is critical to achieving that goal. We believe in providing application and website creators with comprehensive visibility into performance data, so they can build apps that people want to use time and again. The performance data captured by HTTP Archive will prove to be a valuable resource, allowing businesses benchmark their apps against the most popular sites around the world.”
The HTTP Archive is a permanent repository of Web performance information created by Steven Souders in early 2011. When first launched, the HTTP Archive monitored just 20,000 of the top websites. Today, the project will grow to monitor 1 million sites, gathering information such as size of pages, failed requests, and technologies utilized. This performance information highlights trends in how the Web is built and provides a common data set from which to conduct web performance research.
The Latest
Monitoring your cloud infrastructure on Microsoft Azure is crucial for maintaining its optimal functioning ... In this blog, we will discuss the key aspects you need to consider when selecting the right Azure monitoring software for your business ...
All eyes are on the value AI can provide to enterprises. Whether it's simplifying the lives of developers, more accurately forecasting business decisions, or empowering teams to do more with less, AI has already become deeply integrated into businesses. However, it's still early to evaluate its impact using traditional methods. Here's how engineering and IT leaders can make educated decisions despite the ambiguity ...
2024 is the year of AI adoption on the mainframe, according to the State of Mainframe Modernization Survey from Kyndryl ...
When employees encounter tech friction or feel frustrated with the tools they are asked to use, they will find a workaround. In fact, one in two office workers admit to using personal devices to log into work networks, with 32% of them revealing their employers are unaware of this practice, according to Securing the Digital Employee Experience ...
In today's high-stakes race to deliver innovative products without disruptions, the importance of feature management and experimentation has never been more clear. But what strategies are driving success, and which tools are truly moving the needle? ...
The demand for real-time AI capabilities is pushing data scientists to develop and manage infrastructure that can handle massive volumes of data in motion. This includes streaming data pipelines, edge computing, scalable cloud architecture, and data quality and governance. These new responsibilities require data scientists to expand their skill sets significantly ...
As the digital landscape constantly evolves, it's critical for businesses to stay ahead, especially when it comes to operating systems updates. A recent ControlUp study revealed that 82% of enterprise Windows endpoint devices have yet to migrate to Windows 11. With Microsoft's cutoff date on October 14, 2025, for Windows 10 support fast approaching, the urgency cannot be overstated ...
In Part 1 of this two-part series, I defined multi-CDN and explored how and why this approach is used by streaming services, e-commerce platforms, gaming companies and global enterprises for fast and reliable content delivery ... Now, in Part 2 of the series, I'll explore one of the biggest challenges of multi-CDN: observability.
CDNs consist of geographically distributed data centers with servers that cache and serve content close to end users to reduce latency and improve load times. Each data center is strategically placed so that digital signals can rapidly travel from one "point of presence" to the next, getting the digital signal to the viewer as fast as possible ... Multi-CDN refers to the strategy of utilizing multiple CDNs to deliver digital content across the internet ...