The total number of datacenters (of all types) in the United States declined for the first time in 2009, falling by 0.7%. triggered by the economic crisis of 2008 and the resultant closing of hundreds and thousands of remote locations with server closets and rooms. At the same time, total datacenter capacity grew by slightly more than 1% as larger datacenter environments continued to rise despite the economic slowdown. According to new research from International Data Corporation (IDC), these trends have continued in the years since 2009 and reflect a major change in datacenter and IT asset deployment that will accelerate further in coming years.
The dynamic driving these changes in the US datacenter market center around the fast-growing array of applications and devices used to communicate and conduct business, the rapid digitization of vast amounts of unstructured data, and the desire to collect, store, and analyze this information in ever-greater volume and detail. This dynamic has had a significant impact on how businesses build, organize, and invest in datacenter facilities and assets.
"CIOs are increasingly being asked to improve business agility while reducing the cost of doing business through aggressive use of technologies in the datacenter," said Rick Villars, vice president, Datacenter and Cloud Research at IDC. "At the same time, they have to ensure the integrity of the business and its information assets in the face of natural disasters, datacenter disruptions, or local system failures. To achieve both sets of objectives, IT decision makers had to rethink their approach to the datacenter."
The most notable factor reshaping datacenter dynamics was the dramatic increase in the use of server virtualization to consolidate server assets. Virtualization and server consolidation drove significant declines in physical datacenter size and eliminated the need for many smaller datacenters as applications were moved to larger central datacenters. It also made investments in power and energy management that much more critical for datacenter managers.
While the aggressive use of virtualization has reduced the rate of growth in server deployments in datacenters, the creation, organization, and distribution of files and rich content are creating a rapid and sustained increase in storage deployments. One of the key characteristics of the content explosion is data centralization, driven by performance, compliance, and scale requirements. As a result, midsize and large datacenters are the main segments where the content explosion is having a major impact.
A third factor shaping the datacenter dynamic has been the shift toward a cloud model for application, platform, and infrastructure delivery. Here the focus is on extending the value and scale of virtualization by boosting operational efficiency and improving IT agility. Along with the content explosion, the buildout of public cloud offerings is driving major growth in the number and size of larger datacenters.
Combined, these factors will continue to drive a slow but steady decline in the number and size of smaller internal datacenters. For similar reasons, large internal datacenters will not grow at anywhere near the same rate as very large datacenters operated by service providers.
By 2016, IDC expects the total number of datacenters in the US will decline from 2.94 million in 2012 to 2.89 million. This decline will be concentrated in internal server rooms and closets, with a very small decline in mid-sized local datacenters.
Despite the slight decline in total datacenters, total datacenter space will increase significantly, growing from 611.4 million square feet in 2012 to more than 700 million square feet in 2016. By the end of the forecast period, IDC expects service providers will account for more than a quarter of all large datacenter capacity in place in the United States.
The IDC report, U.S. Datacenter 2012-2016 Forecast (Doc #237070) provides a census of U.S. datacenters by size, sophistication, and ownership. The report provides a forecast of datacenter investment plans through 2016 and assesses the impact of changing industry business models as well as IT and network developments on datacenter design, build, and management. The report also includes a new datacenter taxonomy based on a multitude of factors, including scope of IT personnel control, physical location, types of applications supported, power and cooling, downtime, floor area, and staff skill sets.
The Latest
Broad proliferation of cloud infrastructure combined with continued support for remote workers is driving increased complexity and visibility challenges for network operations teams, according to new research conducted by Dimensional Research and sponsored by Broadcom ...
New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...
Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...
SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...
Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...
AI sure grew fast in popularity, but are AI apps any good? ... If companies are going to keep integrating AI applications into their tech stack at the rate they are, then they need to be aware of AI's limitations. More importantly, they need to evolve their testing regiment ...
If you were lucky, you found out about the massive CrowdStrike/Microsoft outage last July by reading about it over coffee. Those less fortunate were awoken hours earlier by frantic calls from work ... Whether you were directly affected or not, there's an important lesson: all organizations should be conducting in-depth reviews of testing and change management ...
In MEAN TIME TO INSIGHT Episode 11, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Secure Access Service Edge (SASE) ...
On average, only 48% of digital initiatives enterprise-wide meet or exceed their business outcome targets according to Gartner's annual global survey of CIOs and technology executives ...
Artificial intelligence (AI) is rapidly reshaping industries around the world. From optimizing business processes to unlocking new levels of innovation, AI is a critical driver of success for modern enterprises. As a result, business leaders — from DevOps engineers to CTOs — are under pressure to incorporate AI into their workflows to stay competitive. But the question isn't whether AI should be adopted — it's how ...