Enterprises Waste Over $2 Million Each Year on Data Availability Failures
January 16, 2015

Pete Goldin
APMdigest

Share this

Of those surveyed, 82 percent of CIOs admit that they are unable to meet their organization’s need for immediate, always-on access to IT services, according to the Veeam Data Center Availability Report 2014.

This availability gap has immediate costs: application failure costs enterprises more than $2 million a year in lost revenue, productivity, opportunities and data irretrievably lost through backups failing to recover. These costs will only increase as the global economy requires enterprises to work with partners, customers and stakeholders across time zones, pressuring data center assets to be always-on no matter the location. With emerging markets predicted to generate 40 percent of global growth within the next 15 years, missing global opportunities due to downtime can cause irrevocable damage.

“The availability of IT is more important than ever. Yet businesses globally are being failed by an IT industry that has led them to believe they have to accept downtime,” says Ratmir Timashev, CEO at Veeam. “This isn’t acceptable. Organizations can’t afford to lose millions of dollars from IT failures, nor can they continue to gamble with data availability. The good news is things are set to change. Organizations just need to throw away what they’ve been told for years about availability and demand better. If every organization does this, then in five years application availability will become a redundant topic as consumers and employees across the planet access what they want, when they want it.”

Key findings of the report include:

■ 82 percent of CIOs said they cannot meet the need of their business. More than 90 percent of CIOs are under pressure to both recover data faster, reducing the financial impact of unplanned downtime, and also back up data more often, reducing the risk of data loss.

■ The reasons CIOs are under pressure include more frequent, real-time interactions among customers, partners, suppliers and employees (65 percent of respondents); the need to access applications across time zones (56 percent); increased adoption of mobile devices (56 percent); employees working outside regular hours (54 percent); and an increasing level of automation for decision making and transactions (53 percent).

■ Unplanned application downtime occurs more than once per month (13 times per year).

■ Unplanned application downtime costs an organization between $1.4 million and $2.3 million annually in lost revenue, decreased productivity and missed opportunities.

■ One in six backup recoveries fails, meaning that with 13 incidents of application downtime per year, data will be permanently lost at least twice. This lost data costs enterprises a minimum of $682,000 annually.

■ Organizations are also risking between $4.4 million and $7.9 million of lost application data from downtime incidents each year.

Businesses are already calling for greater availability. However, IT departments are missing the recovery time objective (RTO) their businesses demand for mission-critical data by more than an hour and are more than 2.5 hours away from the always-on standards set by modern availability solutions. They are also missing the required recovery point objective (RPO); i.e., how often data is backed up, by 1.5 hours, and they are 4.5 hours away from modern always-on standards.

“Make no mistake, we are already in the era of the Always-On Business,” adds Timashev. “To keep pace, enterprises need entirely new types of solutions that enable 24/7 availability in a way that legacy data protection and backup products could never do. This means high-speed, guaranteed recovery of every file, application or virtual server when needed. It means leveraging backup data and environments to test the deployment of new applications, mitigating the risk of failure. And it means complete visibility, with proactive monitoring and alerting of issues before they affect operations. CIOs clearly recognize this, with 78 percent planning to change their data protection product in the next two years in order to get the availability that they need. As a result, the availability gap will start to become a thing of the past.”

Survey methodology: The Veeam Data Center Availability Report 2014 is based on a survey conducted online among 760 CIOs of companies with more than 1,000 employees across the United States, United Kingdom, Germany, France, Italy, the Netherlands, Switzerland, Brazil, Australia and Singapore. Vanson Bourne, an independent market research organization, directed the survey on behalf of Veeam.

Pete Goldin is Editor and Publisher of APMdigest
Share this

The Latest

November 21, 2024

Broad proliferation of cloud infrastructure combined with continued support for remote workers is driving increased complexity and visibility challenges for network operations teams, according to new research conducted by Dimensional Research and sponsored by Broadcom ...

November 20, 2024

New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...

November 19, 2024

Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...

November 18, 2024

SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...

November 14, 2024

Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...

November 13, 2024

AI sure grew fast in popularity, but are AI apps any good? ... If companies are going to keep integrating AI applications into their tech stack at the rate they are, then they need to be aware of AI's limitations. More importantly, they need to evolve their testing regiment ...

November 12, 2024

If you were lucky, you found out about the massive CrowdStrike/Microsoft outage last July by reading about it over coffee. Those less fortunate were awoken hours earlier by frantic calls from work ... Whether you were directly affected or not, there's an important lesson: all organizations should be conducting in-depth reviews of testing and change management ...

November 08, 2024

In MEAN TIME TO INSIGHT Episode 11, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Secure Access Service Edge (SASE) ...

November 07, 2024

On average, only 48% of digital initiatives enterprise-wide meet or exceed their business outcome targets according to Gartner's annual global survey of CIOs and technology executives ...

November 06, 2024

Artificial intelligence (AI) is rapidly reshaping industries around the world. From optimizing business processes to unlocking new levels of innovation, AI is a critical driver of success for modern enterprises. As a result, business leaders — from DevOps engineers to CTOs — are under pressure to incorporate AI into their workflows to stay competitive. But the question isn't whether AI should be adopted — it's how ...