Study of U.S.-based data centers quantifies the cost of an unplanned data center outage.
Data center downtime proves to remain a costly line item for organizations, and the cost has increased significantly in the last three years, according to the results of the “2013 Cost of Data Center Outages,” a new Ponemon Institute study sponsored by Emerson Network Power. The study of U.S.-based data centers quantifies the cost of an unplanned data center outage at slightly more than $7,900 per minute. This is a 41% increase from the $5,600 it was in 2010, when Emerson Network Power first partnered with the Ponemon Institute to calculate the costs associated with data center downtime.
This year’s report analyzes costs at 67 data centers within the last year across varying industry segments with a minimum size of 2,500 sq ft. It provides a comprehensive analysis of the direct, indirect, and opportunity costs from data center outages, including damage to mission-critical data, impact of downtime on organizational productivity, damage to equipment, legal and regulatory repercussions, and lost confidence and trust among key stakeholders.
Highlights of the 2013 Costs of Data Center Outages report include:
• Average cost of data center downtime across industries was approximately $7,900 per min. (A 41% increase from the $5,600 in 2010.)
• The average reported incident length was 86 min, resulting in average cost per incident of approximately $690,200. (In 2010 it was 97 min. at approximately $505,500.)
• For a total data center outage, which had an average recovery time of 119 min, average costs were approximately $901,500. (In 2010, it was 134 min at about $680,700.)
• For a partial data center outage, which averaged 56 min in length, average costs were approximately $350,400. (In 2010, it was 59 min at approximately $258,000.)
Those organizations with revenue models that depend on the data center’s ability to deliver IT and networking services to customers — such as telecommunications service providers and e-commerce companies — and those that deal with a large amount of secure data — such as defense contractors and financial institutions — continue to incur the most significant costs associated with downtime; with the highest cost of a single event more than $1.7 million.
These same industries did see a slight decrease (2% to 5%) compared to 2010 costs, while those organizations that traditionally have been less dependent on their data centers saw a significant increase. The largest increase was in the hospitality sector, which saw a 129% increase; followed by the public sector (116%), transportation (108%) and media organizations (104%).
Emerson Network Power also issued an infographic summarizing the costs of data center downtime and the findings of the study.
In September, Emerson Network Power released the first part of this study, which surveyed more than 450 U.S.-based data center professionals and focused on the root causes and frequency of data center downtime. Survey respondents experienced an average of two complete data center outages during the past two years, while partial outages, or those limited to certain racks, occurred six times in the same timeframe. The average number of device-level outages, or those limited to individual servers was the highest at 11. These durations have declined slightly from 2010 findings (complete: 2.5, partial: 7, device level: 10).
Eighty-three percent of respondents said they knew the root cause of the unplanned outage. The top three most frequently cited root causes of outages remain unchanged from the 2010 report: UPS battery failure (55%), accidental EPO/human error (48%) and UPS capacity exceeded (46%). Thirty-four percent of respondents cited cyber attacks, which is up from 15% in 2010, while 30% cited weather-related reasons, which is up from 20% in 2010. Fifty-two percent believe all or most of the unplanned outages could have been prevented.