Valovic on Virtualization

Blog archive

Tideway’s CEO on the Green Data Center

It's often hard for people outside of the slightly specialized world of distributed computing and data centers to get their heads around quite how critical and complex they have become. So I find the attention the media is now lavishing upon the data center industry to be bittersweet. “Sweet” because it certainly serves to put into perspective the dark art of data center computing. A recent article in the Economist noted that by 2010, the total number of servers in the U.S. is expected to grow to 15.8 million, located in 7,000 data centers nationwide – the biggest of which currently contain up to 80,000 servers each.

“Bitter” because of the increasingly significant impact data center energy use and power consumption is having on the planet. Between 2000 and 2005, energy usage rose from just over 50 billion KWh per year to over 150 billion KWh. Over half of this energy is used to power servers, and 40% is used to keep those servers cool enough to operate. According to the EPA, data centers now account for 1.5% of all electricity consumption in the U.S., up from 0.6% in 2000 and 1% in 2005. McKinsey and the Uptime Institute estimate that data centers globally account for more annual carbon dioxide emissions than the entire country of Argentina.

In this context, the new key metric is “performance-per-watt,” and there is plenty of scope to improve it. The problem is the people, processes and systems required to drive this initiative forward are seriously lagging behind. A research program on best practices set up by the EPA has only 54 volunteers. Most data center administrators are nowhere close to knowing what’s running on which servers in their data centers. That makes adoption of more efficient hardware and co-locating software programs on a single box through virtualization risky, slow and arduous.

Power-efficiency is only one constraint when it comes to data center design, planning and operation. Consider compliance regulations, real estate costs, and the location of servers and compute cycles to allow for latency, cooling, and power supply – and the truly mind-bending challenge of data center optimization becomes clear. As virtualization and cloud computing in all its variants really take off, energy consumption and cost may be optimized in the future by moving workloads from one virtualized machine to another, but this will only increase the cost and complexity of managing the applications.

The right IT management tools can help a company reduce its carbon footprint – and save considerable costs in the process. That’s not a myth. The process isn’t necessarily simple and there is certainly additional work involved when we’re talking about adopting an enterprise-wide green strategy. But before they can get to the big work in the data center – decommissioning the inefficient or useless servers, virtualizing the existing resources, relocating – organizations need to be able to approach these initiatives with accurate and thorough intelligence about their IT assets and the relationships between infrastructure, specific business services and energy consumption. Without this intelligence, it can be hard to know what to do to reduce emissions, much less how to do it without risking the interruption of business-critical, revenue-generating services.

The onus is on IT to get the basics right. In the average data center we’ll find most power-hungry servers running at only 10-15% utilization. Combine this with the aforementioned 30 per cent plus of servers the Uptime Institute estimates are obsolete and decommissioned– but still using power - and it should be clear that there are significant and immediate cost and green savings we can achieve by implementing processes that quickly identify and remove these inefficiencies. In a journey of one thousand steps, these are the first strides.

Posted by Tom Valovic on 10/20/2008 at 12:49 PM


Featured

Subscribe on YouTube