Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
The tech industry is waking up to the need for greener data centers, but how much money can be saved through better energy-saving practices? A report out today from the Environmental Protection Agency says that $4 billion in annual electricity costs could be saved through more energy efficiency in U.S. data centers.
This equals a major opportunity for startups that can help make IT equipment and the heating and cooling infrastructure more efficient. Companies like VMware that are focusing on virtualization have been growing rapidly.
The EPA’s 133-page “Report to Congress on Server and Data Center Energy Efficiency” examines trends in data center energy use and suggests ways to improve energy efficiency. It’s chock full of stats pointing out the inefficiencies and power-hogging practices of U.S. data centers.
Here are some key findings:
- Data centers consumed roughly 60 billion kilowatt-hours (kWh) in 2006, or about 1.5 percent of total U.S. electricity consumption.
- The energy consumption of servers and data centers has doubled in the past five years and is expected to almost double again in the next five years to more than 100 billion kWh, costing about $7.4 billion annually.
- Federal servers and data centers by themselves account for about 6 billion kWh (10 percent) of that electricity use, which costs about $450 million per year.
- There are existing technologies that can reduce server energy use by an estimated 25 percent, and newer technologies will only add to that efficiency.