How to keep the cloud from killing the planet

Jason Hoffman (Joyent), Anant Agarwal (Tilera Corporation), Jason Waxman (Intel) - Structure 2011

Jason Hoffman (Joyent), Anant Agarwal (Tilera Corporation), Jason Waxman (Intel)  - Structure 2011The cloud may have driven down the cost of bandwidth and computing instances dramatically, but engineers shouldn’t take this as carte blanche to be wasteful with their resources, said Intel’s GM of high density computing, Jason Waxman, at GigaOM’s Structure conference in San Francisco on Thursday.

“There’s always a cost,” Waxman said, even if it may feel like small change to the individual corporate consumer. “If you look at the number of servers that are going to deployed in the next five years, for cloud types of architecture, you are looking at power that drives demand for 45 coal-powered plants around the world,” he forecasted.

So how do you keep the cloud from killing the planet? Being more efficient is one thing, but having better analytics will go a long way as well. Waxman predicted that within the next two years Intel will be able to tell you exactly how much power and resources a search query will consume.

Aside from power consumption, there are some more things Intel wants to tackle to benefit the cloud: One of the biggest issues has been security, Waxman said. “It’s a big Bogeyman,” he opined, “and it’s holding everyone back.” Making the cloud more secure also means to have transparency and openness across various offerings, which is why Intel has been making a strong push for open standards.

It helped to launch the Open Data Center Alliance that combines 280 cloud customers, including companies like AT&T, BMW and Lockheed Martin with the goal of getting consistency and transparency across cloud offerings. “People are telling us they don’t want lock-in,” Waxman said.

Watch live streaming video from gigaomstructure at livestream.com
loading

Comments have been disabled for this post