7 Comments

Summary:

Rackable announced today an update to its CloudRack servers. The CloudRack C2 servers can run at 104 degrees inside the data center, and they offload power supply to the rack to reduce energy wasted in converting AC electricity from the wall to DC electricity used by […]

cloudrackc2_tray_doubleRackable announced today an update to its CloudRack servers. The CloudRack C2 servers can run at 104 degrees inside the data center, and they offload power supply to the rack to reduce energy wasted in converting AC electricity from the wall to DC electricity used by the box to 1 percent. Since these beasts can pack 1,280 cores, or 320 processors, into a rack, they’re not exactly in the power-saving category, but the design ensures that the electricity is going to power the processors rather than lost as heat or waste.

The updated servers feature a fan mounted behind the rack, rather than attached to each server, which also cuts power consumption for cooling to 8 percent, rather than 25 percent, of the total energy. Rackspace Rackable also announced that customers will eventually be able to build out servers in the CloudRack trays using Intel’s lower power Atom chips, which they can use for jobs that don’t need the full horsepower of the upcoming Nehalem-based Xeon chips. Customizing processors is one more way that data center operators are trying to boost efficiency.

The rising competition around designing power-efficient, heat-tolerant servers is being driven by a need to lower electricity and cooling costs in a data center contrasted with the need to pack as much computing into a box as possible to run web-scale application increases. Essentially, we need more computing but have less electricity to squander. Rackable can sell into corporate data centers, but its target market is the web world giants running thousands of servers.

It’s a market that’s growing increasingly competitive with Cisco planning a new line of servers dubbed the Unified Computing system, Dell creating a seperate business unit just to deal with web scale customers, and HP has  a web-scale service design team as well. Intel estimates that 25 percent of its chips will go into web-scale boxes by the year 2012. I’m sure Rackable’s hoping that many of them will go into its servers.

  1. [...] Read more: Rackable’s New Servers Like It Hot [...]

    Share
  2. [...] Rackable’s New Servers Like It HotGigaOm, CAThe rising competition around designing power-efficient, heat-tolerant servers is being driven by a need to lower electricity and cooling costs in a data center contrasted with the need to pack as much computing into a box as possible to run web-scale … [...]

    Share
  3. [...] Rackable’s New Servers Like It Hot – Gigaom.comRackable announced today an update to its CloudRack servers. The CloudRack C2 servers can run at 104 degrees inside the data center, and they offload power supply to the rack to reduce energy wasted in converting AC electricity from the wall to DC [...]

    Share
  4. what is the current data center server thermal environment? (versus the 104degF/40degC in the post)?

    For Telco central offices, the equipment has to comply with a 45degC inlet (ambient air) temp, and for a a 72 hour window to tolerate a 55degC ambient temp.

    Share
    1. Stacey Higginbotham Thursday, March 19, 2009

      most data centers run around the temperature of a cool room — about 68 to 74 degrees. That’s changing as folks play aorund with running them hotter.

      Share
  5. [...] Rackable’s New Servers Like It Hot It’s amazing where server technology is going. Check out this one, it can run at 104 degrees. Boom! I hope never have to set up another server. Just have some guy maintain these and I’ll access everything in the cloud. [...]

    Share
  6. [...] mega data center party. SGI (formerly Rackable) has built special-purpose machines for years, and keeps introducing new options for the mega data center. IBM even launched its own, highly proprietary iDataPlex hardware for the [...]

    Share

Comments have been disabled for this post