Deep in the bowels of Google’s offices in Mountain View, Calif., and Seattle, a group of researchers has been consuming an incredible amount of computing resources trying to make scientific discoveries that they hope will help change the world.
On Monday, Google announced on its Research Blog the six projects which it granted 100 million core-computing hours apiece as part of the Google Exacycle for Visiting Faculty program last year. The projects, most of which are led by university researchers (and one by a Google researcher), tackle a variety of pharmaceutical and biological challenges, as well as the problem of analyzing petabytes of data generated by a forthcoming astronomical survey project. The latter hopes to “[reduce] the time required to simulate one night of [Large Synoptic Survey Telescope] observing, roughly 5 million images, from 3 months down to a few days.”
Although these projects require a lot of horsepower to tackle enormous datasets and tough compute problems, Google’s Exacycle program is hardly the first attempt to use to cloud for good. In 2010, for example, Microsoft launched a version of the NCBI BLAST data-analysis tool on its Windows Azure cloud and gave away free access to researchers. Amazon Web Services also hosts its own research grant program that awards access to its cloud computing infrastructure. Both companies also host large genomic and other datasets that are available to anyone using their services.
Free or not, though, cloud computing has proven a boon for researchers hungry for computing power but not keen on waiting in line for supercomputer resources or sending massive data across the network. Google pimped its Compute Engine cloud as a scientist’s best friend when launching the service in June, and AWS recently highlighted how one customer built an 8,000-core system that churned out 115 years of work in just one week.
Feature image courtesy of Shutterstock user Sashkin.