Cloud computing providers Microsoft and AmazonWeb Services are opening up their pools of virtual servers to climate researchers via two new grant programs announced on Tuesday.
Microsoft is offering a year’s worth of Azure cloud resources (180,000 computing hours and 20 terabytes of storage) to 20 recipients submitting proposals related to food resilience. According to a Microsoft Research blog post announcing the grants, “The overarching goal is to encourage data providers, scientists, farmers, food producers and the public to discover the food supply’s key vulnerabilities and inherent resiliency.” Proposals are due Sept. 15, 2014.
The food resilience grants will focus on a handful of USDA datasets, and are part of a broader mission Microsoft says it’s pursuing to address the issue. These efforts include “workshops, webinars, and ‘appathons’ to demonstrate the value of open access data and to promote the development of tools for understanding these datasets,” and the addition of additional datasets over the next year.
The food resilience effort itself is part of an ever broader climate change campaign by Microsoft that includes the hosting of — and free access to — a large collection of government datasets as part of President Obama’s Climate Data Initiative, and a more general-purpose research grant program.
The AWS grant program is also more-general in scope and is being done in conjunction with the Climate Data Initiative. AWS hasn’t specified a number of winners, but has promised it will award a total of 50 million hours of access to Amazon EC2 Spot Instances designed for supercomputing. Spot Instances can be tricky for paying customers because they come and go (or become subject to huge price jumps) as demand ebbs and flows, but they’re ideal for longer-running workloads that don’t require constant state.
AWS’s submission process closes on Aug. 29, and winners will have the opportunity to preview initial findings at the AWS re:Invent conference in November. According to the grant website, entries must include:
- Description of the ability of the proposal to drive increased understanding of the scope and effects of climate change
- Provide analysis that could suggest potential mitigating actions for the climate
- Potential to increase the understanding of how to become more resilient to the effects of climate change
Providing cloud resources, data and even data-analysis tools to tackle tough problems is nothing new. Microsoft and AWS have both been doing it for years across a number of areas ranging from social media and emotions to genomics. Google has also been involved, launching, for example, its Earth Engine tool in 2010 and, in May of this year, providing access to the GDELT database and the ability to analyze it using the BigQuery service.
The whole notion of “big data” is really a product of increases in readily available data combined with the cheaper, more-abundant computing needed to process all of it. The ability to do all of this in the cloud — which means researchers don’t have to worry about downloading or storing massive datasets, or owning powerful computing systems — is still just a few years old. Hopefully, all the data, all the hours of computing on it and all the new researchers able to access will ultimately result in a better world.