Since the first federal census in 1790, the U.S. government has been accumulating statistics and crunching numbers on everything from family size to crime to how many bushels a farmer harvests. Until recently government agencies had limited capabilities for analyzing the vast quantities of data they collected and published. Now, however, with Hadoop and its surrounding tools, the federal government has the potential to improve decision making and save millions through operational efficiencies in areas as diverse as security, bioinformatics, predictive policing, and health care.
Big data solutions are also making dramatic improvements in the government’s ability to serve citizens with information through programs like the Apache Hadoop–based USASearch.gov, now driving search results on more than 1,000 government websites. In this webinar, we will explore some of the issues surrounding the U.S. government and big data, and we will discuss its potential for achieving efficiencies, savings, and big breakthroughs.
In this analyst roundtable discussion, our panel of experts will discuss these topics and more:
- What are the top use cases for government big data solutions?
- What do successful government big data programs have in common?
- What new or unsolved requirements remain?
- Matthew Spady, research director, GigaOM Pro
- Bob Gourley, editor, CTOvision.com
- Dan Olds, analyst, GigaOM Pro
- Dr. Amr Awadallah, chief technology officer, Cloudera
Register here to join GigaOM Pro and our sponsor Cloudera for “Parsing the union.” This free analyst roundtable webinar takes place on Wednesday, Oct. 10, 2012, at 10 a.m. PDT.
Photo courtesy Flickr user [email protected]