Since Hadoop hit the scene almost a decade ago, IT shops have been quietly funneling enormous amounts of data into it for many compelling reasons. It’s vastly cheaper than traditional data warehouse technology, it can process any kind of structured or unstructured data, and it scales out on commodity hardware. But eventually, it was going to catch the eye of corporate auditors, and that day has come. Any public enterprise dealing with Sarbanes Oxley (SOX), HIPAA, PCI or Basel II faces strict regulations about data availability and accessibility. Hardening Hadoop across global data centers and ensuring continuous availability, even during maintenance windows, are key enterprise requirements.
In this webinar, our panel will address these topics:
- What’s driving Hadoop adoption in the enterprise? (Use cases and examples)
- How are companies hardening Hadoop to meet IT audits?
- Trends in Hadoop infrastructure over the next two to three years
- Ron Bodkin, founder and CEO, Think Big Analytics
- John Webster, senior partner, Evaluator Group
- Richard Winter, president, WinterCorp
- Jagane Sundar, CTO and VP, Engineering of Big Data, WANdisco
Register here to join Gigaom Research and our sponsor WANdisco for “Getting Hadoop through an IT audit,” a free analyst webinar on Tuesday, Jan. 28, 2014, at 10:00 a.m. PT.
Read more about this topic in the latest Gigaom Research report “How Hadoop passes an IT audit,” underwritten by WANdisco.