9 Companies that Pushed the Infrastructure Discussion in 2010

Table of Contents

  1. Summary


We’ve already covered the trends that began to shape up in 2010 and will really materialize in 2011, and several companies played, and will continue to play, a big role in making those trends happen. Some of them are the usual suspects — companies that are part and parcel with cloud computing and/or enterprise IT — but others came seemingly from nowhere to make a big splash. All their contributions weren’t entirely positive, but, even still, they served as the catalyst for serious discussions about what should and shouldn’t be going forward.

Amazon Web Services. As usual, Amazon Web Services carried the cloud computing torch during 2010, primarily by redefining cloud capabilities and pricing. Apart from upgrading many of its existing services advanced features, AWS introduced two entirely new offerings on each side of the computing spectrum — Micro Instances and Cluster Compute Instances. Micro Instances made using Amazon EC2 even less expensive for customers running low-throughput applications by giving them only 613MB of RAM and two cores per instance. On the other end of the spectrum, Cluster Compute Instances let users write directly to Intel Nehalem processors and run atop a high-throughput 10GbE network infrastructure. Already popular among HPC users wanting on-demand resources, Cluster Compute Instances earned AWS the No. 213 spot on the latest Top500 supercomputer list.

We also got some insights into the business of cloud computing thanks to AWS use cases and revenue projections. Netflix’s extensive AWS use for core aspects of its service proved the cloud can be used for mission-critical workloads, while Eli Lilly’s public spat with AWS over contractual liability exposed an oft-overlooked dark side to cloud computing — namely, that customers have limited avenues of redress in the case of data breaches or system outages. On the economic front, investment bank UBS analyzed revenues to derive projections and predictions for AWS revenue. If UBS is correct in its estimations of approximately $500,000 $500 million in revenue for AWS in 2010, growing to approximately $2.5 billion by 2014, selling infrastructure as a service might be a more-profitable business than conventional wisdom had suggested.

ARM Holdings. We’ve talked about alternatives to x86-based servers for a couple of years, and, in 2010, ARM finally confirmed it will be part of this market. It did so by introducing its Cortex-A15 server processor architecture and contributing to the $48 million funding round Calxeda announced in August. Chips based on the Cortex-A15 architecture won’t likely ship until 2012, but will sport up to 16 cores and run at 2.5 GHZ when they do. ARM also announced virtualization extensions for the Cortex-A15, which will let ARM-based server shops further their high-density, low-power ambitions. Calxeda (nee Smooth-Stone) plans to sell ARM-based servers, possibly as early as 2011. Considering the current mandates to cut data center energy costs while still ensuring enough capacity to meet ever-increasing computing demand, the native energy efficiency of ARM processors could be a big draw — assuming software vendors will support them.

But ARM’s existing designs already are bearing server fruit, even though they weren’t necessarily designed to do so. In May, Marvell expressed its plans to build quad-core systems on a chip based on the ARM architecture. Already, server maker ZT Systems is selling servers based on ARM’s Cortex-A9 processor, technically not a server processor but, apparently, capable of being worked into one with the right tweaks.

CA Technologies. CA Technologies carried its 2009 cloud buying spree well into 2010, creating a systems-management force to be reckoned with in the process. What began last year with Cassatt and NetQoS culminated with CA buying, in succession, Oblicore, 3Tera, Nimsoft, 4Base Technology, Arcot Systems and Hyperformix. CEO Bill McCracken predicted the company would spend at least $300 million on cloud computing acquisitions, and he didn’t disappoint.

What to make of all this buying is that CA realizes it’s a new IT world, far removed from the mainframe (not that it has given up on that business), and CA desperately wants to be a part of it. In May, it announced the Cloud-Connected Management Suite, which combines the Oblicore, Cassatt and Nimsoft capabilities into a trio of unique products, and which also takes a progressive approach to dealing with external cloud services. By purchasing a consulting firm, an identity management firm and a virtualization-management vendor over the summer, CA proved that it understands the whole cloud picture and where the company needed to improve its offerings.

Cloudera. What else is there to say about Cloudera other than that it made Hadoop a household (or IT-department-wide, at least) term in 2010. Many organizations interested in analyzing their growing volumes of unstructured data were already interested in Hadoop, but Cloudera’s enterprise-class distribution and support — available since 2009 — make deploying a Hadoop cluster a less-scary proposition. Cloudera lowered the barriers even further this year by releasing the first-ever proprietary (and not free) Hadoop distribution. Furthermore, it has been proactive in spreading the word of customer successes, showing potential customers across industries the wide range of possibilities for processing data with Hadoop.

Cloudera also did a lot to reduce customer anxiety about adding new infrastructure to its existing data-management strategies. Thanks to a relentless campaign of technology partnerships and integrations with leading data warehousing, database and business-intelligence vendors, Cloudera began to position Hadoop in its rightful light — as a complement to, not a replacement for, existing database and BI software. The company wrapped up 2010 with its second-annual Hadoop World event, which featured many large enterprise customers and developers – including General Electric, eBay, Orbitz and the Chicago Mercantile Exchange – and with a $25 million Series C funding round.

Dell. Dell’s consumer business might be in the toilet, but Dell proved this year that its enterprise business is alive, kicking and will continue to evolve. Its flagship server business outpaced the rest of the industry in terms of growth, led by the thriving Data Center Solutions group that sells optimized boxes for webscale deployments. But servers were just the beginning.

Dell’s biggest advancement came on the software side of things, where the company seems dedicated to giving customers whatever cloud experience they desire. Want internal cloud management for enterprise applications? Dell bought Scalent and launched its Virtual Integrated System software. Want a highly scalable environment for web apps? Dell has an OEM deal with Joyent. Want to throw some cloud services into the mix? Dell bought Boomi. Want Windows Azure in-house (or possibly as a Dell-hosted service)? Dell’s on that, too, with its Windows Azure Appliance partnership. It’s even getting in on the Big Data movement via an OEM deal with Aster Data Systems, and via storage acquisitions Compellent (after, perhaps luckily, losing out on 3PAR), Exanet and Ocarina Networks.

Facebook. In 2010, Facebook’s technological innovations weren’t as big of news as was its new coal-powered data center. The social-networking leader barely had time to congratulate itself on being big enough to require its own data center, announced in January, before becoming the focal point of a discussion about whether cutting-edge web data centers should rely on dirty energy sources. Aside from touting its specialized methods for reducing energy use, Facebook didn’t seem to care much about the source of that energy: It doubled the size of its first data center during the build phase and announced a second coal- and nuclear-powered data center in November. It’s fair to question Facebook’s energy decisions, especially given the green slant of many of its Generation Y and Z users, but critics might be holding it to too high a standard. After all, clean energy is still plenty expensive, and most companies selling Green IT hardware and/or software, or designing energy-efficient data centers — an area where Facebook is a leader — receive nothing but praise.

Of course, Facebook did do quite a bit on the software side, too. In February, it open sourced its HipHop tool for boosting performance of PHP applications. In November, it shared the details of the infrastructural transformation necessary to enable its new Messages service. Among the highlights was the use of Hadoop spawn HBase as the primary database. Speaking of Hadoop, Facebook also open sourced its latest tools for managing its multiple — and very large — production Hadoop clusters. It wasn’t all kudos for Facebook on the infrastructure front, though, as the company suffered two noteworthy outages — one in April and one in September — after touting its reliability at the F8 developers conference.

Microsoft. Steve Ballmer’s proclamation in March that Microsoft is “all in” for cloud computing received a lot of attention, much of it skeptical at best. And, despite arguments that Microsoft doesn’t get the cloud, Ballmer’s words are ringing true. Probably nowhere is this more apparent than with Windows Azure, which became publicly available in February and has been advancing rapidly since then. As a PaaS offering, Windows Azure is in a league of its own for a variety of reasons — including Microsoft’s mission to combine applications with the platform — but now it’s so much more. Windows Azure Appliance gives large-scale customers and partners the opportunity to deploy the platform within their own data centers. The new Virtual Machine Role within brings an IaaS-like capability that makes Windows Azure a more-direct competitor to Amazon EC2.

Microsoft also had a big year in terms of proving itself against Google in the world of online collaboration and productivity applications. For example, whereas Google scored a high-profile deal with the City of Los Angeles, Microsoft won a deal with the State of California. It also won big deals the State of Minnesota, New York City and multiple large enterprise customers. Skeptics wondered whether Microsoft would be willing to give up large profit margins present in traditional software licenses, or whether it could produce cloud-capable offerings, but the proof that it can is in the pudding.

Oracle. With European Union approval in January, Oracle was finally able to begin the process of integrating Sun Microsystems into Larry Ellison’s software empire. Some results were expected while others were unexpected, but, either way, they already have had major effects on the IT landscape. As expected, Oracle has taken a different approach to open source than did Sun, resulting in tiffs with the Java Developer Community, a high-profile lawsuit against Google for its use of Java code in the Android operating system, and the elimination of free licenses for the MySQL database. Also as expected, much of Sun’s Java, MySQL and general open-source talent has left the company since the acquisition closed.

Somewhat unexpectedly, Oracle not only maintained, but also embraced Sun’s hardware business. When the deal was announced, after all, critics wondered whether Oracle would want to take on the burden of adding a hardware business to its highly profitable and smooth-running software business. In what one might call true Oracle fashion, however, it had a plan — to eliminate low-margin commodity sales and sell hardware primarily in system form. From the Exadata analytics appliance to the Exalogic private cloud system, Oracle has utilized Sun’s server and processor technologies to take fully integrated converged infrastructure to the next level.

Interestingly, Oracle also placed itself in the middle of HP’s executive woes, thanks to Larry Ellison publicly lampooning HP for firing CEO Mark Hurd, then quickly hiring Hurd as a co-president to replace the outgoing Charles Phillips. That move instigated a lawsuit by HP against Hurd, which was quickly resolved.

VMware. VMware really stepped up its cloud game in 2010, transforming its strategy from one of “That would be cool” to one of “Damn, that is cool.” It did so by starting to deliver on the promise of its SpringSource acquisition, and by establishing that it intends to have a data center presence far beyond the virtualization layer.

The eye-catching pieces of news on the SpringSource side were the two PaaS partnership with on VMforce, and with Google on App Engine for Business. By establishing the Spring Framework as the method of deploying Java applications on these platforms, VMware began what could be a long process of attempting to establish Spring as the de facto Java-deployment method across the cloud. Less reported, however, were VMware’s acquisitions of Rabbit MQ and GemStone Systems as components of the SpringSource division, as well as the upcoming vFabric portfolio. Essentially a conglomeration of SpringSource products right now, it should become the foundation for VMware’s PaaS strategy — something that will be necessary as cloud computing forces the abstraction between applications and infrastructure.

In terms of its extra-data-center moves, VMware also bought Zimbra, TriCipher and Integrien. Zimbra turns VMware into an application vendor, TriCipher addresses the tricky problems of identity management across various cloud services, and Integrien provides an IT analytics product that correlates and analyzes all sorts of data from throughout the infrastructure. Along the same line, at VMworld, VMware also expanded its vCloud partner program to include new service providers and capabilities, and released vCloud Director to let customers manage all their VMware infrastructure — on-premise or in the cloud — from one place.

Request Access

Available to GigaOm Research Subscribers

  • Required

  • This field is for validation purposes and should be left unchanged.