63 Comments

Summary:

Written by Geva Perry, chief marketing officer at GigaSpace Technologies. We are witnessing a seismic shift in information technology — the kind that comes around every decade or so. It is so massive that it affects not only business models, but the underlying architecture of how […]

Written by Geva Perry, chief marketing officer at GigaSpace Technologies.

We are witnessing a seismic shift in information technology — the kind that comes around every decade or so. It is so massive that it affects not only business models, but the underlying architecture of how we develop, deploy, run and deliver applications. This shift has given a new relevance to ideas such as cloud computing and utility computing. Not surprisingly, these two different ideas are often lumped together.

What is Utility Computing?

While utility computing often requires a cloud-like infrastructure, its focus is on the business model on which providing the computing services are based. Simply put, a utility computing service is one in which customers receive computing resources from a service provider (hardware and/or software) and “pay by the drink,” much as you do for your electric service at home – an analogy that Nicholas Carr discusses extensively in “The Big Switch.”


Amazon Web Services (AWS), despite a recent outage, is the current poster child for this model as it provides a variety of services, among them the Elastic Compute Cloud (EC2), in which customers pay for compute resources by the hour, and Simple Storage Service (S3), for which customers pay based on storage capacity. Other utility services include Sun’s Network.com, EMC’s recently launched storage cloud service, and those offered by startups such as Joyent and Mosso.

The main benefit of utility computing is better economics. Corporate data centers are notoriously underutilized, with resources such as servers often idle 85 percent of the time. This is due to overprovisioning — buying more hardware than is needed on average in order to handle peaks (such as the opening of the Wall Street trading day or the holiday shopping season), to handle expected future loads and to prepare for unanticipated surges in demand. Utility computing allows companies to only pay for the computing resources they need, when they need them.

What is Cloud Computing?

Cloud computing is a broader concept than utility computing and relates to the underlying architecture in which the services are designed. It may be applied equally to utility services and internal corporate data centers, as George Gilder reported in a story for Wired Magazine titled The Information Factories. Wall Street firms have been implementing internal clouds for years. They call it “grid computing,” but the concepts are the same.

Although it is difficult to come up with a precise and comprehensive definition of cloud computing, at the heart of it is the idea that applications run somewhere on the “cloud” (whether an internal corporate network or the public Internet) – we don’t know or care where. But as end users, that’s not big news: We’ve been using web applications for years without any concern as to where the applications actually run.

The big news is for application developers and IT operations. Done right, cloud computing allows them to develop, deploy and run applications that can easily grow capacity (scalability), work fast (performance), and never — or at least rarely — fail (reliability), all without any concern as to the nature and location of the underlying infrastructure.

Taken to the next step, this implies that cloud computing infrastructures, and specifically their middleware and application platforms, should ideally have these characteristics:

  • Self-healing: In case of failure, there will be a hot backup instance of the application ready to take over without disruption (known as failover). It also means that when I set a policy that says everything should always have a backup, when such a fail occurs and my backup becomes the primary, the system launches a new backup, maintaining my reliability policies.
  • SLA-driven: The system is dynamically managed by service-level agreements that define policies such as how quickly responses to requests need to be delivered. If the system is experiencing peaks in load, it will create additional instances of the application on more servers in order to comply with the committed service levels — even at the expense of a low-priority application.
  • Multi-tenancy: The system is built in a way that allows several customers to share infrastructure, without the customers being aware of it and without compromising the privacy and security of each customer’s data.
  • Service-oriented: The system allows composing applications out of discrete services that are loosely coupled (independent of each other). Changes to or failure of one service will not disrupt other services. It also means I can re-use services.
  • Virtualized: Applications are decoupled from the underlying hardware. Multiple applications can run on one computer (virtualization a la VMWare) or multiple computers can be used to run one application (grid computing).
  • Linearly Scalable: Perhaps the biggest challenge. The system will be predictable and efficient in growing the application. If one server can process 1,000 transactions per second, two servers should be able to process 2,000 transactions per second, and so forth.
  • Data, Data, Data: The key to many of these aspects is management of the data: its distribution, partitioning, security and synchronization. New technologies, such as Amazon’s SimpleDB, are part of the answer, not large-scale relational databases. And don’t let the name fool you. As my colleague Nati Shalom rightfully proclaims, SimpleDB is not really a database. Another approach that is gaining momentum is in-memory data grids.

One thing is certain: The way the industry has traditionally built software applications just won’t cut it on the cloud. That’s why companies such as Google, Amazon and eBay have developed their own infrastructure software, opting not to rely on products from the large middleware vendors such as Oracle and BEA, who designed them with a very different approach in mind.

For this reason, we are seeing the emergence of a new generation of application platform vendors. These vendors, which include my own company, GigaSpaces, are building software platforms made for the cloud from the ground up: “cloudware,” if you will.

So although they are often lumped together, the differences between utility computing and cloud computing are crucial. Utility computing relates to the business model in which application infrastructure resources — hardware and/or software — are delivered. While cloud computing relates to the way we design, build, deploy and run applications that operate in an a virtualized environment, sharing resources and boasting the ability to dynamically grow, shrink and self-heal.

You’re subscribed! If you like, you can update your settings

  1. In your post you say, “New technologies, such as Amazon’s SimpleDB, are part of the answer, not large-scale relational databases.” I would argue that large-scale relational databases as they are implemented today are not the answer. We have yet to fully dive into Codd’s true meaning of relational database and concepts such as those expressed in “The Third Manifesto” have yet to see true daylight. Granted implementation of a clustered D relation database is not for the faint of heart but writing off relational databases for cloud computing seems a little eager. I think we’ll see a renaissance of relational databases as a result of cloud computing. In the same breath I agree that a relational approach is not always necessary for a successful cloud application. Databases (or data stores is you will) like CouchDB and GigaSpaces can sometimes fulfill the need better. I just don’t want to write off relational databases because of currently poor implementations.

  2. To put it in another way, cloud computing is software as a service (where companies run their own software) and utility computing is hardware as a service (where you can run your own software).

  3. Matt – thanks for your comment. I did not mean to suggest that relational databases are going away. RDBMSs are great for very complex querying of very large sets of data, durability of long-lived data, etc. However, I definitely do see relational databases significantly changing (diminishing) their role — and especially in clouds and other distributed environments. I’ll refer you again to my friend Nati’s post: Putting the Database Where it Belongs (http://tinyurl.com/ywhdgo). It refers to great links about this topic from none other than Michael Stonebraker, Pat Helland and others. Also, I suggest reading this from Vivek Randive, Tibco CEO/Founder: http://tinyurl.com/ys2ovo. Here’s a taste: “There has been really very little innovation because in the last 20 years we’ve been locked into extortionist database architecture. The mother of all databases, the relational database, is at the center of everything, which is why prices of databases have stayed the same.”

  4. @Krish — Perhaps it’s a nuance, but again, I would say that SaaS is really a “business model” rather than a technology. Essentially, a software subscription model and delivery over the Web (or some other network).

    Software can also be delivered in a utility model – that’s exactly what Amazon is doing with things like SimpleDB and my company, GigaSpaces, also provides the option of getting our software in a utility model on EC2.

    Cloud computing is more of an architecture in my mind.

    I wouldn’t nitpick normally, but I think part of the point of this exercise is to start agreeing on some common terminology in the industry.

  5. Geva,

    You’ve written an interesting attempt to clarify a subject that’s confusing many people at the moment and I think that much of what you’ve written about cloud computing is quite interesting. However, I’m afraid your definition of utility is overly simplistic. Utility computing, beyond the economic model, is a set of technologies that make it possible to package computing resources for consumption in a utility fashion. More to the topic though, utility computing is a critical technology for developing cloud computing to the level you describe.

    As an example, without utility computing, if you were asked to deploy an instance of your system across a few hundred servers in Europe tomorrow you’d be hard pressed to accomplish the task. The long accepted method of deploying distributed systems accross servers, software, network, storage and security is simply too labor intensive. Utility computing replaces that labor with technology, so that whether you’re deploying to 1 server or 1,000, the process is exactly the same. Fortunately though, now that there are providers in Europe building utility services today (many on 3tera’s own AppLogic), the task of setting up a 100 server application is trivial.

    Looking beyond mere deployment, I believe there is an even more symbiotic link between cloud and utility computing. Consider for a moment one of your defining characteristics of cloud computing; self-healing. How, precisely, will your cloud middleware layer obtain the extra hardware resources? Will an operator be required to add a node? That would hardly be economical or expeditious. No, the system must be able to provision that node itself which leads to the question of whether that intelligence must be built into your system? Will all clouds be required to do so? That hardly seems practical either. No, just as all applications that run atop Windows utilize device drivers provided for windows to access hardware, cloud systems will utilize utility computing to access their hardware – even if that hardware resides across the globe.

  6. Geva,

    Probably I didn’t put it right. When I said software, I meant software for the underlying architecture. In fact, I agree with your differentiation. I have been waiting to write a post like this because many people use both these terms loosely.

  7. @Bert Armijo — First, I think 3Tera is great and on the cutting-edge. I don’t think I was being over-simplistic, I was consciously trying to simplify. Or rather, trying to more clearly define terms that are used as interchangeable. So I am distinguishing between the business model of renting resources by usage (as in the electric utility), and the underlying architecture/deployment/provisioning model, which I call cloud/grid. Sure, it would be almost unthinkable to run a utility service without an underlying cloud/grid architecture. But the opposite is not true. Not all clouds/grids operate in a utility model (according to my definition). For example, many of our customers have internal clouds/grids.

    I think we are arguing semantics with your example of self-healing. I just don’t call that utility. I call that cloud/grid (done right).

  8. Quickthink » Blog Archive » Mixing Things Up Friday, February 29, 2008

    [...] time the villain was an alleged difference between cloud computing and utility computing. GigaOM warns that these present a “seismic shift in information technology”.  With yet another [...]

  9. Harry Quackenboss Friday, February 29, 2008

    Geva,

    Thanks for your thought-provoking article.

    I am struggling with your definition of utility computing. It’s a lot narrower than the definition first used in the 1960s about computer utilities and accessible computing (http://www.multicians.org/mgc.html#computerutility)

    Consumers don’t think of utility electricity as a business model, they think of it as a service. Electricity can be acquired on a metered, flat rate, or free (meaning it is bundled with something else such as your rent or your tax bill) basis.

    But using your definition of utility computing to assert that cloud computing is a broader concept is like claiming that hydroelectric power plants are a broader concept than utility electricity.

  10. As non-technical person, but one who needs to approve IT budgets and development plans, your article was informative. It has provided me with a business level checklist and a clear and high-level linkage between technical design choices and business implications/customer experience.

    Great job.

Comments have been disabled for this post