Business and IT are no longer separate silos. Your application is your business, and software development is the glue that holds it together. But how can you develop your business and the applications that support it at the same time? How far can you push one before the other breaks? The monolithic application is dead, and it’s time we started acting like it. Modern applications provide and consume different services with different needs, and each of those services passes through a number of hands on the journey between conception and production. As a result, there is no single “right” platform for application development and production. For some applications, the quick provisioning of the public cloud may provide an ideal prototyping or dev/test environment, though actual production may happen on in-house servers. For others, competitive or privacy concerns may dictate exactly the opposite strategy. To extract maximum value from their infrastructure, developers must move beyond their current binary thinking and evaluate platform choice based on the various phases of application development and the many roles that touch an application during those phases. An increasingly large swath of businesses are realizing that the cloud-plus-data-centers model can provide the best of both worlds, and integrating the public virtual cloud with the physical data center is the best way to cost effectively scale, secure, and serve modern production workloads.
Key findings include:
- Public cloud infrastructure is excellent for many use cases, but it exposes only the most common 70 percent or so of the functionality that a business may need.
- The cost, control, and security benefits of a data center can be significant differentiators for business applications.
- Cloud-plus-data-center provides the best option for most businesses, maximizing cost, functionality, and agility benefits.