3 Comments

Summary:

Amazon says its new Simple Workflow Service (SWS) will run applications that are distributed between customer sites and Amazon cloud infrastructure, thus further blurring the line between the customer’s own data center and their chosen cloud.

Amazon CTO Werner Vogels

Amazon CTO Werner Vogels

Amazon Web Services says its new Simple Workflow Service (SWF) will run applications that are distributed between customer sites and Amazon’s cloud infrastructure, further blurring the line between the customer’s data center and their chosen cloud.

Amazon, the leader in public cloud infrastructure, appears intent on erasing the line between applications customers run in-house and those “out in the cloud.” The ability to run some workloads (or put some data) in the AWS cloud is important for many businesses that want to save money. But many also balk at putting their mission-critical work outside their own firewalls into a public cloud. If Amazon makes it easier for them to run distributed applications securely across internal and outside infrastructure, AWS stands to gain.

Orchestrating workflow is an important concern in enterprise applications. As Amazon CTO Werner Vogels said in his blog, many applications now rely on asynchronous and distributed processing to achieve scale.

By designing autonomous distributed components, developers get the flexibility to deploy and scale out parts of the application independently as load increases. The asynchronous and distributed model has the benefits of loose coupling and selective scalability, but it also creates new challenges. Application developers must coordinate multiple distributed components to get the desired results.

Vogels went on to say that SWF, which appears to be a much more complex tool than the company’s existing Simple Queue Service, relies on a “decider” process  in which tasks represent logical units of work, which can be executable code, scripts, web service calls and human actions. Developers, he wrote, will have full control over implementing and orchestrating these tasks but won’t have to worry about underlying complexities such as tracking their progress and keeping their state.

In his post on the Amazon Web Services blog, evangelist Jeff  Barr said:

This new service gives you the ability to build and run distributed, fault-tolerant applications that span multiple systems (cloud-based, on-premise or both). Amazon Simple Workflow coordinates the flow of synchronous or asynchronous tasks (logical application steps) so that you can focus on your business and your application instead of having to worry about the infrastructure.

SWF will feed into Amazon’s Management Console and will give developers a view into each step of application execution, according to Vogels. SWF will retain the history of executions for a period determined by the developer, up to 90 days.

This is just the latest Amazon service to blur what was once a fairly clear line between a customer’s own data center and Amazon’s public cloud.  In late January, Amazon introduced its Storage Gateway that, as my colleague Derrick Harris reported at the time, lets companies upload data to Amazon’s cloud-storage services directly from their on-premise storage systems. That happened just a week after it unveiled its DynamicsDB DynamoDB NoSQL database. 


The news is not a complete surprise. TechCrunch wrote about the proposed service earlier this month.

All of this sounds great, but it likely will take some time for enterprise developers to truly get on board with all that AWS is presenting them in terms of new architectures. One thing is clear: Amazon’s attempt to be an enterprise application platform is ambitious indeed.

  1. Great feature taking in mind the fact that eventually all your infrastructure moves should be linked to the actual business/application transactions.

    Share
  2. Cool, something new to play with this weekend :)

    Share
  3. A tiny correction: DynamoDB, not DynamicsDB.

    Share

Comments have been disabled for this post