36 Comments

Summary:

Amazon.com’s S3 storage web service is proving to be quite a hit amongst the very early stage start-ups. Despite my early skepticism, the growing number of early-stage start-ups signing up for Amazon S3 indicates that something big is afoot. One of the reasons for the growing […]

Amazon.com’s S3 storage web service is proving to be quite a hit amongst the very early stage start-ups.

Despite my early skepticism, the growing number of early-stage start-ups signing up for Amazon S3 indicates that something big is afoot. One of the reasons for the growing popularity of S3 is because the service is optimized for developers and offers REST/SOAP access to its system at pretty affordable prices. Amazon currently charges $0.15 per gigabyte of storage per month and $0.20 per gigabyte of data transferred.

These prices, as Jeff Barr of Amazon explained at SF TechSessions in June, will trend lower, thanks to constant commoditization of hardware and Amazon’s scale. “We are not speculating on the future except to say that we will continue to offer Amazon S3 at extremely competitive pricing by passing along Amazon’s own benefits of scale to Amazon S3 customers,” a company spokesperson said.

S3 is proving to be particularly attractive to community-based media companies – homegrown photos, video, even music. Altexa, Elephant Drive, Jungle Disk, MediaSilo, Ookles, Plum and SmugMug are some of the start-ups that are currently using Amazon’s S3. Online photo sharing company SmugMug CEO Don MacAskill seems to be one happy customer, with a good reason!

He was facing a hefty tab for storage – Smug Mug is adding about ten terabytes worth of photos every month and claims he saved almost $500,000 in storage expenses. His monthly tab just in storage is around the $1500. An Apple 7TB XServe RAID costs about $13,000. Of course there are cheaper options, but still it is a lot of savings.

S3’s early success makes you think that if the on-demand infrastructure can be delivered at an affordable price, the cost of setting up an online business is going to decline even further, perhaps prompting a whole cycle of new entrepreneurial activity. Amazon’s Alexa platform plays into this trend quite well since it allows developers to process and analyze data on Amazon, store it (on S3), and serve it back out to the world.(Amazon, after all is the harbinger of Web 2.0 trends.)

Is this extreme commoditization? In the first stage of commoditization, we say the value shift from specialized chips to all-purporse processors, only to be followed by an appliance movement. The value moved into the software. Maybe what we are seeing is the early signs of value moving into user experience and developer skillset. (Nick Carr, if you are reading this, do let us know what you think. Jonathan Schwartz, chime in, for I know, you know a lot about this trend, first hand!)

Despite its early adoption, it is hard to say if S3 (or Alexa) are a financial success. Yet, it is not hard to imagine S3 as a rallying symbol of on-demand infrastructure, just like Salesforce.com turned software-as-a-service (SaaS) as a viable business option. What do you think?

Nick Carr’s take on this is here.

  1. Amazon should be applauded for S3. However, I can not imagine walking into a VC’s office on Sand Hill Rd. and telling a VC that I’ve implemented my business on top of S3. What if Amazing pulls the plug for some unforseen reason? Or what if there is a global recession and Amazon changes its mind about pricing and decides to raise its prices? Or another scenario — some fiend later sues Amazon on patent infringement ala Blackberry / RIM problems? Oops, we have to shut down the S3 service because some judge forced us to do so due to a patent lurking that we didn’t know about in July 2006?

    There’s an old saying — never put your eggs in one basket. It would be great to see a similar large player with “web scale” capability (e.g., Yahoo) offer a competing service to S3 in order for a healthy market place to develop. Until such time, I think Om Malik’s cautiousness is valid when he says:

    “Despite my early skepticism”

    Note: no reason why a startup can’t build a nice prototype service on S3.

    Share
  2. I’d echo Harrold’s skepticism about the potential reliability issues with S3. I’d also worry about using it for a real-time, online task where the latency of contacting a remote service over a WAN might be an issue for customers.

    By the way, the $1500/month number sounds suspicious to me. If SmugBug is adding 10T of data per month, the cost of transferring that to S3 is $.20 * 10000 = $2000 and the cost of storing it on S3 is $.15 * 10000 = $1500/month, so the total cost appears to be $3500/month just from adding the new data.

    In addition, all the old data is charged $.15 per G per month for storage and $.20 per G per month for the fraction of the old data that is accessed. That’s on top of the $3500/month for each addition 10T of data stored.

    So, it is not clear how these numbers from SmugBug add up unless they have some special deal from Amazon.

    Share
  3. Om,

    Have you heard how much data (on average) any of the bigger S3 customers are transferring to and from S3? If my math is right, SmugMug is pushing about 30 Mbps to S3 just from uploads. Do you have any info on whether S3 been keeping up from a performance perspective?

    Share
  4. This week we’ve seen two disparate views on the future of storage: Sun’s X4500 “Thumper” $2/GB server and Amazon’s S3. Luckily they’re both cheap; the own vs. rent decision seems to come down to operational skill instead of pricing.

    Share
  5. We’ll see how it turns out. MyBlogLog is going to depend on S3 at least for a while. We’ve got about 30k images up there now (e.g. the GigaOm screeen shot from Om’s blog community http://s3.amazonaws.com/buzzsh/2006031413475299sh.png ), which will grow out to several hundred thousand in the next month or so.

    Share
  6. Glad to read such a nice piece of information.

    Share
  7. SmugMug demonstrates the need and the ability of every serial entreprenuer to be ABLE to upload and download photo’s and video’s at affordable pricing…this backs my Level3 Investment cause those 10T’s of data only indicate to me that everything is going digital and everything is IP…and the Video explosion is about to HIT come July 25th…
    skibare

    Share
  8. Jake Kaldenbaugh Monday, July 17, 2006

    Doesn’t this raise the question of, IF this service is successful and becomes supported by additional players (SalesForce’s AppExchange, GoogleBase, etc…), what happens when the provided platform becomes competitive with the core offering? Won’t people want a pure-play provider that won’t have those logical conflict issues?

    Share
  9. I don’t know why everyone still claims that Amazon S3 is very cheap (especially for large scale deployments like SmugMug)

    Let’s look at the facts:
    – the price per Mbit in most co-location facilities is ~$30/m (if you buy > 10mbits). For 1000Gbytes you need ~3Mbit or ~$90/m

    • a dedicated server with 1,000Gb of transfer included is ~$100/m

    • S3-based transfer of 1,000Gb cost $200/m

    So unless you are very small operation — there is no so much business sense to use S3…

    But if they add akamai-like service that allow locality-balancing of these files (as they use for their own web site), I may consider them as a replacement of our home-grown CDN in SiteKreator (assuming they keep the same prices).

    Share
  10. lenkov,

    Buying a dedicated server means you have to “pre-pay” for all that bandwidth and storage.

    S3 provides “pay as you go”. In addition, S3 copies your data to several Amazon data centers, effectively giving you multiple copies of the same data. You would have to buy 2 servers just to get 100% data redundancy.

    Plus, less headaches on “scaling”.

    Share
  11. Terence,

    I agree that “pay as you go” is better if you don’t have any idea what your consumption will be (small-scale op)

    But if you know that for the last 6 months you were using between 10 to 12Mbit, you get 12 (to be on the safe side) for $360, instead of $800 for S3.

    About the redundancy and servers/storage cost — for 0.15 per gig you can perfectly get the redundant storage/power/etc (again for large scale deployment).

    I know this because I’m much smaller fish that SmugMug and I did the calculation for my infrastructure (I operate from 4 DC with ~20 servers per location) and it just doesn’t make sense to use S3. Most likely the story for SmugMug saving $500K is total PR BS.

    Share
  12. It would be cool if there were a P2P storage service that had an API on it. In this way you could have distributed storage (with the data perhaps replicated in several places in case one node of the network went down), with API access for FREE.

    Share
  13. @Dave, interesting idea.

    I just blogged about A3 and FolderShare earlier this evening. See: http://billday.com/2006/07/24/network-storage-and-file-sharing/

    Maybe what we really need is a combination of both? “A3 BitTorrent”?

    Share
  14. It’s good to see someone else worrying about dependency on a single supplier. Coincidentally, that was the theme of today’s post in my blog series “S3 in Business” – I make some suggestions for how Amazon might reassure us: comments welcome!

    The easiest link to the whole series (which covers many other risks and opportunities) is http://www.tunesafe.com.

    Share
  15. Amazon is not in this to make stellar profits from on-demand-infrastructure. Commoditization of hardware will add to their margin, introduction of newer services to allay competition will eat at their margins, making this a balanced game of keeping customers at any cost.

    The bigger game, just like Google, is to have more customers use their infrastructure to pass/share their data, leading to better personalized services. That is where the end-goal resides.

    Share
  16. Amazon’s SQS (Simple Queue Service) released today…

    On the back of the growing successof Amazon’s amazing S3 service, today they have just announced…

    Share
  17. It would be great to see one of the open source web cms providers tap into S3 as a repository for content. For small/med businesses this would provide cheap data protection and an object caching layer would address performance and bandwidth cost concerns.

    Share
  18. [...] or Amazon make once mundane and expensive business processes cheap. Store your customer data on Amazon’s S3 storage service; buy computer [processing] power on demand via Amazon EC2. Don’t want to manage your own [...]

    Share
  19. [...] presentation of their photographic assets. Headquartered in Mountain View, SmugMug has gotten a lot of press for its incorporation of Amazon’s S3 storage service, but its model is creative on a number [...]

    Share
  20. [...] also racking up a number of passionate users who swear by it for reliability and cost savings. Phanfare is just the most recent example, albeit [...]

    Share
  21. [...] also racking up a number of passionate users who swear by it for reliability and cost savings. Phanfare is just the most recent example, albeit [...]

    Share
  22. [...] also racking up a number of passionate users who swear by it for reliability and cost savings. Phanfare is just the most recent example, albeit [...]

    Share
  23. [...] solutions. And even some of the big players like Amazon that have emerged in the storage space, serving Smugmug, and the massive virtual world SecondLife doesn’t have a storage footprint in a datacenter, [...]

    Share
  24. [...] a flickr/youtube clone). Who knows? Maybe Amazon just started EC2 when they realized about all that people moving his heavy content to [...]

    Share
  25. [...] up the infrastructure for a content delivery network. Startup companies are embracing it for their online storage solution, and even bloggers are starting to use it to host their images and other static media (such as mp3 [...]

    Share
  26. [...] or Amazon make once mundane and expensive business processes cheap. Store your customer data on Amazon’s S3 storage service; buy computer [processing] power on demand via Amazon EC2. Don’t want to manage your own [...]

    Share
  27. [...] is a popular choice for startups. For example, SmugMug uses S3 as their primary data storage source. There have been a few minor [...]

    Share
  28. The outlook is bright with S3 coming in. I really hope we get more online businesses going, that is the next level now that corporations have taken over. This kind of programs gives more of a chance for the little guys.

    Thanks,
    Tal Lifschitz,
    http://www.cashrichmoney.com

    Share
  29. There is a very important question you have to ask yourself before deciding what service to use: what are you really looking for – remote storage, content delivery, or both. These are crucial to distinguish.

    What I observe is that most people treat Amazon S3 as a content delivery service. While this is not inherently wrong, one has to notice that S3 was especially designed to be a STORAGE service.

    The point is, since terrabyte hard drives are affordable nowadays and internet traffic grows steadily, the stress goes on content delivery rather than on storage. If you are not concerned about storage, there are much better services especially suited for content delivery.

    SteadyOffload.com provides an innovative, subtle and convenient way to offload static content. The whole mechanism there is quite different from Amazon S3. Instead of permanently uploading your files to a third-party host, their cachebot crawls your site and mirrors the content in a temporary cache on their servers. Content remains stored on your server while it is being delivered from the SteadyOffload cache. The URL of the cached object on their server is dynamically generated at page loading time, very scrambled and is changing often, so you don’t have to worry about hotlinking. This means that there is an almost non-existent chance that the cached content gets exposed outside of your web application.

    It’s definitely worth trying because it’s not a storage service like S3 but exactly a service for offloading static content.

    Watch that:
    http://video.google.com/videoplay?docid=-8193919167634099306 (the video shows integration with WordPress, but it is integrable with any other webpage)
    http://www.steadyoffload.com/
    http://codex.wordpress.org/WordPress_Optimization/Offloading

    Cost of bandwidth comes under $0.2 per GB – affordable, efficient and convenient. Looks like a startup but lures me very much. Definitely simpler and safer than Amazon S3.

    Share
  30. [...] StartUps Embracing Amazon S3 – GigaOM [...]

    Share
  31. [...] importantly it is going to be inexpensive.And that will make the service even more attractive to hundreds of small companies who are already using Amazon Web Services for their web operations, who don’t want to sign long contracts with CDN [...]

    Share
  32. [...] for data transfers and requests made to the Amazon S3 System. Nevertheless this is still going to save some dollars for start-ups that are using S3 service. The company in making the announcement gave some interesting data [...]

    Share
  33. [...] for data transfers and requests made to the Amazon S3 system. Nevertheless, this is still going to save some dollars for startups that are using the S3 service. Amazon, in making the announcement, gave some interesting data [...]

    Share
  34. I know this is an old post, but I’m only now realizing and finding the amazing uses of Amazon S3.

    My research now stems from using QNAP NAS boxes that now integrate with S3 for offsite backups.

    I’m increasingly becoming intrigued by S3 for Small Business Technology solutions, whether backups, or other applications.

    The web is looking pretty snazzy these day.

    Share

Comments have been disabled for this post