16 Comments

Summary:

Jeremy Zawodny, whose works for Yahoo, and qualifies for the uber geek title has come to a conclusion that it is cheaper to store his important data on Amazon’s S3 service that use a server at home. He argues that it cost him about $1400 to […]

Jeremy Zawodny, whose works for Yahoo, and qualifies for the uber geek title has come to a conclusion that it is cheaper to store his important data on Amazon’s S3 service that use a server at home.

He argues that it cost him about $1400 to build his server, and at current prices, he is spending around $13 a month at the very least. He thinks the monthly cost is around $35 a month.

By his math, if he transferred all his data now – about 125 gigabytes is going to cost him $18.75 per month.

Let’s further assume that I increase that by 1GB/month for the next five years (mostly photos) and transfer about 2GB every week doing backups (log files, mail, and other temporary stuff is much of that). 2GB every week is 8GB every four weeks, which costs another $1.60 every four week “month” for a total of $20.80 per year or $104 over 5 years. ….. Adding it all up, if those guesses are right and we assume that Amazon’s prices don’t fall (they certainly could in a few years), I’d end up paying $1,688. In other words, switching to S3 could save me $587 over five years!

S3 has been able to become quite the hit with the geek community, and new applications for the storage service are showing up on a daily basis.

Interarchy, a file transfer client for Mac OS X has recently added support for WebDAV and Amazon S3. You can learn more about S3, if you are in the San Francisco Bay Area by attending a talk by AWS evangelist Jinesh Varia. AWS evangelist Mike Culver will give an overview of Amazon Web Services, and feature a demonstration showing how developers are able to easily utilize on-demand server capacity in Chicago on October 16, 2006 at 150 N Michigan, 28th floor at 6 pm.

If you are using S3 and building cool applications and/or using it for virtual work, let us know. You know we are suckers for total geekouts.

You’re subscribed! If you like, you can update your settings

  1. We use S3 for the thumbnails at mefeedia.com. Not only is it way cheaper and MUCH easier to scale than storing them on the server, but also, storing millions of little files isn’t something a regular linux server deals with well (we had all sorts of unexpected problems with server jobs getting stuck etc. coming up). S3 has been brilliant.

  2. I do all offsite backups to S3. It’s incredibly cost efficient.

  3. Guys,

    any suggestions on how someone like me could use this to back up my data and how does it compare with say dot.mac. i would love to get some help on that, and use it as a guest post. anyone interested?

  4. I attempted to use S3 for backups for most of the summer. The best windows client by far I found was Jungle Disk, http://www.jungledisk.com/. While Jungle Disk is planning on integrating backup features into their client, currently you need to find your own backup software. Unfortunately, I was unable to overcome a variety of issues, despite using several different pieces of backup software. The solution? Carbonite! http://www.carbonite.com/ Carbonite is CHEAP ($50) and even more importantly JUST WORKStm. It quietly works away in the background, backing up my 50 gigs of data I don’t want to loose. In fact, Carbonite is so easy and functional that I’ve been very strongly recommending it to my technophobe family and friends. Long story short? Consumer clients for S3 aren’t quite there yet, give them a few months and in the meantime download the Carbonite trial. [FULL DISCLOSURE: I have no financial or other interest whatsover in any of the companies mentioned here.)

  5. I’m thinking about exploring Amazon’s Electric Cloud (beta) along with S3 to provide programable servers on demand for some projects. Specifically I’m considering an attempt at the Netflix Prize (http://www.netflixprize.com) and if I do, I’ll need the use of an adhoc grid to analyze multiple gigabytes of data (the prize data download uncompresses to nearly 2gb, analyzing it will likely generate far more data while the analysis is running as my software clusters and sorts the data sets).

    What’s pretty interesting about the Electric Cloud stuff is that you can either create server images (and store them on S3) OR you can select from a variety of pre-configured server images – and then you can instance those servers via an API call (and shut them down either via standard shutdown commands, or via another API call). This gives rise to the possibility of server farms that are created on the fly via programming as they are needed (i.e. automatic reactions to be Slashdotted, GigaOmed, techcrunched etc).

    It also points, I think, to the likely future of server hosting and servers in general – rather than spending a lot of time and money buying depreciating assets, you should be able to buy the use of servers on demand which are precisely configured for your application’s requirements – and then instead of leaving them live as targets for hackers, you simplely have them automatically turned off when they are no longer needed – paying only for the computation that is used and a very nominal charge for the storage.

    One concern, like many others, around using S3 for off-site backups is the bandwidth issue – not on Amazon’s side but on my local network. I would have about 70gbs which I would need to backup (at least once) and then about 500-1gb PER DAY that changes on my local system (possibly much more – if my email were backed up as a new copy of the .pst file each night, that alone would be over 2gbs to back up each night).

    Another concern, as Doug Kaye points out in a blog post (linked to in the comments to Jeremy’s post), is whether the backups via S3 will be versioned or will they be more like mirroring drives (and encrypting them probably) offsite?

    (which raises another issue/thought – I can’t wait until Leopard from Apple, their new automated backup stuff seems really, really good – especially if it could work both with a local backup HD as well as an offsite tool such as S3 – versioned backups with easy restores – but hopefully handling backups over the wire/network smoothly…. may be enough to get me to switch from XP to the Mac)

    Shannon

  6. I use Dreamhost.com–it’s even cheaper. At $8/month, that’s $96 per year, over 5 years, Mr. Jeremy would be saving roughly $1200.

    You get 200 GB of storage space, increasing by 1GB/week, along with the usual array of web services all for $8/month. You also get 2TB (that’s right–terabytes) of transfer (increasing by 16 GB/month). There’s nothing new to learn either–the web administrative interface is ultra easy, and they support FTP, SFTP, and WebDAV. Heck, they’ll even register a domain name for you for free.

    Just so you know, I have no financial connection to Dreamhost–they’d give me a referral bonus if you actually sign up under my reference, but I’m not putting that link here just to prove it. They’re just that good.

  7. I use Dreamhost too. For my My Pictures folder. The good thing about that is that it’s ok if I forget to backup now and then (I do it manually with FTP). And I have 200G of space available, which is more than my laptop harddrive.

    For my My Business folder I use Mozy. It’s free, and every day when I come back to the computer there’s that reassuring “Your files have been backed up” dialog box. It really makes me feel great :) Because it’s the My Business folder (mostly Word docs), I haven;’t run into the free size limitation yet.

  8. i use s3 with jungle disk and have no problems. at first, i went out and bought an expensive back up drive only to think about a fire and decided that i needed to use cloud storage instead.

    you cannot beat the price of amazon’s s3 (jungle drive is free to use). in fact, my bill for last month was as follows: i transferred .267 gb worth of data, which cost $0.06 and my total storage was 5.095 gb, which cost $0.77 to host for a grand total of $0.83 for the month. impossible to beat that price.

  9. Web Worker Daily » Blog Archive S3, Online Storage & Data Details « Sunday, October 8, 2006

    [...] Our previous post on using Amazon S3 storage service as a personal back-up option was quite popular. Though many pointed out that backing to that service wasn’t all that easy. Jeremy Zawodny has come up with a list of popular tools to help you with all that. Of the lot, the best one from ease of use standpoint seems to be JungleDisk. [...]

  10. Has anyone checked out Tilana Reserve?

    Tilana Reserve…
    – protects files securely, off-site
    – syncs file between your computers
    – includes Web access
    – keeps versions when you save
    – archives deleted files
    – makes it easy to retrieve files
    – works in the background

    You can use Tilana Reserve on as many computers as you want. The software doesn’t cost anything – just download it for free from the website.

    Whenever you create a file, or save changes, Tilana Reserve updates the files to your personal space at the Tilana Reserve off-site data center.

    You don’t have to swap out tapes, manage backup disks, or even remember to press a button, like some external backup drives.

    When you protect files in Tilana Reserve, you can also sync them between any of your computers on the same account. Every time you save a file, the new version gets protected in the data center, and your other computers automatically get it from there, so they’re always current.

    Check it out at http://www.tilana.com

Comments have been disabled for this post