To get a concept of how big a trillion is, Amazon’s Jeff Barr in a blog post announcing the new peak calls on the following examples: “That’s 142 objects for every person on Planet Earth or 3.3 objects for every star in our Galaxy. If you could count one object per second it would take you 31,710 years to count them all.” I recently heard TED founder Richard Saul Wurman discuss the national debt by noting that in order to reach a trillion-dollar debt, you’d have to lose $1 million a day every day for about 2,739 years.
So, yeah, Amazon S3 is a wildly successful cloud storage service attached to a wildly successful cloud computing platform overall. As more big data services use S3 as the storage layer — off the top of my head, I can think of several Hadoop services alone that do — it’s just going to keep growing beyond the degree to which web applications alone would force it to scale. Although, S3 does
For a bit of perspective on its rate of growth, consider the following: As of October, it was hosting 566 billion objects, growing to 762 billion in January and 905 billion in April. According to Barr, “Lately, we’ve seen the object count grow by up to 3.5 billion objects in a single day (that’s over 40,000 new objects per second).”
How soon until we hit the quadrillion mark?
Feature image courtesy of Shutterstock user Slavoljub Pantelic.