More tech Stories
On The Web

This article in IEEE Spectrum highlights some interesting research into making cloud computing more efficient by balancing the carbon footprints of global data centers and the latency of serving requests from those data centers. It’s an admittedly incomplete study, and one that’s probably more important to large web companies (e.g., Facebook) and cloud providers than to normal businesses just consuming cloud resources. Smarter load balancing could help providers cut operational costs and pass the savings onto users. Of course, large cloud users such as Netflix might want to research and develop their own systems, as well.

In Brief

Researchers from the National University of Singapore and the King Abdullah University of Science and Technology in Saudi Arabia have developed a new method of storing data on magnetoresistive random attached memory, or MRAM, chips that they claim can store data for at least 20 years. Some believe MRAM has promise in future consumer devices and in embedded systems because it’s faster, denser and longer-lasting than traditional DRAM and flash memory. However, it’s not exactly clear how revolutionary the researchers’ work is: An Arizona-based company called Everspin already produces MRAM technology it claims can last more than 20 years.

Upcoming Events

In Brief

IDC has released its forecast for the big data market and predicts it will grow to $32.4 billion by 2017, at a compound annual growth rate of 27 percent. The fastest-growing segment will be cloud infrastructure, which IDC predicts will grow at 49 percent per year. Last year, IDC predicted a $23.8 billion market by 2016 at a CAGR of 31.7 percent. It also predicted the storage would be the fastest-growing segment at 53.4 percent. The research firm acknowledges the changes and attributes them (and other changes in methodology) to fluidity in a new, fast-moving market — a fair point and a smart decision if the company wants its forecast to remain relevant.

In Brief

Datameer, one of the first companies to offer software for easily analyzing data stored in Hadoop (it lets users use a spreadsheet interface to work with data), has raised a $19 million series D round of venture capital. Next World Capital led the round, which also included its existing investors and Workday, Citi Ventures and Software AG. Datameer has proven pretty resilient even as other early Hadoop-ecosystem startups have suffered, in part I assume because it been proactive on product design — it was quick to adopt HTML5 and add advanced visualization options — and getting the product in potential users’ hands via single-node and laptop versions of the product.

loading external resource

t
photo: Yahoo Finance

Violin Memory CEO Don Basile has been terminated amid a spate of class-action lawsuits, a tumbling stock price and concerns over his compensation. He is being replaced on an interim basis by the chairman of Violin’s board of directors. Read more »

In Brief

Open source database startup RethinkDB has raised an $8 million series A round, led by Highland Capital Partners. Like MongoDB, RethinkDB is designed to store JSON documents, but it boasts features around administration, scalability and functionality that MongoDB doesn’t offer. Even getting close to MongoDB in terms of users (especially meaningful ones running in production) will be a tall order, but RethinkDB has raised $12.2. million in capital thus far, which isn’t exactly chump change when it’s all going right back into the product and the community.

On The Web

O’Reilly Radar has a useful post from Jetpac CTO Pete Warden on how his company, which offers a visual guide to popular places, uses Amazon EC2, Hadoop and an open source computer-vision program called OpenCV to analyze Instagram images. It’s amazing how pervasive cloud computing and big data technologies have become, and how fast they’ve evolved consumers’ expectations of what an application should be. The bad news is developers need to get smart on how to process lots of data,. The good news is the tools to do it are getting less expensive by the day.

An example Suro workflow. Source: Netflix

Netflix has open sourced a tool called Suro that collects event data from disparate application servers before sending them to other data platforms such as Hadoop and Elasticsearch. It’s more big data innovation that hopefully finds its way into the mainstream. Read more »

On The Web

This is an interesting (and pretty funny) post from MailChimp data scientist John Foreman about analyzing email addresses. For example, Gmail and Hotmail are similar in terms of number and age of users (although possibly for different reasons), as well as preferred browser. AOL and Comcast email users, on the other hand, are older and interested in way different things than Gmail users. Oh, and a surprising number of people still use the AOL browser.

167891047page 8 of 47

You're subscribed! If you like, you can update your settings