4 Comments

Summary:

In a quest to make faster chips and deliver low-power computing, scientists have creating good-enough chips that instead of performing every calculation to its exact decimal point, are allowed to make mistakes. This field of computing could improve big data analysis, networking and even hearing aids.

Avinash Lingamneni uses the new pruning technique.

Avinash Lingamneni uses the new pruning technique.

In a quest to keep pushing the envelope on faster chips and low-power computing, scientists have embarked down the path of creating good-enough chips that, instead of performing every calculation to its exact decimal point, are allowed to make mistakes. Such chips may help with big data processing, networking, and even specific-application chips such as those for hearing aids or camera sensors. Today, Rice University and a team of international researchers released another finding in this field of probabilistic computing with their discovery that by cutting away parts of an integrated circuit that are rarely used, they can double the performance while cutting the power.

Krishna Palem, the Ken and Audrey Kennedy Professor of Computing at Rice University in Houston, who holds a joint appointment at Nanyang Technological University (NTU) in Singapore, said the team has boosted performance and cut energy use by eliminating unnecessary portions of the integrated circuits that are typically used in hearing aids, cameras and other multimedia devices. The technique is called “probabilistic pruning.” The researchers found by accepting an 8 percent error rate they can cut energy consumption in half and double performance, all with a smaller chip.

Palem and collaborators at Switzerland’s Center for Electronics and Microtechnology (CSEM) are unveiling the new “pruning” technique this week in Grenoble, France, at DATE11, a European conference on microelectronics design, automation and testing. Pruning is another method of building chips that can deliver results without consuming so much power. We covered Palem’s efforts in this space two years ago:

Reminiscent of the Infinite Improbability Drive in the Hitchhiker’s Guide to the Galaxy, a PCMOS chip has a tolerance for randomness that researchers have termed “probabilistic logic,” which it uses to solve problems rather than the Boolean logic used in digital circuits. Instead of making calculations that are accurate pretty much all of the time, a designer can program this chip to be right 8 times out of 10 — or even less. Lowering the threshold of accuracy generates the power savings and speed.

The test prototypes contain both traditional circuits and pruned circuits that were produced side by side on the same silicon chip, and the goal is to begin designing a prototype chip that relies on probabilistic pruning for a hearing aid this summer. With the power savings, the hope is that a hearing aid or even a camera sensor could run four to five times longer on a single battery than it does using normal chips. The consensus is that with certain applications such as auditory or visual processing, having a chip that isn’t exactly accurate will still create workable representations of the real world.

Hearing aids aren’t the only application where researchers want to deploy probabilistic computing, Om has written about how Lyric Semiconductor, a firm coming out of MIT, hopes to use probabilistic computing to help process big data even faster. The company has raised $20 million to help improve Flash memory, networking functions and big data processing through its technology. The catch for Lyric and for any company with probabilistic computing goals, though, is that it changes the way software is written and thought about, so the need or perceived benefits will have to be huge to justify programming changes or applications that adopt the specialized computing will be confined to niche spaces.

You’re subscribed! If you like, you can update your settings

  1. From the first paragraph it appears GigaOm is using such processors to run spell-checking functions. If so, the error rate is more than acceptable.

    Sounds like this works for streams or blocks of data where where the majority of correct data can determine a pattern or context to allow correction of the occasional bad data (so we read ‘chips’ in the first paragraph even though it is spelled with an extra ‘v’).

    But I’d hate to use one of these for discrete and significant calculations.

    1. It’s actually the reverse. The copy editor is unfortunately, not running a chip utilizing probabilistic pruning, and is therefore, too slow to get to all posts before they publish. I do try, though.

  2. I don’t quite understand this. E.g. software is always assuming zero errors on the code and zero errors on the memory. The only aspect that is normally allowed to have any tolerance is I/O.

  3. Goggle Tandem and TMR (Triple Modular Redundancy).
    Apps dont need to change to run on faulty hardware.

Comments have been disabled for this post