Blog Post

Don’t eat that! SRI built a calorie-counting food app that works via a photo snap

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

The wonks over at SRI, the lab responsible for technology behind Apple’s Siri, have come up with a pretty sweet way to apply image recognition to food. Dror Oren, executive director at SRI Ventures, says SRI has built a method that could let users snap a photo (or several) of their plates to get an approximate calorie count. Users of Meal Snap, which uses a camera and Amazon’s Mechanical Turk service to count calories, will find the idea familiar. (There’s also this futuristic device that uses laser spectroscopy to measure what’s in your food.)

Instead of apportioning the job out to Mechanical Turk or lasers, however, SRI’s project, internally dubbed Ceres, lets computers do the work of estimating calorie count. To do this, the SRI folk had to solve two problems. The first was an image recognition problem — namely, figuring out what’s on your plate. The second was a volume problem, assessing how much is on your plate.

Oren explains that because of hidden fats like oils and butter used in foods, you can’t get a precisely accurate count of the calories in all foods, but he stresses that for many people, getting a range — between 400 and 600 calories, for instance — is still helpful. The plan is also to use context clues about where a person is or what they have eaten in the past to figure out a more accurate calorie count. For example, if you’re taking a picture your burrito at Chipotle, Ceres could figure out the right questions to ask to approach a more accurate calorie count.

Norman Winarsky, the VP of SRI Ventures, expanded the concept to include heard conversations at the table that could be sent to the cloud and other information, perhaps pulled in from other apps like food logging services. The primary technology, though, is the image recognition. “If a human can recognize the food, so can the app,” said Oren.

I’m a bit skeptical, give the preponderance of food options like wheat bread that looks like white bread or chili made with turkey instead of ground beef, but I like the idea and the simplicity. The image processing and recognition is done in the cloud, and the algorithms were developed by many members of the same team that helped invent HD television.

The idea to try to recognize and track food came for the National Institute of Health, and the idea is to find partners who can parlay the image and volume recognition technology into mainstream consumer life. So it might be an app, a partnership with a company like The Orange Chef or even integration into cooking shows.

Winarsky says SRI is still looking for more partners and expects the technology to hit the mainstream in about a year. I’m just excited that we may soon have an easier way to count calories and the beginnings of a database for food.

6 Responses to “Don’t eat that! SRI built a calorie-counting food app that works via a photo snap”

  1. Peter Simones

    The fallacy here – and in most apps focused on food consumption – is that calorie tracking is necessary/correct. TwoGrand is an app and community built around photo food journaling without calorie counting. You can log a full day of food and exercise in under 3 minutes. No app should get in the way of actually eating your food. And the act of journaling alone provides 90% of the value of tracking your intake.

    • I agree completely but did not want to engage in that discussion. It would be interesting to know, though, whether the founders believe in calorie tracking or they do it only because the mainstream requires so. The latter would be more problematic.

  2. I think it’s better both for SRI and the possible partners that the deal never gets made, simply because this will never work. I’m talking as an image recognition expert and I think this concept is full of holes.

    – “If a human can recognize the food, so can the app,” said Oren. –
    Well, this is not the problem you are solving. You should say ‘if a human can count the calories in a plate, so can the app’. Have you tested this? I seriously doubt that this is possible, even to a 1000-calorie range, given the diversity and complexity of foods.

    Guys, there are many other unsolved problems out there, please find something more realistic.

  3. djones

    Carb. This needs to have amount of carbohydrates added in to it’s information. Doing this would be a HUGE for Type 1 Diabetics counting carb loads for bolusing insulin.
    HUGE i tell you.

  4. realjjj

    That’s better for glasses than phones and if it would integrate location into guessing what the food is since it would help them build a database quickly for places that serve food. Next step would be to learn the user’s habits ofc.
    But then again, if they can do image recognition well they can make lots of apps for glasees that help people build,repair things ,cook ,do gardening and so.