Summary:

Thanks to its camera, Google Glass can see the world around it. And with Emotient’s software, Glass can even read the emotions of people, offering real-time data that could be useful in education, retail and other market segments.

MarianSurprised

San Diego startup Emotient added $6 million in Series B funding and announced the private beta of a new Google Glass app that uses the company’s Emotion API technology. The Next Web says the Glassware builds upon Emotient’s prior effort of reading emotions from people in images or in real-time through camera software. Beta testers with Emotient’s software on Glass will get instant feedback on emotions as Glass scans and sees people.

google glass emotions

Emotient says the Google Glass beta software provides “Sentiment Analysis”:

The Emotient software processes facial expressions and provides an aggregate emotional read-out, measuring overall sentiment (positive, negative or neutral); primary emotions (joy, surprise, sadness, fear, disgust, contempt and anger); and advanced emotions (frustration and confusion). The Emotient software detects and processes anonymous facial expressions of individuals and groups that the Glass wearer sees to determine an aggregate sentiment read-out; it does not store video or images.

While someone wearing Glass can very likely determine the general sentiment of someone they’re speaking with, there’s still value in software understanding emotions as well. Emotient will be testing the Glassware in the retail sector, for example, where vendors could better understand reaction to certain promotions or products; theoretically, software could suggest alternative selling points based on a customer’s emotions.

The auto industry is keen on Emotient’s technology as well. Ken Denman, CEO of Emotient, told The Next Web that Honda was an early Emotient customer because the technology can help engineers and designers understand that features customers do and don’t like in a vehicle.

The beta software isn’t something that most Google Glass owners today would likely use, but I see potential for the technology in various industries here beyond retail and automotive.

Educators and teaching apps could tailor learning modules based on the frustration level of a student, for example. Restaurants could quickly see if diners really aren’t liking their meal even before being asked “How did everything turn out?” Again, humans can already detect emotions fairly well on the surface, but having some software enhancements and tools to help could be very useful.

You’re subscribed! If you like, you can update your settings

Comments have been disabled for this post