Name says it all

Be My Eyes app harnesses volunteers to help sight-impaired people

The Danish developer Robocat, which I’ve previously covered for its Thermodo thermometer dongle, has fully launched a new app that could be of great use to visually-impaired people.

Robocat built the open-source Be My Eyes iOS app for a non-profit startup of the same name. It’s quite a simple concept: visually-impaired people use the camera on their mobile device to shoot live video of whatever it is they need distinguishing or reading, and a sighted volunteer on the other end tells them what they need to know.

“We have launched this locally in Denmark for few months now. We have about 700 helpers. [company]Apple[/company] is planning to feature the app in the App Store which will help to promote it and get more helpers,” Robocat founder and lead developer Willi Wu told me. “We are not only looking for helpers, but also blind people so they can get the help when they need it. So we are trying to get the word of mouth out to people who know blind people that could be useful for them.”

Right now the idea is to get traction on iOS before looking at other platforms, Wu said, pointing out that this is a non-profit venture. He added that the app’s points system and feedback makes it possible to rate the helpers.

As I say, it’s a straightforward concept that could be very useful, as long as that volunteer base grows sufficiently and becomes diverse enough to handle visually-impaired users who speak different languages, for instance. Other interesting apps in this space include KNFB Reader, which provides an audio read-out of printed text captured through the camera, and others that you can find listed here.

Here’s a video of how Be My Eyes works:

[vimeo 113872517 w=500 h=281]

This article was updated at 4.45am PT to note Wu’s comments on mobile platforms and volunteer rating.

8 Responses to “Be My Eyes app harnesses volunteers to help sight-impaired people”

  1. Jeannette Isabella

    I just got a notification but then it disappeared. I really wanted to help and now it tells me to sign in. I must be signed in already to have gotten a notification. Can anyone tell me what I did wrong? Thank you.

  2. Martijn van der Spek

    We have developed a visual recognition app as well: ‘Talking Goggles’. This doesn’t use volunteers as it uses computer vision to recognize objects, but it is free and available for Android as well as iOS.

  3. we hope quality captioning is added asap to this video – we believe “deaf/hoh” may help blind and “blind” may help deaf, and there are also many “deaf/blind” in our wide wonderful world – we also have some volunteer captioning folks who may have time to help – ccacaptioning@gmail.com

  4. Michael W. Perry

    Clever idea. I do hope they have a option for location-specific help. Navigating around a city will go better if the sighted person knows the city and can say, “Oh, you’re at Third and Vine. The place you want to go is just around the corner.”

  5. How do they select helpers? You can’t have kids having fun sending sight-impaired people into walls or worse.
    This would be better for Google Glass though and Glass would also act as a magnifier for those that have some vision. Glass has audio too, without occupying a hand.
    Or on LG G3 using the auto-focus laser would be useful.
    So i guess ,glasses with project Tango would be kinda ideal.