Be My Eyes app harnesses volunteers to help sight-impaired people


Credit: Be My Eyes

The Danish developer Robocat, which I’ve previously covered for its Thermodo thermometer dongle, has fully launched a new app that could be of great use to visually-impaired people.

Robocat built the open-source Be My Eyes iOS app for a non-profit startup of the same name. It’s quite a simple concept: visually-impaired people use the camera on their mobile device to shoot live video of whatever it is they need distinguishing or reading, and a sighted volunteer on the other end tells them what they need to know.

“We have launched this locally in Denmark for few months now. We have about 700 helpers. [company]Apple[/company] is planning to feature the app in the App Store which will help to promote it and get more helpers,” Robocat founder and lead developer Willi Wu told me. “We are not only looking for helpers, but also blind people so they can get the help when they need it. So we are trying to get the word of mouth out to people who know blind people that could be useful for them.”

Right now the idea is to get traction on iOS before looking at other platforms, Wu said, pointing out that this is a non-profit venture. He added that the app’s points system and feedback makes it possible to rate the helpers.

As I say, it’s a straightforward concept that could be very useful, as long as that volunteer base grows sufficiently and becomes diverse enough to handle visually-impaired users who speak different languages, for instance. Other interesting apps in this space include KNFB Reader, which provides an audio read-out of printed text captured through the camera, and others that you can find listed here.

Here’s a video of how Be My Eyes works:

This article was updated at 4.45am PT to note Wu’s comments on mobile platforms and volunteer rating.


Jeannette Isabella

I just got a notification but then it disappeared. I really wanted to help and now it tells me to sign in. I must be signed in already to have gotten a notification. Can anyone tell me what I did wrong? Thank you.

Martijn van der Spek

We have developed a visual recognition app as well: ‘Talking Goggles’. This doesn’t use volunteers as it uses computer vision to recognize objects, but it is free and available for Android as well as iOS.


we hope quality captioning is added asap to this video – we believe “deaf/hoh” may help blind and “blind” may help deaf, and there are also many “deaf/blind” in our wide wonderful world – we also have some volunteer captioning folks who may have time to help –

Michael W. Perry

Clever idea. I do hope they have a option for location-specific help. Navigating around a city will go better if the sighted person knows the city and can say, “Oh, you’re at Third and Vine. The place you want to go is just around the corner.”


How do they select helpers? You can’t have kids having fun sending sight-impaired people into walls or worse.
This would be better for Google Glass though and Glass would also act as a magnifier for those that have some vision. Glass has audio too, without occupying a hand.
Or on LG G3 using the auto-focus laser would be useful.
So i guess ,glasses with project Tango would be kinda ideal.


Given the BOM there is no reason for something like Glass to be more than 200$ and drop to 100$ and less once the category takes off.
According to WHO
“285 million people are estimated to be visually impaired worldwide: 39 million are blind and 246 have low vision.
About 90% of the world’s visually impaired live in low-income settings.”

So 500$ would not be anywhere near accessible for most that need it , an effective solution needs to be much cheaper.
Just like targeting iOS with this app is nonsense given Apple’s ridiculous pricing. India for example has a lot of blind people and chances are they could find a way to buy a 40$ Android phone but almost none could afford a 840$ ipnone.

Comments are closed.