Seeing eyeball to eyeball, through a smartphone

Picture this: Following an intense study session, Grand Valley State University student Juanita Lillie feels like kicking back and watching a favorite movie.

This, like many everyday activities, can be complicated for Juanita because she has a visual impairment.

First, she wants to be sure she really deserves that study break. How long was she really studying?

Second, which of the audio-described DVDs on her bookshelf is the movie she wants to pop into the player?

When Juanita needs a set of eyeballs, she picks up her iPhone.

There are mobile applications that leverage the camera on the phone and text-to-speech functionality to read the time on a clock and the title of the video. Some apps also have versions that run on Android and Windows mobile devices.

The newest such app is Be My Eyes, which connects people with visual impairments with sighted people who want to help in real time. Using the smartphone camera like a mirror, the sighted person can describe aloud where the person who is blind is pointing the device.

Since Be My Eyes debuted in January 2015, more than 190,000 sighted people, and more than 17,000 people with visual impairments worldwide, have downloaded the free app.

By April 1, the app had made possible 65,000 instances where people who are blind received real-time help from someone they didn’t know.

Developer of the app is an employee with a visual impairment at the Danish Blind Society. He recognized that momentarily “borrowing” a working pair of eyes through technology would significantly ease everyday challenges for people who are blind.

Juanita has used “Be My Eyes” to select food from her freezer that she wants to defrost, and to identify canned goods.

During the app’s first week, wait times for assistance often stretched to 15 minutes. Sometimes, over-eager helpers quiz her about her disability. Juanita, although grateful for the help, sometimes just needs to return to her cooking.

She suspects those annoyances will dissipate as Be My Eyes’ novelty fades.

“I have another app that I use most for reading printed text to me,” Juanita said, “but an advantage of Be My Eyes is there’s a person on the other end who can tell you how to position the camera so the type is visible.”

The fact that the app connects to a real volunteer could also be its downfall.

“You should be really careful where you’re pointing the camera,” Juanita said.

For example, using Be My Eyes to distinguish your bottles of shampoo and conditioner would not be wise if you’ve already disrobed for a shower.

Jeff Sykes, assistive technology coordinator at GVSU, predicts the greatest value of Be My Eyes will be realized when a user who is blind becomes lost in a parking lot or some other uniform environment. A real-time connection to someone who can read signposts and detect other visual markers can be critical in such situations.

The app could also be very useful if a person with a visual impairment was in a chaotic situation, and unable to determine from his or her other senses how to leave.

Sykes, who has sight, is a registered user of Be My Eyes but has not yet received an assistance call.

About 20 of GVSU’s 25,000 students have low- or no-vision. Sykes typically teaches those who do to use a mobile app called Tap TapSee.

People with vision impairments are often fiercely independent and reluctant to seek help even in small measures, Sykes said. Some connect with family and friends over mobile apps like Facetime and Skype if they’re in a bind, but hate interrupting again and again.

Sometimes a non-human alternative like Tap TapSee is preferred, Sykes said.

The user takes a photo with the phone, and Tap TapSee sends it from the camera roll to a server, which identifies it through object-recognition technology. Seconds later, the app audibly identifies the object.

The process works quick and well, as long as the user snaps the photo 8 to 12 inches from the object. (Set audible auto-focus alerts as a guide.)

Users say audio descriptions aren’t necessarily precise. The user is told there are “yellow flowers,” leaving her to wonder whether they’re daffodils, dahlias or dandelions.

“Good app, but it’s not free forever,” Sykes said.

Downloading Tap TapSee is free. After an initial 100 trial photos, users have to choose a subscription plan. (The user’s next 100 photos would be prepaid at $7.99. He or she could also get three months of unlimited usage for $24.99.)

Juanita said her hands-down favorite mobile app is KNFB, which loads all the functionality of a stand-alone reading machine into her multi-functional smartphone.

KNFB works similar to Tap TapSee in that character-recognition software identifies a photograph and instantly describes it using text-to-speech software.

As long as print in the photo is clear, the app works great, Juanita said. It doesn’t even need an Internet connection.

But KNFB, at $100, is pricey.

Be My Eyes is completely free open source software developed by a nonprofit organization and powered by volunteers.

It’ll be interesting to see whether it catches on with blind and sighted users.