Wearable Computer-Vision Rig for the Visually Impaired (Case 2241)

Wearable Computer-Vision Rig for the Visually Impaired

Brown University professor Benjamin Kimia, a leader in computer vision and image processing as well as artificial intelligence, has developed a wearable device that can help the visually impaired navigate indoor environments. Called BlindFind, the wearable rig uses cameras, object recognition algorithms, and motion-sensing technologies that are already common in consumer electronics. It marries them to a proprietary data set of navigation information about public indoor spaces such as government buildings, airports, and schools. BlindFind can “see” on behalf of its user and guide them via haptic or audio cues. 

Market Opportunity
The advent of Global Positioning System (GPS) technology revolutionized the way we navigate the outside world, as GPS trackers in our smartphones and vehicles can show a person’s exact location and route them turn-by-turn to their destinations. But navigating indoor spaces is not so simple, and technology to help people find their way around complex indoor spaces has lagged behind.

According to the CDC, some 12 million Americans over the age of 40 have some kind of vision impairment, including nearly 1 million who are entirely blind. BlindFind would allow the visually impaired to ask their smart devices to route them to a bathroom, for example, or to a particular gate in an airport. And while Kimia and colleagues developed the device with blind people in mind, it could be used by sighted people to find their way through large and complex spaces such as transportation hubs and museums, where old-fashioned analog maps remain the primary wayfinding technology.

Innovation and Meaningful Advantages 
BlindFind is a wearable computer-vision rig that features multiple hardware components linked to a mobile device, most likely a smartphone. Four cameras—two on the front and two on the side—are mounted on a custom piece of eyewear. Algorithms process the images from these cameras to determine the wearer’s position in their indoor environment. The system could direct the wearer to his or her destination via auditory instructions delivered through bone conduction headphones, or via a haptic belt that could vibrate to indicate to the wearer which direction to walk. 

A key part of BlindFind is its ability to improve its navigational performance over time. When wearers use the device to move through a building, the images of that space will be stored in a central data repository. As more users move through that space, the system will refine its own understanding of that space. Users can also annotate the map by adding points of interest, such as the location of a bathroom or an obstacle that users must navigate around. 

Collaboration Opportunity
We are seeking a licensing opportunity for this innovative technology. Potential customers include: the visually impaired, operators of government buildings or transportation hubs. 

Primary Investigator 
Benjamin B Kimia, PhD
Professor of Engineering
Brown University
Brown tech ID #2241
Benjamin_Kimia @brown.edu


Patent Information:
For Information, Contact:
Brown Technology Innovations
350 Eddy Street - Box 1949
Providence, RI 02903
Benjamin Kimia
© 2024. All Rights Reserved. Powered by Inteum