Sophisticated Smartphone Apps Helps Blind Users with Indoor Navigation

Image by Unsplash

This post is also available in: עברית (Hebrew)

Researchers at UC Santa Cruz have developed two innovative smartphone applications designed to assist blind individuals in navigating indoor spaces. These apps provide spoken directions, enabling users to safely find their way in areas where GPS is ineffective, such as buildings with thick walls that block satellite signals.

In a recent paper published in ACM Transactions on Accessible Computing, Professor of Computer Science and Engineering Roberto Manduchi and his team introduce two distinct applications that allow users to navigate to specific points and trace back their previous routes. These apps are designed to operate without requiring users to hold their smartphones in front of them, which can be cumbersome and draw unnecessary attention.

Smartphones serve as an ideal platform for this technology due to their affordability and built-in sensors, which are critical for navigation. Unlike other systems that require users to carry their phones out in front, Manduchi’s apps allow users to keep their hands free for guide dogs or canes, enhancing safety and convenience. Furthermore, existing indoor navigation solutions from tech giants like Apple and Google often depend on costly sensor installations within buildings, limiting their scalability.

Manduchi’s system functions similarly to GPS services like Google Maps but employs the smartphone’s inertial sensors, accelerometers, and gyroscopes to provide accurate spoken instructions. The app maps the building’s interior and tracks users’ movements, ensuring they stay on course. To overcome potential inaccuracies in location tracking, the researchers incorporated particle filtering, a technique that helps maintain logical navigation paths and prevents erroneous interpretations, such as walking through walls.

The second app allows users to retrace their steps, ideal for those guided into a room who wish to exit independently. On top of the inertial sensors, this app utilizes the phone’s magnetometer to identify magnetic field anomalies, which serve as landmarks for easier navigation.

According to TechXplore, both applications communicate directions through spoken instructions, with the option to use smartwatches for vibration alerts. Manduchi emphasizes the importance of sharing navigational responsibility between technology and the user. “You need to work with the system,” he said, suggesting that users should still engage their senses and judgement during navigation.

During testing in the Baskin Engineering building at UC Santa Cruz, participants successfully navigated the facility’s hallways and turns. Looking ahead, the team plans to incorporate AI features that would allow users to take photos of their surroundings for scene descriptions, especially in challenging areas. They also aim to facilitate access to building maps, potentially through an open-source software ecosystem.

With these advancements, Manduchi and his team are poised to significantly enhance the independence and safety of blind individuals in indoor environments.