U of M, MDP students have been hard at work validating different technologies to solve their problem in creating an in-building pedestrian navigation system for blind and visually-impaired persons. They’ve landed on Tango, the augmented reality (AR) platform from Google.

A recent trip to the Detroit Institute of Arts (DIA) led a team of students from the University of Michigan Multidisciplinary Studies Program down an unexpected path in creating the campus of the future. Now three months into a partnership with Vectorform as mentors, the team of students is working throughout 2017 to create an in-building pedestrian navigation system for blind and visually-impaired persons. The students recently presented their latest design, and I’m excited about the direction they’re taking.

The students have been hard at work validating different technologies to solve their problem, and they’ve landed on Tango, the augmented reality (AR) platform from Google. Tango was recently in the news because of its use at the DIA.

The team took a trip out to the museum to see what it was all about, and came back convinced that Tango is what they need. So, why would they use an AR platform for people with limited sight? Well, one of the trickiest parts of AR is creating a digital representation of physical space. That also happens to be the biggest hurdle of our project. However, Tango will handle creating this digital representation for us with a special mix of hardware and software. This has an added bonus of not requiring any modifications to a building, which makes replicating this system in other buildings easier.

“In time, the technology should become more efficient, affordable, and widespread. As we’re proving now, Tango will have use beyond just AR, although I’m sure it’ll do great with that as well.”

The students are working with the Lenovo Phab 2, the first phone to feature Tango. The phone comes with a large battery, an infrared projector, an infrared camera, a motion tracking fisheye camera, and usual motion tracking hardware like a gyroscope and accelerometer. The phone will use all this sensor data to give us motion tracking, area learning, and depth perception. With this digital representation of our world available to the team, they can focus directly on the core requirements of their project, which is to get someone with limited sight from point A to point B without walking into anything unexpected.

Someone might ask, “why would they use a platform that doesn’t exist on anyone’s phone?” The team believes that someday soon this type of technology will be on everyone’s phone. In time, the technology should become more efficient, affordable, and widespread. As we’re proving now, Tango will have use beyond just AR, although I’m sure it’ll do great with that as well.

For more information on this project, please visit the U of M, MDP website, and check back for updates on Vectorform’s News & Views page.

Related News & Views