Microsoft announced plenty of exciting stuff at Build 2016, but nothing that moved the audience quite like the new AI tech that could potentially change the lives of visually impaired people.
Using a pair of smart sunglasses and artificially intelligent image recognition software, the Seeing AI platform enables blind people to receive audio descriptions of what’s happening in the world around them.
Written by Microsoft software engineer and Londoner Saqib Shaikh, who lost his sight at aged 7, the platform can recognise activities as well as the gender, approximate age, physical characteristics and emotions of people the user may be talking to.
“When you’re talking to bigger groups, you can talk and talk and there’s no response. You think ‘is everyone listening really well or are they half asleep?’ and you never know,” Shaikh said.
“The app can describe the general age and gender of the people around me and what their emotions are. That’s incredible.”
As shown in an inspiring video during the keynote presentation at Build, the tech can also enable users to position a smartphone app over a restaurant menu and have the items read back thanks to text-to-speech tech.
The idea is a collaboration between Pivothead, the maker of GoPro-like smart glasses and Microsoft’s Cognitive Services team.
Pivothead says the app is still development and there are no immediate plans for release. The manufacturer added that it may be built into a future iteration of its own smart glasses.
Check out the video below. Prepare to be wowed, and beware, you might need to keep a tissue handy.