Microsoft developer Saqib Sheikh who lost his eyesight at the age of seven has developed an app to help the visually challenged.
Microsoft released a video in their developer conference Build 2016 to explain how the intelligent software system called Seeing AI works.
In the video, Sheikh tells his story that how he went to the blind school where he was introduced to talking computers. After joining Microsoft as a software engineer, he always wanted to develop apps which would help others.
The app developed by him can help the blind to read out text from posters or menus by clicking a photo of them. The app also guides them on how to take the photo. For instance, if the person is visiting a hotel the app will guide them on how to move the phone so a full picture of the menu can be taken. After taking the picture the person can command the app to read out the menu.
Sheikh said in the video that, "The app not only runs on the smartphone but Pivothead smart camera sunglasses as well so the person can be hands-free". The Pivothead sunglasses are smart wearable devices which have a camera. The person can take photos or videos simply by touching the side panel of the camera. And the Microsoft Seeing AI will recognise objects like "A young man on a skateboard jumping in the air". The person using the smart sunglasses can hear it through the built-in sound feedback system.
If the blind person is talking to a group of people and want to know the reaction of the people, he can click a picture of the people and the app will recognise them. In the video Sheikh clicks the picture and the app tell him the description of two persons as "40-year-old bearded male looking surprised" and "20-year-old woman looking happy".
"It is an honour to share a stage with Saqib today. He took his passion and empathy to change the world. He is here to inspire and teach about these applications. Microsoft build is all about taking the dreams we have to reality", said Microsoft CEO Satya Nadella.
Microsoft had earlier released a website How-Oldwhere they tried to guess the age of the person based on the web came snapshot. They had also launched Captain Bot AI where the website tried to identify the objects in the picture.
Recently, Facebook debuted a new way for visually challenged people to experience photos. The social network describes the image to the user in the audio format. For example, the app will say,"Image may contain: two people, smiling, sunglasses, sky, outdoor [sic], water" for the photo above.
Twitter also added image support for assistive technology. Now when you post an image you can add a 420 character description so differently abled people can access the information through screen readers or braille assistants.