Microsoft empowering people with disabilities via Ability Summit

Microsoft wants to empower people with disabilities. So far, every Surface product Microsoft has released is not really capable of allowing folks with disabilities to feel at home. The latest Surface tablet is capable of tracking the eye movement of folks with ALS, however, software engineer, Jay Beavers, wants to do more than what has been dole already to make the life of the disabled easier when using Microsoft products.

microsoft Ability Summit

Microsoft Ability Summit

“If you look at the technology for people with ALS, there’s more we can do,” Beavers said.  “Our ultimate goal is to empower people with ALS to do more — talk more easily, play with their kids and move their wheelchairs independently.”

This week, Microsoft is holding its Ability Summit. This is where engineers from within the company, both able and disable, along with those from the outside. The goal here is to create the next best thing that are able to empower those with disabilities.

“We think about how we imagine stuff, build amazing new products and services, and enable people with disabilities to do more and build more inclusive workplaces,” according to Jenny Lay-Flurrie, senior director of the Trusted Experience. “We have an opportunity at Microsoft to empower the world.”

Included at the summit, is hackathon that will have several teams creating accessibility projects from in the day until the late hours of the night. From what we understand, the best will be given an opportunity at Microsoft to help the company increase its talent pool and insight.

The summit is also good for diversity and to show that Microsoft is a company with an excellent working environment, everyone.

The winner of last year’s hackathon was Beaver’s team. We understand that the team found out that folks with disabilities would rather be able to communicate with loved ones better than to have improved mobility. It means the focus at this year’s hackathon could be all about communication.

The team is working on a device that allow people with ALS to type by just looking. By looking at the keys on a keyboard, the software should be able to track eye movements and pick up where the user is looking, then automatically choose the correct letter.

We can’t wait to see this device in action, as it could actually become a game changer.

Posted by with Tags
Vamien McKalin possesses the awesome power of walking on water like a boss. He's also a person who enjoys writing about technology, comics, video games, and anything related to the geek world.