Blog Post

Working with ALS – Insights from the Ability Summit


The 14th annual Ability Summit is a global event that I attended a few weeks ago. It is hosted by Microsoft, and it presents the latest technology innovations and best practices for accessibility and inclusion. The event has three main session tracks: Imagine, Build, and Include. Each track examines different aspects of how technology can enable people with disabilities and make the world more inclusive. The event is free, and anyone can register online to attend. All sessions are recorded and can be watched at any time on demand.

Ability Summit 2024 Highlights

As we think about our enduring commitment and goal at Microsoft, which is to build that culture of accessibility and embed it into everything we do, grounded always by the insights of people with disabilities. – Jenny Lay-Flurrie

In the first keynote, Microsoft CEO Satya Nadella and Chief Accessibility Officer Jenny Lay-Flurrie talked about how AI can remove obstacles and create more accessible experiences, while also addressing the challenges and concerns of responsible AI. The keynote showed several examples of how AI can help people with disabilities, such as voice banking for people with ALS, descriptive audio for people with low vision, and Copilot for people with diverse speech patterns. It was very impressive to see Team Gleason featured as a partner with Microsoft to work on AI to help the ALS community preserve their voice.

Team Gleason and Microsoft Team Up to Give an ALS Person His Voice Back

As a platform company, we have to absolutely lean into that and make sure that everything we’re doing, whether it’s Copilot and Copilot Extensibility or the Copilot stack in Azure is all ultimately helping our ISVs, our customers, our partners, all achieve their own goals around innovation, around accessibility. – Satya Nadella

Build Session: Bridging the Disability Divide with AI

The conference had many sessions and keynotes, but this one about the disability divide and AI was very interesting to me. These are three main points I learned from this session: 1) how people with disabilities are benefiting from AI in their personal and professional lives; 2) advice on how to begin and advance the AI journey with accessibility as a priority; 3) the significance of accessibility as a basic value for developing technologies that enable everyone.

This session also provided some resources and opportunities for us to learn more about AI and accessibility, such as the Accessibility Bot, which is a chatbot that can answer questions about Microsoft’s products and services regarding accessibility topics; the AI Studio, which is a platform that allows users to explore and build AI applications using various cognitive services and SDKs; and the AI Platform Team, which is a group of developers and researchers who work on making AI more accessible and inclusive.

In Real Life

I belong to the ALS community (I have ALS), and I rely on a lot of accessible technology both hardware and software to accomplish work. I used a combination of Voice Access in Windows 11, a Stream Deck foot pedal, a foot pedal joystick on my wheelchair and Microsoft 365 Copilot to write this blog post. Voice Access helps me with dictation and specific commands like selecting paragraphs or capitalization. A Stream Deck allows me to do backspace and deletes. A foot pedal joystick acts as a mouse. Copilot assists me with summarizing and rewriting content. As you can tell, we need a whole set of tools to suit our needs, and there is no single tool or method that works for us. I’m excited to see how AI will enhance accessibility for all of us. My goal is to keep sharing the tools and techniques I use to live and work with ALS through my blog and YouTube channel.

Original post (opens in new tab)
View comments in original post (opens in new tab)


You rated this post out of 5. Change rating




You rated this post out of 5. Change rating