At its 14th Capabilities Summit, which begins today, Microsoft is highlighting developments and collaborations across its portfolio of assistive products. Much of this is around Azure AI, including features announced yesterday such as AI-powered audio descriptions and Azure AI Studio, which better enables developers with disabilities to build machine learning applications. It also showed new updates such as more languages ​​and richer AI-generated descriptions for its Seeing AI tool, as well as new manuals offering best practice guidance in areas such as building accessible campuses and greater support for mental health.

The company also previewed a feature called “Talk About Me” coming later this year. Like Apple’s Personal Voice, Speak For Me can help people with ALS and other speech disabilities use personalized neural voices to communicate. Work on this project has been ongoing “for some time” with partners such as the nonprofit ALS Team Gleason, and Microsoft said it is “committed to ensuring this technology is used for good and plans to launch later this year.” The company also shared that it is working with Answer ALS and the ALS Therapy Development Institute (TDI) to “nearly double the clinical and genomic data available for research.”

One of the most significant accessibility updates coming this month is this Copilot will have new accessibility skills which allow users to ask the Assistant to launch Live Caption and Narrator, among other assistive tools. The Accessibility Assistant feature announced last year will be available today in Insider Preview for M365 apps like Word, with the company saying it’s coming “soon” to Outlook and PowerPoint. Microsoft also published four new playbooks today, including a mental health toolkit that covers “advice for product makers to build experiences that support mental health conditions created in partnership [with] Mental Health in America.

Ahead of the summit, the company’s chief accessibility officer Jenny Leigh-Flurry spoke with Engadget to share more insight into the news, as well as her thoughts on the role of generative AI in building assistive products.

“In many ways, AI is not new,” she said, adding that “this chapter is new.” Generative AI may be all the rage right now, but Lay-Flurrie believes the core principle her team relies on hasn’t changed. “Responsible AI is affordable AI,” she said.

Still, generative AI can bring many benefits. “However, this chapter unlocks some potential opportunities for the accessibility industry and for people with disabilities to be more productive and use technology to fuel their day,” she said. She pointed to a survey the company did with the neurodiverse community around Microsoft 365 Copilot, and the response from several hundred people who responded was “it cuts down on the time for me to create content and shortens the gap between thought and action,” Lay-Flurry said.

The idea of ​​being responsible in embracing new technological trends in designing for accessibility is not far from Lay-Flurrie’s mind. “We still have to be very principled, careful, and if we hold back, it’s to make sure we protect those basic accessibility rights.”

Elsewhere at the summit, Microsoft is featuring guest speakers such as actor Michelle Williams and its own employee Katie Jo Wright, who discuss mental health and their experiences living with chronic Lyme disease, respectively. We will also see the Rijksmusem in Amsterdam share how it goes uses Azure AI computer vision and generative AI to provide image descriptions of over a million artworks for visitors who are blind or have low vision.

This article contains affiliate links; if you click on such a link and make a purchase, we may earn a commission.

https://www.engadget.com/microsofts-neural-voice-tool-for-people-with-speech-disabilities-arrives-later-this-year-161550277.html?src=rss