In Microsoft Teams, we believe that communication should be inclusive, seamless, and empowering for everyone. As part of that commitment, we are excited to announce the availability of a new feature called Sign language mode - a step forward for us in bridging communication and collaboration gaps and creating more inclusive meeting experiences for sign language users. Developed in collaboration with our internal Deaf and hard-of-hearing (D/HH) community, Sign language mode is designed to support the unique needs of signers and their interpreters in Teams meetings. It introduces role-aware capabilities that enhance the discoverability of signers and improve video quality for those using sign language. Sign language mode also uses a new, state-of-the-art sign language detection model to ensure that D/HH participants have the same presence as their hearing counterparts while they are signing.
What’s New in Sign Language Mode
- Self-identification - D/HH signers and interpreters can now identify their roles in meeting settings. This allows Teams to make appropriate decisions that better support sign language–related scenarios.
- Improved video quality - Participants who identify as sign language users will automatically receive enhanced video fidelity, ensuring that their signing is clearer and easier to follow throughout the meeting.
- Interpreter visibility - Interpreters are now easier to locate on the meeting stage and in the participant roster, helping D/HH participants discover and stay visually connected with their interpreters.
- Automatic sign language detection - Our new sign language detection model recognizes when someone is actively signing and automatically elevates them to active speaker status, complete with a border around their video tile, just as it would for someone speaking through audio. This ensures that the D/HH signer can be seen by everyone while they are actively participating in the meeting.
Using Sign Language Mode
To learn how to enable and use Sign Language Mode in Microsoft Teams, including step-by-step instructions and feature details, please visit our official support article: Use Sign Language Mode in Microsoft Teams Meetings
Behind the scenes of the Sign Language Detection feature
A new post on the Microsoft Design blog highlights how Deaf leadership, inclusive design, and cross-disciplinary collaboration shaped the development of Sign Language detection in Microsoft Teams. Read the full story to see how this work came together and what it means for the future of accessible communication: Centering Sign Language in AI and design: A Deaf-led approach to making Sign Language a core principle of inclusive design.
What’s Next
We will continue to evolve Sign Language Mode over the coming months, continuing to focus on improving how D/HH signers and their sign language interpreters appear, participate, and are represented throughout the Teams meeting experience.
Microsoft is committed to delivering the best possible experience for all customers. If you have a disability or need assistance with accessibility features in any Microsoft product, please contact the Microsoft Disability Answer Desk. This dedicated support team is trained to work with many popular assistive technologies and offers assistance in English, Spanish, French, and American Sign Language (ASL) via direct video.