Today, we celebrate Global Accessibility Awareness Day by reflecting on Meta's ongoing commitment to fostering a more accessible future. Our efforts to build and enhance accessibility features are integral to creating meaningful impacts for everyone. In this article, I am proud to share some of our latest advancements aimed at improving accessibility for all users.
The introduction of Ray-Ban Meta glasses marks a significant step in providing a hands-free solution for navigating daily life, particularly benefiting the blind and low-vision community. These glasses integrate Meta AI features, enabling users to capture and share photos, send text or voice messages, make phone calls, take video calls, listen to music, and even translate speech in real-time. This technology empowers individuals to interact with their environment and stay connected with loved ones.
Since the launch of Ray-Ban Meta, users have shared millions of moments, showcasing the various ways these glasses enhance connectivity. Today, we are excited to announce a new customization feature for Meta AI, allowing it to provide detailed responses based on the user's environment. This enhancement will start rolling out to users in the U.S. and Canada in the coming weeks, with plans for broader availability in the future. Users can easily enable this feature in the Device settings section of the Meta AI app by toggling on detailed responses under Accessibility.
Additionally, our Call a Volunteer feature, developed in collaboration with Be My Eyes, will be launched in all 18 countries where Meta AI operates later this month. This innovative feature connects blind and low-vision individuals with a network of sighted volunteers in real-time, assisting them in completing everyday tasks with ease.
Our commitment to accessibility extends to the development of wristband devices designed to enhance human-computer interactions (HCI) for individuals with diverse physical abilities. We are actively researching the potential of sEMG (surface electromyography) wristbands, which use muscle signals as a means of input. These wristbands are particularly promising for users who may have limited mobility due to conditions such as spinal cord injuries or tremors.
The sEMG wristband technology utilized in our Orion AR glasses prototype represents our latest advancements in this field. By investing in collaborative research focused on accessibility use cases, we have made significant progress. Recently, we completed data collection with a Clinical Research Organization (CRO) to evaluate how individuals with hand tremors can utilize sEMG models for computer controls, such as swiping and clicking. Furthermore, we are collaborating with Carnegie Mellon University to empower people with hand paralysis to use sEMG-based controls for HCI, enabling them to communicate effectively from the outset of system use.
We are dedicated to making the metaverse more accessible by integrating live captions and live speech capabilities into our extended reality products. Live captions convert spoken words into text in real-time, allowing users to read along as content is delivered. This feature is available across various platforms, including the Quest system, Meta Horizon calls, and Meta Horizon Worlds.
Our live speech feature transforms text into synthetic audio, providing an alternative communication method for those who may find verbal interactions challenging or prefer not to use their voice. Since its launch, we have seen high user retention, prompting us to roll out enhancements like personalized saved messages.
Moreover, our Llama collection of open-source AI models is being utilized to enhance accessibility. Developers at Sign-Speak have integrated their API with Llama to create a WhatsApp chatbot that translates American Sign Language (ASL). This innovative software enables Deaf individuals to sign ASL into a device, which then translates it into English text for hearing users. In return, hearing users can communicate via voice or text, and the software translates their messages into ASL through an avatar for Deaf individuals.
At Meta, we are steadfast in our commitment to invest in features and products that simplify communication and connection for all users. We will continue to evolve and adapt our offerings to meet the diverse needs of the billions of individuals who rely on our technology worldwide. Together, we can create a more inclusive and accessible future.