Manzana wants to make accessible, to all people, all their devices and with iOS 15 will go one step further to achieve it. Without a doubt, there are no devices on the market that are more accessible, for people with disabilities, than iPhone, iPad Y Manzana Watch.
Manzana announced that there are new accessibility features coming at the end of the year, with iOS 15, to its various products and services. These are functions that are designed for people with mobility, vision, hearing and cognitive disabilities.
Sarah Herrlinger, Senior Director of Apple Global Accessibility Policies and Initiatives, said the following about it … “At Apple, we have long felt that the best technology in the world should serve everyone’s needs, and our teams are working tirelessly to build accessibility in everything we do”… “With these new features, we are pushing the limits of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people, and we can’t wait to share it with our users. “
These are the new accessibility features coming with iOS 15:
Apple announces many more in its statement, of which we leave you a link at the end of the article. We show you the most interesting:
AssistiveTouch for Apple Watch:
For users with limited mobility, AssistiveTouch allows you to use the watch without touching the screen or controls. Built-in motion sensors, optical heart rate sensor, and machine learning on the device will allow the Apple watch detect subtle differences in muscle movement and tendon activity that will control a cursor on the screen through hand gestures such as a pinch or squeeze.
Here you have a video where they talk about this wonderful function:
IPad Eye Tracking:
At the end of this year, iPadOS it will be compatible with third-party eye-tracking devices to allow people to control the iPad with their eyes.
The sounds around us can be distracting and cause discomfort or discomfort. As a sign of support for neurodiversity, Manzana will add new background sounds that will minimize distractions and help users focus, relax or unwind. We can make a light, dark or balanced noise play in the background and the sound of the ocean, rain or a stream to camouflage the noises of the environment in which we find ourselves. Plus, all of this mixes and integrates with other system sounds and prompts.
SignTime. Contact Apple using sign language:
SignTime will allow customers to communicate with AppleCare and retail customer service, using sign language. It is a function that was launched on May 20 in USA, UK Y France and that it will reach more countries in the future.
Recent updates from Voiceover allow users to explore more details about people, text, table data, and other objects within images. Voiceover can describe the position of a person together with other objects in pictures, and with Marked, users can add image descriptions to personalize their photos.
MFi Hearing Aid Enhancements (Made for iPhone):
Apple is introducing new support for two-way hearing aids, enabling hands-free phone conversations and Facetime. Next-generation user models of MFi they will arrive later this year.
The headphones will receive audiogram support. This will allow users to personalize their audio by importing their latest hearing test results.
In addition, other news will arrive, such as the ones that we will discuss in a rough way below:
- Actions using sounds they will replace physical buttons with mouth sounds such as clicks and “ee” for users who do not speak and have limited mobility.
- The screen settings and text size They can be configured in each compatible app, to facilitate the display of the screen for users with color blindness or other visual difficulties.
- The new customization options for Memoji will offer greater representation by allowing users to don oxygen tubes, cochlear implants and protective helmets.
Many of these new features are scheduled for release later this year, which suggests they will be included in iOS 15 or in any of its updates.