Apple has introduced several features to aid those who may have trouble using the company’s Apple Watch products due to physical impairments. By the end of the year, you’ll be able to download updates to your system that enable these functions. The ability for users with limb differences to use AssistiveTouch to navigate the Apple Watch’s interface is one of the device’s most intriguing features. The new accessibility-oriented treatment will include iPhone and iPad users as well. Apple has also announced a new service called SignTime, which will provide a sign language interpreter for customers to use when interacting with AppleCare and retail customer service.
With the help of AssistiveTouch on watchOS, Apple Watch owners can move a cursor on the screen with a series of hand gestures, like pinching or clenching. Apple claims that the Apple Watch’s gyroscope, accelerometer, optical heart rate sensor, and in-built machine learning can all be used to detect subtle differences in muscle movement and tendon activity.
AssistiveTouch’s new gesture control support would make it easier for people with limb differences to use an Apple Watch without touching the display or rotating the Digital Crown to perform tasks like answering calls, navigating the on-screen motion pointer, and opening Notification Centre or Control Centre. The Apple Watch models that will support the updated features have not yet been specified by the company.
iPadOS will add support for third-party eye-tracking devices, allowing users to command an iPad with their eyes instead of their hands, much like the Apple Watch’s gesture controls. According to Apple, MFi (Made for iPad) devices will be able to detect the user’s gaze and adjust the cursor’s position to follow it. This method will allow users to tap and perform other actions on the iPad without touching the screen.
Apple is also enhancing the functionality of the built-in screen reader VoiceOver so that users can inspect greater graphical depth in images. Information such as text, table data, and other objects will be included. With Markup, users can also provide their own captions for photos, giving them a more unique touch.
Apple is bringing balanced, bright, and dark noise, as well as ocean, rain, and stream sounds, to the ears of neurodiverse people and anyone else who is easily distracted by everyday sounds. Apple has claimed that these will “help users focus, stay calm, or rest.”
For users who are unable to speak or who have limited mobility, Apple is introducing mouth sounds to replace physical buttons and switches, such as a click, pop, or “ee” sound. The user can adjust the font and font size for each app separately. The addition of new, user-tailored memojis for things like oxygen tanks, cochlear implants, and a soft helmet for protection from the elements is also on the horizon.
In addition to the major software updates, Apple is also expanding its MFi (Made for iPhone) hearing devices program to include support for bidirectional hearing aids. MFi partners’ next-generation models, it was announced, would be available later in the year.
Apple is also adding the ability to read audiograms, charts that display the results of a hearing test, to its Headphone Accommodations feature. Users can simply upload their hearing test results into Headphone Accommodations to have the volume of quiet sounds turned up and the balance of different frequencies adjusted to suit their individual needs.
No one has told customers when they can expect to see the updated Apple products in stores. There is little doubt, however, that next month’s Apple Worldwide Developers Conference (WWDC) will reveal at least some of the specifics.
SignTime, an online service that supports American Sign Language (ASL), British Sign Language (BSL), and French Sign Language (LSF) for use in communicating with AppleCare and retail customer care, is also set to launch soon. An in-person Apple Store customer will also be able to use it to connect with a remote sign language interpreter without making an appointment in advance.