Apple on Wednesday unveiled a host of new accessibility features coming later this year to Apple Watch , iPhone and iPad users. And it said one new offering, which allows people who are deaf to communicate with Apple support through sign language interpreters, will go live Thursday online and in its US, UK and France stores.
For people who only have the use of one arm, such as amputees, Apple has developed a way to control an Apple Watch without ever having to touch the display. The technology detects the movement of muscles and tendons in the hand and wrist to perform actions and navigate around the watch screen. It will be available with the next version of WatchOS, likely arriving this fall.
For people who have issues with too much surrounding noise and stimulation, Apple plans to introduce a feature called background sounds. The noise — such as the sound of a burbling stream — plays in the background on iPhones and iPads to help reduce surrounding sounds, helping the person focus or stay calm. If you’re trying to work in a coffee shop, for instance, the sound can help drown out the loudness of the surroundings.
Apple also plans to introduce new features for people who use hearing aids, people who are blind or who have low vision and people who have other special needs. The news, timed for Thursday’s Global Accessibility Awareness Day, arrives nearly three weeks before the company’s upcoming developer conference and months before its updated mobile devices are expected to hit the market. Apple tends to keep new features quiet before unveiling them during its splashy keynotes, which makes Wednesday’s announcement rare.
“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make,” Sarah Herrlinger, Apple’s senior director of global accessibility policy and initiatives, said in a statement. “With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people.”
The US Centers for Disease Control and Prevention estimates that a quarter of Americans live with some sort of disability. In the past, people with special needs had to shell out thousands of dollars for technology that magnified their computer screens, spoke navigation directions aloud, identified their money and recognized the color of their clothes. Today, they need only smartphones, computers and a handful of apps and accessories to help them get through their physical and online worlds.
Apple has made accessibility a focus for decades. It builds features into its technology to help people with low vision navigate the iPhone’s touch screen and allow people with motor impairments to virtually tap on interface icons. In November, it introduced an iPhone 12 Pro feature that lets people who are blind detect others around them using lidar, the same technology that enables self-driving cars.
The company typically unveils new accessibility features at its big product launches, and in late 2016, it even kicked off one of its flashy product launches by talking about accessibility and showing off its new, dedicated site. Apple’s next event is expected to be its Worldwide Developers Conference from June 7 to 11.
Along with unveiling new, upcoming accessibility features, Apple said it’s launching new features, sessions and curated collections in its various services in celebration of Global Accessibility Awareness Day. Information about accessibility, recommendations and other items will appear in Apple Fitness Plus, Apple’s new virtual workout service; the App Store; Apple TV ; Apple Books and Apple Maps. Today at Apple, the company’s classes for consumers, will feature live, virtual sessions in American and British Sign Language throughout the day on Thursday that teach the basics of iPhones and iPads for people with disabilities. In some regions, the company will offer more accessibility sessions in its stores through May 30.
Apple has long offered the ability for people who are deaf to set up sessions with sign language interpreters in its stores, but those had to be scheduled in advance. Starting Thursday, anyone who walks into an Apple store will be able to communicate with the staff through sign language via a new service called SignTime. Using an iPad or iPhone, the Apple store employees will connect over video with remote language interpreters over video to help communicate. And you’ll be able to sign in to AppleCare support through a web browser and communicate using sign language.
Initially, SignTime will launch in the US, UK and France with American Sign Language, British Sign Language and French Sign Language, respectively. But Apple aims to expand it to other regions in the future. In the US, the service will be available from 5 a.m. to 8 p.m. PT and will appear as an option in Apple’s general contact page. Customers can access SignTime in the UK and France in stores from 8 a.m. to 8 p.m. local time.
Control an Apple Watch without touching the display
Apple’s AssistiveTouch feature for iOS devices helps people with limited dexterity simplify their actions on an iPhone, such as by getting rid of the use of physical buttons on the side of the device. The feature will come to the Apple Watch in the fall, letting people control the device without using the touch screen.
Apple uses the built-in motion sensors — like the gyroscope and accelerometer — along with the optical heart rate monitor and on-device machine learning to detect subtle differences in muscle movement and tendon activity like a finger pinch or a fist clench. It allows you to navigate a cursor on the watch’s display through hand gestures like clenching a fist, letting them control the watch with one hand. They’ll be able to more easily answer incoming calls, control an onscreen motion pointer and access the Apple Watch’s notification center and control center.
To activate AssistiveTouch on an Apple Watch, you have to enable it in settings and then clench their fist twice to turn it on. Pinching an index finger and thumb allows you to navigate to a button on a screen — like the stop button for a timer — while clenching again will confirm the selection. Some applications use the gestures for quick actions, like double clenching to answer a call.
You won’t be able to customize the movements to set up your own responses on the watch. Instead, there initially will be a standard set of actions that spur responses on the Apple Watch.
Background sounds to tune out noise
For some people, surrounding noises can be distracting or overwhelming, which prompted Apple to introduce background sounds for the neurodiverse community. The aim is to minimize distractions and help the person using it, focus or rest. The sounds include “balanced, bright or dark noise, as well as ocean, rain, or stream sounds” that continuously play in the background to mask unwanted environmental or external noise. The sounds mix into or duck under other audio and system sounds — they can even play underneath apps like Apple Music.
Eye tracking through third-party iPad accessories
People who have diseases like amyotrophic lateral sclerosis, better known as ALS or Lou Gehrig’s disease, lose the ability to control their muscles and instead rely on their eyes to communicate. In the upcoming version of iPad OS, the tablet will support third party eye-tracking devices that make it possible for people to control their iPad using just their eyes.
Starting later this year, compatible devices will be able to track where a person is looking on-screen, and the pointer will move to follow the person’s gaze. Extended eye contact performs an action, like a tap.
iPhone-compatible hearing aids that include mics
Apple’s “Made for iPhone” Hearing Aid program uses Bluetooth and a special protocol technology from Apple to make it easy for people to hear audio from their iPhones and iPads. You can stream audio directly to your ears, much as you would with regular wireless Bluetooth headphones such as Apple’s AirPods. But responding to audio, like talking to someone during a FaceTime video call, has required you to hold an iPhone up to your mouth to use the smartphone’s microphone.
New Made for iPhone hearing aids will now come with mics to let people carry on phone and FaceTime conversations, hands free like they would if they were wearing AirPods or other earbuds. Some of these new hearing aids likely will arrive in the fall.
Along with the new hearing aid support, Apple is bringing support for recognizing audiograms — charts that show the results of a hearing test — to Headphone Accommodations. You can quickly customize audio with your latest hearing test results imported from a paper or PDF audiogram.
Apple is also bringing support for recognizing audiograms, charts that show the results of a hearing test, to Headphone Accommodations, which launched as part of iOS 14. Headphones Accommodations lets you adjust the frequencies of audio streamed through your AirPods Pro, second-gen AirPods, select Beats headphones and EarPods. Each individual can customize the settings for what’s right for them, either dampening or amplifying particular sounds. With the update coming later this year, you’ll be able to quickly customize audio with your latest hearing test results imported from a paper or PDF audiogram.
Better VoiceOver descriptions of photos
Apple’s VoiceOver screen-reading technology has been one of its most popular accessibility features. The technology speaks descriptions aloud to people who are blind or who have low vision, telling them what’s on the screen or reading text to them. With last year’s iOS 14, VoiceOver gained the ability to better describe photos. With the next version of Apple’s mobile software, VoiceOver will get even more descriptive thanks to artificial intelligence. Instead of saying that a photo is of a car, it can say where the car is located in the image.
For people, if a person is tagged in a photo library (like “Dad”), VoiceOver will say something like “Dad is wearing a plaid shirt, has red hair and a beard,” rather than saying “there’s a man in the photo.” It also works better with written receipts and nutrition labels. Instead of running all of the text together and speaking it quickly, VoiceOver will take the information and break it down like a table — by row and column and complete with table headers. VoiceOver also will be able to describe a person’s position along with other objects within images, and Markup will let people add their own image descriptions to personalize photos.
New Memoji features
Apple plans to introduce new Memoji customizations later this year that better depict people with oxygen tubes, cochlear implants and a soft helmet for headwear.
Customizable display and text size
People who are colorblind or have other vision challenges will be able to customize the display and text size to make the screen easier to see. You’ll be able to customize the settings on an app-by-app basis for all supported apps.
Sound Actions for Switch Control
Switch Control helps people with motor difficulties better navigate their Macs by letting them click a switch to do tasks like enter text, choose menus and move the pointer. Keyboard keys, mouse buttons, trackpad buttons, joysticks and adaptive devices can all be used as switches. The new Sound Actions for Switch Control, will let Mac owners replace physical buttons and switches with mouth sounds — such as a click, pop, or “ee” sound — for people who are non-speaking and have limited mobility.News Source: MSN