Going with its long standing commitment to accessibility, Apple on Thursday unveiled a redesigned Apple.com/accessibility site to make everyone aware of ways to personalise their Apple products to work the way they do. As part of this relaunch, coinciding with International Day of Persons with Disabilities on December 3, Apple will also release a bunch of videos explaining how to use these accessibility features, some of which have been augmented with the latest iPhone 12 series devices.
The updated site is organised into Vision, Mobility, Hearing, and Cognitive, making it easier to find the features the users need. The lists helpful tips about the scores of accessibility features enabled on Apple devices, but those that might not be apparent to those with disabilities.
iPhones pack a bunch of accessibility features, including the Magnifier app that lets users see everything better and closer with the exact colours they want. With the LiDAR coming to the new iPhone Pro series, the app’s people recognition feature will tell users how far away a person in, especially useful if a visually challenged person is in a queue, or entering a store. This feature in fact uses the best of what the phone has to offer, tapping into the range of sensors and then exploiting the best of the neural engine.
For some time now, iPhone’s have supported Back Tap using which a users can quickly trigger actions or accessibility shortcuts. Users can customise what the double or triple tap does for them.
Among the more popular accessibility features is Voice Over, which tells users what they are touching on the screen, helping with navigation and reading out mails and messages. Now, harnessing the power of AI, the feature lets users virtually see what the iPhone camera is seeing. The camera within a fraction of a second is able to dictate what it is seeing, like a sky, sidewalk, trees and people, making it a very important tool for those with visual disabilities.
Interestingly, with iPhones Apple has made it so easy to switch on accessibility features that a new user just needs to triple tap the home button to switch Voice Over on. This simplicity is built into every iPhone when it comes to accessibility so there is no friction in switching these features on.
Then there is Voice Control, an Apple technology that helps those with severe physical motor limitations or delays control their Mac, iPhone, and iPad entirely with their voices. Even a selfie can be triggered with just voice commands, using this feature.
As one of the few companies that looks at accessibility as core value, Apple’s tries to ensure its devices are accessible at a very early stages of the design process itself. And this means, that projects are not even started until its accessibility is evaluated. A lot of the features are also driven by user feedback and features are integrated even if the user base is small, helping ensure that everyone can use the devices the way they want. And this is why a lot of the accessibility features on Apple devices come out to be more natural and not something added as an afterthought.