A video was posted as part of Apples World Wide Developer Conference, which discusses a few new accessibility features in the soon to be released Apple operating systems. A summary of the video is provided below.
OS 10 and iOS have had switch control for a couple of years, but it's now going to be available with TV OS. There is also a remote control on-screen which will allow for switch operation with remote like functions.
Apple have added the ability to change the entire color of the display. There was an example provided visually, but no auditory information was provided. However, it seems as if it has something to do with contrast, as the presenter mentioned it assisting to address the issue of black on white text being difficult to read for people who had certain types of low vision.
Feel the Time!
IN watch OS 3, Taptic feedback will allow you to tell the time through a series of vibrations utilizing the Taptic engine. This feature is called Taptic Time. From the brief description provided, it seems like what the TimeBuzz app does now.
With iOS, Apple is introducing a new feature called Magnifier. Magnifier will allow the iOS user to magnify objects in their physical environment utilizing the camera. Once enabled, it can be toggled by triple clicking the home button. The magnifier function will also support many visual filters which can be set using the already built-in filters under the Accessibility menu in Settings on the iDevice.
A software version of TTY will be available for iOS users who are deaf or hard of hearing. It's being called a software version of TTY. Follow the link provided for a definition of what TTY is if you are interested. According to the presentation, the TTY software built in to iOS 10 will allow the user to place relay calls through their carrier, and will also allow a deaf or hard of hearing user to contact other TTY users. It's unclear whether this feature will be accessible with braille.
Enhancements have been made to speak selection and typing feedback which will provide more audio feedback to the user. When there is a pause in typing, the iOS device will speak the last character typed. The ability to customize the verbosity so that you can either have it speak the last character or word will be options.
and one more.
During this session, it was also announced that Apple is unveiling a new Accessibility inspector. For more specific details, please see the source link, which is to the video itself.
Another topic of interest, not touched on in this presentation, but announced at the keynote presentation on Monday was that iOS 10 will have the ability to perform object recognition on images on the device. It will not require uploading to servers, it is entirely self contained. According to the presentation, iOS performs eleven billion calculations per photo to determine what objects are in your photos. It cannot only detect objects, but also facial recognition which can then compare photos in your lists of contacts to tag people automatically. While no mention is made of specific uses for those who are blind or who have low vision, it would seem this could be yet another enhancement to come with iOS 10.
It's not clear to the public whether there are additional enhancements to accessibility settings and features with the new operating systems coming in the fall. As the non-disclosure agreement doesn't permit developers to discuss what has not been revealed publicly by Apple, this is the extent to which this topic can be covered at this time. For a rundown of WWDC's keynote address, you can read this AppleVis article, or find any number of others on the subject.Source: Apple
No one has commented on this post.
You must be logged in to post comments.