Main menu

You are here

A Preview of New Accessibility Features Set to Come Later this Year in iOS 11, watchOS 4, tvOS 11 and macOS 10.13 High Sierra

Submitted by AppleVis on 7 June, 2017 and last modified on 10 June, 2017

At its annual Worldwide Developer's Conference this week, Apple previewed some of the new features coming to its iOS, watchOS, tvOS, and macOS platforms later this year. At this point, it's worth noting that this software is still in beta form, and not all features--particularly, the specifics of how they will be implemented--are finalized. With that said, below is what Apple has told us will be coming for users of accessibility features:

iOS

Enhanced Dynamic Type: Text now grows to larger sizes especially designed for users with low vision, and app UIs adapt to accommodate those sizes.

Redesigned Invert Colors: While using Invert Colors, media content and images won't invert with the rest of the screen making them easier to view

VoiceOver descriptions for images: With images, three finger tap to have VoiceOver describe what's there. VoiceOver can detect text that's embedded in an image, even if it hasn't been annotated. Or it can also tell you whether a photo contains a tree, a dog, or four faces with smiles.

Expanded Braille Editing: Using a Braille display with Apple devices--now including Apple TV--is easier than ever. Your display shows you the text you're editing in context, and your edits are seamlessly converted between Braille and printed text. You can also customize the actions that your Braille display performs on your Apple device by programming new keyboard commands.

Improved PDF support including access to forms: Tagged PDFs now receive support for reading detailed information such as tables and lists

Type to Siri for Accessibility: Supports basic search queries for those who prefer to type to interact with their device

Highlight Colors in Speak Selection and Speak Screen: You can now customize the colors that your iOS device uses when highlighting text with Speak Selection and Speak Screen.

Switch Control typing: It's easier than ever to type with Switch Control. Get access to more predictions, so that you can scan and type whole words at a time.

Additional closed captioning style: Video captions now include the option of a new larger, outlined style for subtitles and captions.

Additional audit capabilities in Xcode’s Accessibility Inspector: Apple’s powerful Accessibility auditing tool, the Accessibility Inspector built into Xcode, now has capabilities to help developers find Large Text bugs in iOS applications.

Additionally, users will now be able to add and organise some of what's displayed in Control Center. Among the items which you will be able to add are:

Accessibility Shortcut

Guided Access

Magnifier

Text Size

The Siri voice has been enhanced to sound more "natural". At this time we do not know if any VoiceOver TTS voices have also been improved; or if there are new voice options.

One of the highlight features new in iOS 11 will be drag and drop support. According to 9to5Mac, Apple is working to ensure that VoiceOver users will be able to take full advantage of this:

... Apple is introducing a new API to enable drag and drop with VoiceOver support, which means visually impaired users will be able to take advantage of drag and drop features when developers implement it.

One new iOS feature that isn't directly related to accessibility but is still likely to be of real value to many blind and low vision users, is the ability to make and share a recording of your screen. Although this doesn't automatically record sound output, it does have the option to simultaneously capture sound input from your device's microphone. This should make for a very easy, but effective, way of letting developers see and hear for themselves exactly what you are encountering when you reach out asking for accessibility improvements to an app.

watchOS

Ability to adjust the click speed of the side button, so if you have motor impairments you have more time to bring up Apple Pay and other double and triple click functions.

Improvements in Large Text support on the Apple Watch-related iOS apps.

tvOS

Support for Braille Displays: Connected Braille displays deliver text of what’s on screen (button labels, movie titles, etc.) that you are currently hovering over. As you move focus, VoiceOver speaks and Braille displays print the text. In addition, Braille display buttons map to VoiceOver commands and support basic navigation around the Apple TV UI.

Support for VoiceOver keyboard commands: Bluetooth connected keyboards will allow you to issue VoiceOver commands for more efficient browsing/navigation.

Switch Control Menu for Media Playback: Switch Control users now have access to a playback control panel to pause, play, fast forward, rewind, and skip while media is playing.

Additional closed captioning style: Video captions now include the option of a new larger, outlined style for subtitles and captions.

macOS

Improved zoom features: New zoom options make zooming in on your content a better experience. Quickly zoom in and out with a new zoom toggle by pressing Ctrl + Option keys. Detach the screen when zooming in and see contextual tool tips using the Ctrl + Command keys. Zoom improves the ability to speak text under your mouse, and flashes the screen when a notification appears that’s not currently visible.

Accessibility Keyboard: macOS can now be used entirely without a physical keyboard. The new Accessibility Keyboard is a customizable, onscreen keyboard that gives users with mobility impairments advanced typing and navigation capabilities.

Type to Siri for Accessibility: Those who prefer to communicate by typing have the option to interact with Siri by typing requests and questions.

Improved Grade2 Braille editing: Improved Grade2 Braille entry and editing creates a more seamless Braille experience for VoiceOver users. For example, you’ll be able to see the context of what you’re typing on the Braille display. Also as you edit in Grade2 Braille, text won’t get translated back into Grade1 Braille.

Enhanced multilingual VoiceOver support: VoiceOver automatically switches to the correct voice when reading in a different language. For example, VoiceOver has a Japanese voice when you visit ja.wikipedia.org from a Mac with English as the primary language.

Improved PDF support including access to forms: Tagged PDFs now receive support for reading detailed information such as tables and lists.

VoiceOver descriptions for images: With a simple key command, VoiceOver can describe text appearing in the image, even if it hasn't been annotated. And it can tell you whether a photo contains a tree, a dog, or four faces with smiles.

Additional closed captioning style: Video captions now include the option of a new larger, outlined style for subtitles and captions. The new video captions appear in iTunes, iMovie, Safari, and QuickTime.

Looking Ahead

It is expected that all of these forthcoming software releases will enter public beta before the end of June , and be released to the public in the fall.

For those of you who will be participating in either the developer or public beta programs, we cannot stress enough how important it is that you report to Apple any bugs that you encounter. Doing so will help to ensure that the final releases for each of these platforms will be as stable, polished and accessible as possible.

We know that Apple's Accessibility Team will be particularly keen for you to explore the new features mentioned above, so do please give these a thorough testing and report any problems that you encounter.

We should also stress that we are only at the beginning of the beta cycle, so it's entirely possible that some of the new features mentioned above may change or evolve between now and the final release of the software. We can also be certain that the Accessibility Team will be working on additional enhancements, improvements and fixes.

We would love to hear your thoughts on these new features, so do please let us know by adding a comment below.

... A nice update. I really like the improvement ti invert colors, as I use that feature all the time, and hate how it inverts pictures. Speaking text in a picture is a nice little feature though. What would be really cool in addition to that is having the ability to set in accessibility a print screen to OCR feature. Essentially you could take a screenshot, and, if this feature is enabled, instead of saving it to your camera, it could use OCR to read text and just say it back. This would be very useful, for instance with Video content on which a lot of text is displayed.

Hello all,
I was not impressed by the keynote on Monday. However the improvements I see above make me a bit happier. Improved PDF support is something I have been seeing for quite a while and it looks like it is finally going to happen. I am also looking forward to the improved web and mail navigation mentioned. I also think with the, hopefully, soon to be released Orbit Reader 20 the braille improvements will be very useful.

Hello! Nice enhancements. I am waiteing for a new option the switch languages. It's very welcome to see after a long time. I am happy about supports Apple TV via a braille display.
I wait to see more so-prices.

I cannot wait to play around with all the new features! I'm especially excited to try out the image disscriptions and text within an image! I wonder if I could use this on GIF.'s. If so, that'd be a huge improvement. While I will not be entering any betas, I'll be anxiously awaiting the update in the fall. I also hope that they squash some of the bugs still crawling around ios10. Nonetheless, I'm excited with anticipation. Also, I wonder if the captions will work on Youtube videos, or if jhe videos will have to be done and edited or tagged certain ways. It'd be nice to be able to watch videos on YouTube and know a t least a bit more about what's going on. I have iome channels on my subscribed list, but it'i just Fox News and The Tonight Show With Jimmy Fallon, because those at least are pretty much auditory. Other than that, I just use it for music.

I'm impressed highly by the new stuff coming. Now my world will open up more, I mean, I can hardly imagine the pics described by voiceover, and those alt text pics on FB with text will never be a mystery again. At least I hope that feature is worked on heavily.

I'm looking forward to Apple's work on automatic image description. I want to say that that's an incredibly difficult job, to which thousands of academic research-hours have been thrown, so I will not be surprised if the function has limitations or false data. I'm of course very interested in it, however, as I'm sure all of you are as well. Actually, the data for videos seems even more useful to me than for plain images, as we already view videos often, and any additional data could be very helpful. I look forward to seeing it in action.

I cannot wait to try out these accessibility enhancements in the fall when I update to High Sierra. I've been waiting with baited breath for better native pdf support, and hopefully it will actually become reality this time 'round. Automatic language detection will be super cool too. Well done Apple accessibility team, as always!

Already feeling the excitement of what will be the new features that Apple will bring for all of us. Image description will be a tremendous step. Finally, we will be able to read text in bedded in images and/or find out The content of a picture. Nice, Apple. Way to go. Language detection will be excellent. I just can't wait to try it out. Thank you, Apple accessibility.

For the first time in years, I feel like Apple is actually showing some love for the Mac. It is long overdue in my opinion, but I am still grateful it is happening.

I have preferred to check email on my iPad instead of on my Mac because of how VoiceOver treated the message as a single block of text, even if it was formatted like a webpage. It sounds like that is changing, and VoiceOver on the Mac will begin interacting with emails more consistently with how webpages are treated. That will be a very welcome addition for me.

I am also really glad to see support for tagged PDFs finally showing up. This should have happened ages ago, but better late than never.

Having said all that, the one feature that I am looking forward to more than any other, by far, is the ability to announce text in images. A recent trend I hate is when somebody releases a statement for the media, and the statement is posted on Twitter. Because of the 140 character limit, the statement is converted into an image of text, which visually looks just like the rest of the text on the screen, but to VoiceOver the text within the image is invisible. I know I can copy the image and drop it into an OCR app of one sort or another, but that is a pain. If I can extract the text directly, with a single VoiceOver command, I will be very happy indeed. This should be easy, since the text in the image is just like any other on the screen, so the issues that normally get in the way of OCR accuracy should not be a problem.

Wow! Apple finally figured out that it is not helpful for low vision people to see negatives of all media!
Kidding aside, the inverted colors thing it was much needed and long-overdue. Picture descriptions is cool. OCR of unreadable text would definitely do a lot to catch voiceover up to other screen readers. Also, I am still going to hope against all available evidence that one day we might have control over the functions of the different gestures.

I'm really looking forward to the new accessibility features coming in iOs and Mac OS. I'm looking forward to hopefully being able to tell what the text says when people post images of text on FB and twitter. I'm also anxious to see how the changes in braille will work. Editing in grade 2 without having text converted to grade 1 will be very interesting. I definitely plan to participate in the public beta cycle for Mac OS since I have access to multiple Mac computers. I would love to test iOs, but being that i only have 1 iOs device that can run iOs 11, it won't be practical for me to beta test. At any rate, I'm pretty excited about this year's upcoming updates.

The automatic transcription of voice mails needs to go away. Apple that talks so much about privacy and then gives the user no option as to whether or not private voice mails are sent to their servers seems kind of wrong.

This is the first time we've seen some of these features, and not everything will be perfect. So, in the first release of iOS 11 be prepared for a bit of a bumpy road. But as time goes on i'm sure any bugs will be squashed. This is how it usually goes with Apple. They perfect things as much as possible, release the new version and squash any bugs in subsequent releases.

Perhaps there are already functions like this and I don't know about them, but here's my VO wish list.
1. More voices; Acapella or Eloquence speech would be nice
2. Verbal/tactile time announcement in the Clock app. You could have a voice and/or chime of your choosing on the hour or half-hour or quarter-hour and it would sync with Apple Watch. The apps designed for this purpose don't sync with the Apple Watch and since this feature is on the Mac to some degree, we should have it on our iDevices as well.
3. Customizable VO sounds; more options and the ability to mute the VO sounds you don't like and keep the ones you do like.
4. Some sort of vibration/haptic feedback/sound when an iOS device is successfully turned on. When I turn my phone off, it's always a guessing game as to whether the phone is turned on or not because there's no feedback unless you can see the apple icon on the screen.
5. Customizable speech output; if you had a Braille Note, you remember the "Speech on request" feature. Basically, while you were using the display, the speech was sort of half on. So, if you needed speech for reading a document or whatever, you could activate that particular feature without turning on speech for everything. I like speech, but only for certain things like reading long strings of text, etc.
6. An easier way to pair Braille displays without being forced to use speech, particularly for deaf/blind users.
7. Haptic time on iOS devices. So when your in a meeting, you just activate haptic time without the phone talking.
8. For the Apple Watch, a set gesture that "wakes" up the watch instead of accidentally activating it when you bump it against something or press a button without meaning to. I'm sure this would save A lot of battery as well as make for a more quiet experience.
And this is all for now. Perhaps more ideas will come later.

Usually, a 3 finger tap will tell you where you are on the screen and what row and column it is. Does this mean that this info will be replaced with the image description feature? Not that I am complaining. I am not really interested in columns and rows anyway.

I like the time suggestion. I have 2 other suggestions for the Mac, and perhaps iOS devices although I don't currently own one of those so I may be jumping ahead of myself here. But the first suggestion has to do with the time thing, and other announcements. It would be cool if we had the ability to choose from a list of possible sayings, and/or make up our own. For instance, when in the VO utility and sampling the different voices and such, there would be a list of possible sayings. I like what we have now too, which is: "The best way to predict the future is to invent it." My other suggestion is haptic feedback for the Mac, or would that be too hard to implement? The idea of haptic feedback intrigues me. I only saw it once, and that was a few years ago when I was helping a PhD. student out with his research project.

Hi,
Well, the new MacBooks have haptic feedback in the trackpads. I don't think they'll put haptic feedback in the whole machine, as it's on your lap, and any alerts that come in would be usually announced by a tone anyway. I don't see haptic feedback in the mac ever happening.

1. The UX I understand is the captions of a video can be rendered by VoiceOver in Braille and/or Speech. I’ve the setting enabled andhave tried videos in iTunes, Netflix, Prime Video, and YouTube, but the captions are not uttered by VoiceOver. Does the app developer or content producer have to interface with some new accessibility API to have this feature come alive?
2. The UX I understand for PDF accessibility is that a tagged PDF produced in Acrobat Pro or Microsoft Word consume using iOS 11 VoiceOver would allwed the tags to be accessible. For example, it should be possible to navigate by headings and read alt text in images. I tried a tagged PDF in iBooks, but it was not possible to navigate by headings or images, nor were the headings announced when touch exploration, and the alt text of image was not spoken. Does this require an iBooks update or does content producer need to apply specialized tags specific to iOS 11 VoiceOver?

Have others had success with these features? Any tips or pointers are greatly appreciated. Thanks!

hi all, can i set an auto-advance mode with a braille display for ios or ist feature only available in windows or mac for displays that support it? I am still considering a focus 14 but if i have to keep panning will find it frustrating. thoughts?

Hello,
The only auto-scrolling setting I know of for VoiceOver for iOS is one that will turn the page on multi-page documents such as in News.
I assume you want auto-panning? I think you will need a display that can do that for you, at least in iOS 10. At this time the only one I know of that can do it is the Handy Tech actilino.

An auto pan or as HumanWare calls it, auto advance would be very useful, especially when reading very long texts. Safari and my Braille sense really do not like each other very much, so a bit of love for Web support would also be very nice, I am downloading the macOS beta as I write this, so I will have to take another look later.

It would be cool if Pages had this feature as well. Or perhaps I'm missing something? I received a newspaper article earlier this week from my mother, about a longtime family friend who passed away suddenly earlier this year. I was able to read the whole article with VoiceOver, but I had to do it manually with the up and down arrow keys because Pages stopped reading automatically after the first page. I even tried a Say-All but that didn't work. This has been the case with other things I've read using Pages and TextEdit. I've looked through the menus and found nothing. But I suppose I'll try that again.

it's great to see such significant improvements possibly coming, especially text recognition in images. I hope that one of the improvements to the webpage navigation on Mac will be the proper support for reading a webpage using the down or up arrow with QuickNav turned off, that is, in the same manner as one would read a text document. Because currently, when the keyboard focus enters a text field, the VoiceOver cursor gets stuck there and you cannot continue to read the webpage with the down arrow, or with any other arrow key navigation commands for that matter.

Another thing is that when reading webpage with QuickNav off, item type, such as heading or link, is announced before the item text itself, and this cannot be configured using VoiceOver Utility.

Better PDF support is also a very welcome one. Apart for navigation by headings and links, which was not mentioned by Apple, but which I consider as quite important feature, it would be great if it is possible to select text in PDFs just as one can do in Pages documents using Shift and arrow keys. Currently, there is no sensible way of selecting and copying text from PDFs.

Regarding the auto detection of a document language and voice switching, as many other people I don't believe very much this could be reliable and it doesn't work for multilingual documents, I never use it on iPhone. Rather, I would welcome a single keyboard shortcut for switching to the last used voice, in a similar way as the speech rotor works. But currently it's not handy to switch between the speech rotor settings all the time to find the voice setting. Therefore, e.g., VO + Shift + left bracket for previous voice and VO + Shift + right bracket for the next voice would be very nice. Also speech rate modification could have direct keyboard shortcuts even though it can currently have them using the Keyboard Commander, but that needs configuration. Default shortcuts like VO + left bracket to decrease speech rate and VO + right bracket to increase it, so that the shortcuts are close to the previously suggested Voice switching shortcuts and are comfortable for the fingers, would be highly appreciated. Unfortunately, these keys are already occupied for other commands, however, to my viewpoint not so essential ones, and those could be moved to VO + Command + comma and period, for instance.

Speaking about shortcuts, I wonder there is no standard one for reading first the current time, including seconds, then date and lastly the day of the week, and which will reflect the system language and region settings. A keyboard shortcut for reading the battery level and if it's charging is also missing. These announcements can be done via AppleScript, but that's not for everyone and launching these scripts has some delay after pressing the shortcut.

Recently we have seen Pages adding support for document navigation by heading using different kinds of rotors, I am very grateful for that, just wish that navigation by headings, links, tables, etc. using single-key QuickNav as well as VO + Command + letter will soon be also possible, hopefully in Microsoft Word too. Something for Pages to be inspired from Microsoft Word might also be using Ctrl + Shift + Right or Left arrow to quickly set a heading style for the current paragraph or increase/decrease its heading level, and keyboard shortcuts for converting to numbered or bulleted lists..

This more or less sums up the most crucial points I am experiencing on the Mac. Anyway, thanks to the AppleVis team for sharing these plans for accessibility which look very good.

For those who would like paragraph navigation, if using a braille display, you can assign a key combination to the previous and next paragraph settings. For example, I've set mine to backspace with dots 2 3 to go to the previous paragraph, and backspace with dots 5 6 to skip to the next paragraph. This is under the "Braille commands" section of the info page of the braille display setup in the braille settings