Accessibility Features of iOS for the iPad and iPhone

Covering all iOS accessibility features with in depth videos on Guide Access, AssistiveTouch, VoiceOver

4.3
(93 ratings)

Instead of using a simple lifetime average, Udemy calculates a
course's star rating by considering a number of different factors
such as the number of ratings, the age of ratings, and the
likelihood of fraudulent ratings.

Curriculum

Welcome to the course "Accessibility Features on the iPad." My name is Sami Rahman. I'm the author of "Getting Started: iPads for Special Needs." I am also the co-founder of an online community called bridgingapps.org.

We're a community of parents, teachers, therapists, doctors, and people with disabilities who are trying to exchange information around using mobile device technology like the iPad and Android tablet, as well as the iPhone and the Android cellphone with those with disabilities.

I also am the parent of a little boy named Noah who is going to turn five here very shortly. Noah has both cerebral palsy and autism. Two years ago, almost three years ago now, we got him an iPad, and his world changed and our world changed. The basis for all of this is how we can help other people have that same change with their users.

In this course, I'm going to assume a couple things. First thing I'm going to assume is that you're not a therapist or a teacher. You may be those things as well, but I'm going to assume that you're not a professional when it comes to assistive technology.

I'm going to also assume that you're not a professional when it comes to technology in general. I'm going to use everyday words to describe the features on the iPad. All right. With that, let's get started, and thank you very much.

Our objectives today. In this course we have two primary objectives. The first thing we're going to do is go through all of the accessibility features as Apple defines them on the iPad. Then, we're going to go into three very specific features, VoiceOver, Guided Access, and AssistiveTouch, with a series of videos that we've done on this subject matter.

This is not only set-up but usage and a couple of 'gotchas' when it comes to it. All right. Let's get started.

The first dilemma that you have to overcome when it comes to the iPad is, the iPad has a few buttons on the outside, but generally speaking, the primary interface is a large 10-inch touchscreen. If you are completely blind or partially blind, a touchscreen may or may not be an adequate interface for you.

VoiceOver is technology that is actually used on the desktop as well, and it's been brought over to the iPad. The first problem that has to be conquered is that the iPad doesn't have a cursor, so the iPad doesn't know where you are onscreen.

VoiceOver: You can think of VoiceOver as a voice narration for where you are onscreen. It's voice navigation when it comes to the home screen. It can also be an aid to information. For example, when you have email and you're reading email, it can also read the email back to you.

It's an audio cursor, and it's constantly giving you feedback. These are called screen readers, and the way that generally screen readers work is they start in the upper left-hand corner, and they go to the lower right- hand corner. If you're on, for example, the home screen, and you've got two dozen items on it, then navigating from the upper left hand corner to the right hand corner is not a problem. You tap anywhere onscreen, and it moves the cursor from one item to the next.

The dilemma exists when you are on a webpage, and now a webpage, when you think about letters or words or hyperlinks or paragraphs, there may be literally thousands of items on a page. And so starting in the upper left hand corner and going letter by letter all the way down to the lower right hand corner of a screen becomes very tedious and not very productive for the user.

There's a key concept here, and that is the concept of the rotor. Let me step back, and let me define the rotor first. Then I'm going to tell you the context of this.

The rotor allows you to change the settings of VoiceOver what it navigates from. For example, Setting One could be letter by letter. Setting Two could be word by word. Setting Three could be sentence by sentence. Setting Four could be link by link.

As you change, and the way the rotor works is you twist your hand or you use two fingers, and you rotate two fingers, and that then changes the setting. Let's say you're in a webpage, and you are starting in the upper left hand corner and you're going letter by letter, and that's just too tedious.

You use the rotor to then change word by word or sentence by sentence or paragraph by paragraph or link by link. Now, you can go navigate down to where you want, and find the information that you want. Then you can change the rotor back to word by word, and it can read you back the content that you want. The rotor allows you to change the VoiceOver settings.

There's a couple key concepts about the rotor. First one is the rotor allows you to change the VoiceOver settings without having to go back into the settings program and change it. The second thing that VoiceOver in general does, in combination with the rotor, is on a per program basis you can decide what e

The technology behind Siri is something called speech to text. What that means is, the way that this works is you touch a very specific Siri button on the device. It then listens for your voice. It then takes your voice, converts it to text, and tries to execute commands around whatever the voice detect conversion is.

The beauty of this is if you've got both physical impairment and/or visual impairments, you can interface more quickly, so using Siri you can launch applications. Instead of using VoiceOver, you can use Siri launch applications. You can also go get data off the internet.

For example, you can ask Siri about the weather, and it'll go out and get the data, the weather, and bring it back. This can dramatically reduce the amount of keystrokes, as it were, or touches that you have to do to go get that information. This fundamentally is a tremendous shortcut for both people with disabilities and people without disabilities, so a lot of people love Siri just for that.

Now, one of the dilemmas is that if you have a speech impediment and/or a thick accent, Siri may not interpret your commands properly. This is a relatively new feature. I think it's a hit and miss feature. Some users find it to works really, really well for them. Other users find it to not work very at all, and find it very frustrating.

The beauty of this, I think this will develop and get better over time. The technology has already moved fairly rapidly in a short amount of time, so I think this is a great area to watch.

Zoom is an operating system level feature, so that means it works in almost every program. I think there are some exceptions out there, but the idea behind this is once you have this feature turned on, it's off by default. Once you have this feature turned on, you can zoom into any portion of the screen and effectively make a small typeface, a very large typeface.

Now, I also have a learning disability, and so I actually use zoom. If there's too much information onscreen, I have a hard time visually handling it. I use zoom to reduce the amount of information onscreen, make it a little larger, and allow me to interpret the information a little bit better.

Zoom can be used also with other accessibility features as well. If you have partial blindness, you can use zoom in conjunction with VoiceOver in most applications. That is the zoom feature. You use touch commands to zoom in and zoom out and scroll around the screen. And you can get a fairly high level of zoom. I think you can go higher than the equivalent of 70 point text.

Black on white is the next feature, and I have here in brackets [invert color.] I want to explain what that is. Black and white is what the name of the feature is, and you can turn it on. Basically what that does is it inverts the color of the screen.

If you've got black text on white background, then you do get black on white, meaning it will go to a black background and white text. This is really good for those users who need high contrast screens.

The invert color, what it really does is if you have colored text or color image, it will make it the opposite on the color wheel. So, blue would go to... I can't think of the opposite on the color wheel of blue, but with black text on a white background, you get a really high contrast. When you've got colored text on a colored background, the contrast may not be so high.

You'll have to experiment as to which programs this works well for you. It does not make everything black and white and make it super contrast like it would on a desktop application. It just inverts the color. Again, you need to do some experimenting to see if this works with every program that you want.

This is a feature that is turned off by default, and what this allows you to do is any time you highlight text, it adds a button that allows you to speak that text back. So, using machine language, it speaks that text back to you.

Not only for visually impaired this is good, but it is also good for those who have, like in my case, I've got a reading disability. If I have a problem managing words on a page, I can highlight them real quick, hit Speak Selection, and have the computer speak it back to me.

It speaks with the languages that are the base languages within the system in general, but you're going to have to test in terms of other languages. It's primarily an English language feature, but you're going to have to test with other languages.

One of the most amazing things of all of the iPad products is they use tactile buttons. What is a tactile button? A tactile button is a physical button versus what I would call a 'soft button'. There are a number of devices out there like the touchscreen that use a button that you can't physically feel.

Particularly if you have fine motor problems... I mean, I see this with my son all the time with fine motor problems. If he's using my Android cellphone, as an example, he touches a button and doesn't know that he's touched the button, and goes to the homepage and doesn't mean to go to the homepage.

There's no way to do that here by accident with physical buttons. The home button here is recessed, so you can't accidentally swipe on it. You have to physically touch it. So, this can be really good for those with visual impairments to know exactly both the orientation of the device, and to actually execute the commands that the buttons are designed for. So, it makes it very, very functional.

Within the operating system, under the settings in "Accessibility" tab, you have the ability to enlarge a text up to 56 points. Not standard 70 point, but up to 56 point. What this does is it does not enlarge the navigation text. This doesn't enlarge everything, this just enlarges large text blocks.

This will enlarge the email, but it won't enlarge the folders next to it. It'll enlarge the web page text, but it won't enlarge the buttons on it. This can be really good for reading text. When it comes to navigation, you're going to need to use Zoom or VoiceOver to do the navigation. You can use them in combination, zoom and large text.

Large text can be particularly difficult to use on smaller device like an iPod Touch or an iPhone, just because there isn't as much screen real estate to show large blocks of text. Like I said, it can be used in conjunction with Zoom to make a complete solution.

You can think about a braille display like a braille printer. What it does the iPad sends a string of text to the display, and the display, depending on how many characters the display has, it will display the first line of text for as many characters. There are braille displays that have 20 characters, and braille displays that have 80 characters. I'll display as many characters as it has until you tell it for the next line of text.

this is a really good interface for those users who are proficient with braille. For those users, for example, who are not only proficient with braille but also who are deaf and blind, who have visual and auditory impairments, this can be a great interface.

The braille displays are connected to the iPad wirelessly via Bluetooth. The same display that can be used on the iPad can be used on a laptop and a desktop, and so you can have a consistent interface across a lot of your technology.

This is something at the base operating system, so the iOS does connect to the braille display by default. Using the analogy of the printer, it connects at the operating system level. Any application, you need to test on a per application basis because the application also needs to be able to send text to the braille display.

A lot of games, for example, braille display is not going to be a great interface for. Things like email and webpages, the braille display is going to be a great interface for. Braille displays, from a cost perspective, they're coming down pretty dramatically now. I've seen them as low as $300.00-400.00 and as high as $4,000.

The good news is that that is substantially less expensive than they were, and so this is becoming a more ubiquitous kind of interface.

The first hearing feature that we're going to discuss is FaceTime. FaceTime is a video conferencing system that can be used throughout the iOS and Mac OS environment. What that means is you can video conference between two iPads, you can FaceTime between two iPads, you can FaceTime between an iPhone and an iPad. You can FaceTime between an iPad and a Mac desktop.

There are other software out there that is non-FaceTime related that will allow you to do it across PCs as well. I can think of Skype as an example. FaceTime is built into the system. It's very well-supported.

For those with hearing impairments, it allows you to lip read. There are also services that will convert to text, as well. One user can use FaceTime, and the other one can use a braille display. That's very functional. As a side note, on a non-hearing related special needs issue, we use FaceTime with my son, because of his autism.

Looking straight in the eye and interacting with people's faces is kind of difficult for him, and so we get on with his grandparents', and he gets to interact with his grandparents and have good social interaction using FaceTime. His grandparents, it's kind of a treat, they live out of state, and so it's a treat for everybody.

FaceTime is a great way to just promote social interaction in general. And then specifically for the hearing impaired, it has a number of features built in.

A couple of sort of caveats. What closed captioning sort of is text that's displayed on screen while videos are being played. Now, here is the catch. The player has to play closed captioning. You have to have to have it turned on, and the video itself has to be closed captioned in a format that the player understands.

There is the default player, video player onboard, and there are also other third party video players on board that play closed captioning. The key here is your video has to be captioned, your player has to understand the captioning, and the captioning has to be turned on. If all of those are true, then you can get subtitles and captioning on your video.

For those who need a higher decibel level or higher volume level, you can use the headphone jack to add speakers to the iPad. This is good if you're using the iPad as a communication device, and the built-in speaker isn't loud enough. This is also good in reverse.

If you are in a crowded room and you want to listen to the iPad and don't want to let everyone hear and don't want to disturb everyone else, you can use the headphone jack for that as well. The key here is that you're going to want to be able to control the volume, not only on the device using the volume settings but if you need an additional boost in volume, then your device, the amplification, needs to also be adjustable on the device itself.

Along with wireless, there is also Bluetooth audio, so the ability to use both Bluetooth headphones, Bluetooth microphones, Bluetooth speakers with the iPad. What's great about this is that Bluetooth is connected wirelessly, so you can have your speakers separate from your iPad, and you don't necessarily have to carry them together.

Here is the key, especially with speakers. The whole idea behind a Bluetooth speaker is that you can add amplification to the sound. If you're using the iPad as a communication device, the built-in speaker may not be loud enough, so you can augment that with a Bluetooth speaker. Here's the key concept though.

The iPad has a 10-hour battery life. What I would try to do is, a lot of particularly less expensive Bluetooth devices do not have a battery life that is that long. My recommendation is that your battery life, particularly for your speaker, needs to be as long or longer than the battery life of the iPad, otherwise if you have an iPad and you need a speaker and the speaker is only at a two hour battery life, then you're effectively only allowing that person to communicate for two hours.

You need to sort of have technology alignment there, and make sure you have enough power to those devices. But, because it's wireless, it can be set somewhere else or be near the people that need it, and the iPad can be in a different place. It gives you the freedom of being wireless.

You can convert the stereo output of the iPad to a mono, and make that at an operating system level, so it's always that way. Then you can choose to be either center, right, or left, depending on mainly your source. Some sources are one way or the other, and so this can be good. This is from an output perspective. If you can hear in one ear and can't hear in the other, this will allow you to set which ear to pump the audio to.

Messaging is a huge component to the special needs community. It allows people to be fluid in a world that they're not always fluid in. So, messaging happens to be the application that they use. There are a number of different other text messaging applications, but if you have a hearing impairment, messaging can be a great lifeline out to the world.

The iPad, particularly with iOS 6 and above, has a tremendous number of ways to alert you. You can choose on an application basis to not be alerted, to have the alert in the banner, or to have an alert pop up right in the middle of the screen.

Every application gives you all sorts of choices, so you can really customize how you want to be alerted on a per application basis. This is a very, very extensive feature. For those who have hearing impairments, this becomes a great way instead of just a normal audio notification.

One of the first features is the idea that the iPad is thin and light. When you think about this, when you contrast to a tablet computer to a piece of durable medical gear, often the durable medical gear is designed to be mounted to a wheelchair or a bed or some other fixed object, and isn't designed to be completely portable.

Think about this, as technology develops, we get more and more technology, more and more capability, and smaller and smaller in space. Really, the iPad is a great example of that. We have a lot of computing power, a lot of battery life, a lot of connectivity in a very thin, light space.

This can also work against you for those users who have fine motor and physical issues, and may not have total control over their ability to move. The thin, light can work against you, and so, in the same way, you can put it in a case and mount it on a wheelchair as well. So this works both ways.

Prior to tablet computing, the only way to get a touchscreen device was to have a very large computer monitor with a very specialized touchscreen that, because the operating system didn't support touch, was only used in certain applications. Once we got to an iPad level or a tablet level, then it became very extensive battery life, thin and light, and because the operating system was touchscreen, everything was touch.

If you think about the way learning happens through our lives, learning happens by touch in a lot of ways, particularly early learning happens by touch. It's very, very intuitive that a user touches a screen and something happens. And so, the iPad then support this very, very intuitive way of learning.

I'm going to take a step back here, and talk about the contrast. Let's look at a computer. A computer is a non-direct interface device. It's a remote interface device. You have to interface with a mouse and a keyboard and a monitor before you interface with the software on that device.

Let's contrast that with an iPad. An iPad, you load a game up, you touch something, and something happens automatically. It's a direct interface device. There's this concept of direct interface versus remote interface. That direct interface capability at an operating system level, meaning with all the programs, it lowers the cognitive threshold that you have to have to be able to use a computer.

Now, when you had to be four years old to use a computer and understand the concepts behind interfacing with a computer, now a one-year-old can interface with the computer quite easily. Its' literally lowering the cognitive threshold to be able to interface with a computer. And because touch is such a key component of early learning, it really is intuitive for the user.

I cannot say enough about large multi-touch displays. Where this contrasts to, let's say a small multi-touch display like an iPhone for a user with physical impairment, hitting a target on an iPad, a larger target is much easier than hitting a smaller target on an iPhone, as an example.

This is one of the features that we're going to go through in-depth, but let me explain here what AssistiveTouch is, and how it can be used. If you have the iPad, for example, mounted in a case, and that case is mounted on a wheelchair.

You have an application that needs to shake the iPad and you have physical impairment, let's say you don't have the use of your arms, how do you shake the iPad? It's mounted in a case, so you can't shake the case. You can't shake the wheelchair. What AssistiveTouch gives you is an onscreen menuing system and button system for physical interfaces of the devices.

All of the physical buttons have virtual buttons in AssistiveTouch, and then also actions like shake have also a virtual button inside of AssistiveTouch. AssistiveTouch also has the ability to record custom gestures like flick up for scrolling a webpage, and allows you to trigger that by just tapping the screen.

It gives you a soft way of accessing all of the physical buttons when you either can't do it physically or it's just easier to do it via software. That's what AssistiveTouch is.

This can be good for both users who have a hard time controlling the physical iPad itself but also this combined with users who also have visual impairment, this could be good to be able to rotate.

It actually zooms in parts of the screen, and so the multi-orientation allows this to be completely functional wherever the device is, whether it's upside down, right side up, and it'll sort of auto correct itself and get in the right position automatically. So, this just becomes easier to use.

Let me explain what this is. First, while it's been on the desktop environment for a long time within the world of assistive technology, it has been used primarily in cellphones when you had a numeric keypad. What it tries to do is, based on the input that you are giving the computer, it tries to predict, guess what full word you want, and then gives you a shortcut to execute that word.

With typical users, this is used with text messaging. For special needs users, this can dramatically shortcut and reduce frustration both in hitting the keyboard properly and also in typing long words. This just can make your life more efficient.

Predictive text is turned on automatically. Predictive text also learns, so if you type a word and it tries to replace it, and you tell it, no, I want this word, it learns that word. That word will then come up in your dictionary when you try to type it again. You have to participate in teaching it, but it can expand with you as you use the device.

Within the world of assistive technology, external keyboards are very, very critical. An external keyboard, for example, can be also used with visual impairment. You can use a large display keyboard for those who are visually impaired.

Also for physical impairment, hitting that soft button onscreen can be very difficult. If you don't have control over your fine motor motion, you can accidentally touch a bunch of buttons you don't want to touch.

A physical keyboard can reduce that, or using what's called a keyguard, both over the top of the keyboard or over the top of a soft keyboard onscreen can also help in helping the user hit the intended button every time.

You've seen here on screen where there's also Arabic letters for multiple languages as well. My son happens to interface with a hard keyboard much better than a soft keyboard. He can position it better in relationship to the screen when the keyboard is detached. For a number of reasons, a keyboard really works well for my son.

He's learning Arabic, so we also put the Arabic letters on there. We had a keyboard do a lot of different things for us. Most users find that they type faster with a physical keyboard than with a soft keyboard, so it can also be an efficiency issue, regardless of whether you have a special needs or not.

What this is, the audio jack on the iPad can do a lot of different things. It can not only be an output device for headphones or speakers, it can be an input device for a microphone.

It has the ability with the right device on the outside, you can control the volume from the headphones straight into the iPad. For example, if your headphones are Apple stereo headset compatible, then using larger volume buttons on the headphones or a tether, you can then adjust the volume of the iPhone or the iPad without having to go into the settings and adjust the volume.

This becomes an easy way of doing that. Presumably, a user with sensory issues or who has a hard time hitting those small volume buttons, you can give an alternate solution using the compatibility.

Automatic sync allows you to back up your iPad. There are three ways you can back up an iPad. You can back up an iPad using iCloud, which is a cloud service, and that allows you to do it wirelessly. The biggest dilemma with that is the programs themselves have to be also iCloud compatible, so not every program is iCloud compatible.

A lot of users think they're backing stuff up when they're not, like a document in a word processor. Now, all of the Apple stuff is iCloud compatible, but not all third party stuff is iCloud compatible. That's the first one that's completely wireless.

On the exact opposite, the other way to back up is wired. You stick your USB cable into your computer, you run iTunes on both the Mac and PC, and you stick the iPad into the other end of that cable, on the 30-pin connector, or the lightning connector now with the iPad 4, and it backs up everything on the iPad.

The advantage of that is everything is backed up no matter whether it was designed to be backed up or not, it's all completely backed up and it's saved on your computer.

Connecting up those wires can be difficult for a user with physical impairment, so there's another alternative, and that is to use the WiFi network, if you have a WiFi network, and your computer is on and your iTunes is on, you can set it up.

It's a setting you have to turn on where once your iPad gets on to your local network, it will start making automatic backups of the entire thing.

The first time you set it up, you have to do it wired, and you make a complete backup wired and you turn it on. Then, every other time after that, it does small incremental backups of the device. For example, in my household we do this using automatic sync. Every time I bring the iPads into the house from outside, they get synced up automatically the moment they're plugged into a power supply.

That keeps everything in sync without having to actually plug wires in and out. And that's very critical for those users with fine motor issues.

At an operating system level, the iPad is not switch compatible. So, let me explain what switch compatibility is and how you make the iPad switch compatible, then we'll talk about the particulars, the ins and outs of this.

There's two elements that you need to use a switch. A switch is a binary interface with the iPad. You need the switch itself and you need a switch interface. All of the switch interfaces are connected to the iPad via Bluetooth.

Here is the key concept. Unlike braille interfaces that are at an operating system and the operating system level understands and there is a standard way of communicating with the braille interface.

The switch interfaces are not at an operating system level and there is not a standard at the operating level to communicate with the switch interface. What that means is that the program has to not only communicate with a switch, but it has to communicate with that specific switch interface.

There are three things that have to be compatible. The switch needs to be compatible with the switch interface. The switches are fairly standard, so that's the least of your worries. The switch interface has to be compatible with the program that you're running on the iPad. Those three things have to be true for you to have a switch interface work with that particular program.

You're going to require a lot of testing with this. The beauty of the switches themselves is the switches are fairly compatible across multiple devices, and so you just need to pick up your switch interface, and then you need to test with your applications.

There are some switches that are both a switch and a switch interface itself, so you just need to shop around. And we'll have more information on that later. More and more applications are becoming switch aware, but it's not at an operating system level, so that it makes it really a small set. The application itself really has to be designed for that particular switch interface to be switch compatible.

In this section in this second section of the coursework linking goes through the three primary, there are many major features accessibility feature but we are going to go through VoiceOver in depth Guided Access in depth and AssistiveTouch in depth All the in's and out's of that So without further ado will be doing that.

We’re going to use three different types of visual indicators. We’ll use a hand every time I tap. Every time I hit a button on the outside of the device we’ll give you an icon. And we’re going to use arrows to help point out key information on screen.

So, before you can set up VoiceOver, the first thing you want to check is, you want to check the 'International Settings'. VoiceOver will speak the language that is set up in your 'International Settings'. So, tap 'General', scroll down until you find 'International' and then you want to check what language your language is set to and that’s what VoiceOver will use when speaking back to you, so this is really important for an international audience.

Let’s set up VoiceOver for the first time. Tap on 'Settings'. Tap on 'General'. And scroll down to 'Accessibility' and tap on 'Accessibility'. And then let’s tap on 'VoiceOver'. You’re going to see a lot of settings inside of VoiceOver so let’s go through them.

The first one is 'Speak Hints'. And this is really good for when you’re first starting out. And what it’ll do is, when you move your finger over and have VoiceOver tell you what an item is, the Speak Hints if it’s turned on, will tell you, "Double Tap to launch the item" or "Open the folder", or whatever you can do there. And so it’s a really good way of sort of reinforcing there are a lot of commands within the VoiceOver environment and so it’s a really good way of reinforcing that as obviously you get used to that, you’ll turn it off.

The speak rate, which is the next item, turtle is for slow, rabbit is for fast. You can see that I am not very good at this so I have it fairly slow. I just don't usually interface the computer this way.

Then 'Type Feedback' is your next option. I’m gonna tap on that. And what Type Feedback does is with a keyboard, you can choose to either have it when you speak back a word or speak back the characters, or in this case the default settings speak back both characters and words. That will make more sense as you use VoiceOver to read back to you, whether you want it to read back characters to you or just whole words. So I'm going to leave the default there.

'Phonetics', when it comes to single characters it will tell you A if for Alpha and N is for November. I have that turned on. Although, I don't really hear it very much. ‘Use Pitch Change' and 'Compact Voice' these are just ways of fine tuning the voice feedback for you.

‘Braille Interface’ - let's tap on that. So here you can see I've taped on the braille interface and it's all the braille interfaces are Bluetooth. And so it has sensed that my bluetooth is turned off and said, "Do you want me to turn on bluetooth?" I'm going to select no, but if you had a braille interface you would turn your bluetooth on. And it would then ask you to pair up with your interface. So I'm going to turn this off. And then you have your different braille options.

Going back, when VoiceOver is active the rotor is a way of changing the settings of VoiceOver while VoiceOver is active. So you don't have to keep going back to the settings. So I'm going to show you here we have character selected, word selected, and most importantly down here, this is not on default and so you wanna do 'Vertical Navigation'. And this allows you, you're gonna use VoiceOver tools to navigate and to read back content to you. And so when you can use rotor to switch the navigation from horizontal, left to right, to vertical up and down and so this is a way of doing that. So you want definitely to select on vertical navigation.

Language Rotor is the same thing when you situation where you wanna change the language, I don't have any other languages selected. I left that blank so there will never be a Language Rotor. Navigate Images always with descriptions or never. So this is when you're in the web browser it'll read back to you. And then ‘Speak Notifications’ this is all of your auditory alerts and visual alerts. This will actually speak them back to you. So those are the settings and then we can go ahead and turn on, these are the settings the way that I want them. You can go ahead and turn on VoiceOver by just tapping and sliding to the right and now VoiceOver's on.

So we just turned on VoiceOver and you'll see that it just popped up the practice button. VoiceOver practice. So I'm going to swipe across to select it and then doubletap now to launch VoiceOver practice. Okay so.

>> Voiceover: VoiceOver practice heading.

>> Sami: So it just read me the heading and if so this is just a free area where I can do different gestures, it'll tell me what the gesture is and it'll tell me what it does so I'm just gonna show you a couple of them. This flick to right.

>> VoiceOver: Practice VoiceOver gestures. Commands. And typing in this area. Select the done button in the top right corner and doubletap to exit.

>> Sami: So you just heard the instructions when I flicked to the right. I'm gonna flick to the right again.

>> Sami: So you can see this is giving me a lot of different feedback on each of my gestures and there are a ton of different gestures and I am not going to be able to do them justice in this video. Notice how when I used different kinds of flick, it would say flick would advance to previous item or advance to next item or previous item. But if I flicked up or flicked down, it would do whatever the rotor setting was set at. So again this s a way of customizing. The rotor is a key component in customizing it's usage.

We’re going to use three different types of visual indicators. We’ll use a hand every time I tap. Every time I hit a button on the outside of the device we’ll give you an icon. And we’re going to use arrows to help point out key information on screen.

When you first launch VoiceOver there are two ways of navigating through, in this example, the home screen. The first way I can navigate is I can tap and what it will do is.

>> VoiceOver: 97 new items. 16 apps. Doubletap to open.

You can see that it read that folder and then it told me to doubletap to open.

>> VoiceOver: Productivity tools folder. 14 apps. Doubletap to open.

>> Sami: Okay so I swiped again on the screen, it moved to the next folder which is productivity tools, it told me there were 14 apps and it told me to doubletap on the screen to open. So that's one way of navigating and I can keep going from upper left-hand corner to lower right-hand corner by swiping one time to the right. A second way of using VoiceOver is if I have limited sight I can swipe across an icon and it will tell me the content of that icon so folder or the name of the icon and it'll also give me hints. So I'm gonna do that with Papercamera first.

>> VoiceOver: Adobe Viewer. Doubletap to open.

>> Sami: I just swiped over across Adobe Viewer and it told me doubletap to open and then I'm now gonna swipe across another folder. And it'll tell me the contents of that folder.

>> VoiceOver: Content tools folder. 6 apps. Doubletap to open.

>> Sami: So I swiped over content tools folder, it told me there were six apps in there and if I doubletap I can open it.

Let's go ahead and doubletap and let's open.

>> VoiceOver: Opening content tools folder. Content tools. Opened.

>> Sami: One of the usage methodologies behind VoiceOver is to drag your finger across items. So I'm gonna show you by dragging my fingers across and down and over, across these app icons and you'll see that VoiceOver starts reading them back to me, so watch.

>> Sami: The way that I did that was, tap and then drag my finger across whatever I wanted it to read back to me. Now I wanna use VoiceOver to help me navigate from page to page. So as you can tell on screen I'm on the last icon of the page and if I tap or flick over if I tap or flick over the page icons at the bottom, I can actually now go to the page scroller so I'm gonna do that.

And now VoiceOver is giving me feedback about what page it is. If I doubletap again I can go to page 2.

>> VoiceOver: Page 2 of 9. Adjustable.

>> Sami: And if I doubletap again I can go to page three.

>> VoiceOver: Page 3 of 9. Adjustable.

>> Sami: Now if I swipe across an icon. Or an app I can then activate that app.

>> VoiceOver: Injini

>> Sami: And if I doubletap that app it'll open and launch the icon. If I wanna get back home then I press the Home button once and I'm back in the Home screen.

So that's how you go backwards. You can cycle all the way through but often times it's just easier to just go back to the home screen.

VoiceOver is a very powerful feature. And in order to change the setting, of how VoiceOver behaves, without having to go back to the settings application, IOS environment has implemented this rotor concept. So, I have VoiceOver activated, and what I can do basically if you remember in the earlier part of the video we have - when you flick to the right or to the left, you can navigate left to right. But then if you flick up or down, depending on how you have the rotor setting, you can get different characteristics from VoiceOver. So, right now, so let's focus on activating the rotor first, changing the setting, and then showing the setting.

So the way that you activate the rotor is it's sort of like changing, it's like turning a dial. It's a two finger action, you put both fingers down on the iPad surface and you turn it as if you were turning a dial. And that's exactly what it did. I changed out to ‘Words’. I'm gonna change it again, I'm going to change my little dial. Now it'll go to ‘Containers’. Headings and vertical. Now I'm at vertical navigation. So if I were to flick down instead of going instead of it reading the word of the container back to me, it would it's actually going to go down to ‘Social Folder’ and if I go down again it'll go down to Development.

>> VoiceOver: In background

>> Sami: If I go up it will go up. And that's because I have the vertical navigation setup. Now I'm gonna change this one more time to Characters and I'm gonna go up and down, I'm gonna flick up and down and you'll hear it read my name letter by letter. The name of the folder.

>> VoiceOver: Sami Folder. A M

>> Sami: So that's where you can change the character or have it read the word or vertical scroll. And you can all do this on the fly while VoiceOver's activated.

We’re going to use three different types of visual indicators. We’ll use a hand every time I tap. Every time I hit a button on the outside of the device we’ll give you an icon. And we’re going to use arrows to help point out key information on screen.

>> Sami: VoiceOver can be used as an effective navigational tool on the iPad. So in this specific scenario what I'm gonna do is I'm gonna navigate to a separate page and then I'm gonna launch an app. So the first thing I'm gonna do is I'm gonna swipe over my page navigation.

>> VoiceOver: Page 1 of 9. Adjustable.

>> Sami: That highlights my page navigation.

>> VoiceOver: Finger to adjust the values.

>> Sami: And I'm gonna doubletap to move to the next page.

>> VoiceOver: Page 2 of 9. Adjustable.

>> Sami: And I'm gonna doubletap again to move to the next page.

>> VoiceOver: Page 3 of 9. Adjustable.

>> Sami: And let's say I wanted to launch the alphabet app. The way that I'm gonna do that is I'm gonna swipe and stop my finger at the point of the app and let go. So I'm gonna swipe and then stop. And what that did was it selected the alphabet and then now I just I'm gonna doubletap anywhere on the screen and it actually launches the app.

One of the key usage scenarios for VoiceOver is reading email back to the user. So what I have here is I have my email open. IYou can see that I get email from Ebay and I'm gonna just go through how you might process email and using the rotor how it changes some of the settings, so right now if I swipe to the right it'll start reading line by line, so let me do that.

>> VoiceOver: Ebay sent this message to Sami Rahman. Srahman2000.

>> Sami: Okay and so if I wanted to change the rotor let's say instead of reading it left to right top to bottom, I wanted to sort of scroll through a little bit faster, I mean I can scroll I can use the rotor to change. There's characters. Words. And again all these settings are changeable in my Settings in my ‘Rotor Settings’, but let me try to find so this will just jump from link to link which is pretty interesting. That can be very useful. Form control and vertical navigation so I'm gonna start swiping down.

>> Sami: Right, so it tells me that it's an image. I can then read the text and the links underneath it as well. But the idea here is that is one of the settings you can turn on and off inside the system if that's something you so desire. Another key usage scenario for VoiceOver is web surfing. So as you can see on this page there are a lot of links and a lot of information so as I'm moving left to right I may want to skip over some of that information. So I'll just sort of go through normally and then I'll use the rotor. This would be normally going through every line item left to right. But then I can use the rotor here to change to characters or words. I'm gonna go to links.

>> VoiceOver: Links

>> Sami: Links. So now I'm gonna use the VoiceOver to just switch from link to link.

>> VoiceOver: Three point zero out of five stars.

>> Sami: Turns out the stars are a link.

>> VoiceOver: All reviews.

>> Sami: All reviews button is a link.

>> VoiceOver: Customer reviews.

>> Sami: So in this particular area notice how I jumped from customer review all the way down to the sellers. And then one used and I can now keep jumping down the page.

>> VoiceOver: [VoiceOver in the background]

>> Sami: So by switching to links what I've done is I'm now navigating the page faster I can do the same thing with headings. That may be a little more effective especially with blogs. Or lines that I can just skip lines real quick. And then as I get down to where I want, let's say I wanna read this section, then I can turn my rotor to something else maybe vertical navigation. Now remember, there are two control elements. I can always go left to right by swiping left and right. And then I can use the rotor to have a secondary navigation, like line by line, vertical navigation and so forth. But this becomes a really effective way of using VoiceOver to read a webpage.

We're going to use three different types of visual indicators. We'll use a hand, every time I tap. Every time I hit a button on the outside of the device, we'll give you an icon, and we'll use arrows to help point out key information on screen.

Let's start with setting up Guided Access. Tap on the Settings app. Tap on the General tab. On the General tab, scroll up or down to find accessibility. Tap on Accessibility. Using your finger, scroll up or down to find Guided Access. Under General Accessibility Guided Access, you will be able to turn Guided Access on. So, to turn it on, I touched the On/Off slider and two settings appear. First, is Set Passcode. Now, when I was first experimenting with this, it wasn't apparent that you had to set a passcode. So, what I'm telling you is, you have to set a passcode, otherwise you're going to go through the same problem that I went through. So, setting a passcode is really easy. Tap on Set Passcode. It's going to give you a four digit number to put in. I'm going to, for the sake of this demo, pick a very easy one. I would recommend for you, to pick something more complex, but for this demo... There we go. I've now set my passcode and confirmed it.

The other thing is the iPad assumes that when you are in Guided Access mode, that you are facilitating this with someone else, for someone else's usage, and you want the application to be on all the time. If you're in a teaching environment, as an example, and you want the iPad to turn off, even when you're staying in that application, then, you would enable Screen Sleep, but by default it is not enabled.

We're going to use three different types of visual indicators. We'll use a hand, every time I tap. Every time I hit a button on the outside of the device, we'll give you an icon, and we'll use arrows to help point out key information on screen.

So, let's see Guided Access in action. The first thing I'm going to do is find an app, to show you how this works. So, this is a one-button app and when I touch the button, it makes a noise. I'll show you. I'll tap on the button right now, and you can see, it made that noise. It says the word, Radical.

Probably the most exciting feature on Guided Access is the ability to shut off the Home button. I'll show you an example of when the home button is not shut off. In this case, I haven't turned on Guided Access yet, and so when I tap the Home button, which I'm going to do right now, it takes me back to the Home screen. Let's get back in the application. So, if I want to disable that feature, I need to turn on Guided Access. Now, there are other ways of disabling this feature, like putting a physical barrier over the Home button, a piece of foam core or above cap or a plastic plate or I've also seen cases that have a slider switch that covers the Home button. So, this is a way of doing it via software, versus hardware.

So, let me show you how to do that. The first thing I need to do is turn Guided Access on for the application and, in order to do that, I triple tap on the Home button. So, one, two, three and it launches the Guided Access menu. In the lower left-hand corner, is the hardware button and, as you can see, it is always off.

One of the primary features of Guided Access is, when it's activated, it automatically turns off the Home button. So, let me show you how that works. So, I need to start Guided Access and in the upper right-hand corner, I tap on the Start button and you can see by the pop-up, Guided Access is now turned on. If I tap on the Home button, let me show you what happens. So, I'm going to tap on the Home button and I get a menu that says, "Guided Access is enabled, triple tap to exit." So, if I tap, I'm going to now double tap. Again, I get the same menu. I have to triple tap. So, now I just triple tapped and now this is where the passcode comes into play. This is why you have to set a passcode, otherwise you couldn't get into the menu, get past this point. So, let me type in our passcode. So, that's how the Guided Access turns off the home button.

Guided access also does two other really great things. The first thing that it does is it turns off certain sensors. So, if you have applications that you want to turn off the entire screen, let's say it's a music app that reacts to the motion sensor and you want to turn off the screen to prevent users from doing other things with it, then you can tap on the touch, and you can see now the screen is grayed out. So, if I hit Resume... Let's do that. I'll hit Resume. And if I tap anywhere, which I'm doing on the screen right now, I'm tapping all over the screen and the button isn't being activated. So, that's how to turn the whole screen off. Tap three times, one, two, three and now add my passcode. Now I'm back in the system.

I'm going to turn Touch back on and that's simply by touching the On/Off button. By tapping the On/Off button, it'll turn back on, and I can also turn off the motion sensor. So, this would be an example, let's say, you have an application like an Etch a Sketch. It's a drawing application that erases itself, or plays something or does something with the motion sensor. You can turn that off as well. I'm not going to demonstrate it here, but you get the idea.

So, the third and final feature of Guided Access that everyone is excited about is the ability to turn off portions of the screen. How would you turn off portions of the screen? The way you do that is by drawing. So, in this Guided Access area, you draw where you want the screen to be turned off. So, I'll show you a couple of different shapes you can draw. So, I can draw a square. I can draw a circle, and I can draw a long rectangle. You can also, as you can see here, draw unusual shapes as well. So, I'm going to exit that.

Let's say I was trying to draw around this area there. So you can see that sometimes it does unusual shapes, sometimes it doesn't, but let's say for the sake of this demonstration that I just want to turn off the Radical button. Let's say there were other buttons, as an example. So, I'm going to just draw myself a circle. Then, bring it, using my finger, I'll tap and slide it over. Then, I'm going to tap and hold on one of the handles and just draw this out to be a little bit bigger. I'm going to exit out of that small one. So, you can see here, I've only turn off a portion of the screen. Now, where this is really exciting is those little setup buttons in the corner, where those buttons that launch out other applications because they always want you to buy more stuff, you can turn all those off. What's very, very cool about Guided Access is this is saved on a per application basis, so you can go through and set these all up in advance, and then it's turned off, per application. So, these hot spots are per application. So, let me show you this. I'm going to hit Resume to activate the app, and you can now see that in that circle area, my app is deactivated and I'm going to touch it. I just touched it. You can't see me touching it, but I just touched it a couple of times and it does not trigger the Radical button. So those are the key features of Guided Access.

There's one last feature I want to show you and that is, what happens if you try and put the wrong passcode in? So, one, two, three and I'm going to put in the wrong passcode and look at what happens. For 10 seconds, I'm locked out. Now, this is interesting because what if you have a user that just sits there, every 10 seconds, and he keeps trying to type in passwords to defeat your code? I'm going to do it again, and now I'm locked out for 60 seconds. So, that's very interesting. I thought that was interesting. It's a way of keeping people from fixating on the button and sort of managing that process. I can also think, for some users, that they'd be very, very frustrated, and that might be a cause of great dissatisfaction. So, you may have to manage that a little bit with your user.

We're going to use three different types of visual indicators. We'll use a hand, every time I tap. Every time I hit a button on the outside of the device, we'll give you an icon, and we'll use arrows to help point out key information on screen.

One of the first usage scenarios that came to mind when we started looking at Guided Access is when you're going use the iPad, primarily as a single use device. So, for example, the thing that immediately comes to mind is when you're using an iPad primarily as an augmentative alternative communication or AAC device. In those scenarios or often in those scenarios, the user, 95% of the time, is going to be in one application trying to use it. They may have multiple iPads doing a couple of different things, but they're going to be using it in one primary function. So, it's perfect.

So, let me show you how to set this up for an AAC user, set up Guided Access for an AAC user, where you wouldn't want them to get out of the app. So, on-screen, I have Speech button. I'm going to tap on Speech button, and I have a communication board set up already. So, what I want to do is just turn on Guided Access. So, triple tap one, two, three, and all I have to do is start. So, now I'm locked in. If I tap on the Home button, it'll give me an error message and now, every time I want to come back to this device, it's always set up, on this page, for AAC or within this app. So, if you had multiple pages, you could scroll within the app. So, it's just perfect for that application.

Now, the other comment that I will make is, if you go to our setup video, there's also a setting to be able to dim the screen. So, if you're going to do AAC and it's always going to be locked as an AAC device, or single use device, then I would also put the screen saver on, so that it will protect the battery life.

I want to show you a Guided Access usage scenario we're calling "In App Control". There are times when you're going to want to prevent the user from touching certain buttons inside of an application. So, I want to show you how to set that up, using Guided Access. On-screen, I'm going to tap on the flash card app, and it launches the flash card app. In the upper left- hand corner, you can see the Facebook and Twitter buttons. So, if I tap on those buttons, it's going to launch the web browser and knock me out of this application and put me in a totally different other application. So, if I want to prevent the user from doing that, I'm going to use Guided Access to do that. So, let me show you how to set that up.

Tap on the home screen three times, one, two, three and now you're in the Guided Access control panel. So, I just want to draw a box around Facebook and Twitter. So, you can see me do that, and there's my box. So, I just now press the Start button and Guided Access is now started, and you can see in the upper left-hand corner that it's grayed out where I do my little box. If I touch on those boxes, it does not launch out Facebook and Twitter. So, that's exactly what I want.

Now, let me show you some problems that we found. In the upper right-hand corner, you can see the Setup button. So, if I tap on the Setup button, you can see that. You can see the Mute button, and you can see a Lock button. Now, I'm going use Guided Access to draw a circle around those, to prevent my users from accessing those, and I want to show you what happens. One, two, three to launch Guided Access. I need to enter my passcode because I'm in Guided Access mode. Type in my passcode. I'm going to draw a box around that rectangle. So, there it is. There's my rectangle. I made my box around those three items, and I'm going to press Resume now to launch Guided Access again. Now, with the Facebook, and I'm tapping on Facebook, that didn't work, but if I tap on to the Setup button, even though it's grayed out, watch what happens. So, I'm tapping now. There, it launches Setup and I can hit the Done button and not only was the setup button and the Done button underneath the restricted area for Guided Access, but they both worked.

So, what does that mean? It means that buttons have different access levels within an application, but it also means, from an end user perspective, you need to test all of your Guided Access, well before you're going use it with an end user to make sure that it works the way you want it to work. Some buttons are going to work and some areas of the screen are going to work perfectly. Other areas, like Setup buttons, may or may not work, depending on whether the developer has written for Guided Access or not, and depending on how that program is written for that button. So, test. That's the answer.

I think the usage scenario that gets the most people excited about Guided Access is the idea that they don't have to use a plastic plate or keep their hand over the Home button to prevent users from switching from app to app. So, let me show you how you would do that. If I'm in a group setting, and I want to launch this, I'm going to launch this and use this application. I would then need to put it in Guided Access mode, triple tap and then hit start. Then, I could hand that off to a user, and if the user tries to get out of the app, by hitting on the Home button, then they get the familiar error message.

The problem that we see, is in an environment where you're switching from app to app to app, using Guided Access, this just needs to be managed because you have a couple of extra steps. I've got to triple tap. I've got to enter my passcode. I then need to end Guided Access. I need to then hit the Home button, then launch the second app, then triple tap again and hit Start to launch it for the second app. So, as long as it's managed, you can work around the extra time it's going to take.

We're going to use three different types of visual indicators. We'll use a hand, every time I tap. Every time I hit a button on the outside of the device, we'll give you an icon, and we'll use arrows to help point out key information on screen.

Here's how to set up AssistiveTouch. Tap the settings app. Tap General. Scroll down until you find Accessibility and tap Accessibility. Scroll down again until you find AssistiveTouch and tap AssistiveTouch.

So, now you're in the settings application under the General tap, under the Accessibility tab, under the AssistiveTouch tab. Here you want to tap and slide the AssistiveTouch slider and turn it on. You're going to notice two things. A little button appeared on the right-hand side, and custom gestures settings appeared.

Let me focus on the button first. This button is the way that you activate AssistiveTouch, and there are eight positions for the button - one, two, three, four, five, six, seven, eight. They will slide into place if you let go. The other thing that's interesting about the button - I'm going to go to the homepage and I'll show this to you.

When I go to Search, so I'm going to slide and Search, you'll notice that the AssistiveTouch button automatically came up and moved out of the way of the keyboard. So, it's an intelligent button, and it'll try to stay out of your way until you need it.

I'm going to show you how to record a custom gesture for AssistiveTouch. Tap on the settings app. Tap on the General tab, and then tap and scroll down until you get to Accessibility. Tap on Accessibility. Tap and scroll down until you get to AssistiveTouch. Under the AssistiveTouch tab, you'll see Create New Gesture, so tap on that.

Then you're given a screen where you can, anytime you touch it, you'll record a gesture and then you can save that. So, I'm going to record a flick up gesture, which allows me to scroll down the screen. So, I'm going to do that. I'm going to tap down and flick up. You can see that it automatically went into record mode and recorded that gesture, and you can see it onscreen.

I'm going to now stop and then play back that gesture to make sure that's what I want, and that is what I want. Now, in the upper right-hand corner, I'm going to hit Save, and it'll give me a dialog. So, I'm going to type in something meaningful. This actually, when I'm in the a web browser, or I'm even in the settings menu, it actually allows me to scroll down. So, I'm going to call this "scroll down". I'm going to hit Save, and now you can see "scroll down" in my custom gestures menu item.

If I wanted to delete it, let's say I made a mistake or I no longer need it, all I have to do at this point is hit the Edit button, which is in the right-hand corner, and then tap that "Do Not Enter" sign basically. Then tap on Delete. I'm not going to do that right now, because I want to show you how it works, but that's all I have to do to delete is by tapping that Delete.

We're going to use three different types of visual indicators. We'll use a hand, every time I tap. Every time I hit a button on the outside of the device, we'll give you an icon, and we'll use arrows to help point out key information on screen.

Before I do a tour of the extensive menuing system of AssistiveTouch, I want to talk a little bit about the actual menu itself and how it works. In order to activate the menu, tap on the little white button, and in my case it's on the right-hand side. So, I'm going to tap on that. You'll see four items in that button. The top three items actually lead into other menus, and we'll get into that in a second.

One of the things that's not apparent though, is let's talk about exiting first, of AssistiveTouch. I can tap anywhere outside of the menu itself, and that will exit out of AssistiveTouch. So, I've just tapped on the home screen and it exited out, so that's one item.

The second item that I want you to know - and I'm going to launch AssistiveTouch again - is that in the middle, especially when you get into gestures, both custom gestures and stock gestures, to deactivate a gesture, you tap in the center of the menu, and that also will turn the menu off. So, it does two things. It deactivates a gesture, and turns the menu off, so that's really important.

The other thing you'll note, for those of you familiar with iOS 5, this is a beta version of iOS 6, and the AssistiveTouch menu is much larger in this version, in the new iOS 6 than it is in the old iOS 5. There are also some new menu items, so I don't want to talk about that.

So, that's talking about the menu, now let's get into a little bit of action here. The first thing you're going to see is the Home button at the bottom. If I were to tap on it, which I will do right now, it does exactly what the physical Home button does. So, I tap on it, and notice that it moved me from the home screen. It shut down the AssistiveTouch menu, it moved me from the home screen to the search pad, and this is exactly the same behavior as if I were touching on the physical Home button.

I will show you. I'm going to tap and scroll over to the home page, and I'm now going to tap on the physical Home button. Tap, and you'll see that it did exactly what it did when I did it with the AssistiveTouch menu.

The next submenu I want to focus on in AssistiveTouch is the gesture submenu. So, I'm going to tap on AssistiveTouch to activate the main menu, and then tap on Gestures. I'm presented with five options. I can either mimic a two-finger, three-finger, four-finger, or five-finger gesture, or I can use the arrow key to get back out of the menu system. So, I have multi- touch turned on. If I use four fingers onscreen and swipe upwards, I will then be able to get the multitask bar.

So, I'm going to show you how to do that with one finger. So, I'm going to use one finger to tap on the four-fingered gesture. You can see onscreen I have four fingers, one, two, three. I'm going to tap and flick up, and then watch. As you can see, I've just opened up the multi-touch menu. To get rid of the multi-touch menu, I can tap and flick down.

The reason why I'm able to do that is I have multi gestures turned on, but that gives you a good sense. I did all of this with one finger. So, it's a good example of how the gesture system works. Now, notice that I still have those four fingers there. In order to turn off the gesture once it's activated, I need to tap on the AssistiveTouch menu again, and then tap in the center. That deactivates the last gesture.

One of the most extensive submenus of the AssistiveTouch menu is the device submenu. So, tap on the AssistiveTouch button, tap on Device, and you get the first of three submenus. In the upper left-hand corner is Lock Screen. If I were to tap on this, this would do exactly the same thing that the Wake/Sleep button does. For the sake of this demonstration, I won't actually tap on it, but it would cause a screen to go blank and put the iPad to sleep.

The next button over is the Rotate Screen submenu. So, I'm going to tap on that, and you get four options and then you're able to go back. So, if I tap on portrait, you'll actually see this screen rotate, and then I can tap on Left and that will bring me back. But I can turn the screen in 360 degrees, tapping the center to get back.

The next button is the Lock/Rotation button. If I tap on that, you'll see in the upper right-hand corner, the Lock/Rotation, and if I were to move the iPad, it would not rotate. So, I'm going to unlock that.

The next two buttons, before we get to the More button, is the Volume Up/Volume Down. So, I'm going to tap twice on the Volume Up and you'll see the volume will go up. Tap, tap, and the volume goes up. I'm going to tap twice, and the volume will go down, and that will bring the volume down. So, that's exactly like the hardware buttons.

Next, when I hit the More button, I'm taken to another submenu. This is a new submenu to iOS 6, and this is a beta version of iOS 6, so this may not be the final configuration. But previously in iOS 5, I had the Shake button, but it was in the first submenu, and not in the sub submenu. I do have the Renew options. Triple click on the Home button, this button will mimic tapping on the Home button three times.

I need to, in Accessibility program it to do something. I don't have it programmed to do here, but you could have it programmed to launch and deactivate AssistiveTouch as an example. Take a screenshot if I tap on that. I'm going to actually do that. Tap on it and take a screenshot, and it actually takes a screenshot. It shuts down the AssistiveTouch menu and takes a screenshot of the actual screen.

Device, More, Multitasking launches the multitasker. So, I'm going to tap on that, and you can see that it brings up the multitasker. If I tap the same button again, it'll shut down the multitasker. So, AssistiveTouch, Device, More submenu, tap the multitasker and it brings down and brings me back to where I was.

Then the last one, Device, More, is Shake. Shake is supposed to when the developer programs it to do so, Shake will actually mimic the device shaking. It doesn't actually shake the device. It just tells the program that the shake sensor has gone off.

I was very excited about the shake button, because if you have this device mounted to a wheelchair, the Shake button allows you to use applications that use the motion sensor. But in my testing, it really hasn't panned out that not every developer develops for it, so that's unfortunate.

The last submenu I want to discuss is the Favorites submenu, and that is where custom gestures can be accessed within the AssistiveTouch menu. Tap on AssistiveTouch, and then tap on Favorites. Earlier, I created a scroll down custom gesture, and that's where the star is. But default is this pinch gesture. So, let me illustrate how to use pinch, and then I'll illustrate scroll down.

So, the best application I can think of to illustrate pinch is the photo album. So, I'm going to tap out of AssistiveTouch, I'm going to tap on photos, and you're going to see our camera roll. I'm going to tap into one of the photographs to bring it up. Then, I'm going to bring up the pinch gesture.

So, I'm going to tap on AssistiveTouch, tap on Favorites, and tap on Pinch. Scrolling out makes it bigger, scrolling in makes it smaller, and this is exactly what would happen if you were pinching with two fingers. Anywhere I tap, I still have the pinch gesture turned on. So, I want to turn that off now, because I want to show you the other gestures. So, I tap on AssistiveTouch and hit the center of the AssistiveTouch menu and that turns off the pinch gesture.

Now, the next one I want to show you, I'm going to get out to the homepage, and tap on the settings application. I'm using my finger to scroll up and down, you see how long that is? Well, if you have a user that has a hard time with sort of fine motor, you can turn on AssistiveTouch, turn on Favorites, and I'm going to show you that scroll down gesture I just made, and you can now tap and it's going to scroll up every time. There it is. Just by tapping on the screen once, I can scroll up. Same is true over here.

We're going to use three different types of visual indicators. We'll use a hand, every time I tap. Every time I hit a button on the outside of the device, we'll give you an icon, and we'll use arrows to help point out key information on screen.

One of the usage scenarios is evoking custom gestures that you've created. One of the hardest things to do when you first get an iPad, especially for a user who has fine motor issues, is unlocking the iPad when it's turned off. So, once you turn it on, you've got to unlock it to get in. What I've done is inside of Settings, under Accessibility here, under AssistiveTouch, I've created a custom setting called "swipe right". We're going to use that to unlock the screen.

So, I'm going to go ahead and turn this off, and then turn it back on by hitting the Home button. Now I've got to unlock it. Well, if I have fine motor issues, I may have a hard time unlocking that lock. So, I can tap on the AssistiveTouch menu, tap on Favorites, and Swipe Right, and then all I have to do is tap the button and it unlocks it for me.

The next usage scenario I want to cover with AssistiveTouch is the idea of button replacement. So, I'm going to scroll and find an application here that has some music, and we'll use the AssistiveTouch menu to control the volume buttons instead of the physical buttons. So, let me show you. Tap on Flashcards, and you can hear the music in the background. I could tap on the AssistiveTouch menu, hit Device, and increase the volume or decrease the button.

I can do this - I'm going to turn this all the way down - without even touching the buttons. For those users who have fine motor issues, tapping the Volume button, especially inside of a case can be very, very difficult. So this holds true for lock rotation, screen rotation, this holds true for a number of different things that using the little slider button may be a problem when turning the device on or off. It may be just easier to hit a soft button onscreen than it is to hit a hard button on the physical device.

So, I'm going to use one finger to tap on the four-fingered gesture, and you can see onscreen I have four fingers, one, two, three. I'm going to tap and flick up. Then, watch, as you can see, I've just opened up the multi- touch menu. To get rid of the multi-touch menu, I can tap and flick down.

The reason why I'm able to do that is I have multi gestures turned on, but that gives you a good sense. I did all of this with one finger.

Resource List: This is the resource list. The first place I'm going to take you to is the Bridging Apps website. This is a community of parents, teachers, therapists, doctors, and individuals with disabilities, and we share information both in physical meetings and online via the community tools.

We also have a search engine called "Insignio", where you can live search iTunes as well as Android. We have about a 1,000 special needs apps, which almost 400 of them have been completely reviewed. The reviews are done with a special needs user and a therapist or special education teacher.

We have a very fixed criteria on which we're not diagnosis-based, we're skill-based. We track 180 skills from ages 0-18 years old, so zero to adult. We not only tracked 180+ skills, but we also have five keystones.

They are auditory, cognitive, fine motor, language, and visual impairment, and those are all based on zero months of age to 18 to adulthood. So, you can search by typical age range as well as by skill.

I also link you to the Apple website. Apple has pretty extensive manuals on all of their accessibility features. And then for those who are particularly interested in switching, Jane Farrell [SP] has a number of great resources. She maintains a list of all apps and switching interfaces. Bridging Apps also has a list as well, but she really is a great source for that.

Students Who Viewed This Course Also Viewed

SHARE

Instructor Biography

Sami Rahman is the CEO of SmartEdTech. SmartEdTech develops software for children with disability to learn and grow. Mr. Rahman has certification in an Assistive Technology Applications Program offered by California State University and Mobile Devices for Children with Disabilities from TCEA. Mr. Rahman is the author of Getting Started: iPads for Special Needs. The book is available in print with a full version online for free.