iPhone, iPad and iPod touch come with assistive features that have changed the learning landscape for students with special needs. These innovative technologies allow every student to experience the fun and function of iOS.

iOS devices are fun and powerful learning tools for students with attention deficits or other cognitive and learning disabilities. Teachers can minimise visual stimulation by limiting access to a single app, and students can use FaceTime and Camera to communicate with more than just words.

Guided Access

Guided Access helps students with autism or other attention and sensory challenges stay on task. A teacher or therapist can limit an iOS device to stay on one app by disabling the Home button, and even restrict touch input on certain areas of the screen. So wandering taps and gestures won’t distract from learning.

Speak Selection

Hearing a word as it’s being read can help with comprehension for a wide range of learners. Speak Selection can read a student’s email, iMessages, web pages and e-books out loud. Double-tap to highlight text in any application, tap Speak, and the device reads the selected text. Students can have words highlighted as they’re being read so they can follow along. And the voice’s dialect and speaking rate can be adjusted to suit students’ needs.

Siri

For some students, typing can be a challenge. Siri, Apple’s built-in personal assistant, can help students do the things they do every day — just by asking. They can say “Remind me to hand in my history assignment on Friday” or “Text Mum and say I’m staying back after school”. Siri can help students who struggle with organisation by scheduling activities and setting reminders. And Siri is integrated with VoiceOver — an advanced screen reader — so blind and vision-impaired students can ask where the nearest music shop is and hear the answer read out loud.1

Dictation

For students with print disabilities like dyslexia, it may be easier to speak a thought than to type it. With Dictation they can reply to an email, make a note, search the web or write a report using just their voice. Tap the microphone button and Dictation converts words (and numbers and characters) into text.

iBooks

iBooks Author gives teachers a way to create customised learning materials for iPad to support a wide range of learning needs. Interactive features like 3D images, video, audio and photo galleries provide multimodal learning opportunities that make iBooks textbooks more engaging to learners. Features like multicolour highlighting, notes, search, study cards and the glossary help students be better organised and better prepared. Built-in review questions give students an immediate assessment of their knowledge so they understand where to focus more study time. And iBooks supports VoiceOver, Speak Selection and closed-captioned videos to help all types of learners.

Safari Reader

For some students, navigating the web can be a sensory overload. Safari Reader reduces the visual clutter on a web page by removing distractions. It strips away ads, buttons and navigation bars, allowing students to focus on just the content they want. And Safari Reader works with Speak Selection and VoiceOver, so students with print disabilities can get auditory feedback.

Dictionary

Stumbling across unfamiliar words is bound to happen when reading new texts or learning new subjects. Students can look up words by using the dictionary integrated in iOS.† They’ll have quick access to definitions and commonly used phrases to help with grammar, spelling and pronunciation — even if they’re offline.

FaceTime

With FaceTime, students can easily make video calls over Wi-Fi. And FaceTime can be a window into the classroom, letting students who are stuck at home or in hospital engage with the rest of the class. Or allowing a therapist to observe a student in action without disrupting the teacher’s lesson. Thanks to its high-quality video and fast frame rate, FaceTime is also ideal for students who communicate using sign language. Every gesture and facial expression shows in crystal-clear detail. And because FaceTime comes standard on the Mac, iPhone, iPad and iPod touch, students can use it to communicate with other OS X or iOS users.2

Photo Booth

Students can use Photo Booth to take and share snapshots, giving them another way to communicate. For instance, students who struggle with personal interaction — like answering a direct question — may find it easier to see their own face on the screen in order to begin communicating. And because Photo Booth is integrated with the built-in iSight camera, it displays images the moment they’re captured.

Camera

Every iOS device comes with a built-in camera that can be used for still photos and video. Therapists working with students can capture examples of student behaviour or model expected behaviour. Speech pathologists and physical therapists can record therapy sessions to document student progress. Teachers can record classes, experiments and excursions to share with students who are hospitalised or can’t leave home. Or students can record themselves completing assignments — to test reading fluency, for example — and send them to the teacher.

Word Prediction

Word prediction in iOS can help students who have dyslexia or cognitive challenges — or those learning English — improve their vocabulary and word-building skills. iOS suggests the correct spelling after just a few letters are typed. With Speak Auto-text enabled, students hear a sound effect and the suggested word spoken. They can keep typing to ignore the word, or press the Space bar to have iOS type it. So students can learn new words without struggling to spell them correctly.

Photos and iMovie

For students who have a hard time communicating their thoughts in written words, Photos and iMovie allow them to express themselves through multimedia. With the built-in camera and Photos, many aspects of learning that are traditionally print oriented can be captured in a concrete, visual way. This can help students who struggle with reading or are learning English. And teachers can create photo books to teach social situations and life skills or to model appropriate behaviour, so students can refer to these stories for future use.

With iMovie, students may find the process of writing both the visual and the audio elements of a script — and the overall excitement of making a movie — more engaging than other kinds of narrative writing assignments. iMovie can also help strengthen sequential ordering skills, and give students the chance to use visual and spatial strengths to develop their storytelling skills.

Speech

Students who have difficulty with expressive speech can benefit from the assistive features built into iOS. FaceTime lets students communicate visually — through sign language, gestures or facial expressions.2 iMessage lets students chat with classmates about homework via text.3 And Speak Selection lets them hear words read aloud to help with speech development. It can even communicate for them by speaking the words they type.

Students who are blind or have low vision can use VoiceOver, an advanced screen reader, to get the most from their iOS devices. And Siri and Dictation help kids type, launch apps and read their calendars.

VoiceOver

VoiceOver is a gesture-based screen reader that lets students know what’s happening on their Multi-Touch screen — and helps them navigate it — even if they can’t see it. Students can triple-click the Home button wherever they are in iOS to access VoiceOver. They can hear a description of everything happening on the screen, so they can know which app their finger is on, find a passage in an essay or have an e-book read aloud. Students can adjust the VoiceOver speaking rate and pitch to just how they want it. Learn more about VoiceOver

Siri

Siri, Apple’s built-in personal assistant, can help students do the things they do every day — just by asking. They can say “Remind me to hand in my history assignment on Friday” or “Text Mum and say I’m at the library”. Siri can make phone calls, send messages, schedule meetings, set reminders and more. And Siri is integrated with VoiceOver — an advanced screen reader — so blind and vision-impaired students can ask where the nearest music shop is and hear the answer read out loud.1

Dictation

Dictation lets students talk where they would type. They can reply to an email, make a note, search the web or write a report using just their voice. Tap the microphone button and Dictation converts words (and numbers and characters) into text.

Zoom

Zoom is a built-in screen magnifier that works anywhere in iOS, so students can better read an essay, view a diagram or get details on a map. And it works with all apps from the App Store. A simple double-tap with three fingers instantly zooms in 200 per cent, and the magnification can go up to 500 per cent. While zoomed in, everything works as usual. Students can use all the familiar gestures to navigate their device. And Zoom works with VoiceOver, so they can better see — and hear — what’s happening on the screen.

Invert Colours

If a higher contrast helps students better see what’s on the screen, iOS lets them invert the colours shown onscreen. This works with text, graphics and even video. Once the colours are set, the settings apply system-wide, so it’s always the same view no matter what they’re seeing. It can also be used with Zoom and VoiceOver.

FaceTime helps students who are deaf or hard of hearing communicate and stay connected. And closed captions and mono audio let them get the most out of the content on their iPhone, iPad and iPod touch.

FaceTime

Thanks to its high-quality video and fast frame rate, FaceTime is ideal for students who communicate using sign language. Every gesture and facial expression is in crystal-clear detail. And because FaceTime comes standard on the Mac, iPhone, iPad and iPod touch, students can talk to OS X or iOS users in a classroom down the hall — or halfway around the world. As if they’re face to face.2

Mono Audio

Stereo recordings usually have distinct left- and right-channel audio tracks. So students who are deaf or hard of hearing in one ear may miss some of the audio contained in that channel. iOS can help by playing both audio channels in both ears, and lets students adjust the balance for greater volume in either ear, so they can experience all the audio in a lecture, video or musical composition.

Closed Captions

Closed captions offer all kinds of visual learners the ability to see captions in video to help with comprehension. Captions appear onscreen in easy-to-read white type on a black background. iOS supports closed captioning — as well as open captions and subtitles — across a wealth of educational materials, from podcasts in iTunes U courses to embedded videos in iBooks textbooks.

GarageBand

GarageBand can help improve auditory comprehension among students who are deaf or hard of hearing — particularly those adjusting to new cochlear implants. Teachers can create podcasts of conversational speech and download them to a Mac, iPhone, iPad or iPod touch. Students use the podcasts to learn inflection and how to differentiate one voice from another. GarageBand is also great for speech therapy, learning tonal languages like Chinese or helping deaf students gain an understanding of how loud things sound using an audio wave file.

Innovative iOS technologies make the Multi-Touch screen more accessible to those with physical or motor challenges. And features like Siri let students control their iPhone, iPad and iPod touch just by talking.

AssistiveTouch

iOS devices feature high-precision, touch-sensitive displays that require no physical force, just simple contact with the surface. And AssistiveTouch allows students with limited motor capabilities to adapt the Multi-Touch screen of their iOS device to their needs. So more complicated Multi-Touch gestures, like a pinch or multifinger swipe, are accessible with just the tap of a finger. Or students can create customised gestures. And if they have trouble pressing the Home button, they can activate it with an onscreen tap. Gestures like rotate and shake are available even if the iOS device is mounted on a wheelchair. And for students who need assistive devices such as joysticks, iOS devices also support a number of third-party options.

Siri

Siri, Apple’s built-in personal assistant, can help students do the things they do every day — just by asking. They can say “Remind me to hand in my history assignment on Friday” or “Text Mum and say I’m at the library”. Siri can make phone calls, send messages, schedule meetings, set reminders and more, all with minimal physical touch. And Siri is integrated with VoiceOver — an advanced screen reader — so students can ask where the nearest art supply shop is, and hear the answer read out loud.1

Dictation

Dictation lets students talk where they would type. They can reply to an email, make a note, search the web or write a report using just their voice. Tap the microphone button and Dictation converts words (and numbers and characters) into text.

Apple Footer

Siri is available in Beta and requires Internet access. Siri may not be available in all languages or in all areas, and features may vary by area. Click here to see the complete list. Mobile data charges may apply.

FaceTime video calling requires a FaceTime-enabled device for the caller and recipient, and a Wi-Fi connection. Availability over a mobile network depends on carrier policies; data charges may apply.

Normal carrier data rates may apply. Messages may be sent as SMS when iMessage is unavailable; carrier messaging fees apply.

† Dictionary is available in English (US), English (UK), Chinese (Simplified), Japanese, German, Spanish and French.