iOS devices are fun and powerful learning tools for students with attention deficits or other cognitive and learning disabilities. Teachers can minimize visual stimulation by limiting access to a single app, and students can use FaceTime and Camera to communicate with more than just words.
Guided Access helps students with autism or other attention and sensory challenges stay on task. A teacher or therapist can limit an iOS device to stay on one app by disabling the Home button, and even restrict touch input on certain areas of the screen. So wandering taps and gestures won’t distract from learning.
Hearing a word as it’s being read can help with comprehension for a wide range of learners. Speak Selection can read a student’s email, iMessages, web pages, and ebooks out loud. Double-tap to highlight text in any application, tap Speak, and the device reads the selected text. Students can have words highlighted as they’re being read so they can follow along. And the voice’s dialect and speaking rate can be adjusted to suit students’ needs.
iBooks Author gives teachers a way to create customized learning materials for iPad to support a wide range of learning needs. Features like multicolor highlighting, notes, search, study cards, and the glossary help students be better organised and better prepared. Built-in review questions give students an immediate assessment of their knowledge so they understand where to focus more study time. And iBooks supports VoiceOver, Speak Selection, and closed-captioned videos to help all types of learners.
For some students, navigating the web can be a sensory overload. Safari Reader reduces the visual clutter on a web page by removing distractions. It strips away ads, buttons, and navigation bars, allowing students to focus on just the content they want. And Safari Reader works with Speak Selection and VoiceOver, so students with print disabilities can get auditory feedback.
Stumbling across unfamiliar words is bound to happen when reading new texts or learning new subjects. Students can look up words by using the dictionary integrated in iOS. They’ll have quick access to definitions and commonly used phrases to help with grammar, spelling, and pronunciation — even if they’re offline.
With FaceTime, students can easily make video calls over Wi-Fi. And FaceTime can be a window into the classroom, letting students who are house or hospital bound engage with the rest of the class. Or allowing a therapist to observe a student in action without disrupting the teacher’s lesson. Thanks to its high-quality video and fast frame rate, FaceTime is also ideal for students who communicate using sign language. Every gesture and facial expression shows in crystal-clear detail. And because FaceTime comes as standard on the Mac, iPhone, iPad, and iPod touch, students can use it to communicate with other OS X or iOS users.1
Students can use Photo Booth to take and share snapshots, giving them another way to communicate. For instance, students who struggle with personal interaction — like answering a direct question — may find it easier to see their own face on the screen in order to begin communicating. And because Photo Booth is integrated with the built-in iSight camera, it displays images the moment they’re captured.
Every iOS device comes with a built-in camera that can be used for still photos and video. Therapists working with students can capture examples of student behavior or model expected behaviour. Speech language pathologists and physical therapists can record therapy sessions to document student progress. Teachers can record classes, experiments, and field trips to share with hospitalized or housebound students. Or students can record themselves completing assignments like reading fluency and send them to the teacher.
Word prediction in iOS can help students who have dyslexia or cognitive challenges or are learning English improve their vocabulary and word-building skills. iOS suggests the correct spelling after just a few letters are typed. With Speak Auto-text enabled, students hear a sound effect and the suggested word spoken. They can keep typing to ignore the word, or press the Space bar to have iOS type it. So students can learn new words without struggling to spell them correctly.
Photos and iMovie
For students who have a hard time communicating their thoughts in written words, Photos and iMovie allow them to express themselves through multimedia. With the built-in camera and Photos, many aspects of learning that are traditionally print oriented can be captured in a concrete, visual way. This can help students who are struggling readers or learning English. And teachers can create photo books to teach social situations and life skills or model appropriate behavior, so students can refer to these stories for future use.
With iMovie, students may find the process of writing both the visual and the audio elements of a script — and the overall excitement of making a movie — more engaging than other kinds of narrative writing assignments. iMovie can also help strengthen sequential ordering skills, and give students the chance to use visual-spatial strengths and develop their storytelling skills.
Students who are blind or have low vision can use VoiceOver, an advanced screen reader, to get the most from their iOS devices.
VoiceOver is a gesture-based screen reader that lets students know what’s happening on their Multi-Touch screen — and helps them navigate it — even if they can’t see it. Students can triple-click the Home button wherever they are in iOS to access VoiceOver. They can hear a description of everything happening on the screen, so they can know which app their finger is on, find a passage in an essay, or have an ebook read aloud. Students can adjust the VoiceOver speaking rate and pitch to just how they want it.
If a higher contrast helps students better see what’s on the screen, iOS lets them invert the colors onscreen. This works with text, graphics, and even video. Once the colors are set, the settings apply throughout the system, so it’s always the same view no matter what they’re seeing. It can also be used with Zoom and VoiceOver.
Zoom is a built-in screen magnifier that works anywhere in iOS, so students can better read an essay, view a diagram, or get details on a map. And it works with all apps from the App Store. A simple double-tap with three fingers instantly zooms in 200 percent, and the magnification can go up to 500 percent. While zoomed in, everything works as usual: students can use all the familiar gestures to navigate their device. And Zoom works with VoiceOver, so they can better see — and hear — what’s happening on the screen.
FaceTime helps students who are deaf or hard of hearing communicate and stay connected. And closed captions and mono audio let them get the most out of the content on their iPhone, iPad, and iPod touch.
Thanks to its high-quality video and fast frame rate, FaceTime is ideal for students who communicate using sign language. Every gesture and facial expression is in crystal-clear detail. And because FaceTime comes as standard on the Mac, iPhone, iPad, and iPod touch, students can talk to OS X or iOS users in a classroom down the hall — or halfway round the world. As if they’re face to face.1
Stereo recordings usually have distinct left- and right-channel audio tracks. So students who are deaf or hard of hearing in one ear may miss some of the audio contained in that channel. iOS can help by playing both audio channels in both ears, and lets students adjust the balance for greater volume in either ear, so they can experience all the audio in a lecture, video, or musical composition.
GarageBand can help improve auditory comprehension among deaf and hard of hearing students — particularly those adjusting to new cochlear implants. Teachers can create podcasts of conversational speech and download them to a Mac, iPhone, iPad, or iPod touch. Students use the podcasts to learn inflection and how to differentiate one voice from another. GarageBand is also great for speech therapy, learning tonal languages like Chinese, or helping deaf students gain an understanding of how loud things sound with an audio wave file.
Innovative iOS technologies make the Multi-Touch screen more accessible to those with physical or motor challenges.
iOS devices feature high-precision, touch-sensitive displays that require no physical force, just simple contact with the surface. And AssistiveTouch allows students with limited motor capabilities to adapt the Multi‑Touch screen of their iOS device to their needs. So more complicated Multi‑Touch gestures, like a pinch or multi‑finger swipe, are accessible with just the tap of a finger. Or students can create custom gestures. And if they have trouble pressing the Home button, they can activate it with an onscreen tap. Gestures like rotate and shake are available even if the iOS device is mounted on a wheelchair. And for students who need assistive devices such as joysticks, iOS devices also support a number of third-party options.