Mobile Screen Reader Practice
Practice using a screen reader on a mobile device and learn how it differs from the desktop experience.
Overview
In this lesson, you will learn the basics of using screen readers in a mobile environment. You might imagine the ideas are similar, but touch based devices have a very different approach. You will learn to do similar things, like navigate content, announce structure, and interact with form controls. This lesson will cover some of the most common screen reader commands, how content is read aloud, and highlight key accessibility considerations on iOS and Android.
Goals
You have the following goals for this lesson:
- Learn about how screen readers are used on touch devices
- Learn basic gestures and commands on touch devices
- Observe how mobile screen readers announce content
- Practice navigating a web page using a mobile screen reader
- Recognize platform-specific related challenges
Warm up
Screen reading devices often use concepts like navigation to get around. Linear navigation is, after all, very slow. While this is the case, tablets and phones have a jarringly different experience that takes some getting used to. Further, it is inconsistent between operating systems and even more inconsistent on various operating systems within browsers. What do you think might be different and how do you imagine it might work in all these scenarios?
Vocabulary
You will be learning about the following vocabulary words:
Term | Definition |
---|---|
VoiceOver Rotor | An iOS feature that lets users change navigation options (like by headings or links) using a circular dial gesture. |
Explore By Touch | A method where users move their finger around the screen to hear item names and information. |
TalkBack | Android’s screen reader that provides spoken feedback and uses gestures for navigation and interaction. |
Context Menu | A TalkBack menu that offers actions specific to the selected item or screen context. |
Gesture | A specific finger movement (like swipe or tap) that performs a screen reader command. |
Web Content Accessibility Standards
This lesson covers the following standards:
- WCAG 2.2 - 1.1.1 Non-text Content
- WCAG 2.2 - 1.3.1 Info and Relationships
- WCAG 2.2 - 1.3.2 Meaningful Sequence
- WCAG 2.2 - 1.3.3 Sensory Characteristics
- WCAG 2.2 - 2.5 Input Modalities
- WCAG 2.2 - 2.4.3 Focus Order
- WCAG 2.2 - 2.4.6 Headings and Labels
- WCAG 2.2 - 3.3.2 Labels or Instructions
Explore
While desktop screen readers like NVDA or JAWS rely on a more traditional computer setup, mobile screen readers are designed for the smaller, touch based devices. Navigation and interaction on mobile screen readers thus need to work with gestures and vibrotactile feedback. Exactly what they should do in this situation is an active topic of academic research.
The two major screen readers you will encounter when working with mobile devices are TalkBack and VoiceOver. TalkBack is built into Android devices while VoiceOver is built into iOS devices. While the name, VoiceOver, is the same as in the previous sections, it is a different tool that even uses different libraries under the hood. Both screen readers rely heavily on touch-based gestures such as swipes, taps, and pinches. This shift in the interaction model requires a different approach when it comes to accessibility. You need to learn how to navigate with touch, but also on a technical level how to manage rather significant technical inconsistencies across these platforms.
Turning The Screen Reader On
Mobile platforms vary on how to turn screen readers on. Further, due to the nature of Android devices, there may be several ways to turn on the screen reader based on the device. The most common way to start TalkBack on an Android device is to first navigate to the Settings app. From there, go to Accessibility and find the TalkBack option under the Screen readers section. Once you select it, toggle the switch to enable TalkBack. Once enabled, TalkBack will provide spoken feedback for all on-screen elements, and you can begin using touch gestures to navigate your device.
To enable VoiceOver on an iOS device, open the Settings app and navigate to Accessibility. There should be an option for VoiceOver, and toggle the switch to turn it on. You will be prompted with a confirmation message letting you know that VoiceOver will change the way your device responds to touch gestures. Once activated, VoiceOver will read aloud on-screen text and interface elements, and you can start using the touch gestures specific to VoiceOver for navigation and interaction.
Navigation
On a desktop screen reader, you can still use your computer as normal with the screen reader running and only adding some functionality in specific cases. On mobile this is not really the case. Specifically, the screen reader takes over how you interact with the device normally. You can still tap on the screen in specific spots, but in order to navigate with a mobile screen reader on you will need to learn how the gestures change.
Gesture ( one-finger) | Action |
---|---|
Swipe Right | Move to next item |
Swipe Left | Move to previous item |
Swipe Down | Jump to next item based on reading control settings |
Swipe Up | Jump to previous item based on reading control settings |
Double Tap | Activate selected item |
Swipe Right then Left | Scroll forward |
Swipe Left then Right | Scroll backward |
Swipe Up then Left | Go to Home screen |
Swipe Down then Left | Go back |
Swipe Up then Right | Open Local Context Menu |
Swipe Down then Right | Open Global Context Menu |
Gesture | Action |
---|---|
One-finger Tap | Speak an item |
One-finger Swipe Right | Move to next item |
One-finger Swipe Left | Move to previous item |
One-finger Swipe Down | Jump to next Rotor item |
One-finger Swipe Up | Select previous Rotor item |
One-finger Double Tap | Activate the selected item |
Two-finger Swipe Up | Speak the screen from the top |
Two-finger Swipe Down | Speak the screen from current item |
Two-finger Tap | Pause or continue speaking |
Two-finger Double Tap | Start or stop current action |
Three-finger Swipe Down | Move down one page |
Three-finger Swipe Up | Move up one page |
Three-finger Double-Tap | Mute or unmute VoiceOver |
For high level context, the most crucial point to understand is that with the screen reader on, navigation is again all about structure. When you think about it with touch devices, this is rational. If you have vision challenges, moving your finger around on the screen hunting for something is certainly possible, but clunky. Navigating with gestures instead allows the user to quickly hunt through high level items trying to find what they want. Generally, if you remember nothing else about navigation, gestures and double tap can get you pretty far.
TalkBack Context Menus
Talkback features a series of menus that allow you to quickly change certain settings and controls. There is a global context menu, which has options that can be applied anywhere on the device. It also has a local context menu that has options that are relevant to the content currently on the screen.
Global Context Menu
To access the global context menu, swipe down then right with one finger in a single motion. This menu offers several important options for screen reader users. You can choose to have the screen read from the top or from the next item, repeat the last thing that was spoken, or activate screen search to find specific content. The menu also includes options for adjusting language or speech settings, accessing TalkBack preferences, and using text editing tools. These features allow users to better control how content is read aloud and make navigation more efficient.
Local Context Menu
To access the local context menu, swipe up then right with one finger in a single motion. The menu that appears changes based on what is currently open. For example if you are in Chrome there will be navigation options like headings, links, and paragraphs.
Also note that because Android is available on a wider range of devices, the local context menu can vary for each device. For example, Samsung devices have their own customized version of TalkBack so the menu may appear different from a Google Pixel. While it might be ideal if this were standardized, or at least more consistent on Android, in practice it can take quite a bit of testing to figure out what makes sense for users at a technical level due to the inconsistencies. In general, although they work differently, these concepts in TalkBack are sort of similar to VoiceOver's Rotor.
VoiceOver Rotor
The Rotor in VoiceOver for iOS is similar, but also different. In this case, the idea is the same: make it easier to navigate complex content. It allows you to jump between different types of content, such as headings, links, form fields, and more, using gestures. The Rotor provides a quick way to interact with specific elements on a web page or within an app without the need to manually swipe through each one, making it particularly useful when working with long or complex documents.
To pull up the Rotor in VoiceOver on iOS, you need to use a two-finger twist gesture, similar to rotating a dial. This action allows the Rotor to appear with a selection of options tailored to the content on the screen. Once activated, the Rotor will display different navigation choices, such as Headings, Links, Form Controls, Text, Landmarks, and more, depending on the content being viewed.
![A screenshot of the VoiceOver Rotor on iOS, showing a circular dial with options like Words, Headings and Links. Two fingers are shown twisting on the screen to demonstrate how to use the Rotor gesture. Source: BAE Systems Accessibility [1].](/media/Accessibility/MobileScreenReaderPractice/VoiceOverRotorImage.png)
After the Rotor is activated, you can cycle through the available options by using a one-finger swipe up or down. For example, if you are reading a web page and select Headings from the Rotor, they can swipe up or down to quickly navigate between the headings on the page. If you want to jump between links, you can select Links from the Rotor and then swipe to move through the hyperlinks on the page. This can save the time and effort of manually searching for a specific section or link by going through each element in order.
Explore by Touch
If you use a mobile device, you might already be used to navigating by looking where an item is located on the screen physically and touching that part of the screen. Mobile Screen readers have a feature that closely resembles this called Explore by Touch. Explore by Touch allows you to navigate content on the screen by physically moving your finger around the screen. When your finger is above something on the screen the screen reader will read aloud the item and typically it is accompanied by haptic feedback.
Explore by Touch is an attempt to help you feel the content on the screen. It mimics the way a sighted person could scan the screen visually by translating that into a tactile or audio experience of moving your finger across the screen. While useful and allows a more intuitive way for you to navigate a mobile screen reader it is still important to understand how to get around with gestures and ignoring and the physical positioning of objects on the screen.
Citations
- BAE Systems. VoiceOver – iPhone, iPad, iPod Touch iOS 11. Accessibility. Figure 21. BAE Systems website.
Engage
Mobile screen readers offer a very different approach to using mobile devices than you might be used to. To understand the experience and needs of a mobile screen reader user, again using one is likely the most useful way to learn. Especially on mobile in browsers, using mobile screen readers yourself is critical when testing for accessibility in practice.
In this activity, you will explore what it is like to use a mobile device with a screen reader and gestures only. This is real usage of the tools that blind and low-vision users rely on every day to read, navigate, and interact with content on these devices.
Directions
Different smartphones and operating systems (like iOS and Android) behave differently when it comes to accessibility features, so for this activity, you will work in groups. If possible, try to find partners that have different smart phones than you. The more variety of devices the better. Once you have formed groups, based on the platform of each device, choose the appropriate screen reader. For Android devices, you should use TalkBack and for iOS devices you should use VoiceOver.
Take a moment to get comfortable with the basic gestures on each of the devices:
- One finger swipes left and right
- Double taps to activate
- Two finger swipe to scroll
- Use explore by touch
Note that with a mobile screen reader, common mobile gestures might not work as they would normally. Being able to turn the screen reader on or off for your own testing can be helpful if you are not already comfortable with it.
Now, each group will pick applications to explore. Pick at least two different types of applications, including one native app and one webpage.
It is important to try both types of applications because sometimes web browsers have inconsistencies that do not exist in native apps, especially in regard to gestures. Once you have chosen your two apps, choose a task for partners in the group to try. Have one person try it while talking aloud and others observing. Here are some example tasks you might consider:
- Find and open a specific setting in the app
- Navigate to a specific article and read out a paragraph
- Check text messages and read them aloud (if appropriate)
This is a talk aloud exercise and mobile screen readers can arguably be somewhat unintuitive. As such, your goal is not just to play around, but to listen to your partner describe their experience as they interact with the application.
Wrap up
The digital space has increasingly become more mobile-centered over time, bringing new challenges for usability and accessibility. By the end of this lesson, you should have a foundational understanding of how mobile screen readers work and how important they are for non-visual access to digital content. You should have been able to use basic gestures and seen how structure and semantics impact screen reader usage and output. While not discussed here, writing code for each of these devices to maintain accessibility and consistency can be quite complex. What do you think the primary programmatic challenges might be?
Next Tutorial
In the next tutorial, we will discuss Introduction to ARIA, which describes the basics of ARIA and how it affects assistive technologies.