Skip to main content

Mobile

VoiceOver Your TalkBack To Me: A Mobile Screen Reader Overview

iOS VoiceOver, Android TalkBack

Accessibility testing on Websites and apps with automated accessibility scanner tools are a great help but they only get you so far.  Screen Readers will bridge the gap in your accessibility goals to reach your user base spanning various impairments.

Using Screen Readers

Text-to-Speech Screen Reader technology (generally referred to as simply “Screen Readers”) can be used to help people who are blind or otherwise visually impaired to use and enjoy websites and applications. Onscreen content and instructions can be read aloud by the screen reader software so that the user is able to read, navigate, and access the content and functionality available to sighted users.

Because most Screen Reader users don’t use a mouse, a variety of either keyboard commands (for desktop/laptop computers) or touch commands (for mobile handheld devices) are available to users. Here, we’ll focus on some of these commands for using mobile Screen Readers.

For developers, setting up your website or application for a good Screen Reader experience, using modern, valid HTML will get you most of the way there – but to optimize the experience your code will need to be configured.

Additional Code Elements Include:

  • Accessibility Identifiers: Mobile element identification id that allows for easy traversal of screen elements. Eliminates possible XPath errors and can allow be used for QA Automation
  • “alt” attributes for images: these attributes are applied to HTML tags, and are best used to provide concise descriptions of images on the screen
  • ARIA attributes:
    • “aria-label”: used to provide additional description, often used for interactive elements that don’t have an accessible name (e.g. icons)
    • “role”: used to describe elements that don’t natively exist in HTML, or elements that do exist but don’t yet have full browser support
    • Various state and property aria attributes (e.g. https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/ARIA_Techniques)
  • Headers (h1-h6): used properly, these form a sort of outline of the content of a page
  • Landmarks: a way to identify the various regions of a page (e.g. navigation, main content, etc.)

The two most widely used mobile screen readers are VoiceOver (iOS) and TalkBalk (Android). 

Users need to learn a set of touch gestures to use these mobile screen readers. The most basic gestures used are tapping to select items, and swiping right or left to respectively select the next or previous item to interact with on the webpage. As items are selected, their onscreen text and perhaps other information will be spoken by the screen reader voice.

Just as a sighted user can visually scan pages displayed on the screen to more quickly get to the info or functionality they want, screen reader users can navigate using these multi-finger gestures in a similar fashion. These users are not locked into going though all the content in sequence. There are additional gestures available to users to perform more complex interactions.

VoiceOver

VoiceOver provides the ability to traverse page elements with gestures, keystrokes, and Voice Recognition. It also supports braille displays – users can connect a Bluetooth® wireless braille display to read the output if they prefer that to the voice output.

VoiceOver offers a wider range of gestures beyond the basic swiping/tapping gestures. 

Additional VoiceOver Gestures:

  • Mute/Unmute VoiceOver: Three-finger double-tap
  • Use a standard gesture: Double-tap and hold your finger on the screen until you hear three rising tones, then make the gesture. When you lift your finger, VoiceOver gestures resume.
  • Open the Item Chooser: Two-finger triple tap
  • Rotor: Initiated by using an onscreen two-finger rotation – among other multi-finger gestures to change page
  • “Magic Tap”: Two-finger double tap; this is the only gesture that can be programmed to perform different actions in different apps

Full gesture list for VoiceOver: https://support.apple.com/guide/iphone/learn-voiceover-gestures-iph3e2e2281/ios

TalkBack

TalkBack also provides eyes-free navigation and control of applications, allowing users to use gestures and voice commands. It also supports braille also – including an on-device keyboard for typing braille so users that wish to type with braille don’t need to connect to their physical braille keyboard.

Additional TalkBack Gestures:

  • Mute/Unmute TalkBack: Three-finger double tap
  • Explore by touch: Drag a finger on the screen to hear the item below your finger spoken
  • Magnification settings: Triple tap
  • Move to the next/previous reading control: 3-finger swipe down/up, respectively

Full gesture list for TalkBack: https://support.google.com/accessibility/android/answer/6151827?hl=en

The use of Screen Readers can greatly benefit your accessibility testing efforts to ensure you are reaching as much of your user base as possible. This empowers users to skip pages, control video, dismiss alerts and even mute the screen reader voice, all with touch gestures! So consider adding Screen Reader support to bolster your accessibility testing efforts.

For more information about Perficient’s Mobile Solutions expertise, subscribe to our blog or contact our Mobile Solutions team today!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Rob Asher

Rob Asher is a QA Lead on the Mobile and Modern Web Solutions team at Perficient. He has many years of experience in the digital world and a passion for usable and accessible web and native apps.

More from this Author

Follow Us