Quantcast
Channel: Envato Tuts+ Code - Mobile Development
Viewing all articles
Browse latest Browse all 1836

Creating Accessible Android Apps: Assistive Technologies

$
0
0

Whenever you design an Android app, you want as many people as possible to download and use that app, but this can only happen if your app is accessible to everyone—including people who access their Android devices via assistive features, or who experience mobile apps without elements such as colour or sound.

In my last post about Creating Accessible Android Apps, I showed you how to provide the best experience for everyone who uses your app, by optimizing your application for the accessibility features that are baked into every Android device. I’ll also covered accessibility best practices, and how to really put your app’s accessibility to the test, before sending it out into the world. 

By the time you’ve completed this article, you’ll know how to create applications that integrate with screen readers, directional controls, and Switch devices, plus other handy Android accessibility features such as closed captions.

Supporting Assistive Technologies

An assistive technology or accessibility feature is a piece of software or hardware that makes devices more accessible. Android has a number of accessibility features built in, and there are many apps and even external devices that people can download or purchase in order to make their Android devices better fit their needs. 

In the same way that you optimize your Android apps to work well with the touchscreen and different screen configurations, you should optimize your app for these accessibility services.

Optimizing for assistive technologies is one of the most important steps in creating an accessible app, so in this section I’m going to cover all the major accessibility services and show how to optimize your app to provide a better experience for each of these services. 

Supporting Screen Readers

Users with vision-related difficulties may interact with their Android devices using a screen reader, which is a speech synthesizer that reads text out loud as the user moves around the screen. 

Recent releases of Android typically come with Google’s Text-to-Speech (TTS) engine pre-installed. To check whether TTS is installed on your device:

  • Open your device’s Settings app.
  • Navigate to Accessibility > Text-to-speech output
  • Check the Preferred engine value—this should be set to Google text-to-speech engine.

The TTS engine powers various screen readers, including Google’s TalkBack, which is the screen reader I’ll be using:

  • Download Google TalkBack from the Google Play store.
  • Navigate to Settings > Accessibility.
  • Select TalkBack.
  • Push the slider to the On position. 

If you own a Samsung device, then you may have the Voice Assistant screen reader pre-installed. Voice Assistant is a port of Google TalkBack that has many of the same features, so you typically won’t need to install TalkBack if you already have access to Voice Assistant. 

Navigating in Screen Readers

Most screen readers support two methods of navigation: 

  • Linear navigation. Delivers audio prompts as the user moves around the UI in a linear fashion, either by swiping left or right or by using a directional control (which is another accessibility service we’ll be looking at shortly).
  • Explore by Touch. The screen reader announces each UI element as the user touches it.

It’s important to test your application using both linear navigation and the Explore by Touch methods.

Note that some people may use TalkBack alongside the BrailleBack application and an external, refreshable braille display. Braille support isn’t something you can fully test without purchasing a braille display, but if you’re interested in learning more about these devices, then there are plenty of braille display videos on YouTube.

You can also use the BrailleBack app to preview how your app’s text will render on a braille display. Once BrailleBack is installed, navigate to Settings > Accessibility > BrailleBack > Settings > Developer options > Show Braille output on screen. Navigate back to the main BrailleBack screen, push the slider into the On position, and BrailleBack will then add an overlay that displays the braille cells for whichever screen you’re currently viewing.

Now that you’ve set up your screen reader (and optionally, BrailleBack) let’s look at how you can optimize your app for this accessibility service. 

Adding Content Descriptions

Text labels add clutter to the screen, so wherever possible you should avoid adding explicit labels to your UI. 

Communicating a button’s purpose using a trashcan icon rather than a Delete label may be good design, but it does present a problem for screen readers, as there’s nothing for that screen reader to read! 

You should provide a content description for any controls that don’t feature visible text, such as ImageButtons and CheckBoxes, and for visual media such as images. 

These content labels don’t appear onscreen, but accessibility services such as screen readers and braille displays will announce the label whenever the corresponding UI element is brought into focus. 

You add a content description to a static element, using android:contentDescription:

If you’re adding a content description to a control that may change during the Activity or Fragment’s lifecycle, then you should use setContentDescription() instead:

Crafting the perfect content description is a tricky balancing act, as providing too much information can often be just as bad as providing too little information. If your content descriptions are unnecessarily detailed, or you add content descriptions to elements that the user doesn’t need to know about, then that’s a lot of white noise for them to navigate in order to make sense of the current screen. 

Your content descriptions need to be helpfully descriptive, independently meaningful, and provide just enough context for the user to be able to successfully navigate your app. 

To avoid overwhelming the user with unnecessary information: 

  • Don’t include the control’s type in your content descriptions. Accessibility services often announce the control’s type after its label, so your "submit button” description may become “submit button button.”
  • Don’t waste words describing a component’s physical appearance. The user needs to know what’ll happen when they interact with a control, not necessarily how that control looks.
  • Don’t include instructions on how to interact with a control. There are many different ways to interact with a device besides the touchscreen, so telling the user to “tap this link to edit your Settings” isn’t just adding unnecessary words to your content description, it’s also potentially misleading the user. 
  • Don’t add content descriptions to everything. Screen readers can often safely ignore UI elements that exist solely to make the screen look nicer, so you typically don’t need to provide a content description for your app’s decorative elements. You can also explicitly instruct a View not to respond to an accessibility service, by marking it as android:contentDescription=“@null” or android:isImportantForAccessibility=“no” (Android 4.1 and higher).

Users must be able to identify items from their content description alone, so each content description must be unique. In particular, don’t forget to update the descriptions for reused layouts such as ListView and RecyclerView.

Once you’re satisfied with your content descriptions, you should put them to the test by attempting to navigate your app using spoken feedback only, and then make any necessary adjustments. 

Don’t Drown Out Screen Readers

Some screen readers let you adjust an app audio’s independently of other sounds on the device, and some even support “audio ducking,” which automatically decreases the device’s other audio when the screen reader is speaking. However, you shouldn’t assume that the user’s chosen screen reader supports either of these features, or that they’re enabled. 

If your app features music or sound effects that could potentially drown out a screen reader, then you should provide users with a way of disabling these sounds. Alternatively, your app could disable all unnecessary audio automatically whenever it detects that a screen reader is enabled. 

Don’t Rely on Visual Cues 

It may be common practice to format links as blue, underlined text, but people who are experiencing your UI as a series of screen reader prompts may be unaware of these visual cues.  

To make sure all users are aware of your app’s hyperlinks, either:

  • Phrase your anchor text so that it’s clear this piece of text contains a hyperlink.
  • Add a content description.
  • Extract the hyperlink into a new context. For example, if you move the link into a button or a menu item, then the user will already know that they’re supposed to interact with this control. 

Consider Replacing Timed Controls

Some controls may disappear automatically after a period of time has elapsed. For example, video playback controls tend to fade out once you’re a few seconds into a video. 

Since screen readers only announce a control when it gains focus, there’s a chance that a timed control could vanish before the user has a chance to focus on it. If your app includes any timed controls, then you should consider making them permanent controls when your application detects that a screen reader is enabled, or at least extend the amount of time this control remains onscreen. 

Don’t Rely on Colours

Unless you include them in your content descriptions, screen readers won’t communicate colour cues to your users, so you should never use colour as the sole means of communicating important information. This rule also helps ensure your app is accessible for people who are colour-blind, or who have problems differentiating between certain colours. 

If you use colour to highlight important text, then you need to emphasise this text using other methods, for example by providing a content description, sound effects, or haptic (touch-based) feedback when this text is brought into focus. You should also provide additional visual cues for people who are colour-blind, such as varying the font size or using italic or underline effects.

Switch Access and Directional Controls

Users with limited vision or manual dexterity issues may operate their device using directional controls or Switch Access, rather than the touchscreen. 

1. Testing Your App’s Switch Access 

Switch Access lets you interact with your Android device using a “switch,” which sends a keystroke signal to the device, similar to pressing an OK or Select button.

In this section, we’ll be creating separate ‘Next,’ ‘Previous’ and ‘Select’ switches, but it’s also possible to create a ‘Select’ switch and have Switch Access cycle through the screen’s interactive elements on a continuous loop. If you’d prefer to test your app using this auto-scan method, then navigate to Settings > Accessibility > Switch Access > Settings > Auto-scan.

Android supports the following switches:

  • The device’s hardware buttons, such as Home or Volume Up/Volume Down. This is typically how you’ll test your app’s switch support, as it doesn’t require you to purchase a dedicated switch device.
  • An external device, such as a keyboard that’s connected to your Android device via USB or Bluetooth. 
  • A physical action. You can use your device’s front camera to assign the “switch” feature to a physical action, such as blinking your eyes or opening your mouth. 

To enable Switch Access:

  • Navigate to Settings > Accessibility > Switch Access.
  • Select Settings in the upper-right corner. 
  • Select the NextPrevious and Select items in turn, press the hardware key you want to assign to this action, and then tap Save.
  • Navigate back to the main Switch Access screen, and push the slider into the On position. 

You can disable Switch Access at any point, by navigating to Settings > Accessibility > Switch Access and pushing the slider into the Off position.

2. Testing Your App’s Directional Control Support 

Directional controls let the user navigate their device in a linear fashion, using Up/Down/Left/Right actions, in the same way you use your television remote to navigate the TV guide.

Android supports the following directional controls:

  • the device’s hardware keys.
  • external devices that are connected via USB or Bluetooth, for example a trackpad, keyboard, or directional pad (D-pad)
  • software that emulates a directional control, such as TalkBack gestures

Designing for Directional Controls and Switch Access

When the user is interacting with your app using Switch Access or a directional control, you need to ensure that: 

  1. They can reach and interact with all of your app’s interactive components.
  2. Focus moves from one UI control to the next in a logical fashion. For example, if the user presses the Right button on their directional control, then focus should move to the UI element they were expecting. 

If you’re using Android’s standard Views, then your controls should be focusable by default, but you should always put this to the test. 

To check that all of your interactive components are focusable via Switch Access, use your switches to navigate from the top of the screen to the bottom, ensuring that each control gains focus at some point. 

The easiest way to test your app’s directional control support is to emulate a directional pad on an Android Virtual Device (AVD).

The downside is that this requires editing your AVD’s config.ini settings. Note that the following instructions are written for macOS, so if you’re developing on Windows or Linux, then the steps may be slightly different. 

  • Open a ‘Finder’ window and select Go > Go to Folder… from the toolbar.
  • In the subsequent popup, enter ~/.android/avd and then click Go.
  • Open the folder that corresponds to the AVD you want to use.
  • Control-click the config.ini file and select Open with > Other...
  • Select a text editing program; I’m opting for TextEdit.
  • In the subsequent text file, find the hw.dPad=no line and change it to hw.dPad=yes. Save this file. 
  • Launch your application on the AVD you’ve just edited.
  • Select the More button (where the cursor is positioned in the following screenshot). 
Select the More button in your Android Virtual Device AVD
  • Select Directional pad from the left-hand menu.
  • You can now navigate your application using an emulated directional pad.
Put your app to the test by navigating it using the emulated D-pad

 Android’s standard UI controls are focusable by default, but if you’re struggling to focus on a particular control then you may need to explicitly mark it as focusable, using either android:focusable="true" or View.setFocusable().

You should also check that the focus order moves from one UI element to the next in a logical fashion, by navigating around all of your app’s controls, in all directions. (Don’t forget to test reverse!) 

Android determines each screen’s focus order automatically based on an algorithm, but occasionally you may be able to improve on this sequence by changing the focus order manually. 

You can specify the View that should gain focus when the user moves in a certain direction, using the following XML attributes: android:nextFocusUpandroid:nextFocusDownandroid:nextFocusRight, and android:nextFocusLeft.

For example, imagine you have the following layout: 

A layout consisting of a Button EditText and CheckBox

By default, when the Button control is in focus:

  • Pressing Down will bring the CheckBox into focus.
  • Pressing Right will bring the EditText into focus. 

You can switch this order, using the android:next attributes. In the following code:

  • Pressing Down brings the EditText into focus.
  • Pressing Right brings the CheckBox into focus. 

Alternatively, you can modify the focus order at runtime using setNextFocusDownIdsetNextFocusForwardIdsetNextFocusLeftIdsetNextFocusRightId, and setNextFocusUpId.

Simplify Your Layouts

Simpler layouts are easier for everyone to navigate, but this is particularly true for anyone who’s interacting with your app using Switch Access or a directional control. 

When testing your app’s navigation, look for any opportunities to remove elements from your UI. In particular, you should consider removing any nesting from your layouts, as nested layouts make your application significantly more difficult to navigate. 

Don’t Neglect Your App’s Touchscreen Support

Some users with manual dexterity issues may prefer to interact with their devices using the touchscreen. 

To help support these users, all of your app’s interactive elements should be 48 x 48 dp or larger, with at least 8 dp between all touchable elements. You may also want to experiment with increasing the size of a touch target without actually increasing the size of its related View, using Android’s TouchDelegate API.

Closed Captions 

You should provide subtitles for all of your app’s spoken audio.

To enable closed captions on your device:

  • Navigate to Settings > Accessibility > Captions.
  • Push the slider into the On position. 

On Android 4.4 and higher, you add an external subtitle source file in WebVTT format using addSubtitleSource(), for example:

Captions are a system-wide setting, so someone who relies on captions is likely to launch your application with captions already enabled. However, if a user doesn't have captions enabled, then it’s crucial you make it clear that your app supports closed captions and provide a way of enabling captions. Often, you can achieve both of these things by featuring a Captions button prominently in your UI—for example, adding a Captions button to your app’s video playback controls. 

Since captions are a system-wide setting, your app simply needs to forward the user to the appropriate section of their device’s Settings application (Settings > Accessibility > Captions). For example: 

Android will change your caption’s formatting according to the user’s system-wide captions settings, located in Settings > Accessibility > Captions. To ensure your captions remain legible regardless of the user’s settings, you’ll need to test your captions across Android’s full range of formatting options.

Font Size

Users who are struggling to read onscreen text can increase the font size that’s used on their device.

You'll have to ensure that your app still works across a range of text sizes. To test this, try changing the text size device-wide.

  • Launch your device’s Settings app.
  • Navigate to Settings > Accessibility > Font size
  • Push the slider towards the large A to increase the font size, and towards the small A to decrease the font size. 

Assuming you defined your text in scaleable pixels (sp), your app should update automatically based on the user’s font-size preferences.

If you’ve designed a flexible layout, then ideally your app should be able to accommodate a range of text sizes, but you should always test how your app handles the full range of Font size settings, and make any necessary adjustments. Text that increases or decreases based on the user’s preferences isn’t going to improve the user experience if some settings render your app unusable! 

Conclusion

In this post, you learned how to optimize your app for some of Android's most commonly used assistive technology and accessibility features. 

If you’re interested in learning more about accessibility, then Google has published a sample app that includes code for many of the techniques discussed in this article. You’ll also find lots of information about mobile accessibility in general, over at the Web Accessibility Initiative website.

In the meantime, check out some of our other posts about Android app development!

2017-12-06T11:00:00.000Z2017-12-06T11:00:00.000ZJessica Thornsby

Viewing all articles
Browse latest Browse all 1836

Trending Articles