Manual Testing
Last modified on Fri 17 Mar 2023

Manual testing

If our goal is to check if the mobile application is accessible, using it the way your users do is the best way to go. Although parts of it can be automated, manual testing is an irreplaceable way of making sure that the mobile application is usable for the maximum number of users.

Some of the guidelines and accessibility features can be checked by using the app regularly – for example, closed captions on videos, proper error identification, or flashing.

But, to be aware of most of the problems users with disabilities bump into, we need to either use the app the way they use them (with accessibility settings) or use tools that would help us identify some invisible issues. For example, we might recognize that the contrast might be off or that a button is hardly tappable, but we can’t be sure that it is not conforming to the WCAG standards. Accessibility testing tools can be of great use here.

Testing your mobile application with accessibility hardware could also offer a great insight into usability. Still, it is not mandatory – e.g. issues that can be found with an external keyboard will probably be the same as those you found with the Switch Control/Access accessibility feature.

Let’s see how we can do it in more detail.

Testing with accessibility settings on mobile devices

Android and iOS offer a variety of accessibility settings to their users – they can be found in the settings of mobile devices. Accessibility settings cover a wide scope of needs users might have. We can split them into Motor, Vision, and Hearing.

Accessibility settings on iOS are the same on all iOS devices, of course, while Android devices offer more variety. To be sure that all Android devices can offer the same scope for all users, no matter the model of a device, Android developed Android Accessibility Suite. It is a collection of accessibility apps that help users use their Android device eyes-free or with a switch device.

Below, we will explain accessibility settings that require some adaptation from our app, what they are for, and how we can use them in our testing efforts. Keep in mind that some of these features have a learning curve – if you turn them on by accident before knowing how to use them, it might take time to figure out how to turn them off. Trust me :)

Try watching/reading a tutorial first; it’ll save you some time.

Motor

Switch Control/Switch Access – is a tool created for users with dexterity issues. It helps users to use their mobile phone with one switch or more, with no need to use the touchscreen. For example, users can use an external switch, physical buttons, or their head/face movements to operate their mobile devices.

You can learn more about Switch Access (which is what it’s called on Android) here, and you can get details on the usage of Switch control (iOS) here.

Voice Control/Voice Access – the same as before, VC/VA is a tool created for users who have difficulties with operating touch screens – just in this case; users use their voice as a switch.

Find out more about Voice Control (iOS) and Voice Access (Android) and their usage.

Vision:

VoiceOver/TalkBack – is a gesture-based screen reader that helps visual-impaired users operate mobile interfaces without seeing the screen. It reads the screen out loud to users and gives them instructions on the actions they can do with each element on the screen.

Find out more about VoiceOver (iOS) and TalkBack (Android) and their usage. Also, some basic instructions can be found here.

Display and Text sizes – this set of features includes actions like bold text, changing font size and style, different color adjustments/filters (to adjust the colors of the screen to people who are colorblind), and similar. These settings vary from OS to OS and on Android – from device to device. Since they can highly affect the look and feel of your app, cause bugs, and even make some of the features unusable, it’s important to do at least a smoke test of your app with each of them turned on – more on this in the next chapter.

Hearing

Both Android and iOS offer a variety of hearing enhancements for their users, for example, sound recognition tools, hearing aid support, and live captions. They are mostly used as tools for themselves, so we will not cover testing applications with them turned on.

Testing with accessibility testing tools

Different tools can enhance your mobile accessibility testing. In this chapter, we will cover two of them – one developed by and for Android and the other one focused on iOS.

Accessibility scanner is an application created by Google that helps you test the accessibility of your app, fast and with no coding. As its name suggests, Accessibility Scanner scans your screen, finds violations of WCAG rules on it, identifies them, and suggests improvements.

Keep in mind that although it can be super useful in finding violations like too-small touch targets, low contrast, or elements that miss labels, many other violations will not be seen. For example, it might tell you that an element doesn’t have a label, but it will not recognize that an element has a wrong label (for example, the Save button is labeled as “Cancel”).

Nevertheless, it is a great tool to start with and helps you quickly find some of the most common accessibility issues on your app. You can read more about it here.

Another useful tool is Accessibility Inspector – a built-in tool in XCode that allows you to scan your app in XCode’s simulator and find similar breaches of WCAG rules as the Accessibility scanner.

As with the Accessibility scanner, Accessibility Inspector can help you identify some difficult-to-notice bugs (too small touch targets or wrong contrast). Still, it should not replace manual testing, as it misses the context of each element.