TESTING Q&A: What procedures should I use to test my ICT for accessibility?
This article was developed as part of
The Accessibility Switchboard Project
from the
National Federation of the Blind Jernigan Institute
January 2017, Version 1.0.1
Creative Commons License: CC BY-SA 4.0
Brief answer
What procedures should I use to test my Information and Communications Technology (ICT) for accessibility?
There are a variety of approaches available for different types of ICT, and each has their pros and cons. These pros and cons need to be assessed before adopting an appropriate procedure.
In-Depth Answer: What procedures should I use to test my ICT for accessibility?
Are you living on ‘Inaccessible Island’? For your Information and Communications Technology (ICT), how do you know if you haven’t tested your products to see if they are accessible?
Testing is one of the bedrocks of an accessibility program. Without testing, you can’t assess where you are now, and you won’t know when you’ve met the performance goals that you set for yourself.
This article introduces some key concepts around testing, and provides a list of pros and cons for various testing approaches. The pros and cons should be evaluated first, and then links to available test processes are provided at the end of this article.
Key testing concepts
Product development, quality assurance, and quality control
The figure above shows an overview of product development. Products are initially conceived and designed in order meet an identified need. The product is built and tested, and then packaged and delivered to the consumer for use. During product development there are opportunities for checking the quality (suitability for use) of the product:
- Quality Assurance (QA) entails adding checks throughout the build and test phase of product development. Introducing appropriate QA checks throughout development reduces the likelihood of having a completed product that does not meet identified user needs.
- Quality Control (QC) is a final step at the end of production, performed before the product is packaged for delivery to the user (customer). A high number of QC rejections can occur as a result of poor implementation of QA throughout development. If products are delivered to customers after failing final QC checks, the number of problems experienced by end users is likely to be higher than if those products were rejected for delivery.
For any accessibility test approach, a key question is whether it can be applied as part of QA during product development, and QC at the end of development. Necessary accessibility fixes are easier, less costly, and more likely to be addressed if they are caught during QA than if they are caught only at the point that QC takes place.
Is it ‘Accessible’? (Standards conformance versus usability)
As with many other aspects of product development, there is a difference between standards conformance and usability. For example, if an electrical gadget has to work at 240 volts, it can be tested according to technical standards with appropriate test tools, and if it is within acceptable tolerances, it will pass those checks. Technical standards don’t tell you anything about usability. Usability testing is a process whereby products are tried out by potential end users, and test methods are employed by research and development staff to gauge how well the product operates for users, whether there are major or minor usability problems that should be fixed.
Laws and standards in the US relate to the technical aspects of ICT accessibility; they do not relate to the usability of the product. For example, one US standard requires: “At least one mode of operation and information retrieval that does not require user vision shall be provided”. Note that this requirement does not say that the mode of operation be easy to use. This gives us a separation between testing for technical standards conformance and testing for usability.
This article concerns testing for technical conformance with established standards for ICT accessibility, such as Section 508 and WCAG 2.0.
Note: We have produced a related article concerning the other aspect, usability: How do I ensure my products work for people with disabilities?
Test approaches
The different test approaches
- Manual Assistive Technology (AT). Employing AT—designed for use by people with disabilities to interact with mainstream technologies—as a testing tool. Such AT commonly includes screen readers used by people who are blind, screen magnifiers used by people who have low vision, and speech recognition used by people who are unable to use a physical keyboard. This testing involves persons skilled in the use of one or more AT, who may or may not have disabilities.
- Give to a group of users who have disabilities. Providing the ICT under test to people who have given disabilities, such as blindness, low vision, deafness, difficulty hearing, etc. This involves persons familiar with the use of their particular AT as a part of their daily lives.
- Manual Code Inspection. Examining the underlying code that governs how information is rendered by the ICT. For example, the code underlying software, or underlying the structure of an electronic document. Testers use a number of inspection tools that interact with the ICT under test in order to reveal the code to be tested.
- Automated Code Inspection. A tool or set of tools is programmed to automatically scan content, inspecting the underlying code elements. The results are tracked and errors listed at the end of the automated scan process. Manual setup and management of scan settings is required, but the scan runs without the need for human interaction.
- Semi-Automated Parsing of Code. ICT code is sent to a tool that scans it and detects errors in coding that are displayed to the human tester. Usually no manual setup or management of settings is required. (This operates like a manual one-off version of automated code inspection tools.)
- Combinations. The above test approaches can be combined to make specific test process. In the pros and cons table below we list two fairly common combinations (Manual Code inspection + Automated Code Inspection; and Manual Code inspection + Manual Assistive Technology). Other combinations are possible.
Note: For some of the tests listed above, the primary approach may be supplemented by individual test elements that use a secondary approach. For example, AT may be used as the primary means to gauge the accessibility of a piece of software, but for testing color contrast, a code inspection tool may be employed as a secondary tool.
Availability and approximate conformance test process scope (percentage) for different types of ICT
Test Approach |
Websites |
Software |
Mobile |
Electronic Documents |
---|---|---|---|---|
Manual Assistive Technology |
Available (100%) |
Available (100%) |
Not Applicable |
Available (100%) |
Give to a group of users who have disabilities |
Available (100%) |
Available (100%) |
Available (100%) |
Available (100%) |
Manual Code Inspection |
Available (100%) |
Available (100%) |
Available (100%) |
Available (100%) |
Automated Code Inspection |
Available (<50%) |
Not Available |
Not Available |
Available (<50%) |
Semi-Automated Parsing of Code |
Available (<50%) |
Not Available |
Not Available |
Not Available |
Note: The percentages given above are intended to be indicative. The percentage given refers to use of both the primary approach and any secondary approach that is typically employed.
Pros and cons of different test approaches
Test Approach |
Pros |
Cons |
---|---|---|
Manual Assistive Technology |
|
|
Give to a group of users who have disabilities |
|
|
Manual Code Inspection |
|
|
Automated Code Inspection |
|
|
Semi-Automated Parsing of Code |
|
|
Combined Example 1: Manual Code inspection + Automated Code Inspection |
|
|
Combined Example 2: Manual Code inspection + Manual Assistive Technology |
|
|
Combined Example 3: All approaches together |
|
|
Finding test approaches
Web searches will return links to available test processes in the categories described in the table above. In addition, a number of organizations publish lists of accessibility tests. For example:
- Accessibility Evaluation Resources (from the W3C). Includes advice on selecting tools, and a link to a searchable list of tools
- Testing Website Accessibility (from Colorado State University). A more selective list with links to test processes.
- Create Accessible Electronic Documents (from the GSA). A list for electronic document authoring that also includes a number of test tools for documents.
About this article
Authors
This article is published as part of The Accessibility Switchboard Project, an initiative of the National Federation of the Blind Jernigan Institute with support from the members of the Accessibility Switchboard Project Community Of Practice, and from the Maryland Department of Disabilities.
Suggested citation
The Accessibility Switchboard Project. TESTING Q&A: What procedures should I use to test my ICT for accessibility?. January 2017, Version 1.0.1 National Federation of the Blind Jernigan Institute. Available: http://switchboard.nfb.org/
Feedback, additions and updates
The authors welcome feedback on this and other articles in the Accessibility Switchboard. Use the feedback form to provide updates, new case studies, and links to new and emerging resources in this area. The feedback form can also be used to join the mailing list for notification of new content and updates from the Accessibility Switchboard.
Copyright, use and reproduction
Accessibility Switchboard articles are published under the Creative Commons License Attribution-ShareAlike 4.0 International. You are free to share (copy and redistribute the material in any medium or format), and to adapt (remix, transform, and build upon the material) for any purpose, even commercially. This is under the following terms: (1) Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use; (2) ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. For more detail on the license, see CC BY-SA 4.0 on the Creative Commons website.
Picture credits
‘Inaccessible Island’ by Chris M. Law & The Accessibility Switchboard Project. CC BY-SA 4.0. Cropped and modified from original: ‘Map of Tristan da Cunha Group, Southern Atlantic Ocean’ (Public Domain).