home/blog/engineering
Ari Ross - Chinese-African American woman wearing black-rimmed glasses, big smile, and holding hand under face
Ari Ross Software Engineer
Posted on Jul 21, 2022

How to Test for Web Accessibility

#accessibility#testing

When I was an intern I got my first accessibility audit back from a 3rd-party auditor. It listed out issues with the website I was working on and provided recommendations on how to make my code more accessible. Then when it came to future features, I would copy-paste the same techniques to other parts of my code, but I wouldn't fully understanding why or when those techniques were needed.

Since then I've learned how much context I was missing around users and how they actually use and experience the web, part of that context being how to accurately test my work. So let's talk about how we can check for accessibility!

Automated Testing Tools

Yay! Yes! Tools! Gotta love it when you can just run testing software that tells you what's wrong with your code.

Browser extensions like aXe Dev Tools and ARC Toolkit can be run by anyone and will scan a page and produce a list of code-level issues such as if an <img> is missing an alt tag, an invalid value has been passed to an ARIA attribute, and if an element has no programmatic text. There are also linters like the eslint a11y plugin that can be used for local development.

Automated testing tools are super easy to include into your development workflow and a quick way to learn about common issues and fixes. However, while automated tools are great for correcting your implementations, they won't necessarily alert to when a feature needs some extra a11y love. That's because automated tools don't necessarily know what it is that you're trying to achieve. For example, there are certain ARIA attributes that a toggle button needs to be made accessible to all, but automated tools don't know that you're trying to build a toggle button, so it won't tell you what you're missing. But you know. And that's where manual testing comes in.

Manual Testing Methods

Manual testing is something we do all the time. We interact with a feature or component and see if it does what we expect it to do. But as mentioned in my post on the different ways people navigate the web, there are so many different tools and methods that people use to interact with our websites and apps. So clearly we should be testing our work with as many assistive technologies (ATs) as we can get our hands on, right?! Wrong. Since most ATs consume and make standard calls to the same accessibility APIs, we can usually say that if it works with one AT, it'll work for most others. So we can easily cover most cases by testing with a few tools we likely already have at our disposal.

Keyboard

A lot of us are used to using a mouse to point and click our way through the web, but that's hard to do for users who have trouble seeing where elements are on a page, or even users who have difficulties with the minute movements a mouse can require. Instead users should have the option to navigate solely with a keyboard. When navigating with a keyboard, it's typical to scroll via Arrow keys, navigate to interactive elements using the Tab key (and Shift+Tab to reverse it), activate buttons with the Enter or Space keys (for link elements, only the Enter key is used), and use the Esc key to close menus, modals, etc. The WAI ARIA Practices Guide is a great reference for how users expect to interact on a per component basis using a keyboard (and also a great place to view accessible versions of common components). During testing, you'll want to ask yourself the following:

Honestly, if at any point you're trying to use a keyboard and you get frustrated, it's a good indicator that your user will get frustrated as well.

Mouse

This is the most common tool used when navigating the web, but it's often overlooked how much we use the keyboard as well. Most people without disabilities are hybrid users and will seamlessly switch between mouse and keyboard. Just as there are those who only use keyboards, there are those who rely solely on "point and click" methodology. Dragon Nuance is a popular speech recognition software that allows users to navigate via dictation, and the most preferred method of interaction is "saying and clicking" (check out this Dragon Naturally Speaking video tutorial if you need visualizations). When, say, a quick search form in the header is only submittable by pressing the Enter key, it becomes harder for these users to perform the desired functionality. We also want to keep in mind target size. We don't all have the steady hand of a surgeon, so the general rule is to make sure that all interactive elements have a width of at least 44 pixels. Other than that, there aren't too many out of the ordinary things to test for with a mouse, but if you're looking for more rules like the keyboard list from earlier or you deal with crazy custom interactive components, you can check out the Input Modalities section in the WCAG 2.1 for related rules.

Magnifiers/Zooming

Making sure that your website or app is resizable and customizable ensures that users with low vision can use your product. Honestly this is a pretty big one considering people with low vision tend to be those who are older or even those who have lost their glasses to the sea while swimming at the beach after an earthquake...anyway, it's just to say that this helps more people than we think. Users with uncorrectable low vision might use magnification software such as ZoomText Magnifier, which comes with a bunch of features like zooming in 200x, and enlarging text of webpages and apps. We can check if our products will respond correctly in a few ways:

tl;dr Resize your browser window, zoom in 200%, and resize your text. That's it. That's how you test. If anything looks wonky or unreadable, then it probably needs some fixing!

Screen Readers

Okay. So if you need one take away from this blog, it should be that you should keyboard test with a screen reader on your website. It'll cover keyboard functionality and text-to-[insert preferred sense here] accessibility, which will pretty much cover 90% of your issues (I put a random percentile in there for the drama. Please don't quote me!).

So. Screen readers.

A screen reader is software a user can use to translate written text on a page to audible speech. But it's not just the visible written text that it can announce. If a website or app is marked up correctly, screen readers will announce 3 important pieces of information:

  • Name: This is the information we commonly expect screen readers to announce, as it's often the visual information on a screen. The name of an element could be the text of a paragraph, label of a form field, alt text of an image, etc. It is the very content of our site or app.

  • Role: This lets us know about the type of element so that even without visual indicators, users can understand the structure and interactivity of the page or app. It helps a user know how to interact with an element if they know if they're on a "button" vs a "link" vs a "text input", and they can understand page hierarchy and context when they screen readers announce that the text they just heard was a "Heading level 2" or "Heading level 4". Most elements have a role announced except for paragraphs and divs, which often are assumed as static content.

  • State: This provides information on the state of the element. This is often where ARIA comes into play, as when used correctly it can announce if content is "expanded", "collapsed", "busy", "required", "invalid", and so much more. For example, imagine an FAQ drawer. When you click on the button, the FAQ information expands, and when you click again, it collapses. Visually we know that the button did something because we saw the information block show/hide, but for users without sight, we would need the button to tell us something is happening by updating and announcing it's state to say something like "expanded" or "collapsed"**. Not all elements will have a state. Usually it'll just be the elements with some sort of interactivity tied to it.

If you're testing with a screen reader and you don't hear those three things (save for exceptions noted earlier), then it's a good indication that you have an accessibility issue to log. A lot of the success criteria of the WCAG relate to these three pieces of information being accessible, and if it works with one screen reader, it's likely to work with a majority of other ATs.

If you're on a Mac, the Voice Over screen reader is already installed and you can start using it right away. For Windows or other OS's, popular screen readers are JAWS (subscription based) and NVDA (free and open-sourced). When using one of these screen readers, it's important to make sure you're testing with proper screen reader/browser pairings. Based on the most recent screen reader survey, you should test your desktop sites with at least one of the following:

Just as Voice Over is available on Macs out of the box, it's also available on iPhones, and even most Androids come with preinstalled with TalkBack:

You'll want to test with the most popular pairings to determine if any bugs are implementation issues or compatibility issues, as it's often the case to report a bug with the AT for compatibility issues instead of trying to find a workaround on your end.

Other Testing

I would say that what we've covered so far will catch most accessibility issues. If you're doing dedicated accessibility testing, there are a few other methods you could use to get more coverage.

Reduced Animations

This is actually a pretty big thing to test for, but I'm putting it in this section because it only applies to when you have animations, videos, or movement in general on your website or app. Users with certain cognitive disabilities may have trouble using a product that has distracting movement and content on the site, and those who use a screen reader will have difficulty hearing if there's any audio content that plays automatically. To help with this, pretty much all operating systems come with a way to reduce or turn off motion on their device, and if implemented correctly, will also apply to websites and apps.

When it comes to animations, videos, and audio, make sure to check the following:

High Contrast Mode

Windows users have a control setting called "High Contrast Mode" that enables them to set a sort of theme for their display. They have the ability to decide what the color of their background, buttons, links, static text, selected text, and disabled elements. This setting helps users with low vision and photosensitivity by allowing them to choose their own color scheme and set contrast levels of their preference. Since essentially this mode overrides the CSS of apps and websites, it's useful to check to make sure that no meaningful content is loss. Assistiv Labs is a subscription based testing tool that emulates many of the popular software-based ATs, and it has a good description of High Contrast Mode if you'd like to learn more about it.

Next...

Phew! That was a lot, and maybe even more information than you wanted. But I swear: understanding what the issues are and how to reproduce them will help you understand the best approach towards fixing them. Or even just helps you know what to search for. Like "how to reduce motion based on user settings," ya know? But if not, I'll talk about the most common accessibility issues on websites and how to fix them in another post. For now, thanks for reading!


Footnotes

* You might be wondering if you should care about an AAA guideline when often websites only try to meet AA guidelines. The answer is "yes," though they are more often considered as "best practices" to help provide an even better experience for users. It might be helpful to think of conformance levels in this way: A = minimum needed to unblock users from using your site; AA = minimum needed to prevent users from rage quitting your site; AAA = minimum needed to provide a decent experience for users. It should be noted that it's near impossible to meet all AAA guidelines, as some actually may conflict with legacy systems, modern designs, and security practices. So it's best to consider them as "best practice" and do what you can.

** I know y'all are itchin' to see some code and learn how to implement this, but that'll be in another post. Look at how much text is already on this page!!

© EF Education First