Accessibility research enables us to create products that are usable by all people regardless of ability, making it a vital part of any UX research program. Accessibility studies are tricky to run in remote environments, but there are also a number of unexpected added benefits. By conducting accessibility research remotely, teams can connect with people in their natural environments, using their assistive tools and technologies in a realistic context.
“Accessibility is making user interfaces perceivable, operable, and understandable for people with a wide range of abilities. Accessibility also makes products more usable by people in a wide range of situations, circumstances, environments, and conditions.”
- Shawn Henry, Web Accessibility Initiative
As we continued to refine our remote research approaches this year, we conducted our own research to better understand the participant experience during remote accessibility studies. The findings and insights are helping our team improve our methods and research approach for differently abled participants.
We conducted research with participants with vision impairments who used some form of assistive technology like screen readers or screen magnifiers to access digital experiences. We conducted these sessions on both mobile and desktop devices, using UserZoom GO and Zoom as our video conferencing software.
We set out with the goal of identifying the challenges that exist when conducting remote research with participants using screen readers and screen magnifiers. Additionally, we wanted to dive further into the constraints and technical challenges of using different video conferencing platforms in conjunction with assistive technology. Ultimately, the findings would inform how we should modify our approach to remote research to make our sessions as successful, comfortable, and seamless as possible for participants, researchers, and our clients.
During the sessions, we asked participants to complete tasks on a bank website and an ecommerce website while using their assistive technology. As they navigated these tasks, we asked probing questions about their experiences using assistive technology in general and their experience participating in the research session itself.
Our findings illuminated a number of best practices for remote research with screen readers, as well as some opportunities for continued experimentation and learning.
How people use assistive technology differs across the board, from what programs they use to how often they use them. To limit any confusion, define the specific assistive technologies you want to use during research. We learned there are tools (e.g., Read Aloud, ZoomText) that are technically screen readers, however they are only designed or only used by some to read text on the screen, not underlying markup (i.e., headings, form field states, etc.). This interpretation of the underlying code is where most accessibility issues occur, and therefore, it’s ideal to talk to participants who use one of the highly functional screen readers on the market during your accessibility studies. These include JAWS, NVDA, VoiceOver, and Narrator for desktop, and VoiceOver and TalkBack for mobile.
Some participants only use assistive technology for convenience (e.g., when their eyes are tired) and not as their sole method of interacting with the web. Being clear about the frequency with which you want participants to use their assistive technology is critical during recruitment. Based on our research, we recommend trying to talk with participants who use their assistive technology all of the time or nearly all of the time.
Whether you’re running your research on mobile or desktop, there are some different capabilities and tradeoffs you should be aware of.
We tested two different research platforms in our research (Zoom and UserZoom GO) on both desktop and mobile. Participants were able to join sessions and share their screens on both platforms and devices, but several technical issues arose with mobile. Some of these difficulties included audio issues and feedback loops, difficulties entering passwords, struggling to use dictation features because of the research tools, and more. Unless it’s absolutely essential to the research goals, we recommend sticking to desktop research to limit technical difficulties and strain on participants.
Additionally, in both platforms on mobile and in UserZoom GO on desktop, the moderator was unable to hear the screen reader during the session. This made it difficult for our researcher to know when to ask a question without interrupting the participant as they listened to the screen reader. On Zoom desktop, however, the participant can select an option to share their computer audio which allows the moderator to hear the screen reader during the session. This setting makes it the best option for this research need.
Regardless of the platform or methods you use, be sure to test your tools prior to research to make sure you’ve accounted for any technical challenges or hidden settings. There are tradeoffs and limitations to all of these tools and methods, so take some time to evaluate which is best for meeting your research needs.
We recommend adding 10 minutes or more to your session duration to address any technical issues or questions. Some participants might be able to complete the session in less time, but when technical difficulties arise, we found we needed longer just in case. However, be careful not to make the session too long. We found many of our participants were visibly fatigued after the full session. Learn as you go to better refine the right session length for your participants.
While many people rely on screen readers and screen magnifiers to access digital experiences, many of us who are designing or researching those digital services and products have never experienced what it’s like to use a screen reader. To see these technologies in action, we’ve shared a couple videos from our research below:
This participant is using JAWS to navigate a shopping website. As you can hear, screen reader users often have the audio set at a very fast speed. As they get more and more acquainted with the tool, users often get used to this speed, but it can almost be unintelligible for those of us who don’t use a screen reader regularly. Something to note is that since the screen reader is focused on playing back the code itself rather than just what’s on the screen, having items in the proper order and properly tagged is very important.
In this example, a blind participant uses a screen reader to receive a message and type out a reply using “Voice Over” on an iPhone. Pay attention to how Voice Over works with the keyboard in particular as he types out the message. The way this is set up is it requires the user to tap once to select an item, for example, the letter K on the keyboard, and then double tap to activate selecting that item. That’s why you hear the repetition as the person uses the keyboard.
Learn more about accessibility research and inclusive design with more resources from our team.