Establishing accessibility benchmarks for users with visual impairments

The Challenge
For many users, navigating a website feels intuitive. For people who are blind or have low vision, the experience depends heavily on compatibility with assistive technologies and predictable user journeys.
Accessibility is not a niche concern. Up to 1 in 4 Americans live with some form of disability, representing roughly 61 million people in the United States alone. When digital products fail to account for these users, organizations risk excluding a significant portion of their audience.
A global technology company partnered with AnswerLab to better understand how users with and without visual impairments experience their product and to establish accessibility benchmarks that could guide future product development.
AnswerLab’s Approach
The client needed to understand how accessible their product was for users with visual impairments and establish baseline usability metrics to guide future improvements.
Designing a benchmarking research program for accessibility
AnswerLab partnered with the client to design a benchmarking study that combined qualitative observation with quantitative usability metrics. The goal was to create both immediate insight into accessibility challenges and a measurable framework for evaluating the product over time.
Recruiting representative accessibility user segments
The study included three participant groups: individuals who were blind, individuals with low vision, and sighted users. This structure allowed the research to compare performance across segments and surface where accessibility barriers had the greatest impact.
Observing task completion and assistive technology usage
Participants completed a series of critical product tasks during remote one-on-one sessions. Researchers captured success rates and behavioral observations while also documenting how participants interacted with assistive technologies such as screen readers and magnification software.
Key Insights & Results
Screen reader navigation created task completion barriers
Participants using NVDA screen readers encountered unpredictable and inconsistent navigation patterns that made certain tasks difficult to complete.
Accessibility shortcuts were difficult to locate
Participants struggled to find key accessibility shortcuts designed to help them navigate the experience more efficiently.
Assistive technology significantly shaped the user experience
Users relying on screen readers and magnification software interacted with the interface differently than sighted users, revealing opportunities to improve navigation clarity and structure.
Benchmark metrics created a measurable view of accessibility
By tracking task completion and usability across segments, the study established baseline performance metrics for the product experience.
Business Impact
Established baseline accessibility metrics for future development
AnswerLab defined measurable benchmarks that the product team can use to evaluate future product updates and track accessibility improvements over time.
Enabled clearer evaluation of future product iterations
With baseline metrics in place, the team can now test new versions of the product and measure whether changes improve usability for users with visual impairments
Created a framework for competitive accessibility evaluation
The benchmarking approach also enables comparison with competitor products, helping the team understand where their experience performs well and where additional improvements are needed.
Delivered a clear path to improve accessibility for visually impaired users
The client left with a prioritized set of usability improvements that will help them better support users with visual impairments and create a more inclusive product experience moving forward.

-1.png&w=828&q=75&dpl=dpl_3jgR99ikAEv4LprQYjgJT3MT5m3Z)
-1.png&w=828&q=75&dpl=dpl_3jgR99ikAEv4LprQYjgJT3MT5m3Z)