Sightsavers Organisational Inclusion Coordinator Kate Bennell is involved with testing our systems, processes and content to make sure they’re accessible, especially for people who are visually impaired. Here she explains Sightsavers’ approach.
I use a screen-reading software called JAWS, which converts text into synthesised speech, to carry out testing of online and offline content (web pages, documents, presentations). Following a talk by accessibility specialist Joshua Marshall, we’re revising our system testing approach this year and putting together a formal testing checklist, which looks at all elements of accessibility, so that we can say what percentage of a product is accessible.
We started by looking at the Web Content Accessibility Guidelines (WCAG), but realised that this was too complicated to meet our needs, so we wrote our own checklist based on nine key principals:
- Images and illustrations
- Graphs and tables
- Structure and design
- Access with keyboard only.
Each core element that we are testing now has a number of questions against which I test accessibility with the screen reader. I give it a pass or fail and a percentage score to show how many passes or fails there are. At the end of each test page we have a notes section for recommended improvements that could be made to raise the scores.
I’ve used the new system testing checklist this week on sample pages from our new website. Using the new approach, I was able to identify images without alt text, text boxes that were not recognised as points where the user is expected to key in some text, buttons that were not read by the screen reader and hyperlinks that didn’t work with the screen reader. I provided feedback on these elements to my colleagues in the web design team who are making changes to enhance the accessibility before we launch the website.