The European Union (Accessibility of Websites and Mobile Applications of Public Sector Bodies) Regulations were ratified in September 2020, marking Ireland’s first step in its journey towards full digital inclusion. The National Disability Authority (NDA) have been appointed to monitor public sector bodies and their compliance with digital accessibility standards. They published Ireland’s first Monitoring Report for the EU Web Accessibility Directive in December 2021.
We reviewed the report’s key findings in a previous post; this blog now offers a breakdown of the types of reviews performed and the scoring system used.
In-depth versus Simplified Reviews
The directive asks for 175 simplified, 19 in-depth, and 4 mobile reviews to be conducted in 2020 and 2021. Due to the delay in ratifying the directive in Ireland, this number was reduced to 50 simplified, 5 in-depth, and 2 mobile reviews.
The main difference between in-depth and simplified reviews comes down to the type of testing done. Automated tools such as the WAVE Evaluation Tool and the Google Chrome Developer Toolbar and manual tools such as the JAWS, NVDA, and VoiceOver screen readers were used to carry out in-depth reviews. The NDA presented a table to the five organisations containing a pass, fail, or not applicable mark for each level A and AA success criterion in the Web Content Accessibility Guidelines (WCAG) 2.1.
On the other hand, simplified reviews were based solely on the automated Axe Core and Axe Monitor tools, provided by Deque. This allowed the NDA to test 40,373 pages across 50 different websites, but on a much higher level and with less precision than an in-depth review. From the NDA’s report (2021, p. 60), “an Axe Core test is sufficient to identify a failure, to a high level of probability, on a webpage of a Success Criteria, but is insufficient to test if a Success Criteria is passed.”
The NDA Accessibility Score
The 50 organisations who had a simplified review conducted on their websites were given a full Axe scan report with detailed descriptions of each issue found. The monitoring report itself provides only a summary of their results and an accessibility score calculated by the NDA.
Although the directive doesn’t require a scoring system, the score gives public bodies an idea of their digital accessibility and provides the NDA with a baseline for future reports. The most important thing to note is the accessibility score doesn’t measure compliance: it is a weighted calculation of the number of serious, moderate, and minor accessibility issues found. The higher the final score, the better the accessibility of an organisation’s site.
However, where in-depth reviews were able to measure compliance with all 50 level A and AA success criteria, the Axe Core engine identifies failures against only 26 guidelines. This number includes only tests that are possible to automate, such as checking if alternate text exists, videos contain captions, text has sufficient colour contrast, and if zooming and scaling are enabled.
Any issues to do with navigation such as keyboard traps, reading and focus order or the context of image alt text or labels and instructions can’t be captured using automated tools. Even if an organisation was able to achieve an accessibility score of 100% in the NDA report, they would still need to test for issues such as these.
Conclusion
The next monitoring report is due to be published in December 2024. The NDA plan to include the 141 reviews missing from the first report, in addition to new reviews. For any public sector bodies given a simplified review, regardless of the accessibility score they achieved, further manual testing is needed to ensure complete compliance.
Ireland’s first monitoring report plays a huge role in establishing the current state of digital accessibility in the Irish public sector, but there is more work to be done. If you would like to discuss the NDA report or need help meeting accessibility requirements, please don’t hesitate to contact IA Labs.