Five more reasons to get excited about governmentwide Section 508 assessment criteria — technology
Michael Gifford, a senior strategist for CivicActions, highlights key questions that agencies should pay close attention to in the new 508 assessment criteria t...
This is part two of a two-part look at new Section 508 assessment criteria. You can find part one here.
People are needed to build an accessible digital government, but as with all digital processes, we must understand the technology. It is especially important in government that it be properly deployed and documented. The General information — Governmentwide section 508 assessment criteria article published on Section508.gov, provides several important technical questions. Given the silos in government, it is important to simply start with documentation of what exists.
Better reporting of digital tools
Government clearly needs to have better reporting on its digital tools. The questionnaire highlights both external internet and internal intranet, as well as mobile applications and desktop software. Knowing that both citizens and staff have disabilities, it is great that it was recognized that reporting is needed on more than simply the public facing interfaces. Documentation and training resources are also included in those elements which must be both identified and tracked.
With some agencies, knowing what domains and sub-domains are public can be hard. For digital accessibility, detailed page-level understanding is required. I am confident that most agencies will have trouble accurately estimating the number of pages on the public internet and internal intranet sites. It might be easier for agencies to estimate how many pages were evaluated for Section 508 conformance.
Only by knowing the total number of pages in the internet and intranet will the number of pages tested for accessibility have value. Section 508 goes beyond the web, as does this questionnaire which requires similar information on custom desktop software and native mobile applications.
I also appreciated that there was special attention given to top 10 viewed internal intranet and public internet pages. Because all of these questions are tied to the reporting period, we should see them change over time. Popular pages are a good metric for where users are likely experiencing problems. This approach isn’t as structured as user journeys or top tasks, but it is great that there is a requirement for comprehensive manual testing to be done on these pages, and that this be detailed in the report. Needing to list the Web Content Accessibility Guidelines (WCAG) errors, along with the page URL allows for simple spot-checking of the results.
As I mentioned in the first article, most governments overlook the accessibility needs of their own workforce. Intranet sites, internal documentation and training resources are commonly an afterthought when it comes to accessibility. These questions highlight the importance of changing this, which is in keeping with the goals of the diversity, equity, inclusion and accessibility (DEIA) executive order to build a more inclusive workforce.
This basic knowledge of what is available is key to understanding the size of the challenge that agencies face.
The importance of accessibility statements
In a previous article, How OMB can improve .gov accessibility, I stated the importance of effective accessibility statements. It is clear from these questions that the government accessibility strategy understands the importance of this.
I am glad that agencies are asked to detail more information about their publicly posted website accessibility statements. I especially think it is important that encouraging feedback from users about the accessibility of the site is key. Ultimately, users should be encouraged to give government feedback about the barriers they face, and have confidence that they will be addressed. There are agency sites where the accessibility statement is not fully Section 508 conformant. This is obviously problematic.
The questionnaire raises how the whole feedback process needs to be accessible, not just the accessibility statement. The whole user journey for submitting feedback must also be clear of accessibility barriers. Furthermore, when errors are submitted there should be a process to track and report on this feedback. Knowing the number of barriers users face is key, as is knowing how quickly they are addressed.
Documented usage of automated tools and services
I love automated testing. Well-built automated accessibility tools are like training wheels to help developers and designers better understand digital accessibility. Tools like Deque’s axe are being built to avoid false positives. It is just one accessibility engine, but has been incorporated in Google Lighthouse, Pa11y and Microsoft’s Accessibility Insights. All of these are open source, but there are plenty of tools like these which are available in the desktop or the command line.
This survey is working to document the usage of automated tools and services used by agencies. Agencies need to get better at large-scale web content monitoring. Google knows of about 900 million .gov web pages, so this is quite a large problem. Going beyond that, knowing how often comprehensive large-scale accessibility testing is done prior to deployment is a good challenge, as most government agencies only evaluate the public facing sites. I would assume that few intranet sites have any sitewide scanning done for accessibility.
Challenging agencies to evaluate both internet and intranet pages report on the number of fully conformant pages, and average number of pages per page in the reporting period, provides a useful comparison. Government needs to be able to prove progress, if it cannot itself achieve perfection.
There are other questions which address training, which should help teams in government understand the results of automated testing. I will be addressing this in more detail later, but it is worth noting that often pursuit of good accessibility scores has distracted projects from addressing key issues that impact their users.
Manual testing must be included
Government at scale needs automated tools to catch issues. I am also very aware that it isn’t sufficient. Manual testing is required with assistive technologies and with users with disabilities. Many elements of keyboard-only functionality require human engagement to ensure that the experience is the same whether you use a mouse to navigate the page. A range of assistive technology must be used to understand actual user experience of people with disabilities.
How many pages only have automated testing in this reporting period for both internet and intranet pages? It should be much higher than those pages which are evaluated with manual testing, but manual testing is still needed, and it needs to be strategically included. Agencies are asked to document how often they conduct “comprehensive manual conformance validation testing for web content (internet and intranet) prior to deployment.”
Accessibility is part of the technology lifecycle
I mentioned earlier that the questionnaire was asking about accessibility testing before deployment. The earlier in the lifecycle of a product accessibility is addressed, the cheaper it will be to develop and more robust it will be for users. Ensuring that testing is done multiple times prior to deployment, will mean that many accessibility issues will be addressed before users ever encounter them. Having robust feedback tools in accessibility statements, and ongoing monitoring of the live site ensures it continues to meet those requirements.
Agencies are asked to determine how accessibility is addressed throughout the technology development lifecycle. This is great. Now most agencies may have hundreds if not thousands of different information and communications technology (ICT) projects, but asking these questions is key to helping agencies begin to develop an organization wide approach.
There are a lot of similarities between how organizations manage both ICT security and accessibility. One of them is simply understanding it in terms of managing risk. The internet is constantly changing, and tools on it will never be 100% secure or accessible. Knowing how government agencies identify and prioritize risk of Section 508-non-conformant information and communications technology (ICT) barriers helps to get at their accessibility maturity. Many of these risks are just part of procurement.
Next steps
I look forward to seeing the first set of results from this questionnaire. Both the report to Congress and the raw data will provide insights into how agencies are addressing accessibility. I have reviewed the accessibility of a lot of government websites, and they all fall short of meeting Section 508. I look forward to seeing which agencies and what practices are proving to be most successful. Good policies, practices and artifacts are key for agencies developing accessible sites. Hopefully we will see agencies, and their contractors, sharing more of what they know works.
Mike Gifford is a senior strategist for CivicActions, a digital services firm that pairs expertise in free and open source (FOSS), Drupal, and accessibility to help the government deliver high-impact public services. He also is an invited expert with the World Wide Web Consortium (W3C).
Five more reasons to get excited about governmentwide Section 508 assessment criteria — technology
Michael Gifford, a senior strategist for CivicActions, highlights key questions that agencies should pay close attention to in the new 508 assessment criteria t...
This is part two of a two-part look at new Section 508 assessment criteria. You can find part one here.
People are needed to build an accessible digital government, but as with all digital processes, we must understand the technology. It is especially important in government that it be properly deployed and documented. The General information — Governmentwide section 508 assessment criteria article published on Section508.gov, provides several important technical questions. Given the silos in government, it is important to simply start with documentation of what exists.
Better reporting of digital tools
Government clearly needs to have better reporting on its digital tools. The questionnaire highlights both external internet and internal intranet, as well as mobile applications and desktop software. Knowing that both citizens and staff have disabilities, it is great that it was recognized that reporting is needed on more than simply the public facing interfaces. Documentation and training resources are also included in those elements which must be both identified and tracked.
With some agencies, knowing what domains and sub-domains are public can be hard. For digital accessibility, detailed page-level understanding is required. I am confident that most agencies will have trouble accurately estimating the number of pages on the public internet and internal intranet sites. It might be easier for agencies to estimate how many pages were evaluated for Section 508 conformance.
Do you know your true cloud costs? Find out how agencies are quantifying cloud costs in our latest Executive Briefing. Download today!
Only by knowing the total number of pages in the internet and intranet will the number of pages tested for accessibility have value. Section 508 goes beyond the web, as does this questionnaire which requires similar information on custom desktop software and native mobile applications.
I also appreciated that there was special attention given to top 10 viewed internal intranet and public internet pages. Because all of these questions are tied to the reporting period, we should see them change over time. Popular pages are a good metric for where users are likely experiencing problems. This approach isn’t as structured as user journeys or top tasks, but it is great that there is a requirement for comprehensive manual testing to be done on these pages, and that this be detailed in the report. Needing to list the Web Content Accessibility Guidelines (WCAG) errors, along with the page URL allows for simple spot-checking of the results.
As I mentioned in the first article, most governments overlook the accessibility needs of their own workforce. Intranet sites, internal documentation and training resources are commonly an afterthought when it comes to accessibility. These questions highlight the importance of changing this, which is in keeping with the goals of the diversity, equity, inclusion and accessibility (DEIA) executive order to build a more inclusive workforce.
This basic knowledge of what is available is key to understanding the size of the challenge that agencies face.
The importance of accessibility statements
In a previous article, How OMB can improve .gov accessibility, I stated the importance of effective accessibility statements. It is clear from these questions that the government accessibility strategy understands the importance of this.
I am glad that agencies are asked to detail more information about their publicly posted website accessibility statements. I especially think it is important that encouraging feedback from users about the accessibility of the site is key. Ultimately, users should be encouraged to give government feedback about the barriers they face, and have confidence that they will be addressed. There are agency sites where the accessibility statement is not fully Section 508 conformant. This is obviously problematic.
The questionnaire raises how the whole feedback process needs to be accessible, not just the accessibility statement. The whole user journey for submitting feedback must also be clear of accessibility barriers. Furthermore, when errors are submitted there should be a process to track and report on this feedback. Knowing the number of barriers users face is key, as is knowing how quickly they are addressed.
Documented usage of automated tools and services
I love automated testing. Well-built automated accessibility tools are like training wheels to help developers and designers better understand digital accessibility. Tools like Deque’s axe are being built to avoid false positives. It is just one accessibility engine, but has been incorporated in Google Lighthouse, Pa11y and Microsoft’s Accessibility Insights. All of these are open source, but there are plenty of tools like these which are available in the desktop or the command line.
Read more: Commentary
This survey is working to document the usage of automated tools and services used by agencies. Agencies need to get better at large-scale web content monitoring. Google knows of about 900 million .gov web pages, so this is quite a large problem. Going beyond that, knowing how often comprehensive large-scale accessibility testing is done prior to deployment is a good challenge, as most government agencies only evaluate the public facing sites. I would assume that few intranet sites have any sitewide scanning done for accessibility.
Challenging agencies to evaluate both internet and intranet pages report on the number of fully conformant pages, and average number of pages per page in the reporting period, provides a useful comparison. Government needs to be able to prove progress, if it cannot itself achieve perfection.
There are other questions which address training, which should help teams in government understand the results of automated testing. I will be addressing this in more detail later, but it is worth noting that often pursuit of good accessibility scores has distracted projects from addressing key issues that impact their users.
Manual testing must be included
Government at scale needs automated tools to catch issues. I am also very aware that it isn’t sufficient. Manual testing is required with assistive technologies and with users with disabilities. Many elements of keyboard-only functionality require human engagement to ensure that the experience is the same whether you use a mouse to navigate the page. A range of assistive technology must be used to understand actual user experience of people with disabilities.
How many pages only have automated testing in this reporting period for both internet and intranet pages? It should be much higher than those pages which are evaluated with manual testing, but manual testing is still needed, and it needs to be strategically included. Agencies are asked to document how often they conduct “comprehensive manual conformance validation testing for web content (internet and intranet) prior to deployment.”
Accessibility is part of the technology lifecycle
I mentioned earlier that the questionnaire was asking about accessibility testing before deployment. The earlier in the lifecycle of a product accessibility is addressed, the cheaper it will be to develop and more robust it will be for users. Ensuring that testing is done multiple times prior to deployment, will mean that many accessibility issues will be addressed before users ever encounter them. Having robust feedback tools in accessibility statements, and ongoing monitoring of the live site ensures it continues to meet those requirements.
This questionnaire doesn’t stop there though.
Sign up for our daily newsletter so you never miss a beat on all things federal
Agencies are asked to determine how accessibility is addressed throughout the technology development lifecycle. This is great. Now most agencies may have hundreds if not thousands of different information and communications technology (ICT) projects, but asking these questions is key to helping agencies begin to develop an organization wide approach.
There are a lot of similarities between how organizations manage both ICT security and accessibility. One of them is simply understanding it in terms of managing risk. The internet is constantly changing, and tools on it will never be 100% secure or accessible. Knowing how government agencies identify and prioritize risk of Section 508-non-conformant information and communications technology (ICT) barriers helps to get at their accessibility maturity. Many of these risks are just part of procurement.
Next steps
I look forward to seeing the first set of results from this questionnaire. Both the report to Congress and the raw data will provide insights into how agencies are addressing accessibility. I have reviewed the accessibility of a lot of government websites, and they all fall short of meeting Section 508. I look forward to seeing which agencies and what practices are proving to be most successful. Good policies, practices and artifacts are key for agencies developing accessible sites. Hopefully we will see agencies, and their contractors, sharing more of what they know works.
Mike Gifford is a senior strategist for CivicActions, a digital services firm that pairs expertise in free and open source (FOSS), Drupal, and accessibility to help the government deliver high-impact public services. He also is an invited expert with the World Wide Web Consortium (W3C).
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Five reasons to get excited about governmentwide Section 508 assessment criteria
How OMB can improve .gov accessibility
In pursuit of digital accessibility: Navigating the impact of Section 752 on federal agencies