DHS is driving toward ambitious burden reduction goals by requiring components to leverage "usability tests" for any public information collections.
The Department of Homeland Security is leveraging “usability testing” to help meet its ambitious customer experience goals, and the department’s CX chief sees the potential for other federal agencies to follow DHS’ playbook for designing more effective government services.
“Usability testing” refers to a formal method of testing out functionality by observing how real users attempt to complete tasks on a website, application or product. Such testing has featured in user-centered design techniques for decades.
Dana Chisnell said she did her first usability tests on software manuals in the 1980’s. Later on in her career, she said usability tests helped states improve voter experiences at the Center for Civic Design. They also featured in some of the U.S. Digital Service’s work when Chisnell served there during the Obama administration.
When she joined DHS as the executive director of customer experience in 2022, Chisnell said such testing was largely lacking at a large department that interacts with the public more than any other federal agency. DHS employs more than 260,000 people across eight operating components and other support divisions.
“They all have different customers, they have their own agencies with their own org charts,” Chisnell said during a Dec. 19 webinar hosted by the Human Centered Design Center of Excellence. “And there wasn’t nearly enough usability testing going on, either for internal applications or for external applications. And so we wanted to make this possible.”
DHS also launched an ambitious initiative in 2022: Chief Information Officer Eric Hysen directed components to cut 20 million hours from the estimated 190 million hours the public spent every year on DHS forms, surveys and other burdens, as measured by the Paperwork Reduction Act.
Last year, DHS announced it had met the goal. And Chisnell said usability testing helped play a crucial part in helping DHS components cut down on paperwork.
“One of the conditions was that each of the forms that came through that initiative must be usability tested,” she said.
DHS helped its employees understand the basics of such testing through a “usability testing kit” that’s now available on its website. Chisnell said it’s generic enough for “a lot of different federal applications,” not just DHS ones.
“You’ll see that there’s a simple format,” she said. “It might look like a recipe book. And that’s because cookbooks actually work. When you tell people what the ingredients are and how to mix them together . . . most people can come up with something workable.”
The key is tracking how successfully participants can use a given website, application or product without any teaching or training, according to DHS’ website. The goal is to test the usability of a product and fix any issues before it’s released to a wider user base or the public.
“We talk about errors or mistakes, even if the participant corrects them. The point here, though, is not that they’re making the mistake. The point is that the design did not support them in their task or their knowledge. And so you want to track that,” Chisnell said.
In fiscal 2024, DHS is attempting to meet new burden reduction goals to eliminate an additional 10 million of public burden hours and to redesign 75% of internal DHS forms. And Hysen is requiring all DHS information collection requests to go through usability testing.
Each request “must reflect that usability testing was conducted, what the main findings were, and how those findings will be addressed within the accompanying supporting statement,” Hysen wrote in a Sept. 29 memo. “Information collection requests that do not follow these best practices will be rejected by my office unless an emergency circumstance merits an exception.”
DHS is focused on documenting decisions made about a product and its associated tests, Chisnell said, to help understand gaps in usability in the future. But Chisnell said it’s also crucial for customer experience teams to get “as many stakeholders in the room” as possible to observe usability tests, to avoid surprise recommendations that aren’t feasible to incorporate into a product.
“None of this should be done in a bubble, in isolation,” she said. “You need cross functional support and attention.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Follow @jdoubledayWFED