Trust underpins evidence-based innovation, but all parties still need to agree on their definition of trustworthiness.
In the push for evidence-based decision making, data-driven studies are scrutinized more and more at federal agencies. Long-term studies that rely on multiple stakeholders are vulnerable to environmental changes, technological barriers and personnel cooperation — all of which have an outcome on the data.
Trust underpins evidence-based innovation, but for Teri Caswell, a broadband program specialist at the National Telecommunication & Information Administration, all parties still need to agree on their definition of trustworthiness.
“Is the evidence that’s been provided or sought after trusted because it’s been tried and true? Is it trusted because of the community and the audiences of the people who are touching it or defining it, or presenting it?” she said as part of the Performance Institute’s 2022 Government Performance Summit on Wednesday. “I personally believe that if I am part of a compilation of evidence or artifact that has been and there’s a delineation there as well, but if I can present it once and then, when asked, present it again in a different labeling, packaging, compilation — however you want to phrase it — it still has to be trusted.”
In other words, context matters.
Trust is also important to Shonda Mace, a project manager in the Texas General Land Office’s Community Development and Revitalization who has experience working with FEMA and federal Department of Housing and Urban Development on long-term disaster recovery. After Hurricane Harvey in 2017, her office is conducting regionalized flood studies, which she said were purposely regionalized because local communities often do not communicate with each other, sometimes because of a lack of trust. Her team has to be that reliable go-between.
“So one big thing we’re doing is we’re working with not just the communities, but also other state agencies, and federal agencies to break down silos and work together,” Mace said. “If you don’t have the trust amongst the other agencies and your partners, if you don’t have the trust in most communities, you’re not going to get the information you need to move this project forward.”
With long-term studies, it can be difficult to keep all stakeholders engaged over time. Mace said communities want fast answers and, after a natural disaster has passed, the energy for impact studies can fade. She said it takes a balance of not exhausting stakeholders with outreach but also not waiting so long between outreach that they forget about the study altogether. While multiple agencies in Texas are conducting similar studies to those of her team, they must be careful not to duplicate efforts funded by federal dollars or else relinquishing that money.
Caswell added that there should be client knowledge management in the background of long-term studies. Worldly and environmental considerations can change over the course of the study, such as budgetary cuts, political shifts or another entity assumes the program area.
“The positive side of that is, the more willing we are to look at evidence-based criteria to drive innovation, we should be seeking more than one or even 100 inputs to that innovation design, lest we become, a reputation of doing things in a vacuum and we didn’t consider 80% of our benefactors,” she said.
Her recommendation was to track the key words and phrases that change over the course of multi-year studies, a vocabulary list or checklist of sorts, to maintain some level of consistency in the data so that questions are adequately answered by the end.
She also spoke to the question of whether or not to share information as you go, as opposed to waiting until the end of a study to show stakeholders the data. The anecdotal knowledge subject-matter experts or senior advisors can potentially add to a report headed to an elected official, but it may be hard to digitize. In this case, Caswell said, it helps to have an iterative approach to informing stakeholders and reviewing the study with them before completion.
“I am done with the days of writing a summary report before we’ve even looked at the data. Let’s get it on a paper, let’s get it on report, get people around our camera or table, whatever it takes, and start recognizing what it does look like and are we on the correct path? And if we’re not, there’s your first [knowledge management] piece, right?” she said. “We went this way to prove or disprove the hypothesis, we’ve got a course correction we need to affect, we’re notifying all parties, we’re having a conversation, and we’re building the trust in the process, not just the report.”
However, she said, a major consideration that complicates data collection from stakeholders is the technological requirements involved when partnering with federal agencies. Normalizing technology at the federal level takes a long time, and as cybersecurity requirements increase so do the possible challenges for stakeholders to submit data for studies. Mace’s office encountered this on their contract with the Army Corps of Engineers to review modeling. Sharing large items via box or SharePoint were no longer options as USACE’s file sharing system was not large enough.
Yet, the Texas General Land Office saw this as an opportunity to innovate. Mace said the GLO is working on a new Texas Disaster Information System with the University of Texas and Texas A&M University, “where our vendors can put models into there and USACE can go in and access them and get and get those models.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Amelia Brust is a digital editor at Federal News Network.
Follow @abrustWFED