Stakeholders weighing in on rollout of the data strategy said building a workforce that can keep pace with the scope of the problem and the rate of change in te...
Agencies have a plan in place to make the most of their data going forward, but bringing data-centric talent on board and reskilling the workforce raises new challenges under the first-year action plan of the Federal Data Strategy.
At an open forum sponsored by the Data Coalition and the Office of Management and Budget, stakeholders weighing in on the rollout of the data strategy said building a workforce that can keep pace with the scope of the problem and the rate of change in technology stands out as one of the most daunting challenges.
“Data management is fairly straightforward. It’s the people management part of it that’s very difficult,” said Michael Atkin, a former chair of the Treasury Department’s Financial Research Advisory Committee, now managing director of Content Strategies LLC.
Kathy Rondon, a former CIA collection management officer, now the vice president of talent management at The Reports and Requirements Company, said improving data literacy and related skills for a broad swath of employees – from manager to rank-and-file employees — should serve as the “linchpin” of the federal data strategy going forward.
“Your most highly technical employees may, in fact, be data illiterate,” Rondon said.
Jason Briefel, the Senior Executives Association’s executive director, said getting a baseline analysis of data literacy in the federal workforce would allow agencies to get a better headcount of the number of managers they have who can ask the right questions from their data.
“Policy does not drive results. People drive results. While people may be highly technically proficient in some areas of their jobs, they may not know how to leverage data, how to ask good questions of their staff,” Briefel said. “If you’re putting a dashboard in front of somebody, but they don’t know how that dashboard was created, what fed up into it, they may make potentially dangerous decisions.”
Mike Fleckenstein, a principal at Mitre, said agencies will see an increasing variety of data workers — like chief data officers — arrive in waves, much like the now-common prevalence of chief information officers and chief technology officers both in and out of government.
“More and more, we’re seeing data stewards, we’re seeing business analysts. We’re seeing data preparation folks. We’re seeing data quality analysts, and the position titles just keep on growing,” Fleckenstein said, adding that data scientists currently spend about 80% of their time preparing data for analysis.
Under the Foundations for Evidence-Based Policymaking Act, agencies have until the end of July to appoint chief data officers.
Under the Evidence Act, which sets down many of the goals in the data strategy, agencies must also release data inventories to the public, striking a careful balance between data access and data security.
White House Office of Science and Technology Policy Director Kelvin Droegemeier said one of the major goals of the data strategy is “responsible liberation” of more agency data through public-private partnerships, and to lay a foundation for data standards that will allow agencies to share certain data sets and work across silos.
“Every agency is different, and that’s important. We want to try to homogenize as much as possible so that we don’t have sort of a patchwork of capability … you don’t want to have to learn a whole different set of rules and regulations every time you go to a different data set, even within the same agency,” Droegemeier said.
But the way forward isn’t just getting answers out of the data that agencies already have. The data strategy tasks agencies with developing learning agendas, which challenge agencies with asking open-ended, mission-centric questions and then seeking the answers. And searching for those flecks of valuable data that can answer those questions can be difficult with a torrent on new data created every second.
Drogemeier estimated that next year, agencies will reckon with about 44 zettabytes of data, but added that about 88% of all data produced goes unused or unexamined.
“We’re swimming in data, but we really don’t understand the data,” he said, pointing to tools like artificial intelligence and machine learning as having the potential to help sort through this data. However, linking those data sets across agencies could gain valuable new insights.
OSTP, for example, has partnered with the Energy Department’s Oak Ridge National Laboratory, the Defense Department and the Department of Veterans Affairs on a project aimed at drilling down on data about veterans to better identify reskilling and upskilling opportunities.
Vishal Kapur, a principal at Deloitte, said agencies need to not only look inward to take stock of what data they have. They also need to look outward, he said to identify opportunities to link data with other agencies and break down silos.
“I would think about what data do I need from other agencies that can be helpful in solving my mission needs, and vice versa – what data can I offer for other agencies to address their mission needs,” Kapur said.
But in some cases, barriers around data sharing exist for good reason. Statistical agencies like the Census Bureau adhere to strict protections of personally identifiable information from the decennial census.
“Some data is siloed for a reason and must remain siloed,” Rondon said. “But it’s been my experience that workforce knowledge on this subject is segmented and specialized, and many data users don’t understand the underlying requirements.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jory Heckman is a reporter at Federal News Network covering U.S. Postal Service, IRS, big data and technology issues.
Follow @jheckmanWFED