The Department of Homeland Security is looking to create a “black belt” program to identify data and AI champions across several areas of expertise.
Several agencies are looking to build up their workforce’s proficiency in data and artificial intelligence skills as part of an ongoing effort to maximize the benefits of these analytics tools.
The Department of Homeland Security, for example, is looking to create a “black belt” program to identify data and AI champions across several areas of expertise.
DHS Chief Technology Officer David Larrimore said at ATARC’s AI and Data Summit that the agency is looking to identify AI experts in areas that include fraud, biometrics and statistical modeling.
“What we’re trying to do around our black belt program is find who the experts are in the DHS organization, and we want to create a black belt for that specific concept,” Larrimore said on Nov. 17.
Larrimore said the black belt program, which is in its early stages, will rely on its champions to promote AI and data analytics across all aspects of the agency.
“Ultimately, you become a part of a larger community to help where people like you aren’t available — because there is no way, that of the 350-or-so acquisition programs going on right now, everyone has someone who could be considered an AI black belt,” he said. “Wouldn’t it be great if a black belt from [Customs and Border Protection] could go spend six months over in a FEMA program to help them get up and running?”
Larrimore said several DHS components over the past few years have made “huge strides” in data sharing. He said DHS data maturity has especially been useful in its immigration operations.
“Because of that data sharing, we have actually been able to process and help hundreds of thousands of people, so information sharing, at its core, has been really important,” he said.
Larrimore said DHS Chief Data Officer Mike Horton has served as an effective “traffic cop” for how the department shares data across its components.
“Not only is he trying to figure out how components can report up data, but also how they can share information with each other in a structured way. We’re going to see over the next several years that role drastically mature policy, instructions to be able to support that,” Larrimore said.
Larrimore said DHS is looking at ways to use data to improve its customer experience and reduce the burden on its customers.
But as the agency looks at emerging technologies such as AI and machine learning, Larrimore said DHS will need to take a closer look at its policies around using public data, and how DHS maximizes the use of data it already collects.
“Especially up on the management side, we have to constantly question the data we’re looking at. And it’s only through working with components with the data providers with the data stewards are we actually understanding where the rubber meets the road … It’s not until those conversations happen, until everybody comes to an agreement on what information actually provides value,” he said.
William Streilein, the CTO of the Defense Department’s Office of the Chief Digital and Artificial Intelligence Officer, said DOD as it looks to get “AI-ready” by 2025, is looking to improve the overall data literacy of its workforce.
Streilein said that AI readiness will vary across DoD and that the CDAO is taking steps to “right-size” AI education across a broad spectrum of positions.
“Somebody who’s in acquisition … at a high level, I would think somebody in that position needs to know that innovation is par for the course. You may not be able to actually inspect and provide requirements for how the model’s working, but you need to know what it needs to do. And so [performance] metrics are absolutely key,” he said.
Streilein said DoD is also focused on data interoperability standards across the department.
“The first priority is data. It’s quality data, and so we’re bringing that message across the DoD, to our partners, to vendors, even internationally to help focus people on the fact that you need good data before you can leverage analytics and AI to bring new insights and things like that,” he said.
Ben Joseph, the chief data officer for the Postal Service’s Office of Inspector General, said the IG office is investing heavily in the data literacy of its workforce.
“I can actually build the best AI/ML model, but if I put it in the hands of my investigator, and he has a ton of questions, then we just lost them. We want to make sure that [as] we go ahead, we have a proactive approach in terms of telling them what change is coming in. How do you actually interpret some of these models? And how do you actually put them in front of people, and how to utilize them?” Joseph said.
Joseph said USPS OIG is particularly focused on bringing in new hires who are data-literate.
“We don’t want to invest a ton of time on transforming everybody into data scientists. We need a mix of people like data analysts, data engineers, data scientists, and people who can also communicate change and all that. That becomes an ideal analytics team for us,” he said.
The agency, Joseph added, has the capability of producing advanced data analytics, but needs to ensure it has a workforce that can take full advantage of these capabilities.
“I can actually create the best model and all these results, but if I have folks who can’t really interpret a bar chart or pie chart, it’s not going to go anywhere. So I got to really educate my workforce, investigators, auditors and everybody else [on] how do you interpret data,” Joseph said.
Udaya Patnaik, the chief innovation strategist at GSA’s Federal Acquisition Service, said agencies are looking for help on ways to start experimenting with AI and machine learning.
“We have agencies that are saying, ‘Help, we’ve got petabytes of data here, that we would love to be able to turn on some of the automated machine-learning-as-a-service tools that we could deploy onto it. We’re just scared of what comes out on the other side of this.’ And that’s a legitimate fear,” Patnaik said.
Patnaik said FAS is looking to provide “safe spaces,” such as test beds, and sandboxes with limited sets of data to test AI and machine learning before scaling projects up beyond the pilot phase.
“It doesn’t have to be something that is scary, or that’s going to put people at risk. That’s something that we just try to keep reminding folks of,” Patnaik said. “At the same point, within GSA, we’re looking at and thinking, ‘What can we do to be able to enable that? What can we do to be able to create those experimentation areas, so that it doesn’t become something that’s daunting for everybody?’”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jory Heckman is a reporter at Federal News Network covering U.S. Postal Service, IRS, big data and technology issues.
Follow @jheckmanWFED