With ChatGPT dominating headlines about AI, agencies are interested in what generative AI can do for them. But often, it’s a solution in search of a problem.
Ever since President Joe Biden issued the executive order on artificial intelligence, federal agencies have been increasingly striving to find ways to effectively implement AI in their operations.
But many government organizations face a series of common challenges, said Art Villanueva, chief AI architect for Federal Strategic Programs at Dell Technologies:
That last one is true for many technology challenges, but is compounded by the constantly evolving nature of AI in actual use, Villanueva shared during an interview for the Federal News Network AI & Data Exchange 2024.
The key for agencies is to start small, to do iterative development and prototyping, added Kurt Steege, chief technology officer at ThunderCat Technology.
“And that’s not just with the AI and machine learning models themselves but also the data privacy and security,” Steege said. “Agencies can develop, test and prototype AI models — not to mention train their workforce on them — in the cloud without ever exposing sensitive data.”
AI comes in a variety of magnitudes, ranging from tiny machines on the edge served by the cloud to multithousand-node clusters of graphics processing units. What capacity and configuration an agency needs is entirely dependent on what they want to do, both Steege and Villanueva said.
The biggest buzzword of the moment — for the last year, in fact — is generative AI, with ChatGPT occupying much of the spotlight in the AI conversation. It’s the most accessible reference point for people who aren’t experts in the field, so it’s often becoming the first thing agencies with a new mandate to adopt AI want to explore, Villanueva said.
“But if you look at the universe of use cases, you actually start to notice that in most instances, generative AI actually doesn’t even play in it,” he said. “There’s generative AI, and there’s what’s called discriminative AI, sometimes called traditional AI, things such as understanding text. You’re not generating anything, just understanding text or images — just trying to classify whether something is a missile or a plane, for example.”
In that missile versus jet example, a system would not necessarily be tapping generative AI. Instead, it could accomplish these classification problems by matching imagery and data against known stored imagery and data to identify a flying object quickly.
“What’s nice about discriminative AI is, you don’t necessarily need all the hardware that’s required with generative AI,” Villanueva said.
That’s why for any agency looking to incorporate AI, the first challenge to overcome is determining the right use case for the mission, Steege advised.
In many cases, agencies may not need anything more than robotic process automation to automate business functions to achieve efficiencies. One possibility gaining traction? The use of a bimodal approach, where different types of AI modalities are used at different times.
Agencies with sensors gathering data in the field, for example, might want to begin preprocessing their data with machine learning algorithms in limited ways at the edge while it’s being collected, something that’s known as “tiny ML,” Steege explained. Then, when that data gets back to a centralized data repository, larger AI and ML models can be applied.
Sometimes those bimodal approaches become stepping stones to bigger and better things. “What we have to do first in a lot of cases is understand the information, understand the data, get the data together, understand how to take all of that information into some format, some platform, that you can then leverage different types of algorithms on,” Steege said. “Sometimes it’s just to get to yes or no, do this, do that. After you get through that, then you apply general artificial intelligence, and then you can get to machine learning and simulation and then, maybe then, generative AI.”
Agencies need to know what data they have and evaluate it objectively to ensure it’s usable for training AI models, Steege said. Is the data clean, accurate, and derived from a single source? Is every single population segment represented to avoid bias?
Although AI biases that reinforce racial discrimination and social inequalities often drive discussions about AI bias, organizations must also address possible informational bias.
“For example, when gathering data through sensors, if you’re taking information from a limited number of sources, you may be designing, modeling or simulating based on just one aspect and then extrapolating those results into a larger context,” Villanueva said. “The lack of variability in data can cause issues.”
Steege reiterated that the data also has to be considered from privacy and security angles as well.
“So doing things like data anonymization as part of the aggregation and encryption too — to protect sensitive information, ensuring that the proper protocols are in place,” he said. “Layering the zero trust side of it in there as well, making sure that when you’re reflecting an AI algorithm onto the data. That way, the data itself is private and anonymized — and structured and encrypted and stored — so that you don’t have problems with personal, classified or otherwise sensitive data being leaked through these models.”
Then, when an agency has built out the models, it can move them into secure environments and apply them to the actual sensitive data without ever having to go through the expense or hassle of data repatriation, Villanueva added.
Throughout this process, agencies can lean on the cloud to scale up and down as needed for modeling and prototyping, he said.
“It’s never been so easy for agencies and companies and individuals to be able to prototype their ideas,” Villanueva said. “There are a lot of things that you can do on hybrid environments or in a limited proof of concept before you actually implement it with your sensitive data on premise.”
Discover more articles and videos now on our AI & Data Exchange event page.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.