AI is already here, and its most common form is a lot closer to home than many people realize.
Decades of science fiction have primed people to conjure a very specific image when the subject of artificial intelligence arises: The malevolent robotic overlord that has evolved beyond the need for humans, or concern for their safety.
And with that image in mind, it’s easy to imagine that it’s still a long way off. But in actuality, AI is already here, and its most common form is a lot closer to home than many people realize.
“It’s already hard to interact with a modern technology that doesn’t in some way benefit from AI, whether it’s obvious or not,” said Mason McDaniel, chief technology officer at the Bureau of Alcohol, Tobacco, Firearms and Explosives, said during a Sept. 4 Meritalk webinar. “And I’d say that’s going to continue as we move forward. And AI is going to increasingly be embedded and hidden behind the scenes simply as part of the applications; you’re not going to go specifically to an AI app.”
There’s actually already the beginnings of a movement away from apps in the IT world, McDaniel said. People are getting used to interacting with voice assistants like Siri, Alexa and Cortana, talking to their devices, asking questions or giving commands. And those are one of the most common uses for AI technology currently on the market.
“And I see that as those mature, as those get better, those are going to become more of a primary interface, whether it’s spoken or using natural language text typing, to ask the devices for what you want,” McDaniel said. “And then behind the scenes, it can transparently orchestrate across whatever services are needed: multiple applications, AI models that are needed to do or provide what you’re asking for, and then give it back to you whether it’s on the screen or through voice. So it’s going to be more of a direct interaction into the device itself, as opposed to into specific apps.”
And these kinds of technologies are already being integrated into commercial off-the-shelf technologies available to federal agencies. Some agencies have already turned to solutions like chatbots and voice assistance to help improve customer service, reduce backlogs and automate help desk functions.
So how does McDaniel suggest agencies prepare themselves to take advantage of these capabilities as they proliferate?
“I can’t emphasize enough how important it is to build a data team,” he said. “We’ve had the need for a long time to generate test data for applications. [We] haven’t necessarily always done a good job of that. So that’s been sort of a pain point for a lot of organizations. But that’s going to be magnified so much as we start moving more towards AI. Because … the results from AI are only as good as the training data you put into it, and how you actually trained the models. So you’ve got to have people that focus on how you actually collect that and then use it.”
Not all AI is transparent in its decision-making, McDaniels said. These so-called “black boxes” analyze data, and spit out a response to that data based on what it learned from its training data. But that decision making processes isn’t always clear, and sometimes it can have unexpected results.
And that’s why data teams are so important, McDaniel said. The decision-making process for AI is just as important as a successful outcome.
“You also need to focus on what happens at the AI human boundaries, and where are they? How far are you trusting your AI? What are you using it for?” McDaniel asked. “Is it for coming up with ideas, inferences, recommendations, suggestions, and then handing it over to a human to actually interpret and take action on? Is the AI actually making decisions? Are you going to let it actually then follow through and implement on those decisions? Like as we talk about autonomous vehicles, that’s now actually taking charge? It’s actually driving making decisions and steering things. And once you reach that boundary, what does that process look like? How is the information handed off from the AI to the human? So all of those processes have to be defined.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Daisy Thornton is Federal News Network’s digital managing editor. In addition to her editing responsibilities, she covers federal management, workforce and technology issues. She is also the commentary editor; email her your letters to the editor and pitches for contributed bylines.
Follow @dthorntonWFED