Much like every other agency in the federal government, the Navy is experimenting with how to implement the latest technological advancements to their mission.
Much like every other agency in the federal government, the Navy is experimenting with how to implement the latest technological advancements to its mission. But the classified nature of much of the data involved, and the speed, context and stakes of warfare present unique challenges that the civilian side of the government doesn’t have to deal with.
“Trying to take the first or the second oldest profession and trying to automate that is like John Henry and his machine building the railroad, or the innovation of the car industry,” Capt. Andrew Charles, director of Tactical Exploitation of National Capabilities at the Navy said on Agency in Focus: Navy. “We’ve got a lot of people that are used to humans making the decisions, doing the analysis, churning through the information. To replace even some of the basic steps with a computer, a machine, requires a lot of trust to be earned.”
One of the biggest problems is trust, he said. Eventually, these decisions will be made by machines interacting with other machines. That requires a great deal of trust, as well as some human oversight. And if an interaction between machines proves to be false, that can be very damaging.
He said it comes back to cyber defense. Right now, that revolves around the security of the data. Is it safe, is it accessible and is it accurate? But when machine learning gets into the mix, then the security of the decision-making process has to be considered. What happens when someone interferes in a machine-machine interaction?
“With no human in the loop, if someone gets into the system, how far does the machine get down the wrong path?” Charles asked, during a Future Naval Technology panel at the Oct. 1 AFCEA Nova Navy IT day.
AIs can make logical decisions, he said, along an “if this, then that” or an “after this, then that” basis. But decisions in warfare have subjective elements as well. Shifting a ship location, for instance, sends a contextual message that an AI may not pick up on. On the other hand, the pace of warfare is speeding up to the point that humans cannot process enough data quickly enough to make good decisions.
So the question is where does AI fit, and where doesn’t it?
“The fun part about my job, and why I think I’ve got probably the best job in the Pentagon is I’m going to test that question,” Charles said. “I’m going to figure out where AI doesn’t belong and where it does belong in that decision cycle that takes me from finding a potential threat to engaging that potential threat somehow. My job is proofs of concept and prototyping. So we’re going to test some things out and put that in front of the operator and say, ‘Will you let that happen, or do we need to break that up?’
Meanwhile, Kurt Wendelken, assistant commander for Supply Chain Technology/Systems Integration at Naval Supply Systems Command, said that on the supply chain side of the house, new technology is a mixed bag. For one thing, he said he needs to see a strong use-case before even considering something. For example, he said he just doesn’t see blockchain being relevant in his domain. On the other hand, he’s exploring the internet of things and AI.
“We’re in the middle of a very large reform effort at NavSup,” Wendelken said. “We’re doing some digitization, we’re doing some other things. Our technology staffs are very excited and energized. Our other staff, maybe not so much. We spent a little time stomping fires out, that I think are probably generated out of concern for, ‘What does this mean for me?’ or ‘What’s going to happen?’ or ‘Why is it different?’
Ironically, every time AI is discussed, Wendelken said, the first topic that arises is the workforce. People are concerned about their jobs. But there’s also the question of training. Technology officials are talking about “compile to combat in 24 hours.” But if a whole new code or program, less than 24 hours old, is sent off to a ship, the sailors still have to learn how to use it.
Then there’s the cautionary tale of AIs corrupted by the internet, Wendelken said. So while there are use cases for AI in many aspects of the supply chain side, it’s not entirely clear yet where the best places to implement it are.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Daisy Thornton is Federal News Network’s digital managing editor. In addition to her editing responsibilities, she covers federal management, workforce and technology issues. She is also the commentary editor; email her your letters to the editor and pitches for contributed bylines.
Follow @dthorntonWFED