The Army Forces Command and the Defense Innovation Unit are trying to ensure the benefits of AI tools reach all segments of the Defense Department.
The Army is giving a crash course on artificial intelligence to everyone from junior enlisted service members on up to senior leaders.
At the same time, the Defense Innovation Unit is trying to fill in the AI gaps with capabilities for the end user.
The Army Forces Command and the Defense Innovation Unit are two examples of how the Defense Department is looking for ways to address the ongoing challenges and opportunities to taking more advantage of AI.
Col. Travis Hartman, the chief technology officer for Army Forces Command, said soldiers are experimenting with large language models and what could be done with generative AI.
But the real challenge is ensuring the benefits of these tools reach all segments of the Army.
“How do I give these capabilities to my junior people so they can get the benefit of it, yet they get that development so when they are mid-tier or when they are senior, they know how to do it without having the AI essentially acting as a crutch? Or if I have soldiers that are forward deployed and they lose that data link back, or they lose access to some of their electronics, how do they go ahead and have the skills necessary?” Hartman said on a recent GITEC panel, an excerpt of which played on Ask the CIO. “They can break out a map and a compass or they can figure out these other things like with a rifle; they’re going to learn to use a rifle with the iron sights before they get optics or before they get a scope on it. So I need to be able to have the equivalent to that as I get them those basic, foundational things, and I don’t know yet what’s that test? Can you write a document and you have an appropriately placed Oxford comma? Now, I will let the generative AI help you with your next paper.”
Hartman said FORSCOM recognizes just how innovative and technology forward many of its soldiers are, so it’s more than just putting some policy guardrails around the GenAI tools.
This rule flows up to senior leaders, who have been known to use ChatGPT or similar tools to write evaluations. Hartman said Army technology leaders have reminded others using LLMs is not a good idea for evaluations and other documents.
“The Chief Data and AI Office has an initiative where they are taking all the generals and putting them through this three-day course about all the great things that AI can do for them,” Hartman said. “Now, I’ve got generals that are very excited about doing something with AI, and they want to do stuff, which is good, because they can lend that political will to help make it happen. We’ve got pressure from the seniors when you do it. We’ve got mid-level folks kind of skimming under the radar, like maybe they’re actually using ChatGPT on the evals, and then we have the lower level folks, where they’re finding innovative approaches, and honestly, I don’t know all of the things that they’re doing, and probably nobody does, I would not be surprised if some of them are running their own models. They’re doing experimentation with it, and some of the best things about that experimentation is somebody’s done basic proof of concept on some cloud services they pay for out of pocket.”
The goal is to make sure soldiers and leaders have the knowledge and authority to take what was done by experiment and expand it to scale. Hartman said if the GenAI tools can give first sergeants two more hours a week with their soldiers and three more hours a week with their family, that would be a worthwhile investment.
That idea of worthwhile investments is what DIU is focused on when it comes to AI tools and capabilities.
For example, DIU released a solicitation earlier this summer seeking tools to identify altered, synthetic media sources whether through imagery, video or other sources and be able to do that at scale.
“We have been flirting with a lot of different technologies around the mis- and dis-information space, deep fakes just being one of those very important veins,” said Sarah Pearson, the co-lead portfolio manager for AI and machine learning at DIU. “So if you timestamp that back four years, the sophistication of mis- and dis-information of deep fakes and other altered synthetic media has really evolved. This is one of those areas where AI just took a problem that was already really challenging to wrap your arms around, and it just went up into the right at an exponential rate over the last two years. We are moving forward with a project, meaning that we’ve got an awesome end user group that has a need that needs to get solved as soon as possible, has funding to put behind it and also has the risk appetite to try one of these more bleeding edge technologies capabilities that work but within a DoD or intelligence community setting.”
Along with using AI to identify deepfakes, DIU has been applying these advanced capabilities to an assortment of cyber challenges.
Johnson Wu, the cyber portfolio leader also at DIU, said the organization has focused on projects that use AI-enabled endpoint agents on cloud access security brokers, as a part of secure access service edge (SASE) and with autonomous security operations center tools.
“We completed the prototype and that laid the foundation for the Defense Information Systems Agency’s Thunderdome zero trust program. We have a host of zero trust and AI/ML enabled products in there right now,” he said. “The focus today has shifted a little bit, and in terms of the cyber portfolios focus and discipline, we are looking more at AI enabled, AI augmented hunt forward capabilities.”
DIU, in many ways, sits in the middle between DoD users and industry providing these advanced tools. Pearson said there are thousands of companies — DIU alone meets with 4,000 or 5,000 a year — that are providing cyber and AI tools to help solve many of these problems.
Pearson said DIU, which has a staff of about 200 people, is always on the look out for new technologies and, maybe more importantly, tools or capabilities that meet current services or component use cases.
“We also work very closely with the DoD CIO, especially with the industry engagement front office. They are always bombarded with vendors that claim AI supremacy, zero trust supremacy, Blockchain, quantum, whatever. So we do a lot of market mapping for the DoD CIO as the policy makers, and we’re closely tracking their prescriptions in terms of AI in their zero trust cybersecurity stack,” Wu said. “At this point in time, their prescription for the specific activities are listed in what’s called the advanced zero trust category. It is still not in the basic minimum zero trust category, and it is our job to help accelerate the trials, at least give it a spin. How can I test products from a vendor that don’t have like, Impact Level 5 hosting or FedRAMP high certification and things like that. There are creative ways of doing that. So we as DIU, as the sherpa of these prototype projects we take on the task of helping push stuff over the boundaries into material execution that leads to good and disciplined evaluation of products that are to be fielded.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED