DoD sees AI’s potential in addressing humanitarian, disaster-relief challenges

When it comes to artificial intelligence, private-sector industry may have a leg up on the federal government. But it doesn’t mean agencies are completely new to the game.

Suzette Kent is the federal CIO.

Agencies are already using AI capabilities for predictive analysis. They’re using drones that map disaster-stricken areas and facial recognition to speed up air travel. That’s the message that Federal Chief Information Officer Suzette Kent offered to industry at NIVIDIA’s GPU Technology Conference in Washington.

“We are not waiting,” Kent said. “Now is the time to seize this opportunity.”

“I’m here because we also want feedback,” she added. “When we talk about policies, use cases and areas where we are evolving our thinking and the maturity of how we use AI across federal agencies, we intend to continue engagement.”

The Defense Department itself offered a similar message. It wants industry help and feedback through its still-relatively new Joint Artificial Intelligence Center (JAIC), which has already been discussing how it can apply AI to humanitarian and disaster relief missions.

DoD sees potential in AI to help the department solve a variety of challenges, but it has a particular interest in finding solutions to meet the Pentagon’s national mission initiatives.

Brendon McCord, Joint Artificial Intelligence Center

“These missions and other missions like predictive maintenance or cyber-space operations, these are within our grasp,” Brendan McCord, chief architect of DoD’s JAIC, said. “We’re all in in the Department of Defense on this new, additional imperative.”

JAIC is leading and facilitating a small team of DoD technical experts, as well as acquisition and privacy professionals and members of academia and industry, to find ways that artificial intelligence can solve a specific mission challenge.

JAIC oversaw its first successful deployment of AI last month. When Hurricane Florence hit the East Coast, a JAIC team mobilized and worked with the National Guard, Federal Emergency Management Agency (FEMA), Customs and Border Protection and the Coast Guard to develop a disaster operational prototype using AI, McCord said. The prototype used imagery to help first responders map rescue targets on the ground.

“We focused on rapidly tailoring solutions to the needs of those on the ground,” McCord said. “We were able to train models using data sets … and successfully integrate for increased situational awareness of conditions on the ground and for supporting initial response efforts. This was a very humble first start. It was a tiny piece of the puzzle.”

JAIC said this first successful disaster response case will inform how the department approaches AI in the future.

Advertisement
The Defense Innovation Board, meanwhile, is developing a series of AI principles for the Pentagon within the next nine months. To develop those principles, the board will host roundtables and listening sessions that will be open to the public. Members of the public will also be able to leave comments on an online portal. All of this feedback will help inform DoD’s AI principles.

There are also other opportunities for industry and academia to engage. DoD will hold a department-wide AI industry day on Nov. 28, McCord said. JAIC and the Defense Innovation Board are also holding monthly discussions with academia and industry.

It’s examples like these that the Trump administration can use and examine as use cases to inform future planning, Kent said.

“Executing the use cases also helps us understand the impact to the federal workforce and closely examine the results so that we can prioritize the investments and the skills building,” Kent said. “As the largest employer in the United States, this also impacts not only how we think about work today but the future of work.”

AI in the federal workforce

What’s less clear, which Kent acknowledged, is how exactly AI will change the jobs that today’s federal employees perform.

“Guideposts” for AI policy, which the administration is developing now, will help, Kent said. Those guideposts should set parameters on finding the right balance of human oversight and identifying “stewards” of authoritative data.

“We know that many of the agencies have a chief data officer, they’re starting to build skills around data science, they’re starting to look at labeling data and they’re following all those things that we’re putting out on the data strategy, but we still have much work to do in examining how work changes and how we hire, retain, reskill and change our operating model across many agencies as we introduce these capabilities,” Kent said.

Kent said the administration is identifying what new jobs might look like, identifying employees’ adjacent skill sets and finding ways for employees who are interested in building new skills to make themselves known.

“When I look across all the agencies, when we say ‘data scientist’ or ‘chief data officer,’ some have them, some don’t. They don’t have common sets of responsibilities. We don’t have formal sets of responsibilities and definitions for data labelers [and] model builders.”