IARPA working on ways to protect AI training data from malicious tampering

IARPA Director Stacey Dixon said the agency has laid the groundwork for two programs focused on ways to overcome adversarial machine learning.

The intelligence community’s advanced research agency has laid the groundwork for two programs focused on ways to overcome adversarial machine learning and prevent adversaries from using artificial intelligence tools against users.

Stacey Dixon, director of the Intelligence Advanced Research Projects Activity (IARPA), said the agency expects both programs to run for about two years.

“We appreciate the fact that AI is going to be in a lot more things in our life, and we’re going to be relying on it a lot more, so we would want to be able to take advantage of, or at least mitigate, those vulnerabilities that we know exist,” Dixon said Tuesday at an Intelligence and National Security Alliance (INSA) conference in Arlington, Virginia.

Stacey Dixon, second from left, IARPA director, speaks at an Intelligence and National Security Alliance conference in Arlington, Virginia, on April 16, 2019.

The first project, called Trojans in Artificial Intelligence (TrojAI), looks to sound the alarm whenever an adversary has compromised the training data for a machine-learning algorithm.

“They have inserted some training data that is saying that a stop sign is actually a speed limit sign, for example,” Dixon said. “How do you know that there are these kinds of triggers in your training data, as you take the algorithms that come out of the training and use them for something else?”

IARPA released a draft broad agency announcement last December and had received feedback, comments and suggested changes from the private sector through the end of February.

Another program, which Dixon said would have a draft announcement coming later this year, will look to protect the identities of people whose images have served as training data for facial recognition tools.

“How do you ensure that no one can take the algorithm that you created and go back and recreate the faces that were in the database?” Dixon said. “These are certain areas that we hadn’t seen too much research, and so we will be starting programs.”

While a handful of agencies have piloted simpler AI tools, like robotic process automation, Customs and Border Protection since June 2016 has been working on a biometric facial recognition pilot program that compares images of passengers boarding flights to photos on their passports, visas and other forms of government-issued identification.

In addition, Dixon said IARPA has made cybersecurity forecasting an “aspirational” goal, and described the project as giving agencies and companies a heads-up about an imminent cyber-attack, and the identify of who might be behind it.

“We’ve been able to, so far, find some interesting things out by what you can find from publicly available information, non-traditional sensors — things that you wouldn’t think to necessarily look for as an indicator that a cyber attack might be happening,” Dixon said.

For more than a decade, IARPA, much like the Defense Advanced Research Projects Agency, has invested in what Dixon described as “higher risk, higher payoff” research that intelligence agencies can’t take on.

But IARPA isn’t the only agency to have taken an interest in AI research. President Donald Trump’s executive order in February tasked several agencies with elevating AI as a priority in their research portfolio.

However, Dixon said the future of AI research will require an “all-hands-on-deck” approach, with partnerships between government, the private sector and academia.

“The government used to be the biggest funder of a lot of things, and now we’re not … We know that there are going to be venture firms that are going to give more money more quickly to a startup company than government would be able to do in that same timeline,” Dixon said. “We have to figure out how to live in this space where the great industry investments can also support national security endeavors.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.