Data requirements to ensure lack of bias in AI and transparency and how it works are not part of the standard license agreements.
Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.
Artificial intelligence software isn’t like other software, especially when it comes to acquiring it and licensing it. The data requirements to ensure lack of bias in AI and transparency and how it works are not part of the standard license agreements. This is all the subject of a study by the School of Business at George Mason University. Study author and senior fellow Benjamin McMartin joined the Federal Drive with Tom Temin to discuss some of the warnings.
Interview transcript:
Tom Temin: Mr. McMartin, good to have you on.
Benjamin McMartin: Hey, Tom, great to be on.
Tom Temin: So you looked at contracting for artificial intelligence and what are the big differences? It’s just software, but maybe not?
Benjamin McMartin: So at its core, yeah, it’s software and should be easy enough. But there are elements of AI that are particular and actually create some challenges within the acquisition environment. The Center for Government Contracting at George Mason is well as my co-author Maj. Andy Bowne who is the chief counsel of the MIT AI Accelerator in the Air Force, really looked at some of the current challenges that DoD is having in procuring AI software technologies, particularly when it comes to licensing.
Tom Temin: And what are some of the challenges?
Benjamin McMartin: The department and honestly, the federal government are looking at the issue of responsible AI. So how do we look at AI technologies and identify whether there are inherent biases, whether we’re able to explain the results that we get, and so while you may not be able to explain why Spotify has recommended certain songs for you, or why Tinder has sent you on a certain date, in the Department of Defense, we must be able to identify and explain the results that we get from AI software. The results and the impact of the results are much more dire. And so those are the type of issues that we looked at on this paper, which is how do we develop licensing schemes within the current constructs that allows the department to get the type of information that you need to actually explain the results that you get from artificial intelligence?
Tom Temin: Well, isn’t that just embodied in the logic of the AI just as any outcome with software is embodied in its logic?
Benjamin McMartin: So you may be able to get results out of your AI and understand that, hey, I got results based on some algorithm. The question for the department is, can you actually have access to that? Most of these technologies are not being developed within the department. They’re being developed in private industry at very high private expense. And so these are big, upfront investments that companies are making. The department traditionally has looked for licensing rights and technologies that allow them to do a few things and these are no surprise, right? What do I want to do with data rights? I want to make sure I don’t get locked into a vendor, I want to make sure that I have the data that I need to do test and evaluation and sustain systems for a long, long time. But even that level of data rights does not give me the access I need to explain what was the background data? How was this developed? Why am I getting the results that I’m getting based on the background data? These are traditionally not things that are developed and delivered under a traditional DFARS – Defense Federal Acquisition Regulation – supplement data rights license scheme.
Tom Temin: Got it. We’re speaking with Benjamin McMartin, he’s a senior fellow at the George Mason University School of Business’ Center for Government Contracting, and an attorney we should also point out. So what can be done? What is the Air Force and the Navy and the Army that are all pursuing this? What can they do?
Benjamin McMartin: The purpose of the study that we did, again, in partnership with George Mason and the MIT AI accelerator with the Air Force was to create a framework, a practical framework for how acquisition professionals across DoD, and honestly across the federal government, can look at licensing that does two things: One, it gives access to the type of background data that you would need to understand the results that you’re getting from AI solutions. But two, it gives the opportunity to balance. And this is an issue that we kept at the forefront of our paper is, the more data and background data that you ask for from industry, the higher likelihood that folks are not going to want to work with you. And so you have to over communicate what you’re using this data for, what the limits of the use of the data are for, and how those custom licensing structures are going to work. This is a challenge. This is a communication challenge to be able to say to accompany, “I’m going to need your background data. I understand in your commercial practice you don’t give that to anybody, it’s not part of your business model. For DoD’s uses we’re going to need to look at it, but we’re going to procure a license to it, it’ll be limited, and you’ll understand exactly what we can and can’t do with it.” And so in our paper, we’ve provided that framework going through all of the DoD is responsible AI principles, which honestly were developed out of national policy and promoted by the Joint Artificial Intelligence Center. And they’ve done a great job of identifying what those principles are.
Tom Temin: Yeah, so the government is highly aware of this limitation in current licensing. Is there anything in the FAR or the DFAR that can enable this type of licensing request in the first place? Do we need a DFAR update?
Benjamin McMartin: So the nice part about the DFAR is contrary to what a lot of people might say, it’s pretty flexible. It’s got the opportunity, and in fact, it encourages the development and negotiation of specially negotiated license rates. Now, there are some limits. But for example, the joint Artificial Intelligence Center through the Tradewind [other transcation agreement] is finding a lot of success in going outside of DFARS and drafting these custom licensing agreements that are pretty close to what you could get with DFARS but there are some nuance. But within the DFARS licensing scheme, our framework that we’re proposing through our study in our white paper provides you examples of how you can achieve this within the current framework and the DFARS, or under OTAs, which gives you even more flexibility. But ultimately, there are going to be some issues that are going to come up in the future. And we expect these will be the subject matter of future white papers, which is ultimately through machine learning. There is a point where the machine is developing the data. And the current DFAR scheme is based on who has developed the data and who has funded the data. There becomes a point in machine learning where the machine has developed the data. And the current scheme has not been developed to understand how that will work.
Tom Temin: And what about the source code? Because that could be also something required to have full transparency and the audit capability that DoD wants in AI software, can that be part of this mix also?
Benjamin McMartin: Absolutely. So source code, especially when it comes to machine learning models and artificial intelligence is key to understanding how the algorithms have developed, how they’ve modified, how they’ve learned. And ultimately, you need to know what the input data is, and the source code is to understand the outputs that you’re getting. The scheme that we’re proposing, however, through our white paper, is that those should be special licenses put aside, there shouldn’t be a one-license-fits-all for these type of acquisitions. You should sit down and say, okay, source code, this is super important for us for a couple of purposes. And for a limited amount of time, we are going to negotiate a very narrow, very specific license for that piece of it. And then for other stuff, there’ll be larger licenses. Ultimately, companies wants to sell to the Department of Defense. But they want to make sure that they maintain their competitive advantage on the commercial market, and honestly, they want to make sure that they remain a preferred customer for federal agencies as well. And so you really have to get in the weeds on each type of data or software and negotiate those as custom license agreements.
Tom Temin: So the issue then is not what’s in the FAR, the DFAR or the law or regulations. It’s simply a matter of trust, and being able to craft very detailed, one-off or bespoke contract licensing agreements as you adopt AI.
Benjamin McMartin: AI suffers the same challenges as a lot of federal acquisition. And it’s communication. Ultimately, the policy of the regulation is to only negotiate the license rights you need for the purposes you need them for, that policy has never changed. There’s a couple of issues there. One is communication and this need to inform industry that these are the purposes that we’re going to use this for. There are the flexibilities in the law to allow us to do this, the policy demands it. And honestly, the policy benefits both industry and government for that. The second piece, Tom, to that is education. And I am encouraged by congressional actions in the last two [National Defense Authorization Acts] to promote and find money for AI literacy among the acquisition workforce, which is needed because these are not things that folks are going to find on a template. You actually have to sit down and develop these agreements and understand the technology at least to a degree where you can competently advise on license terms.
Tom Temin: Attorney Benjamin McMartin is a senior fellow with the George Mason University School of Business’ Center for Government Contracting. Thanks so much for joining me.
Benjamin McMartin: Thank you very much, Tom.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Tom Temin is host of the Federal Drive and has been providing insight on federal technology and management issues for more than 30 years.
Follow @tteminWFED