The recently released executive order on AI from the Biden Administration drew a lot of interest from technology professionals and interest groups. Everyone is ...
The recently released executive order on AI from the Biden Administration drew a lot of interest from technology professionals and interest groups. Everyone is glad the White House is focused on the issue. Federal News Network’s Eric White spoke with one expert observer: Barney Mccabe, the executive director of the University of Arizona’s Institute for Computation and Data Enabled Insight.
Interview Transcript:
Barney Mccabe I was pleased to see [National Institute of Standards and Technology (NIST’s)] involvement, I think that I would like to see a clearer call for participation from universities in national laboratories, or some part for the national laboratories in there. I think they can provide more input on a lot of the programs and areas I think academia can put some more into. And I think, my biggest concern going into all of this is that the rate of change in these technologies, the rate of introduction of new technologies. And it’s hard to imagine that any process can keep up with them effectively without having a lot more involvement. And I guess I would say, one of the things I pushed on is engaging for the professional societies that are relevant here. And there’s a number of them. And so you’ve got association for computing machinery. And I think that I’d like to see some more explicit sort of how do you work with them? How do you work with the science board? And some of that. But in general, I think that they’re trying to address the challenges coming up. So I guess I’m sort of on that fine tuning aspect of what’s there.
Eric White Would you consider it a good start, though?
Barney Mccabe Absolutely. Yes. Yeah, I think it’s they’re trying to fill a void that needs to be filled. And so they’re starting on a process. I think going back to the AI bill of Rights, the blueprint for that. Going back to I think they’re starting off with the right sort of process on that. Again, I could fine tune that and say, I’d like to see some some tweaks added to that. But in general, this notion of what are our rights as citizens and communities and start from that and then go through how do we then build a process that’s as apolitical as possible, that will focus on the technology and the challenges that are real in this process?
Eric White Well, I want to go back to what you said about the rate of change. What does that mean when it comes to artificial intelligence? Obviously an upgrade, but what does a rate of change look like? Is it just going to be moving an exponential pace or is it going to be small and incremental? What do you do you foresee?
Barney Mccabe Exponential. And that’s a real challenge. I think, because, and computing has been on this exponential rise that we’ve seen with Moore’s Law, we’ve seen this continual increase in our ability to compute and bring the technology in. And what’s happened with AI in particular, these sort of advanced information technologies. We started from having to have a lot of theory behind the tools that we would build to use a natural language processing or understanding of human communication. And with the new tools we have, it’s just massive amounts of data that you can throw out in the computing can come in. And the real innovation here comes out of what body of work are you using? What data do you have available to you? And how do you start to think about using that differently? And so there’s no new generative techniques. The diffusion methods or the other approaches that people are using, those are evolving very, very quickly and new approaches come in. And so I’d hate to predict what’s going to be the big thing after ChatGPT within a year, but there will be something and there will be something new that sort of shakes us in what the capabilities are. So that kind of rate of change coming at us means it’s very hard. I mean, traditionally when information technologies came in, they came in rather slowly. And the next one, we had time to react to the change to our societies and our communities. We’re not getting that time to wait and reflect and think back on.
Eric White We’re speaking with Barney McCabe, who is a professor at the University of Arizona. And is the rate of change, is that the only challenge? I imagine it’s not, but that’s obviously the biggest challenge. What are the other issues that regulators are going to have if they are going to try and implement some sort of standards when it comes to creating and utilizing artificial intelligence?
Barney Mccabe Yeah, that’s great. I mean, so the tradeoff here seems to be the rate of change, new things coming in. How do you control them enough that they do minimal harm and yet allow innovation? Because the innovation coming out of these technologies is phenomenal. What we’re able to do, what we’re able to create and how are we going to advance that process? It’s pretty amazing. And so you get this new thing coming in, try it out, see what you can do with it, but try to minimize or at least regulate the harm that gets done. We’ve seen this happen before, where the technology’s come in to either automate a process on housing or applications for loans. And then we go back and look and find out there was bias in the data that was used to generate that. But the process of automating loans made it much easier, much faster to get a loan and to go on and do business. And so that kind of challenge is the thing you face, and we’ll face that going forward as we see in partly the process that we use and the approaches we come out for, for reviewing these technologies, that’s going to need to change as well to keep up with the technological changes.
Eric White So it almost sounds as if, as a technology professional like yourself, you’re welcoming the ethicists who start voicing their opinions when it comes to artificial intelligence.
Barney Mccabe I think that would be an understatement. Yes, it is. Embrace them, bring them in, bring the sociologist into this conversation. And I think this is really the opportunity we have as a society is to say, let’s bring in the artists as well, and understand how this technology can be transformed to do good for societies and be aware and be watching and be ever present for the challenges that it brings in terms of the potential harms that come in. And yet, absolutely bring the association and bring the humanists and bring in all of the disciplines that have been sort of, I mean, at least financially, I would say marginalized. But I think bring them into the table in the conversation. We all need to sort of participate in that.
Eric White Going to ask you to get a little meta here. Is the answer to regulating AI more AI itself, having software watch software?
Barney Mccabe Yes. In fact, there will be a process by yes that there will be some parts of that that involve an AI actually looking over the project. Yeah, it’s something that you can automate, we will automate.
Eric White So what do you foresee as the next steps that need to happen here? Obviously some more voices that you talked about getting in on the process, but as you mentioned, a good start. But what would you like to see happen next?
Barney Mccabe Yeah. So I guess, from the executive order, it’s going to be how does it roll out? And what is specifically is the role of NIST in this? I think that, I’ll go back to that, what’s the role of Department of Homeland Security? What’s the role of the national laboratories? What’s the role of academia? What are the roles of the professional societies? And as you bring out momentarily, how do we get them involved? And how do we get the conversation to where people are sort of in the social human sciences, understand the technology well enough to understand and project where the challenges are going to be before they come up. So I think it’s mostly in how do we roll out this executive order? I think it’s a great start. So the challenge is going to come in to in having worked in a national lab for a number of years. I can assure you that the agencies are going to be sort of pushing around to try and figure out where their place is in this. And I think that’s possibly a good, there should be a good outcome from that. That will be a conversation that’s had in Washington as to who’s responsible for what.
Barney Mccabe Again, my my biggest concern there is that the academic institutions, the professional societies that may not have as strong a representation in that process could easily get moved to the side. And I think that would be a bad thing. I think that this is what the social scientists, the historians come in and look back in the history of these technological challenges and changes. So we’ve been through this before. This is I mean, we’ve been through these sort of industrial revolutions. And I would say we’re really on an industrial revolution at this point. How do we manage that to make sure we, you’re not going to eliminate harm. People are going to get displaced from the jobs they’re doing today. Things are going to happen in that, but we’ve got to be observant of that. And we probably be looking back at the previous industrial revolutions and seeing what happened there. So there’s a book I’m currently reading is Blood in the Machine. And it’s a history of the Industrial Revolution with automated weaving. And what happened there and sort of the processes and the political aspects and that’s the Luddites. And pointing out that the Luddites weren’t actually anti-technology. They were anti the way technology was being used to displace them from the jobs they had and the recognition that the technology would be part of that. I think that’s a conversation we need to be having, more broadly, is it’s not necessarily about the technology. The technology will come in and it will get advanced. How is it going to be used in our societies? And I think again, this is the executive order starts to address, I think what are the critical issues and we need to participate in how that’s actually executed, how it comes down.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Tom Temin is host of the Federal Drive and has been providing insight on federal technology and management issues for more than 30 years.
Follow @tteminWFED