"We need AI tools in place to help us plan around climate events, to help us plan around looking at ways in which we can accommodate," said Dan Pomeroy.
Artificial intelligence gobbles up gigantic amounts of electricity. So its both expensive and, from the General Services Administration point of view, decidedly un-green. GSA has an official is on the case both for the dollars and for the carbon. Deputy associate administrator for the office of technology policy, Dan Pomeroy joined the Federal Drive with Tom Temin.
Interview transcript:
Dan Pomeroy Because we have the coming AI revolution, as I’m sure many of your interviewees have talked about, that the amount of electricity that we spend on compute is increasing rapidly. We have industry friends that we talk to that are building data center campuses that could pull as much as 260MW out of the grid. And what we need to do as a federal enterprise is to be cognizant of our own footprint on that grid. And we need to make sure that efficiency is something that we plan. So if we envision a world where we have more AI integration into not just the federal workspace, but the general workspace across the whole country, and we see not only chat bot AIs, but also agent based AIs partnering up with employees to make them more effective. Of course, not just in the federal government, but everywhere else. We can imagine a world where we will be a draw on AI resources considerably. And that’s a good thing, especially if you think about accessibility. An agent AI could do wonderful things to help people that have difficulty with our standard accessibility toolsets. I think surely the sky’s the limit. So it’s not that we’re turning away from artificial intelligence because of its draw, but we need to be cognizant of that draw.
Tom Temin And in other words, you’re worried the government or concerned, I should say, the government will become a big consumer of that, which is a big consumer of electricity.
Dan Pomeroy Exactly. And the irony, Tom, is that we need AI tools in place to help us plan around climate events, to help us plan around looking at ways in which we can accommodate for these weather events. For instance, our friends at NOAA have implemented an artificial intelligence toolset that they use to predict hurricanes and hurricane severity. Of course, they also use high performance computing as well to do real time modeling. And we all benefit from that activity. We all benefit with the data that is processed and then distributed not just to us, but to our friends across the globe.
Tom Temin Well, sure. So GSA then, from your standpoint, you know that the data center farm phenomenon. You just drive 15 miles out the Dulles Toll Road. You can see the data farms out there.
Dan Pomeroy That’s right. Ashburn, Virginia, and in that region. Yes, absolutely.
Tom Temin All those hotels with no windows, and fences around them, are they jails or hotels? Well, no, they’re data centers. And the data center industry itself, from things I’ve read and you’ve read, they’re taking care of that with simply looking at nuclear power. And there’s a whole bunch of new technologies that the Energy Department is involved with approving and overseeing that are very different from plants of yesteryear, such that the grid is not a question anymore, because they have their own. Is that the answer? What are you looking at here?
Dan Pomeroy Well, I’ll defer to my friends at Department of Energy to go deep on that question itself.
Tom Temin We don’t want you to meltdown right here.
Dan Pomeroy Right. They certainly don’t want me commenting on on their posture. But what I can comment on is that it’s important for federal agencies to make appropriate plans for the impacts that they make on compute. So we’ve been pushing cloud computing as part of the data center optimization initiative for many years now, going back to 2016. And because of that, agencies can leverage the efficiencies of the cloud to scale up when needs arise to scale up tiered data storage to only have readily available data be priced differently than data that’s not as useful and not as accessed as quickly. And that’s a good thing. We do want agencies to invest in the cloud. And we know that the those data centers in Ashburn, that’s where the cloud basically lives. If you see those large buildings, that is the cloud. A cloud, put simply, is somebody else’s computer that you’re using. And by moving to the cloud, we have been able to gain a lot of great efficiencies in the federal government. And it has been the right answer for us and it has put us in a good position to adopt AI as well, because we would not have been able to evolve into AI adoption without the cloud adoption that we’ve seen so far. At the same time, though, if we can imagine a world where many agencies or not just federal agencies, but in the public sector itself, if the public sector starts looking at doing on premise large language model implementations using GPUs instead of CPU’s, that’s a very different activity than what we’ve historically been doing with more generalized compute activities that were easy to move to the cloud. And in some cases, we’ve talked to our friends at Johns Hopkins University, they’re working on building their own on premise, large language model based on government data. And they’re going through that process of accumulating the data, accumulating the hardware, training the models. And it’s a difficult road, and I’d refer you to them. And they’ve learned a lot, they’ve been fantastic. But we can imagine public sector entities looking at that as an option among an array of possible AI options.
Tom Temin We’re speaking with Dan Pomeroy. He’s deputy associate administrator for technology policy within the Office of Government Wide Policy at the GSA. So then your effort then is to try to get agencies to use whatever GPU facility they’re using, whether a commercial cloud or in case they decide we’re going to build one of our own. These are power intensive regardless to use them in an efficient way, in other words.
Dan Pomeroy Yeah, the trick is to never overbuy either in hardware or in space with a physical location or overestimating what we think our loads are going to be. So it really is going to come down to aggressive modeling to truly understand what we believe our needs are going to be, and to make sure that we rightsize that activity when we go out to the commercial sector to make sure we’re not using more than we need, to make sure we’re not overtaxing the grid more than what’s necessary.
Tom Temin Yeah, it strikes me there could even be an economy of use cases. There’s conference after conference on AI use cases. Well, maybe some of them really don’t need all that. And can kind of have a priority. This is what we really need. It’s the same way that science rationalized use of supercomputing is still expensive. Well, you don’t run your spreadsheet through the supercomputer. Same type of thinking could come to the AI idea.
Dan Pomeroy Yeah, precisely. So if you have deployed a chat or a chat bot that’s there to answer questions like we’re working on deploying those even in my own shop. Publicly, we’re looking into that. That is a different activity than having an agent based AI that’s always on. It’s always pulling in data and working on your behalf that you check in with and that you guide, that you shepherd in its work and in its activity. Those are two different types of generative AI that will be very commonplace in the near future. And we need to appropriately cost model out exactly what direction each agency is going to need a cost model out, what’s the right fit for their mission.
Tom Temin It’s really ironic in some sense because, of this expensive resource and it’s expensive to buy and it’s also energy intensive to use, which gets back to the climate question, it takes us back to the mainframe early days of timesharing and of being able to reserve time in such a way that you had to be efficient in use of resources. You couldn’t just do everything and dump it onto a mainframe. Now we have the capacity, but there’s a great cost.
Dan Pomeroy And we are always to your point, we’re always standing in the middle of the teeter totter between decentralization and centralization. And the pros and cons will go along with each one of those. If you get economies of scale with centralization, sometimes you miss out on the customization that you would get doing something more locally.
Tom Temin So do you have to be the guy that writes the policy on all of this? Because it sounds like you can’t win.
Dan Pomeroy My job is to help the executive office of the President, OMB, White House, OSTP, who’s ever in charge. I’m not talking from a political standpoint, but from whatever leadership needs to help agencies implement. And we do that through a couple of different ways, Tom. We build out guidebooks and playbooks that are generalized, that can be applied in a lot of different instances, but maybe are not perfect for one specific mission. Another key strategy we use as communities of practice, we bring agencies together, small agencies, large agencies. Not only do we listen to what their problem spaces are, but we also listen to their successes, and we highlight their successes. And so those communities of practices so those successes can be transmitted to other agencies. I’m always in favor of never charging the taxpayer twice. So if a solution has been identified in another agency and it’s applicable to a different agency, we should leverage that learning and not invest in figuring that out twice.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Tom Temin is host of the Federal Drive and has been providing insight on federal technology and management issues for more than 30 years.
Follow @tteminWFED