New data center journey must include optimization, modernization, hybrid cloud

Tom Santucci, the director of IT modernization within GSA's Office of Governmentwide Policy, and Steven Naumann, a senior advisor in Office of IT Modernization,...

As the number of data centers continue to shrink year after year, agencies must make the hard decision about how best to optimize the remaining on-premise facilities.

With another 54 planned closures in the final months of fiscal 2022, the 1,500 or so remaining are primed to advance their use of energy metering tools, server virtualization technologies and approaches to decrease the number of servers and increase the utilization of those that do remain.

Tom Santucci, the director of IT modernization within the General Services Administration’s Office of Governmentwide Policy (OGP), said the data center optimization opportunity now is more about application rationalization and less about straight data centers.

Tom Santucci is the director of IT modernization within the General Services Administration’s Office of Governmentwide Policy.

“It’s really nice to go to the IT Dashboard and see the accomplishments of the federal workforce of closing data centers. Given the interconnection between cloud and data centers, we see a migration toward that,” Santucci said on Ask the CIO. “We’re going to continue to focus on IT modernization and not physical structures, and spotlighting administration’s priorities around cyber, climate, zero trust, IPv6 and other things that make it all work.”

The Government Accountability Office said in January that since 2010, agencies have closed more than 6,800 data centers and saved or avoided spending more than $6.6 billion.

But there still is plenty of work remaining. Not only were there more than 1,500 open data centers, Carol Harris, the director of IT and Cybersecurity Issues at GAO, said there is another 4,500 data centers that have fallen from agency inventories that represent both huge closure and optimization opportunities.

“We want to make sure that agencies are including these non-tiered data centers in their inventories to make sure that they are keeping track of them, that they are making sure that they are still centers that should be utilized,” Harris said. “And if not, then they should be consolidated. Just from an IT management perspective, they, as well as OMB, should make sure that they can wrap their hands around that comprehensive set list and that’s why we continue as of today to support having these centers being reported upon in their inventories.”

As most agencies continue to run on-premise data centers for the foreseeable future along with utilizing commercial cloud services, experts say the goal is to find the right balance of where their applications are most effective for the right cost.

Application modernization is a key feature

Tony Scott, the former federal CIO and now CEO of Intrusion, a cybersecurity company, said application modernization will drive many of the consolidation vs. data center modernization decisions.

“If you focus on that, you will do a better job of driving consolidation and optimization than any other thing you could do because of the needs of new applications,” he said. “If you think like a start up to some degree, if you are starting off serving something today you’d make a bunch of different choices than you would’ve 10-to-15 years ago. You will be more efficient, more effective and it will be easier to keep the application modern than anything you did 15 years [ago].”

Santucci said the data center consolidation and optimization initiative continually has evolved with the 2019 memo from the Office of Management and Budget first pushing for application rationalization and modernization.

“It was getting at the application layer and not the physical structure or trying to identify what a data center is,” he said. “This means data centers and IT portfolio management are always in a constant state of refinement. There’s changes happening every day, every weekend and things come in and out of priority for the agency. Agencies may not realize it, but they’re doing application rationalization on on a regular basis. Every time that you put something into production and then have to update it, it’s always something that needs to be looked at on a constant basis.”

In the meantime, the Federal IT Dashboard says agencies have a long way to go to make on-premise data centers for efficient.

Out of 1,400 data centers, only 33% have energy metering technologies, about 1 % of all servers are considered underutilized and 33% are virtualizing servers.

GAO found in March 2022 that agencies are doing a good job meeting their data center optimization goals.

Harris said GAO is analyzing agency optimization efforts and the impact it’s having. She said this work started earlier this summer.

“In terms of where we move from data centers, looking at sustainability and optimization is the right way to go. Obviously, that evolution in policy makes sense because it should mirror the progress that the agencies are making in this area,” she said. “As far as consolidation is concerned, I think that agencies have worked really hard for 10 years to consolidate those centers. I think that we can, call that a success. But we still have a lot of work to do, and now that the move should be on optimization or on ensuring that their operations are the most cost effective as doing so and if that means doing something different, at least, we want to make sure that agencies have done that proper analysis work to support what they’re doing next.”

Steven Naumann, a senior advisor in Office of IT modernization within GSA’s OGP, said data centers are tightly engineered, specialized buildings and as agencies move applications to the cloud, they have to make sure they don’t introduce inefficiencies.

“There’s always talk about sustainability and water usage and things like that. Those are already engineered into the structure, but once that equation is changed, you’re not going to get that many more efficiencies by using less water or things like that,” he said. “This is why we’re actively encouraging people to get to the cloud or at least take, what I’ll call, a half step to the cloud going into co-location site where you can use products and services with other agencies and take advantage of that innovation that’s available.”

SSA is a good example

Naumann came to GSA from the Social Security Administration where he led a $200 million project to modernize the agency’s data center using money from the Recovery Act. Naumann said while Congress gives few agencies the opportunity to build from the ground up, the lessons learned from that effort is something others can take advantage of.

“We had GSA and the Department of Energy all working together to come up with ways to make it more energy efficient. Some of the things that the data center had was like a natural meadow around the data center so we didn’t have to cut the grass every week,” he said. “The data center was utilizing hot aisle containment, it had a heat map, where you could look at what aisles are being really, really high and then eventually, vMotion, or virtual machine migration. If you have a whole aisle that’s running too high, you can move those servers logically to another area to spread out the load.”

SSA also took over active energy management, moved to graphene batteries, which last much longer than typical batteries that could be charged in periods when electricity costs less and discharged during periods of high cost.

“There was liquid cooling that was being used, especially for the mainframes. The data center standards now are for higher temperatures where 40 years ago, you go into a data center with a raised floor and it’s 55 degrees. Now you can go into one and you’re sweating because it’s 78 degrees, and that’s what we’re looking for what we’re looking to do, we’re trying to take advantage of all the energy efficiencies that are available,” Naumann said. “The Department of Energy has some great programs like the Federal Energy Management Program (FEMP) and they offer the data center energy practioner (DCEP) program that agencies can take advantage of. And all their national labs are doing great things too.”

Naumann also said industry continues to come with ways to make data centers more energy efficient, such as moving away from power usage effectiveness as a metric and looking at the efficiency of individual computers and how they are using power and managing compute and storage capabilities.

“I believe eBay has something that’s efficiency per compute cycle and things like that. So that’s really what we’re looking for,” he said. “There’s all sorts of things that we can do in data centers to make them more efficient. They already are very efficient because they are very tightly engineered. But a lot of the older data centers can really take advantage of containment and raising temperatures, and we’ll get better numbers by doing that.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories