Insight by Schneider Electric

Optimizing data centers can save agencies money, prepare for the future

This content is provided by Schneider Electric.

After ten years, the goal around government data centers has shifted from closing them to optimizing them. Federal agencies seem to understand that they will never fully get to the cloud, and that data centers will always be necessary for certain purposes.  But even with fewer than half the data centers it once owned, the federal government is still the largest data center owner in the world. That means there’s still a lot of optimizing left to do.

One reason agencies will never fully rid themselves of data centers is that recently some data centers for certain applications have actually been moving closer to the edge. That allows agencies to process data and put it to use quicker and more efficiently in the field. That could mean a Veterans Affairs hospital with its own data center, or an OCONUS DoD forward operating base. But there are a lot of considerations in how to install data centers in these places.

“If you have a purposely built data center somewhere in the D.C, metro area, it was probably housed in a building built for a data center. A lot of these edge sites are crammed into an old operating room in a hospital, for example, or a wiring closet on a military installation has just been expanded and walls were taken down,” said Jeff Chabot, public sector segment director for Schneider Electric. “Edge data centers are typically housed in places that were never considered for a datacenter. They weren’t built for a datacenter. You have to look at everything there. Do you have the right cooling? Do you have the right software tool? Are there people housed in that data center that go to work there every day, or are they remote or hybrid?”

More than anything, what data centers of all kinds need is a data center infrastructure management (DCIM) tool. That’s a piece of software that gives data center owners a full view, including the power, and allows them to manage remotely. It also provides physical threat detection for things like smoke, water or humidity. They help determine where to place new equipment like server chassis or switches.

“We can have threat level at the actual rack, such as an camera that activates when you open the rack,” Chabot said. “And then you want something that’s going to ideally tie into your building management system, so that your facilities guys can help your data center folks, too. If there’s water in the data center, you want to know whether that’s from the cooling system or from the plumbing.”

Federal agencies need to prepare for this next big trend of edge data centers, if they’re not addressing these requirements properly, it will lead to more issues. And agencies aren’t going to want to have down time because the cooling, placement, or the UPS aren’t ideal. These things will affect availability of the data center. The industry standard for availability, Chabot said, is known as “five nines,” or 99.999%.

But availability isn’t the only reason to optimize a data center, Chabot said. The average federal data center is inefficient, which means there are significant cost savings to be had through optimizing. For example, older federal data centers often cool the entire room rather than just the equipment that needs to be cooled. In row cooling, the air conditioning is placed in the row with the server racks. And most data centers were sized before the movement to the cloud began, so they were intended to use far more power than they actually do. Couple that with the fact that newer equipment is more energy efficient, and data center owners are paying for far more power than they need, whether they’re using it or not.

These energy savings, Chabot said, could be redirected to invest in things like cybersecurity, or even paying down money borrowed from the Technology Modernization Fund.

“There’s also ways to look at alternative financing. So for example, there’s what’s called an energy savings performance contract, or ESPC for short. Schneider Electric is one of around 20 companies that is certified by the Department of Energy to conduct and hold this type of contract with the federal government. We would come in and provide a massive audit of a data center or a building or a campus,” Chabot said. “We would basically create a new data center for these folks in terms of equipment. Then we would make a guarantee. These are typically 20 year type contracts, we would make a guarantee that at the end of the contract, you’re going to save X amount of dollars in energy costs. If not, we would actually cut them a check for the difference in terms of performance versus promised savings. The customer gets an efficient data center, brand new equipment, and they’re paying a fee, like data-center-as-a-service.”

Comments

Sign up for breaking news alerts