The Office of Management and Budget’s latest attempt to tame federal data centers is receiving rave reviews from industry and government alike. But maybe more importantly, White House officials recognize changing the federal approach to data centers is no longer just a savings exercise — no matter what Congress and the Government Accountability Office say.
Instead the draft policy recognizes the data center consolidation and optimization initiative must fold in several ongoing priorities — from infrastructure and systems modernization, to cybersecurity, cloud first, and the greening of the government — to be more effective.
“This draft policy is a half-step forward because it’s more aggressive than Federal Data Center Consolidation Initiative (FDCCI). It gets agencies out of the square feet discussion and it now says if you have a server, it’s a data center,” said Keith Trippie, who helped lead the FDCCI strategy and implementation in 2010 and now is president and CEO of the Trippie Group. “Part of the challenge for agencies is to go back and recount what constitutes a data center. It’s more work for agencies just to get a baseline of data centers. There are plenty of broom closets that have servers that are connected to Internet that need to be counted and weren’t five years ago.”
The number of data centers has been rising over the last six years, mainly because of better discovery and definitions by agencies. OMB reported in November that agencies had more than 11,700 data centers, up from 9,000 in 2014 and 1,100 in 2009 when the administration initially set the goal to reduce this infrastructure.
But as one federal official told me, this draft policy goes beyond closing down data centers and achieving a big savings total.
“This is about moving to cloud and asking tough questions about where we are, and really moving to cloud and getting out of ownership of infrastructure business,” said the official, who requested anonymity to talk about the draft policy. “This isn’t about savings primarily as the original effort was. There are a lot of knobs to turn from savings to cyber to efficiencies to capabilities.”
To be clear, savings are important, but it’s only one of several metrics, not the end all, be all metric like GAO and some lawmakers want it to be.
A March 3 GAO report on the data center consolidation initiative shows auditors’ focus on savings and across-the board reductions. GAO said agencies reported achieving an estimated $2.8 billion in cost savings and avoidances from fiscal 2011 to 2015. GAO said the departments of Commerce, Defense, Homeland Security, and the Treasury accounted for about $2.4 billion — about 86 percent — of the total. Agencies estimated the government can save or avoid spending another $5.4 billion for a total of $8.2 billion by the end of 2019.
OMB has a goal of reducing the number of federal data center to just more than 2,000 by 2019.
“While OMB’s optimization targets provided clear and transparent goals for agencies’ fiscal year 2015 optimization efforts, agencies made limited progress against those targets. Expeditiously implementing [the] Federal IT Acquisition Reform Act (FITARA), which includes several provisions aimed at improving the federal data center optimization effort, should improve agencies’ optimization progress,” GAO stated. “Furthermore, OMB’s implementation of our September 2014 recommendation to develop a metric for server utilization could help ensure that agencies are more efficiently using computing resources.”
Several experts pointed to FITARA and the power usage metric as some of the most positive aspects of the draft policy.
“The DCOI roles the responsibility of data center management— including infrastructure and services — under the agency CIO. This will provide a single point of focus for an agency’s efforts and should result in more progress, more quickly,” said Chris Howard , vice president of federal for Nutanix in an email to Federal News Radio. “Given that FITARA has also given CIOs the budget responsibility for agency IT, this will enable an overall view into what is really required in terms of procurement from a budget perspective, and what can be consolidated or optimized from a data center perspective. This might also shed some light on the fact that agencies are relying too much on legacy infrastructure and provides CIOs the ability to do something about it.”
Howard said the requirement in the draft policy for agencies to justify new data centers or significant expansions of current infrastructures also could force agencies to stop relying so much “on clunky, inefficient and expensive three-tiered data center architectures that are the root cause of this situation in the first place.”
Isaac Negusse, Iron Mountain Government Services’ federal business development executive for data centers, called OMB’s approach compelling for many of the same reasons.
“It’s a logical next step from FDCCI because it correlated concepts of consolidation and optimization, and it fits well with what agencies already are doing,” Negusse said. “From industry’s perspective, agencies consistently are expressing inline optimization targets like low power usage effectiveness (PUE) through market research since the middle of 2015.”
He said one area he’d like to see OMB improve the policy around is offering more specifics on the metrics for optimization.
“Optimization can be an abstract construct unless they use widely accepted industry metrics like PUE,” Negusse said.
Rob Stein, vice president of U.S. public sector at NetApp, said the draft policy just closing physical data centers alone can’t be enough.
“That is why we see agencies extending consolidation of IT infrastructure down to the shared server, storage, network and application level — an approach that aligns well with an OMB focus on transitioning to cloud and data center shared services,” he said. “We are witnessing the transformation of the data center in a way that not only impacts future cost savings in 2016 and beyond but also brings operational efficiency when it comes to data mobility and security. What’s emerging is a modern ‘data fabric’ woven together and managed by intelligent software that allows agencies to control, integrate, move, and consistently manage their data across different IT environments and multiple cloud vendors.”
Stein added the “data fabric” approach opens up “a tremendous opportunity for government agencies to become more proficient at harnessing new technologies to lower costs and improve productivity. Capitalizing on innovation in IT systems, software and cloud services involves operational impacts and also includes factors that extend beyond the data center. At the same time, policy in conjunction with fostering new internal agency processes for procurement, IT processes and people enablement are required to streamline the consumption of IT resources.”
There still are several questions that the memo doesn’t answer, which is in part why OMB is releasing the draft for public comment.
Trippie said more rigorous metrics such as showing 70 percent utilization rate on all servers by 2018 or requiring all development and test environments to be in the cloud in the next two years would be good starting points.
Negusse said it may be tough for agencies to close 100 percent of their non-tier data centers without specific guidelines for how to achieve that goal.
Howard said agencies need help to stop incorporating legacy infrastructure into procurements.
He said the fact that OMB is tasking the General Services Administration’s Federal Acquisition Service with creating and maintaining an inventory of acquisition tools and products around data center optimization “will open up the opportunity to include these disruptive technologies as part of a “tool set” that agencies can consider for future deployment — pre-vetted and pre-certified — to overcome the current procurement barriers to adoption.”
The final policy still is a good three months away, but the draft version gives agencies and industry alike more momentum to shift resources and plans to have a real impact on how agencies procure, manage and deliver IT.