A prevailing theme running through government initiatives the past few years has been “first.” There was “cloud first,” and “mobile first,” and then you had “cut first.” Though not officially branded as such, “cut first” is perhaps the best way to describe how agencies initially approached the February 2010 Federal Data Center Consolidation Initiative (FDCCI) – essentially cutting down and eliminating as many of their data centers and server rooms as possible.
In 2013, however, we started to see a philosophical change from how those involved with FDCCI viewed the best path to better optimize and reduce the costs of government data centers. At a briefing that summer, former Office of Management and Budget cloud computing and FDCCI portfolio manager Scott Renda said, “Data center consolidation is a means to an end but not the end itself.”
The comments reinforced the fact that as important as it is to focus on shuttering and consolidating non-performing data centers and consolidating physical infrastructure, taking a sledgehammer to crack a nut is not always the best approach.
The Department of Defense (DoD) was no exception to the “cut first” approach, shutting down and consolidating under-performing data centers on the heels of the FDCCI. When DoD joined the governmentwide consolidation drive, even tallying up the military’s overall data center inventory turned out to be a harder task than the Pentagon first imagined: After about a year of counting, 786 were on the books.
There were 2,100 “data centers” on the list by the time DoD revised its definitions to include smaller rooms and closets, and it eventually quit counting. As reported by Federal News Radio, previous plans called for DoD to consolidate most of those into a fixed number of large government-owned “core data centers” and shutter most of the rest.
This article details four key strategies for agencies to gain a better understanding and view of its data management infrastructure before making policy and investment decisions around data center consolidation.
Recognize the challenge
Organizations are constantly being asked to do more with less; less budget, less manpower, less hardware, less everything. Specifically, IT administrators are being asked to create an environment that has a perfect balance of efficiency, utilization, performance, and cost. As data sets continue to grow in size, so too does management complexity. This is especially true for larger environments with physical and virtual resources from a heterogeneous mix of server, network, and storage solutions. With each resource within the infrastructure generating important and potentially insightful data, IT administrators need an easy way to analyze and understand that data, from existing resource consumption to potential performance problems.
Ultimately, agencies must manage the complexities of a dynamically changing IT infrastructure and address challenges common to any IT organization in three key areas: operations, cost management and service quality. Only a smooth blend of all three aspects characterizes an acceptable IT infrastructure.
Ask first, cut later
Before embarking on a process to cut and consolidate data centers, agencies need to ask themselves, “How can I make my data storage and management infrastructure more efficient and more capable?”
Agencies can no longer tolerate infrastructures that do not optimize data storage and management and that have no real-time visibility into the quantitative metrics underpinning the operations of their servers, storage, virtual machines and applications. Moreover, agencies need to equitably apportion the costs of their data management infrastructure to organizations that consume those capabilities. In a resource-constrained environment, agency chief information officers must insist that there is “no free lunch” when it comes to data storage and management.
Go beyond self-reporting
Too often as agencies approach the data center consolidation challenge, they rely on verbal self-reporting from both the individuals running the on-site infrastructure as well as the teams at remote sites that provide backup/recovery and other capabilities. These reports are notoriously inaccurate:
They are often out of date, usually based on assumptions rather than hard metrics, and subject to the interpretations of individuals charged solely with system performance. Indeed, many agency assessments conducted under the FDCCI suffered from these inaccuracies.
Recent experience across the public sector has repeatedly shown that agencies that rely on asynchronous, non-automated, self-reporting of their data storage and management infrastructures are unable to make the kind of granular, precise, accurate, and forward-looking asset management decisions that characterize those agencies that leverage the comprehensive, real-time reporting available through automated tools.
The real value of the results of this on-going assessment is that they yield the current, real-time TRUTH about the health and usage of all aspects of the agency’s IT infrastructure. Armed with this knowledge, agency CIOs and their teams can make prudent, well-informed decisions that optimize the totality of their IT investments.
Recognize cultural challenges
The major hurdles to automated data storage and management reporting are human, not technical. Within the public sector, the expectation that leaders are charged with – and resourced to – run their own organizations, combined with a very deep concern about data protection, often leads to “stovepiped” and “in-house” approaches to data storage and management that do not leverage best-practices like automated infrastructure reporting and cloud computing.
The agency approach to data center consolidation has grown more sophisticated over time. This is, in part, due a keener understanding by all parties involved that wrapping the focus around how many server rooms and data centers can be eliminated is an incomplete strategy. Instead, gaining a better understanding of the existing agency data management infrastructure before making policy and investment decisions is critical, and ultimately essential to a successful outcome.
Greg Gardner, PhD, CISSP is the chief architect, defense and intelligence solutions for NetApp U.S. Public Sector
Agencies must ask first, cut later with data center consolidation
Greg Gardner of NetApp details four key strategies to making data center consolidation into a business and decision-making activity.
A prevailing theme running through government initiatives the past few years has been “first.” There was “cloud first,” and “mobile first,” and then you had “cut first.” Though not officially branded as such, “cut first” is perhaps the best way to describe how agencies initially approached the February 2010 Federal Data Center Consolidation Initiative (FDCCI) – essentially cutting down and eliminating as many of their data centers and server rooms as possible.
In 2013, however, we started to see a philosophical change from how those involved with FDCCI viewed the best path to better optimize and reduce the costs of government data centers. At a briefing that summer, former Office of Management and Budget cloud computing and FDCCI portfolio manager Scott Renda said, “Data center consolidation is a means to an end but not the end itself.”
The comments reinforced the fact that as important as it is to focus on shuttering and consolidating non-performing data centers and consolidating physical infrastructure, taking a sledgehammer to crack a nut is not always the best approach.
The Department of Defense (DoD) was no exception to the “cut first” approach, shutting down and consolidating under-performing data centers on the heels of the FDCCI. When DoD joined the governmentwide consolidation drive, even tallying up the military’s overall data center inventory turned out to be a harder task than the Pentagon first imagined: After about a year of counting, 786 were on the books.
Join us Jan. 27 for our Industry Exchange Cyber 2025 event where industry leaders will share the latest cybersecurity strategies and technologies.
There were 2,100 “data centers” on the list by the time DoD revised its definitions to include smaller rooms and closets, and it eventually quit counting. As reported by Federal News Radio, previous plans called for DoD to consolidate most of those into a fixed number of large government-owned “core data centers” and shutter most of the rest.
This article details four key strategies for agencies to gain a better understanding and view of its data management infrastructure before making policy and investment decisions around data center consolidation.
Recognize the challenge
Organizations are constantly being asked to do more with less; less budget, less manpower, less hardware, less everything. Specifically, IT administrators are being asked to create an environment that has a perfect balance of efficiency, utilization, performance, and cost. As data sets continue to grow in size, so too does management complexity. This is especially true for larger environments with physical and virtual resources from a heterogeneous mix of server, network, and storage solutions. With each resource within the infrastructure generating important and potentially insightful data, IT administrators need an easy way to analyze and understand that data, from existing resource consumption to potential performance problems.
Ultimately, agencies must manage the complexities of a dynamically changing IT infrastructure and address challenges common to any IT organization in three key areas: operations, cost management and service quality. Only a smooth blend of all three aspects characterizes an acceptable IT infrastructure.
Ask first, cut later
Before embarking on a process to cut and consolidate data centers, agencies need to ask themselves, “How can I make my data storage and management infrastructure more efficient and more capable?”
Agencies can no longer tolerate infrastructures that do not optimize data storage and management and that have no real-time visibility into the quantitative metrics underpinning the operations of their servers, storage, virtual machines and applications. Moreover, agencies need to equitably apportion the costs of their data management infrastructure to organizations that consume those capabilities. In a resource-constrained environment, agency chief information officers must insist that there is “no free lunch” when it comes to data storage and management.
Go beyond self-reporting
Too often as agencies approach the data center consolidation challenge, they rely on verbal self-reporting from both the individuals running the on-site infrastructure as well as the teams at remote sites that provide backup/recovery and other capabilities. These reports are notoriously inaccurate:
They are often out of date, usually based on assumptions rather than hard metrics, and subject to the interpretations of individuals charged solely with system performance. Indeed, many agency assessments conducted under the FDCCI suffered from these inaccuracies.
Recent experience across the public sector has repeatedly shown that agencies that rely on asynchronous, non-automated, self-reporting of their data storage and management infrastructures are unable to make the kind of granular, precise, accurate, and forward-looking asset management decisions that characterize those agencies that leverage the comprehensive, real-time reporting available through automated tools.
Read more: Commentary
The real value of the results of this on-going assessment is that they yield the current, real-time TRUTH about the health and usage of all aspects of the agency’s IT infrastructure. Armed with this knowledge, agency CIOs and their teams can make prudent, well-informed decisions that optimize the totality of their IT investments.
Recognize cultural challenges
The major hurdles to automated data storage and management reporting are human, not technical. Within the public sector, the expectation that leaders are charged with – and resourced to – run their own organizations, combined with a very deep concern about data protection, often leads to “stovepiped” and “in-house” approaches to data storage and management that do not leverage best-practices like automated infrastructure reporting and cloud computing.
The agency approach to data center consolidation has grown more sophisticated over time. This is, in part, due a keener understanding by all parties involved that wrapping the focus around how many server rooms and data centers can be eliminated is an incomplete strategy. Instead, gaining a better understanding of the existing agency data management infrastructure before making policy and investment decisions is critical, and ultimately essential to a successful outcome.
Greg Gardner, PhD, CISSP is the chief architect, defense and intelligence solutions for NetApp U.S. Public Sector
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
How 21st-century data management can help leaders provide more effective correctional healthcare services
Breaking down barriers: The challenges of federal micro-purchases for small businesses
The push to upskill the technology workforce in federal agencies