Throughout government, every agency is under the gun to help bring the number of federal data centers down by at least one third by 2015.
At the Pentagon, however, the latest Office of Management and Budget-directed effort is round-two in a long process of IT consolidation that began in the mid 1990s. The Defense Department began its consolidation under the auspices of a massive and lengthy remodeling of the building that stretched on until this year. “We’re real proud of what we’ve accomplished,” said Donald Adcock, director of the Army Information Technology Agency, the organization in charge of providing IT services for the Pentagon and many other Defense and civilian government entities in the Washington area. “We’ve decreased the footprint of computing in the Pentagon and doubled or tripled the computing capacity for our customers. We’ve come to be what I consider a model for other entities in the federal government to follow.”
What’s now called data center consolidation started for the Pentagon as an effort to use its precious office space more wisely. Before the remodeling began, Adcock said, servers were scattered throughout the building in an ad-hoc fashion-tucked under desks and stashed away in closets with telephone equipment.
Then came the Sept. 11 attacks. At that point, he said, the server reorganization changed from an effort focused around remodeling to one centered on survivability. “The whole focus became creating an infrastructure and computing capacity that was 100 percent geared toward continuity of service, staying alive in the event that something like 9/11 ever happened again,” he said. “As a result of that, people really had no choice. We had built state-of-the-art consolidated server rooms that offered them mission-critical power, mission-critical cooling, state-of-the-art security, and really enticed them that they had to move into these rooms.”
In all, ITA has taken 30,000 square feet of floor space that used to be occupied by redundant, inefficient computers and returned it to productive office use.
Today, the Pentagon has what ITA refers to as a single data center taking up about 15,000 square feet of floor space, though it occupies more than one room. In order to maintain continuity of operations and deliver computing services wherever they’re needed in the building, the “center” is spread out through 14 different physical rooms. If one room is damaged in an attack, mission-critical data and services can fail-over to other rooms.
Additionally, the Pentagon data center has fail-over capacity to backup data sites outside the building in the event the entire building is compromised. Citing security concerns, ITA officials declined to say how many such sites exist or where they are located, but said they are in multiple locations outside of the National Capital Region.
Despite years of work that predated the modern Federal Data Center Consolidation Initiative, Adcock said the Pentagon’s work in the area is far from finished.
Consolidating computer systems into server rooms is one thing; the next steps are the same as they are with other agencies: rationalizing the number of active software applications, doing smarter enterprise licensing of software, and virtualizing systems so that one server can do the job of many. ITA says it has made progress toward those goals. Since 2006, the agency has increased virtualization capacity by 30 percent, reduced software licensing costs by 10 percent and increased processor performance by 40 percent.
Also, unlike during the Pentagon’s first round of consolidations, ITA will have to scrupulously track its cost savings as it pushes toward OMB’s IT efficiency targets.
“I’ve tasked an organization within ITA to start tracking those efficiencies, because we’ve got to be able to answer the question of how many dollars did we save,” Adcock said. “As far as the savings from the past, every time we would make an investment or we would find something, it would be reabsorbed to the tenant. As I took something out of your closet, I had to invest money to build the infrastructure to put you in a new space. If you didn’t provide me with what it cost you, I didn’t have the data. People didn’t keep good records and a lot of things got lost. So it’s very hard to figure out what was saved in the first round of consolidations.” And quantifying the savings going forward won’t be easy, he said.
“The savings is so second and third order effect,” Adcock said. “By changing the way you do electric supplies or air flow in the building, you have to start asking much did we pay for electric before, and were we tracking that? In a building like a Pentagon where you have so many tenants that control those things, it’s very hard to pull that thread and come up with a dollar amount. It’s doable, and we’re working really hard to get our hands around those costs.”
Adcock said he would offer several pieces of advice for federal IT managers who are moving forward with data center consolidation for the first time.
“You need to understand what’s in your environment,” he said. “Inventory it well and get a good understanding of what it is, because you’re going to need to provide for it in your future environment. You need to have robust policy throughout the organization that others can use, and say either ‘this is the mission, you have to move to it,’ or defer to and ask, ‘how do I get there?’ You have to have strong policy. You have to have a strong understanding of what it is you’re trying to do for your customer and make sure you can deliver on it. That means strong service level agreements. And you’ve got to understand the requirements. Consolidating into an environment is one thing, but it’s another thing to know how you’re going to it and how you’re going to put it in place. Proper planning up front can help drive some of that.”