IT officials in DoD and within the military services have made no secret of the fact that they have many examples of overlapping systems with duplicate data, and identifying an authoritative data source is often quite difficult.
Now, the department is making a concerted effort to identify and consolidate that duplicate data — an effort DoD thinks might wind up eliminating between $10 billion and $20 billion in wasted IT spending over the next five years.
Terry Halvorsen, the department’s acting chief information officer, told reporters last week that the Business Process and Systems Review is being led jointly by his office and Dave Tillotson, the department’s acting deputy chief management officer.
“Our tasking from the deputy secretary of Defense is to go through every agency and principal staff agent in the DoD and look for both process and systems improvements,” he said. “On both the process and systems side, you really should lead by example, so we started with the DCMO and the CIO. Within our own organizations, we have already found some places where we have some duplicate data, where we are not using the most authoritative data. That’s going to follow through in almost every place we review.”
In the end, the department wants to settle on a smaller universe of remaining databases that hold high-value and accurate data, whose operating expenses are reasonable and that have good data integrity. In some instances, that might mean acquiring new systems rather than just decommissioning older ones.
“In some cases, we’re going to look at what we have in our legacy systems, decide that they’re not the answer, and decide it is more effective and efficient to kill those legacy databases and start new,” Halvorsen said.
This post is part of Jared Serbu’s Inside the DoD Reporter’s Notebook feature. Read more from this edition of Jared’s Notebook.