Bill Curtis, the executive director for the Consortium for IT Software Quality (CISQ), said new standards will help agencies better understand the cost of IT mo...
Agencies are facing an increasing mound of debt — a different type of debt than you may be thinking.
By some estimates over the next five years, agencies will have more than $7 billion in technical debt—old hardware and software that needs to be replaced or updated.
Part of buying down that debt is understanding where to start, and that’s where new standards from the Consortium for IT Software Quality (CISQ) could help.
Bill Curtis, the executive director CISQ, said his organization worked with experts to develop a new set of standards that could help agencies begin to lower agency technical debt by focusing on fixing a broad set of known problems in software code.
“There really weren’t any other standards for that. We looked at ISO, but they really never got to the level of defining standards for measuring the structure problems in source code,” Curtis said in an interview with Federal News Radio. “We created those standards. They are out there. They are in use and being referenced in some federal contracts as options for what ought to be used for as acceptance criteria as service level agreements about the quality of the system in various areas like security, reliability and performance. Based on the fact that these standards are built from counting known problems in the code that you have to fix, it’s not every potential reliability or security problem, but the severe ones that you know you have to get out of the code. Those represent a form of technical debt.”
Then, Curtis said CISQ asked software experts to estimate how long it would take to fix those kinds of defects at the most basic architecture level. Finally, CISQ looks at the complexity through a static analysis tool to adjust the level of effort based on the complexity and level of effort and come up with a cost of what it would take to fix each defect.
“What you get is looking at quality of code and the amount of effort it will take to fix the most severe problems, you get an estimate of the technical debt,” he said. “If someone delivers a system to me I can analyze it and look at the amount of technical debt they just delivered to me to see what it will cost me to maintain this system.”
He said over the last 18-to-24 months about 24 companies, including the Mitre organization and the Software Engineering Institute, met to identify problems in software that were most critical and then developed an approach to estimating the cost to remediate them.
The concept of technical debt has been around since the 1990s. But only in the last 18 months or so when former Federal Chief Information Officer Tony Scott asked agencies for data on the amount of legacy IT they were supporting did it come to the forefront in the government.
Scott said initial data showed about $3 billion in technical debt coming due over the next three years, but then as the Office of Management and Budget collected more information, it found agencies had more than $7.5 billion in technical debt. That number only is increasing as agencies struggle to move off legacy technology.
The Trump administration has picked up the call for agencies to modernize their systems, releasing a new strategy, creating Centers of Excellence to help with the heavy lifting and are drafting an executive order to further illuminate CIOs authorities.
Curtis said agencies should start understanding their technical debt with a policy change or improvement rather than jumping right into standards implementation.
“They can use these measures by setting policies to say, ‘we are going to do an analysis prior to any lift-and-shift just to see what the costs will be so we really understand what we are buying into here,” he said.
Curtis said these standards, for example, could help agencies decide whether a certain application should be moved directly to the cloud or optimized first either through an update of the underlying software, or by moving off the current version altogether and onto a new one.
“You really have to pay a lot of attention to how you make sure you are securing your applications, its performance and its data before you move it to the cloud,” he said. “That’s just one of the issues. I’m going to be paying for space. I’m going to be paying for time. I want to make sure my performance efficiency is sufficient to the point where I’m not wasting a lot of extra money because this thing isn’t efficiently designed. There are all of these issues that give an IT executive in the government or industry a better picture of what their likely future costs will be if they can go look at the structure of their system, analyze it and get some feedback on the potential costs. Then they can make some decisions about what they want to do. How much they want to repair, what is an acceptable level of performance or security or what kinds of risks they are willing to take. It gives them the data that they can make those trade-off decisions.”
These standards also compliment the administration’s push for agencies to use Technology Business Management (TBM) standards. Under this approach, agencies can show the value of IT investments by detailing specific business cases for each area of commodity IT.
Like TBM, Curtis said the technical debt standards are implemented through a tool provided by vendors. In this case, static analysis software.
Curtis said static analysis tools look at structure aspects of software to ensure the code is well constructed, securable, scalable and represents good engineering approaches.
CISQ will hold a free webinar on the technical debt standards on Jan. 16 at 11 a.m. EST.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED