“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.
Submit ideas, suggestions and news tips to Jason via email.
The Cybersecurity Maturity Model Certification (CMMC) program recently reached an important milestone, naming the first several certified third-party assessment organizations.
Kratos and Redspin made it through the CMMC maturity level 3 (ML3) assessment gauntlet performed by the Defense Contract Management Agency’s Defense Industrial Base Cybersecurity Assessment Center (DIB CAC) and other administrative and personnel requirements.
“Reaching this step in getting the CMMC ecosystem up and running is a significant milestone and we look forward to authorizing additional C3PAOs in the coming days and weeks,” said CMMC-AB chief executive Matthew Travis in a release.
The naming of C3PAOs is the first step toward getting companies CMMC certified. The question now is whether vendors will decide it’s not worth the time or cost.
This is the potentially the case among electronic manufacturers.
A new survey from the IPC, an industry association representing electronic manufacturers, found nearly a quarter of all respondents said the cost and burden of CMMC may force them out of the defense industrial base (DIB).
About half of IPC’s 3,000 members are located in the U.S. and many are serving the DoD market.
Chris Mitchell, vice president for global government relations at IPC, said in an interview with Federal News Network that CMMC may lead to further contraction of an industrial base that has been shrinking over the last 20 years.
“This is important because we’ve already seen a considerable contraction and reduction in the number of electronics manufacturers here in the United States. To give you a sense of the kind of trajectory that we’ve been on as a country, over the last 20 years or so we have dropped from more than 2,000 printed circuit board manufacturers in the United States to fewer than 200. And that number is expected to decline further,” Mitchell said. “We were hearing from so many of our members that they were having anxiety about CMMC. It’s important to understand that electronics manufacturing generally is a thin margined business, so even small incremental cost increases can really effect a company’s competitiveness. As companies are beginning to undertake the assessments and do the other stuff necessary for certification, we were hearing from many of them that the costs were much larger than they had anticipated, and that there was continuing to be a lack of clarity about the requirements and what the timeline were.”
He added the combination of a shrinking industrial base combined with the costs and burden of CMMC could lead to the Defense Department facing a much weakened industrial base.
Taking this one step further, nearly every weapon system, every back-office process and every communication tool relies on the sector.
DoD’s January report to Congress on its industrial base capabilities underscored this problem.
“The dependence on foreign sources for semiconductor products continues to represent a serious threat to the economic prosperity and national security of the U.S., as much of the critical infrastructure is dependent on microelectronic devices,” the report stated. “This threat will become more pronounced as emergent technology sectors, such as Internet of Things (IoT) and AI, require commodity quantities of advanced semiconductor components.”
DoD also recognized the contraction in the market. The Pentagon said in the report that in the aerospace and defense sector, electronic equipment contributed 23% of total mergers and acquisitions’ deal value in the first half of fiscal 2020 about $15.4 billion. The most noteworthy of these mergers and acquisitions were the BAE Systems Inc. acquisition of Collins Aerospace-Military – Military Global Positioning System, and the Teledyne Technologies Inc. acquisition of Photonics Technologies SAS.
Mitchell said the potential impact isn’t just on the prime contractors, but the flow down to the subcontractors too.
“When it comes to the supply chain, there are already great strains on it. We had a call with an industry representative, not related to CMMC, and a big part of that discussion was the fact that we already are having a hard time sourcing parts, components, materials,” he said. “I think CMMC without some adjustments is likely to exacerbate these concerns.”
More than a third of the respondents say that CMMC will weaken the DIB, and 41% say the requirements will cause other problems in their supply chain. IPC received 108 responses from contract manufacturers, printed circuit board fabricators, original equipment manufacturers and suppliers who self-reported they are planning to undergo a CMMC assessment in the next five years.
Despite their concerns, IPC found some of its members, including original equipment manufacturers (OEMs), prime contractors and others already are beginning to implement CMMC.
Cost of CMMC is another obstacle for electronic manufacturers. The survey found most suppliers say they expect and are willing to spend upwards of $50,000 on CMMC readiness. Nearly one-third (32%) report that it will take them one to two years to prepare to undergo CMMC assessment.IPC found more than half of the suppliers say if implementation costs more than $100,000, CMMC would be too expensive.
“DoD’s own cost analysis estimated the cost of a CMMC Maturity Level 3 (ML3) certification to be more than $118,000 in the first year. This means DoD’s own estimate of CMMC compliance costs is too high for 77 percent of the IPC survey respondents,” IPC found.
DoD estimates the cost to obtain a CMMC level 3 certification to be about $118,000.
But Mitchell said that estimate seems to be low.
“Those companies that are going through that process are reporting much, much higher cost estimates in excess of $300,000 in some cases, and these are not large companies that we’re talking about,” he said. “I think the fear on our part is that as companies go through this process, the cost estimates are likely to increase, and as a result, the inclination to leave the defense market may increase as well.”
What the survey didn’t answer is just how big the DoD market is for these electronic manufacturers, and is it a big enough market for them to spend money on CMMC? For instance, the Center for Strategic and International Studies (CSIS) estimated that the Army would spend more than $5.6 billion on communications and electronics equipment last year. Overall, CSIS projected funding for communications, sensors and electronics to increase by 21% by 2022.
Is a $10-15 billion market big enough for these firms to spend a few hundred thousand each to play? Or is the potential not as attractive as the globalization of electronics sector means hundreds of billions more and DoD isn’t worth the trouble?
While IPC can’t necessarily answer it, it’s clear the dwindling number of contractors is concerning for both DoD and the industry at large.
The Defense Advanced Products Research Agency (DARPA), for example, initiated in 2017 the Electronics Resurgence Initiative (ERI) as a response to several technical and economic trends in the microelectronics sector.
Through the program, DARPA is funding work across seven areas, including accelerating innovation in artificial intelligence hardware to make decisions at the edge faster, mitigating the costs of electronic design and overcoming security threats in the hardware lifecycle.
Mitchell said IPC would like to see DoD provide more clarity and transparency around CMMC, particularly by addressing reciprocity with existing industry standards.
“There are many existing industry standards in place that have actually been doing a pretty good job of strengthening the security of the industrial base. IPC, in fact, has worked very closely with the Defense Department to establish IPC- 1791, which is a trusted supplier standard that also integrates into it cybersecurity requirements. Companies have now been working for more than two years in order to meet that standard and be validated. As a result, the printed circuit board and printed circuit board assembly industries are more robust today, are more secure today than they were two years ago,” he said. “We would love to see whether it’s in the context of CMMC, or apart from it. We would love to see DoD place greater emphasis on leveraging these standards. I think that they reflect an industry commitment to ensure that our industrial base is secure, both physically as well as cyber.”
Interestingly enough, DoD even refers to the IPC-1791 standard in its January report to Congress, saying “A strategy is currently under development and will require implementation by January 2023.”
Mitchell said IPC has shared the survey results with DoD, as well as lawmakers.
He said the goal is to use the data to help convince DoD to work more closely with industry to figure out how companies can earn the CMMC certification in a way that isn’t too burdensome and too costly. He said the other issues is to clarify how to gain compliance beyond hiring consultants.
“Let’s take every opportunity to try to leverage existing standards that are already in use by industry to figure out if we can fray some of the costs that way as well,” he said. “My understanding is that there is a desire to bring some uniformity across the entire industrial base. In many respects, if you talk to the industry, they think it’s a laudable goal. I think the challenge, of course, is that it isn’t just in the case of security, but both in terms of security and quality as whole, as well as a whole number of other areas. These companies are expending tremendous resources in order to have operations that are validated by one measurement or another. CMMC adds tremendous costs to businesses that are operating on the thin margins. So to the degree that we can leverage existing standards, we think that that’s a really good approach.”
IPC’s members’ concern over CMMC isn’t just one sector. While DoD has done a good job of talking about CMMC, the number of unanswered questions or what the path forward looks like is growing. DoD needs to make public how it will update its plan for CMMC based on Deputy Secretary Kathleen Hicks’ review that completed in May and squash some of the silly rumors that started to gain traction.
The deadline for agencies to fully implement the Technology Business Management (TBM) framework is technically three months away, about the time when initial budget requests for fiscal 2023 will go to the Office of Management and Budget.
Despite working on implementing TBM for the better part of four years, agencies continue to struggle with the data that is required to drive decisions and compare their costs with private and public sector experts.
“I couldn’t do benchmarking without having that four years of data, having the ability to have that trending and the comparison and that understanding of the data. But when you look federalwide, some agencies that have really matured around using the taxonomy are definitely looking at benchmarking,” said Maria Roat, the deputy federal chief information officer, during a webinar sponsored by ACT-IAC. “I want to caution around benchmarking. You just can’t say ‘Hey, I’m going to start with benchmarking,’ without having a pretty good read on your data because you don’t want to have flawed data and try to do benchmarking.”
And too many agencies still have flawed data, making it harder to get the expected value of TBM.
Roat said the governmentwide maturity around TBM and data isn’t quite there yet.
“While we’re continuing to make adjustments to the fiscal 2022 [budget] and looking at cutting down on the data requirements and what we’re getting at the Office of Management and Budget level, you can’t put the cart before the horse on the benchmarking because you don’t want to use flawed data. But perfection is the enemy of good enough. You want to have good enough data to be able to start doing benchmarking, and figure out in what areas that you want to compare yourself to,” she said. “There’s a lot of levers and a lot of moving parts on it, and when you make the decision and the timing on starting to do benchmarking, because the Community of Practice said we’re going to benchmark against industry well, how do you know you have good data, and it’s not flawed? Can we compare agency to agency? Sometimes you can, sometimes you can’t, depending on the size of the agency, and what parameters you’re looking. There’s a lot of factors to consider in the benchmarking. I think there’s value in it.”
Kelly Morrison, director of digital transformation and management at Grant Thornton Public Sector and a former a performance analyst in OMB’s Federal Chief Information Officer’s office, said without a baseline it will be difficult for agencies to improve management, governance, strategic budgeting and oversight of IT spending.
“Agencies have to orient investments in IT around IT services/solutions being provided and be able to see how new investments or projects enhance existing IT services or create new services and how those services are enabling mission and business objectives — providing a clear line of sight,” she said in an email to Federal News Network.
Some of those changes to IT project data OMB is considering may include removing 17 fields and adding seven new ones to the IT data management reporting requirements that help feed the capital planning and investment control (CPIC) efforts.
And email obtained by Federal News Network from March outline some of the possible changes OMB proposed to the CIO community. The proposed removal of data include the requirement to outline how the investment matches to the Federal Enterprise Architecture’s Business Reference Model (BRM); the requirement to offer alternatives to cloud computing; and detailing total spend on infrastructure-, platform-, software-as-a-service and managed services.
The new data fields under consideration include questions focused on how projects are adequately implementing incremental development methodologies, including how often the agency is delivering new capabilities.
“As part of the larger federal IT dashboard modernization effort, GSA will be transitioning IT management data collection to a new approach called ‘IT Collect Application Programming Interface,’” the email stated. “The new application will take advantage of a modernized architecture, simplified coding language, longitudinal data collection and a ‘flat ledger’ designed to improve both the usability of the data being provided as well as reduce future costs when making data collection changes.”
The IT Collet API also will support OMB’s final requirements for the development of the fiscal 2023 budget request.
The data, both under TBM and more broadly for budget development, has been inconsistent over the years.
Keith Bluestein, the Small Business Administration’s CIO, said there was a time when a CIO’s office would send data to senior leadership to present to the President’s Management Council and find out later that the PMC received totally different data from OMB.
Bluestein said OMB and the data is much more consistent today than four or five years ago.
“The challenge was making sure that everybody was going to the same data source and it was being provided to the same place,” he said. “It’s tough to kind of get your arms around what that true data set is, and to make sure that’s the only one that’s going out of here.”
Dan York, the director of IT Spending Transparency at GSA, said this is why data standards and data quality is so important.
“The sooner we have actual standards by which we can capture IT and the acquisition systems or the FM financial management systems, budget execution systems, the sooner we can begin the process of automating the those data poles,” he said. “When we automate, we take the human error out of it. We take the fat finger errors out. We take the program manager guessing what their budget is and what they spent. That really allows the program managers to focus on the schedule, the performance, the risks of their program, and less of the manual data entry because we can pull those from auditable authoritative systems of record. Understanding what data we need, creating the standards by which to collect it in a system of record, and then automating that collection in such a way where we can really have the humans focus on human work, and have the machine focus on machine work. That’s really going to clean up data quality throughout the federal space.”
Roat said at the event OMB started focusing on data standards in 2019 with a focus on realigning product and service codes (PSC). OMB and the CIO Council removed 20 PSCs and realigned 40 others to abide by the TBM taxonomy overall.
“This past year the CIO Council took on a project to take a holistic look at the IT portfolio of the federal government, not just looking at CPIC, major investments and the cost, but really look at holistically what that portfolio is. I challenged the team to say, what are the core requirements that agencies have to report on? What does OMB need? What will help across the federal government?” she said. “We went back to basics and tore down all the way down to what’s required in law, the definition of CPIC, what’s required for budget submissions because even as we try to do multiyear approaches to modernization, if you don’t have good data to support that even trying to look forward, that’s going to be really hard.”
She said these changes will eventually be in Circular A-11.
Grant Thornton’s Morrison said it’s great that the TBM taxonomy is being baked into the Federal Integrated Business Framework (FIBF) and Product Service Codes (PSCs) but the taxonomy alone doesn’t provide the full value potential TBM offers.
“Policy and or legislation is needed to maintain the focus for agencies to implement TBM holistically, and for the federal government to continue baking TBM into the way the federal government does business/operates — for instance — updates to sub budget object class codes in financial management systems, incorporating the enterprise program inventory required by the Taxpayer Right to Know Act into TBM so that all IT costs can be aligned with the program inventory — both of these examples require engagement form communities outside of IT, thus reinforcing the focus of the panel,” she said. “TBM is not just an IT initiative, it requires the full CXO suite policy or legislation is likely needed to hold agencies accountable.”
Morrison said that’s why it’s important for OMB to reinforce and agencies to understand TBM isn’t a CIO’s job.
“There was previously a TBM executive steering committee with OMB and agency CXO leadership spanning IT, budget, finance, acquisition, human capital and performance. Unless this is happening at OMB and the CXO Councils, it is going to be up to every agency to push the boulder up the hill to affect the desired changes,” Morrison said in an email to Federal News Network. “This is why TBM was included in the prior President’s Management Agenda and as a cross-agency priority goal. TBM needs to be a holistic management priority tracked at the PMC level and for which agency deputy secretaries/administrators are responsible for.”
GSA’s York said more and more agencies understand the potential of TBM, as the community of practice grew to more than 650 members from 20 when the CIO Council launched it in 2017.
“There’s real community approach to choosing projects that they want to implement and help with their maturity. We are sharing the results of those projects across the entire government helps all boats rise together, which is really our main mission within the TBM project management office,” York said. “There’s certainly a lot of challenges that exist, particularly around change management and how you get large organizations to see the value and in future looking endeavors. But with the help of agencies and OMB, we really accomplished a lot over the last few years.”
The question is when will all of that work turn into widespread value for all agencies, instead of just a select few who jumped into TBM with both feet.
Since agencies began to talk about moving to the cloud in 2010, public and private sector advocates played up the idea that programs could “pay by the drink” or buy services on a consumption basis.
The fact is, few if any agencies truly achieved this model.
After almost two-and-a-half years of work, the General Services Administration is about ready to unleash a new way to buy cloud services.
GSA released its second draft policy to industry in May that would let agencies buy cloud services “by the drink” through the schedule contract.
A second draft policy created by Jeff Koses, GSA’s senior procurement executive, outlines how this buying approach would work under the schedule contract, including not requiring the Price Reduction Clause, which mandates vendors give the government their lowest price at all times, and what type of contract and how the funding would work.
“GSA anticipates purchasing cloud computing on a consumption basis will increase competition, as the move towards commercial practices will encourage new entrants to the FSS program,” Koses wrote in the draft policy, which Federal news Network obtained. “With a contract structure more closely tied to real time demand, this approach also provides greater flexibility to take advantage of technology improvements and better support cyber security. Tying cloud computing procurements to commercial market prices will also provide cost transparency without burdening contractors with additional transactional price reporting requirements. Plus, this approach promotes cost efficiency as it reduces the need to lock into long term contracts in markets where falling prices are reasonably anticipated.”
“We hope the policy lays out a clear way to execute the pay by the drink execution strategy using the schedules,” said Nick West, GSA’s deputy director of the Office of Policy, Integrity and Workforce, during a panel at the recent Coalition for Government Procurement spring conference. “We hope to have some sort of language in the schedule contracts by the fall or maybe earlier, hopefully. We really are looking to build something that the CIOs will use and [industry] will offer solutions for them to use.”
GSA gave industry its first look at how it wanted to change this policy in January 2020.
West said the pandemic delayed the work on the final policy, but GSA did incorporate comments into this second draft policy.
Keith Nakasone, who recently retired after 32 years in government, including the last four-plus as the deputy assistant commissioner for acquisition in GSA’s Office of IT Category, said adding the pay-by-the-drink model to the schedules is another way the Federal Acquisition Service is evolving the cloud special item number (518210C — previously 132-40).
The second draft memo offers more specifics than the first one. For instance, agencies would buy off of cloud service provider pricelists and receive any discounts as prices change. Agencies can incrementally fund task orders for cloud services instead of putting all of the money on contract at one time.
“The ordering activity contracting officer will use a requirements task order. This task order type provides for filling all actual purchase requirements of an ordering activity for cloud computing services during a specified contract period, with performance by the contractor being scheduled by activating and funding individual contract line items (CLINs) and sub-CLINs under this task order,” the draft policy states. “The ordering activity contracting officer must state a realistic estimated total quantity in the task order solicitation and the resulting task order. All CLINs within the task order must include a defined scope with all items priced at time of award. The ordering activity may obligate funds as the bona fide need arises for predefined and established fixed-priced procurement requirements on individual CLINs and sub-CLINs.”
GSA also says it will analyze metrics such as cost transparency, increase competition and better cybersecurity, on how this pay-by-the-drink model is working.
Agency customers haven’t exactly been beating down the door to use the schedules to buy cloud services. Of the $6.8 billion agencies spent on cloud services in fiscal 2020, according to Bloomberg Government, about $400 million of that came through the schedules. Cloud buys accounted for about 7% of all spending on the IT schedule last year.
The new policy would meet two of GSA’s goals: Making it easier to buy cloud services—a common call from industry and agency customers—and driving more business to its vehicle.
“GSA should be applauded for trying to make governmentwide policy in this vitally important area for federal IT modernization and service delivery. The hope is that they spend even more time consulting with industry leaders in this space, as well as key agencies who are leveraging the cloud most effectively, and only adopt guidance that is necessary for the bulk of government — and not simply trying to nibble around edge use cases,” said one industry expert, who requested anonymity in order to speak about the draft policy. “The scope and focus of the memo take a very ‘government-centric’ view of a very common issue in the commercial market, which is how to price the consumption of cloud services. Rather than complicate the federal buying guidance on this, GSA should require agencies to adopt commercial pricing that cloud providers offer to their non-government clients. These prices can be made transparent and be updated in real time through standard cloud service provider tools that can make data available on a GSA catalog.”
One way the draft policy takes this “government-centric” view of cloud buying is the requirement for task orders to have ceilings and for vendors to alert the customer if the total amount will reach 85% of the cap within the next 30 days.
Industry experts would argue these requirements still do not match how the private sector buys, which is based on demand and not funding.
“GSA seems to acknowledge that the cost per unit of usage for cloud generally goes down over time, but they feel the need to orient the memo to solve for the fact that there might be some random need for a spike in usage at a certain instance,” the industry expert said. “It seems difficult to understand how GSA will constantly monitor for this across all agency cloud contracts and there is worry that they are trying to solve for a problem that is already being addressed. Think of how agencies operated with increased compute needs during the coronavirus response.”
To GSA’s defense, it may never be able to achieve pure commercial parity, but the draft policy takes an important step.
“[C]onsumption purchasing may not be the best fit for every requirement. Cloud service providers offer multiple pricing models, including pay-as-you-go (e.g. on-demand and spot instances) and paying upfront (e.g. reserved instances, subscriptions). The pay-as-you-go models are the most popular in the private sector because of their efficiency and flexibility. However, other models may be more appropriate in certain circumstances,” Koses wrote. “For example, upfront payment plans, while inflexible, are often highly discounted and may offer the best value for users with predictable needs. As such, the private sector routinely leverages combinations of these pricing models and the government should replicate this approach as appropriate.”
This isn’t the first time GSA has tried to make it easier for agencies to buy cloud services. In 2016, the agency began work to change the Federal Acquisition Regulations and even considered legislative remedies, but it’s unclear if anything came from those initiatives. A year later, an interagency working group developed a best practices guide for buying cloud services.
Let’s hope after five years of fits and starts, GSA completes the promise made more than a decade ago that agencies will be able to buy cloud services like they buy electricity.
The Biden administration’s first complete budget request was light on technology and cyber policy and process changes. But it was definitely full of hope.
The hope that Congress will fund the Technology Modernization Fund (TMF) at another $500 million for fiscal 2022 after putting $1 billion into it as part of the American Rescue Plan Act.
The hope that agency workforces will grow by as much as 9.6% at the Department of Housing and Urban Development (HUD) or 7.4% at the Environmental Protection Agency (EPA), after years of being stagnant or seeing reductions.
And the hope the largely flat Defense Department request and the decision to do away with the Overseas Contingency Operations (OCO) account will not be over taken by the defense hawks, lobbyists and the Pentagon itself continuing the drum banging over near peer competition and losing the next great power competition.
Like every White House budget request, the Biden version is full of ideas and concepts that begin the long conversations with Congress. The goal, of course, is that these efforts will culminate not with the threat of yet another partial government shutdown or multiple continuing resolutions, but agencies knowing where they stand with enough time left in fiscal 2022 to hire employees, improve technology and continue to prove the value of federal programs and policies.
“The budget makes these investments in a way that’s responsive to both the near- and medium-term economic landscape, as well as the long-term challenges our country faces. In the near-term, the decades-long, global trend of declining interest rates, even as publicly held debt has increased, gives us the fiscal space to make necessary upfront investments,” said Shalanda Young, acting director of the Office of Management and Budget (OMB), during a call with reporters on May 28. “Under the budget policies, the real cost of federal debt payments will remain below the historical average through the coming decade, even as the budget assumes that interest rates will rise from their current lows, consistent with private-sector forecasts. Low real debt service payments show that the cost of these upfront investments is not burdening the economy. To the contrast, failing to make these investments at a time of such low interest costs would be an historic missed opportunity that would leave future generations worse off. This budget does not make that mistake, and it invests — its investments will pay dividends for generations to come. Over the long run, when we face larger fiscal challenges and more uncertainty about interest rates, the budget will reduce the deficit and improve our nation’s finances.”
Anyone who has paid attention to this annual exercise over the last two decades knows that the budget request has been more of a policy document than a funding effort based on reality. This is especially true for technology, cybersecurity and government reform efforts.
This year, however, the analytical perspectives chapter, where most of these policy ideas usually live, is full of “mom and apple pie” quotes like, “Cybersecurity is an important component of the administration’s IT modernization efforts….” or “Federal agencies’ ongoing efforts to modernize their IT will enhance mission effectiveness and reduce mission risks through a series of complementary initiatives…”
So to find the real policy ideas and see where OMB wants agencies to move, you have to dig a little deeper.
Here are the budget items that help demonstrate the Biden administration’s IT and cyber policy goals. These are not in any specific order.
Usually any discussion about IT and cyber starts with the TMF. But for this one, let’s start with that other piece of the Modernizing Government Technology (MGT) Act that has the potential for more impactful, working capital funds. Over the last few years, several agencies have asked for this authority from Congress, but only the Small Business Administration (SBA) has received it.
The Labor Department is asking for authority to create an IT working capital fund (IT WCF) using MGT Act authority. “This IT WCF would include all activities currently financed through the WCF, as well as the development and operational costs for agency-specific applications currently funded directly by agencies. Shifting these activities into an IT WCF has no impact on total spending at the department.”
Labor is asking for $37.2 million for IT modernization efforts in 2022, up from $29 million in 2021 and $26 million in 2020.
The Office of Personnel Management (OPM) also seems to be requesting authority for an IT WCF too. The budget says it plans to “transfer salaries and expenses extra funding into the IT WCF.”
OPM’s CIO’s office asked for $73 million in 2022, up from $39 million in 2021 and $46 million in 2020.
The U.S. Agency for International Development also is seeking IT WCF authority for at least a third year in a row. It is asking to transfer 5% or up to $30 million to the fund.
Congress added a provision in the 2021 budget that says no more than 3% of salaries and expenses and business loans program account may be transferred to IT WCF. SBA says it expects to have $2 million in the IT WCF in 2022, down from $7 million in 2021.
The SBA’s CIO’s office asked for $30 million in 2022, down from $48 million in 2021, but up from $29 million in 2020.
Interestingly, the Environmental Protection Agency has run a working capital fund since 1997 and says the MGT Act gives it additional authority to use the money for IT modernization efforts, such as its enterprise human resources IT services and regional information technology service and support, managed by EPA Region 8. EPA expects its WCF to have $354 million in 2022.
Few other, if any, agencies have taken a similar interpretation of the MGT Act, which is why Sen. Maggie Hassan (D-N.H.), chairwoman of the Homeland Security and Governmental Affairs Subcommittee on Emerging Threats and Spending Oversight, plans to add a technical amendment to the MGT Act to fix this challenge.
Interestingly enough, the Department of Homeland Security says it is shutting down its non-MGT Act working capital fund. In 2020, the WCF had $424 million.
“DHS and the Working Capital Fund (WCF) governance board decided to dissolve the WCF in 2021. This decision was reached after conducting strategic reviews of the WCF governance criteria and discussions within the Management Directorate on their business strategy for providing services to their customer base,” the budget documents state. “As a result, no funds are included in the 2022 Budget. All activities were removed from the WCF with base transfers in 2021. DHS components will transfer funds to the servicing management lines of business for fee-for-service and governmentwide mandated services.”
Finally on a related WCF note, the Transportation Department (DoT) plans to spend $93 million from its $726 million agencywide central account “to continue the department’s implementation of a shared services environment for commodity information technology (IT) investments. The IT shared services initiative will modernize IT across the department and improve mission delivery by consolidating separate, overlapping, and duplicative processes and functions.”
The CIO’s office is asking for $17.7 million in 2022.
The budget also says DoT will continue consolidating commodity IT services across operating administrations with a focus on investment-level commodity IT, as well as IT security and compliance activities. It will use shared services to enable the department to improve cybersecurity, increase efficiencies and improve transparency in IT spending.
The Biden administration is asking for $9.8 billion for federal civilian cybersecurity in 2022. This would be a 14% increase over 2021. The Defense Department says its cybersecurity budget request in 2022 is $10.4 billion, bringing total cyber spending above $20 billion governmentwide for the first time.
Included in the overall request is $20 million for a new Cyber Response and Recovery Fund (CRRF) run by the Cybersecurity and Infrastructure Security Agency (CISA) to improve national critical infrastructure cybersecurity response.
“In the first year, the administration proposes to pilot the CRRF, limiting funding during the pilot phase to supporting non-federal entities in responding to, and recovering from, a critical cyber incident,” the budget documents state. “The CRRF would be purpose restricted to carrying out CISA’s existing statutory authorities for cyber response and recovery in support of critical infrastructure and during a significant cybersecurity incident as defined in Presidential Policy Directive (PPD 41): United States Cyber Incident Coordination. Funds would only be available if all criteria were met and if the President had approved use of the funds.”
Sens. Gary Peters (D-Mich.) and Rob Portman (R-Ohio) introduced the bill to create the fund on May 12 that would enable CISA to more easily coordinate federal and non-federal response efforts to a major cyber incident. The bill authorizes $20 million for the fund.
The Biden request also includes $15 million to support the Office of the National Cyber Director, which Congress established in the National Defense Authorization Act of 2021.
In all, the Biden administration is asking for $1.7 billion in total funding for CISA, including $913 million for cybersecurity. Of that, $350 million would be for procurement construction and improvements account, down from $439 million in 2021 and $481 million in 2020. This account typically funds the continuous diagnostics and mitigation (CDM) program as well as the national cybersecurity protection system initiatives like EINSTEIN and automated information sharing. The budget did not break out CDM or other programs specifically.
Beyond CISA, several agencies are seeking increases in cybersecurity funding, including those impacted by the SolarWinds attack.
The Treasury Department, for example, is asking for $137 million for its cybersecurity enhancement account (CEA), which is $114 million more than usual to provide “resources to strengthen Treasury’s cybersecurity posture and address the impacts of the SolarWinds incident.”
The CEA supports departmentwide and bureau-specific investments of high value assets and provide more enterprisewide services.
The Energy Department’s CIO office is asking for $68 million in 2022 after making no specific line item requests in 2021 or 2020. While the request doesn’t say what the money would be used for, it’s easy to assume that at least some of the funding is for cybersecurity. Energy wrote in the 2022 request that “significant investments will address cyber vulnerabilities identified as a result of SolarWinds incident of December 2020.”
Energy also is looking to merge two cyber offices with the Defense Critical Energy Infrastructure (DCEI) Energy Mission Assurance functions moving into the Office of Cybersecurity, Energy Security and Emergency Response (CESER). The merger would bring the cybersecurity sharing and support efforts with the electric utility industry under CESER’s purview.
Energy is requesting $201 million for CESER, up from $157 million in 2021 and $158 million in 2020.
Beyond the SolarWinds attack, agencies are seeking more funding on cyber defensive efforts.
The FBI is seeking $15.2 million to defend itself from cybersecurity threats.
The Agriculture Department CIO requested $101.1 million, including $56 million for cybersecurity requirements. The overall request is up from $67 million in 2021 and $65 million in 2020.
The Commerce Department is asking for $126.9 million for technology modernization projects. This funding would be divided up with $20 million for business application system modernization and $106.9 million for cybersecurity risk mitigation.
The federal civilian IT budget request reaches $58.4 billion, a 2.4% increase over 2021. The Biden administration, for reasons unknown, didn’t include the Defense Department’s IT or cyber budget requests in the analytical perspectives. So it’s hard to get a sense of how much the overall federal IT budget is increasing.
The Federal IT Dashboard says DoD will spend about $37.1 billion on IT in 2021, up from $36.6 billion in 2020. In its 2022 request, DoD says it’s moving funding around to meet needs in artificial intelligence, 5G and other emerging technologies.
Using these numbers as the baseline, total federal IT spending would be more than $95.5 billion.
As mentioned earlier, the administration wants another $500 million for the Technology Modernization Fund (TMF). What’s interesting with this request, which comes under the General Services Administration’s budget because they manage the fund, is they expect to carry over $811 million out its $1.086 billion 2021 and 2020 funding.
This seems to highlight that OMB and the TMF Board expects to hand out less than $200 million from the TMF over the next four months.
The two other funds that help with IT modernization also are seeing solid support.
On one hand, GSA’s Federal Citizen Services Fund is asking for an increase to $59.2 million, up from $55 million in 2021.
OMB’s IT Oversight and Reform (ITOR) fund seeks $10.4 million in 2022, up from $5 million in 2021 and $6 million in 2020.
OMB says the additional money will be used for the Federal Acquisition Security Council (FASC) and to implement its sharing of supply chain risk information and exercise its authorities to recommend issuances of removal and exclusion orders to address supply chain security risks within agencies.
The administration also offered more details about its plans to use the $200 million for U.S. Digital Service that was in the American Rescue Plan Act.
USDS plans to increase its full-time employees to 271 in 2022, up from 161 this year. It says the larger number of employees will enable “USDS to quickly address technology emergencies, ensure access and equity are integrated into products and processes, and help agencies modernize their systems for long-term stability.”
USDS, however, expects fewer employees, 60 compared to 63, to be reimbursable through agency fees.
The Justice Department (DoJ) also is seeking a hefty increase in its IT modernization account.
DoJ says it wants $141 million for its Justice Information Sharing Technology fund. That is up from $84 million in 2021 and $68 million in 2020.
“IT transformation is an ongoing commitment to evolve DoJ’s IT environment by driving toward shared commodity infrastructure services and seeking simplified design and implementation of tools to advance the mission. These efforts allow DoJ to shift from custom, government-owned solutions, to advanced industry-leading offerings at competitive pricing. The OCIO recognizes modernization as an ongoing activity, requiring IT strategies to adapt as technology changes,” the budget document stated.
The decision to ban Kaspersky Lab products and services from federal agency networks and systems may just have been a shot across the bow.
The Justice Department is considering rolling out the big guns against companies owned and operated by Russian nationals.
John Demers, the assistant attorney general for National Security in DoJ, said in light of the SolarWinds attack, Justice, along with the FBI and the intelligence community, launched a new effort to see where there may be supply chain vulnerabilities of companies that are Russian or are doing business in Russia.
“This is not meant punitively, but meant protectively,” Demers said at the recent Justice Department Cyber Symposium. “Where there’s a critical pieces of software, if there’s back end software design and coding being done in a country where we know that they’ve used sophisticated cyber means to do intrusions into U.S. companies, then maybe the U.S. companies shouldn’t be doing work with those companies from Russia or from other untrusted countries. That’s something that we’re going to be looking closely at.”
Demers said Justice, FBI and ODNI will share the information it collects with the Commerce Department, which will then decide how to use its authorities under the May 2019 executive order signed by President Donald Trump about whether to prohibit use of technologies that pose a risk to agencies or critical infrastructure.
“We are evaluating the risks of using information and communications technology and services (ICTS) supplied by companies that operate or store user data in Russia or rely on software development or remote technical support by personnel in Russia,” a Justice Department spokesman added after the event in an email to Federal News Network. “Unlike sanctions, which punish individuals and entities for bad behavior in the past, this review is focused on risk management: Which companies, or classes of transactions, pose a heightened threat to national security because of the vulnerabilities they introduce or the consequences, should they be exploited in the future.”
The spokesman offered not timeline for when DoJ and its partners would complete the review.
This move from the Justice Department, FBI and the intelligence community follows Congressional requirements for agencies to stop using products and services from telecommunications companies owned and operated by Chinese nationals.
The recent SolarWinds compromise and other cyber attacks are driving DoJ’s review of Russian companies. Only Kaspersky Lab has been officially banned from federal networks and systems.
Concerns over foreign ownership or influence on technology companies isn’t just about ownership and location, it’s also about espionage and data leaks.
The Defense Department is taking steps to shore up its industrial base against these long-standing problems.
The DoD chief information officer is expanding its defense industrial base (DIB) cybersecurity information sharing program.
“Although this program was designed to share indicators of compromise and malware analysis services with cleared defense contractors—those members of the industrial base that have security clearances and access to classified information—the DoD CIO is working to amend relevant regulations to expand the program to include non-cleared defense contractors, thus enabling small- and medium-sized contractors to receive important information, including the same signatures, malign IP addresses and threat advisories that the larger cleared primes receive as part of the program,” said Rear Adm. William Chase III, the deputy principal cyber advisor to the Secretary of Defense and director of protecting critical technology task force, in written testimony before the Senate Armed Services Subcommittee on Cybersecurity. “The Defense Cyber Crime Center (DC3) is also expanding the services available to the DIB, piloting efforts such as penetration testing to address contractors’ external-facing vulnerabilities and an adversary emulation program.”
Chase also told lawmakers about a cyber threat intelligence sharing program called Crystal Ball, which is an “outside looking in” type of program
It helped identify and notify 13 DIB partners about the Microsoft Exchange attacks from Chinese actors.
Another is the DIB vulnerability disclosure program to help companies improve their cyber hygiene.
DoD is looking to expand these pilots to those companies that do not have security clearances in order to more broadly share the threat data from just 800 contractors to tens of thousands.
Additionally, the National Security Agency is running pilots to share unique, actionable threat information and cybersecurity guidance with members of the DIB and their service providers. Another pilot provides unique cybersecurity capabilities to the DIB, among the most promising of which is the provision of free and secure Domain Name System (DNS) lookup services to the DIB.
“The NSA is offering this cybersecurity service — called Protective DNS, or pDNS — in partnership with an advanced commercial DNS provider and is currently enrolling members of its industrial base,” Chase wrote. “This capability combines a commercial DNS sensor architecture with real-time analytics to quickly understand malicious activity targeting the DIB and to deploy immediate countermeasures. The efficacy of this service has been widely demonstrated—it does not require access to internal contractor networks and has the potential to prevent or disrupt adversary cyber exploitation activities.”
And, of course, there is the Cybersecurity Maturity Model Certification (CMMC) program.
Jesse Salazar, the deputy assistant secretary of Defense for industrial policy, told Senate lawmakers DoD moved CMMC under his office earlier this year.
He said the final CMMC acquisition rules should be in place by the end of 2021 after reviewing about 850 comments and the recommendations from the deputy secretary’s review.
“As part of our look, we are trying to assess how we bring clarity to the requirements that we are asking, looking at the barriers to small businesses and then making sure we have trust in the ecosystem,” Salazar said.
A source with knowledge of CMMC confirmed the deputy secretary’s 30-day review completed in late April.
Salazar said in his written testimony that one of DoD’s biggest challenges with CMMC is to deconflict and streamline multiple cyber requirements to avoid requiring duplicative efforts.
“This includes providing clear guidance on the alignment of the NIST SP 800-171 DoD Assessment Methodology and CMMC, as they pertain to safeguarding controlled unclassified information (CUI), as well as the requirements and assessment approach for contractors that use cloud service provider offerings,” he said. “Moreover, the department is committed to working with our allies and international partners to better understand how the CMMC framework compares with other nations’ cybersecurity requirements and better align these requirements to help protect the department’s mission critical supply chain.”
Chase said many of these pilots are providing direct cybersecurity services to the industry instead of depending on their ability to use tools to detect threats and vulnerabilities.
“This approach institutionally buys down cybersecurity risk across entire industry segments rather than relying on individual small- and medium-sized businesses to defend their networks as if they were large prime contractors,” he said.
In a matter of days, the federal contracting community will learn whether the General Services Administration avoided and learned from the mistakes it made in November 2019, or if history will repeat itself.
GSA’s Federal Acquisition Service flipped the switch on May 24 for the new SAM.gov portal, bringing together 1.5 million users across six acquisition websites under one umbrella and integrating the data to reduce the burden on agency and industry customers alike.
The new SAM.gov removes the “beta” from the web address, retires the previous version of the site and aims to create a common look and feel across acquisition systems under the Integrated Acquisition Environment (IAE).
“One of the visions we’ve had now for many years in integrating all of these capabilities together is to create for you and contracting officials a single workspace,” said Judith Zawatsky, the assistant commissioner of the Office of Systems Management in FAS, during the recent Coalition for Government Procurement spring conference. “If you already have a log-in and an account, you will not need to do anything but come in and authenticate yourself. You will not need to create a new account and you will not have any issues. When you do authenticate yourself, you are going to find a workspace where you can do all the things for which you have a role for your entity.”
Zawatsky said the user’s workspace will include common searches the user performs and the reports they typically run.
“We’ve changed the design and layout to make it more readable. The design in compliance with the 21st Century IDEA Act. The design is already very much ahead of the USDS web standards and brings in a lot of functionality. We are very excited about the look and feel, and the ability to see more data that you are looking for and understand what you are looking at so you have to go through less pages,” she said. “We’ve also added a whole new set of data analytics into the system so we actually will be able to get a better understanding of how people are interfacing not only with the site, but with each page, and that, along with feedback, continue to drive changes and improvements.”
The constant feedback — there is a link on every page of SAM.gov to submit comments — will be the tall tale sign of success.
If you remember, the first time GSA launched this site, then called beta.sam.gov in November 2019 when it shut down FedBizOpps.gov, industry was none too happy with the results.
Some of the common complaints back then focused on search parameters, reduced functionality and a lack of data standards. The Professional Services Council outlined their complaints in a 22-page letter sent two months after the launch.
Zawatsky said in an interview with Federal News Network that GSA received more than 35,000 pieces of feedback already from customers, and the landing pages have gone through 50 different iterations to improve design and functionality through user experience and feedback mechanisms.
“We really, really, really listened. It’s taken us more than a small amount of time to move from that first rollout of beta.sam.gov and then the FBO integration, and we’ve taken a lot of firepower through our business and product owners to really review all of the input that we got, suss it out, organize it and do feedback sessions,” she said. “We do recognize that we have users who are very large businesses that have 100 entities under them and they are trying to manage all of that, and we have people who are just trying to apply for an American Rescue Plan Act grant and just are trying to get through the process. We are trying to accommodate all of those people across an intense amount of data.”
GSA isn’t blind to the fact that agency and industry customers will have to adjust to the new site and there will be some challenges.
Zawatsky said she believes GSA opened up its testing and focus groups to as many agency and industry customers as possible. GSA also believes the workspace concept will give users more control over the specific data they are looking for.
“We really encourage people to log-in so they can have their profile and workspace. They can follow opportunities. They can look at their searches and create their own experience. That is our iterative goal. We bring all this data together. We make it 21st Century IDEA Act compliant. We make it secure. We keep the data clean and protected, and people create their own experiences,” she said.
To log-in to the systems, users will have to have a login.gov account.
Another big change GSA is getting industry and agencies ready for is the move away from the DUNS number and to the Unique Entity Identifier (UEI), which will be the standards starting in April 2022.
Every vendor organization will receive a UEI at the launch of the new portal.
Amber Hart, co-founder of the Pulse of GovCon, a market intelligence firm, said GSA has an opportunity with the new SAM.gov to fix some of the frustrating aspects of beta.sam.gov.
“A majority of industry spends thousands of dollars so that a commercial firm can string together GSA’s own data to figure out the historical context. GSA could remedy this situation and did make a slight attempt at doing this with ‘history’ and ‘related notice’ data entry points on the new SAM.gov but that feature seems to be missing the point as it still is a manual entry process,” Harts said in an email to Federal News Network. “This is a massive undertaking that I don’t think anyone could ever get fully right, and I know GSA had their own reasons for combining all of these systems – like making internal processes easier, more secure, etc. – but to industry, it just seems like the contract opportunity functionality of SAM.gov was an afterthought based on the outcome.”
Hart and others will have plenty of opportunity to comment and offer feedback over the coming weeks.
FAS Commissioner Sonny Hashmi said at the Coalition’s event that the site is more responsive and more scalable.
“We decoupled the front end from the back end. For you, that means that we are able to roll out more capabilities more quickly and in a more decentralized way,” he said. “We know SAM isn’t always easy to use, and we know you all have identified some pain points. One thing I can commit to you is that we will listen.”
And if there are problems, industry will not be shy about speaking up.
Soon after the specifics about the SolarWinds attack came to light, the Department of Homeland Security went to work to limit the damage.
Among the first things it did was put the attack signatures into the EINSTEIN toolset that is used by nearly every agency.
“As part of the SolarWinds campaign, EINSTEIN was extremely useful in terms of identifying suspicious network traffic from a handful of federal civilian agencies that upon further investigation by those agencies helped identify additional victims of this campaign. It’s worth noting that EINSTEIN didn’t prevent the intrusion nor was it able to detect the intrusion until, in this case, we received threat information from private sector partners to inform our detection and prevention mechanisms,” said Matt Hartman, the deputy executive assistant director for cyber at CISA, in an interview with Federal News Network. “As soon as CISA received indicators of this activity from industry partners, we immediately leveraged EINSTEIN to identify and notify agencies of anomalous activity on their networks, which helped accelerate response, remediation and recovery activities.”
Hartman said it also helped CISA as part of the Unified Coordination Group to provide asset response and remediation of the attacks.
“Without EINSTEIN, we may have departments today that still did not know they were victims of this campaign. Through the EINSTEIN 1 NetFlow capability to — after the fact — look at indicators, identify potential indications of compromise has proven extremely useful,” he said. “This is just one example over the last few months of CISA being alerted via EINSTEIN of a potential compromise of a federal agency’s network. We are consistently flagging this sort of anomalous activity to agencies, which then kicks off further investigation and incident response activity, as appropriate.”
Hartman said EINSTEIN provided insights into specific indicators or call-outs to internet protocol addresses or domains that were known to be part of this campaign at other agencies or the private sector.
He added EINSTEIN helped confirm to multiple agencies that they were victims of the SolarWinds attack.
The value EINSTEIN demonstrated during SolarWinds is overlooked by many in the federal community. Part of the reason is the Homeland Security Department’s poor communication and lack of transparency about EINSTEIN’s capabilities over the last 15 years.
Suzanne Spaulding, the former undersecretary of the National Protection and Programs Directorate at DHS and now the senior adviser for homeland security and director of the Defending Democratic Institutions project at the Center for Strategic and International Studies, said NPPD, now known as CISA, could’ve done a better job educating the public, Congress and the media about what EINSTEIN was designed to do.
“I can remember these conversations post-OPM breach about what EINSTEIN allowed us to do once we detected malicious activity. We took the signature information and loaded it into EINSTEIN. It provided protection to other agencies who deployed EINSTEIN. It was valuable in that sense,” she said. “I remember having these conversations with folks on the Hill. It missed the initial attack because it was something we hadn’t seen before, but the important value of EINSTEIN was that it prevented the same attack from being used against others.”
Spaulding and other former DHS cyber officials readily admit EINSTEIN’s limitations can be frustrating. Among the complaints about EINSTEIN over the years has been that it is only reactive to known problems and don’t help agencies address the threats in real time.
Additionally, CISA has been slow to evolve the tools and capabilities in EINSTEIN. The technology that EINSTEIN uses doesn’t work well with cloud services and causes latency in networks.
For example, E2, which is a network intrusion detection tool looking for malicious or potentially harmful network activity, is less effective over the last few years because so much of the data is encrypted.
Hartman said CISA recognizes those shortcomings and is trying to move E2 toward the end points.
“We have been working on a pilot or proof of concept for the better part of 18 months now. We’ve seen some great successes, really pairing CISA’s threat hunting analysts, who have an intelligence-driven focus, with the agency security operations center analysts, who have a tremendously rich understanding and context of their environments, to help rapidly detect anomalous activity and potentially malicious activity at the end point to include lateral movement,” he said.
Tom Bossert, the former homeland security advisor to President Donald Trump and now president of Trinity Cyber, said he was encouraged by CISA moving its tools closer to the end points and to cloud environments.
He said agencies understood the push to remote work over the last 15 months created a broader attack surface, but only now do they realize they have to do more to protect their employees, data and applications.
“The future of cybersecurity is a different architecture. We put active sensors at the internet-facing edge of network where a department or company connects to the internet and that usually goes through things like firewalls or intrusion detection and prevention tools,” he said. “But if you aren’t going through a central access point and going straight to the cloud, your protections are more limited. The only thing that is preventing the agency or company from being hacked is that web gateway which aren’t necessarily that good. If a hacker accesses a machine that is a pathway to cloud services. Any internet facing access should run through a break-and-inspect type of service that fully interrogates all traffic. You have to decrypt it first, but if you apply that standard you will end up with a better option to protect your networks.”
Despite EINSTEIN’s limitations over the years, cyber experts agree agencies wouldn’t be as safe without it.
“The value of EINSTEIN is old exploits and signatures are still valuable if you are not looking for them,” said Matt Coose, the former director of Federal Network Security Branch at DHS and now CEO and founder of Qmulos. “The real goal is to move the timelines to the left so we get updated signatures more often so we can detect what is going on more quickly. Without the early warnings, we still are in reactive and monitoring mode against old threats.”
Spaulding added because 98% of user population across civilian agencies are using EINTSEIN, it gives CISA a level of visibility into what’s happening on agency networks that is essential for other cyber tools and activities to work well.
CISA fully recognizes that EINSTEIN needs to change more quickly especially as remote work remains this widespread.
“We are evolving all our core programs and capabilities to provide the protections at the network, at the host levels and anywhere else we can secure the civilian enterprise while increasing CISA’s ability to rapidly detect threats,” Hartman said. “We also are working with OMB and the inter agencies to drive toward more sophisticated architectures, including zero trust concepts that are focused on identifying and securing the federal government’s highest value data.”
Back in 2005, the head of the National Security Agency broke out his red marker and circled a section of a white paper written by cybersecurity experts and gave them a two-month deadline to bring this idea to bear.
The concept the experts detailed to Gen. Keith Alexander would let NSA use technology to identify adversary tradecraft in flight, outside the wire so to speak, and treat it as a network problem.
Alexander thought the technology would be a game changer — maybe not a silver bullet — but something that would give the Defense Department a head start against ever-increasing threat before they made their way into the network.
Now, 16 years later, experts say this type of technology would’ve gone far to prevent, or at least limit, the damage for most of the major cyber breaches federal agencies suffered since 2005.
“It was a big deal because we brought intelligence and defensive folks under one roof. The results were profound. We created a rich contextual threat intelligence about what adversaries were doing to DoD and we used it to warn incident responders and others,” said Steve Ryan, a former deputy director of NSA’s Threat Operations Center who coauthored the aforementioned white paper.
“We set out to do something big and bold. We created classified capabilities that were largely tuned to interfere with cyber outside the network,” said Ryan, who is now the CEO and co-founder of Trinity Cyber. “We were leveraging our knowledge of the adversaries.”
Ryan said his team fielded a pilot by the end of calendar year 2005 and presented it to Alexander. By December 2008, the capability was protecting all of DoD.
Generally speaking, the capability is focused on deep packet inspection and the ability to reroute traffic that is potentially a threat to the network. The tool would stop remote code execution and find malicious software to make it more difficult for hackers to get inside an organization’s network. Some experts said capabilities like this, especially now 13 years later, would have limited the impact of the SolarWinds attack.
As DoD rolled out the capability, NSA started talking to the Department of Homeland Security about adding the technology to the EINSTEIN program.
John Felker, a former Coast Guard and DHS cyber official, said NSA was set to implement a version of these capabilities in EINSTEIN.
“They got all the way to be ready to pull the trigger and the deputy secretary at the time decided to stop it, and decided that DHS could do something similar on their own,” said Felker, who now is president of Morse Alpha Associates. “That was unfortunate. There was an idea that DHS could do it themselves and folks were selling a program that would have a positive impact, but they oversold it. That may be a reason so many people don’t understand what EINSTEIN is or was supposed to do.”
That lack of understanding of what the EINSTEIN tools are and are not came to a head over the last few months as lawmakers and misinformed “experts” questioned why the more than $1 billion investment over the last 16 years didn’t stop the SolarWinds attack.
The fact is the EINSTEIN program was never designed to stop SolarWinds or the Microsoft Exchange hack or even the Office of Personnel Management hack.
Current and former DHS and White House officials said the investment in the EINSTEIN tools made agencies safer and met the initial goals laid out in 2004: To stop known attack vectors or signatures through intrusion detection and prevention tools.
“There seems to be the misconception that EINSTEIN should block every sophisticated cyber threat. Unfortunately, that is a false narrative,” Matt Hartman, the deputy executive assistant director for cyber at the Cybersecurity and Infrastructure Security Agency, said in an interview with Federal News Network. He called EINSTEIN just one component of a layered defense.
“It’s a key piece and its success relies on information provided by commercial and intelligence community partners,” he said. “But it’s not going to pick up a novel supply chain attack that was designed for many months and executed in a matter of hours. For that reason, it must be complemented with other tools like those through the continuous diagnostics and mitigation (CDM) program and through cybersecurity shared services.”
Understanding those layered defense concepts became more critical with the new cybersecurity executive order that President Joe Biden signed May 13. With dozens of new and expanded initiatives, lawmakers and agency leaders should heed the lessons learned from EINSTEIN and other governmentwide cyber programs: The need for flexible, iterative tools and capabilities. The White House needs to break down the DoD, intelligence community and civilian agency silos by not adhering to the old, “not invented here” mindset.
Karen Evans, the former administrator of e-government and IT at the Office of Management and Budget and former DHS chief information officer, said the combination of EINSTEIN, CDM, Automated Indicator Sharing (AIS) and other capabilities were laid out in the Comprehensive National Cybersecurity Initiative (CNCI) in 2008 during the waning days of the Bush administration.
“Our goal was to connect the security operations center initiatives across government,” Evans said. “DoD, NSA, DHS and others were supposed to bring them altogether.”
Evans is referencing paragraph 39 of the CNCI that called for a whole-of-government incident response capability.
Of course that never happened, so tools like EINSTEIN were left to fend for themselves.
A former DHS cybersecurity official, who requested anonymity because they didn’t get permission to speak to the press from their current private sector job, said a common choke point for EINSTEIN was the inability of DHS to get consent from agencies to monitor their internet traffic.
“Even once consent was reached, then it took time to schedule the cut overs onto the services. EINSTEIN versions 1 and 2 were easy. E3A was complex because it was inline and blocking. However, once those legal, privacy and technical hurdles were crossed, the later onboarding of agencies could move rapidly,” the former official said. “The big gap during my time was the lack of internal monitoring data – security alerts from applications, servers, desktops and other endpoints. EINSTEIN can only see what data it is receiving. Cyber is about the whole picture. I don’t think the question is about EINSTEIN as much as it is about whether the National Cybersecurity and Communications Integration Center (NCCIC) and U.S. Computer Emergency Readiness Team (CERT) are authorized and able to receive a much more comprehensive set of security event data.”
Felker, the former DHS and Coast Guard cyber official, said creating a signature for a malicious code, testing that signature and putting it into EINSTEIN is not as easy as some would like to think.
“We’ve had this before where signatures ended up blocking mission critical activities even with testing. That made signatures challenging,” he said. “Add to that the fact that agency networks are not homogenous, it becomes a balancing act and a risk management act.”
Once again, another executive order will try to break down those systemic barriers that prevented the NSA adding its capability to EINSTEIN. Congress and other administrations have attempted to do this through policy and laws, but few have succeeded in making real progress.
CISA’s Hartman said it’s clear no entity can know everything. This was never more true than with SolarWinds, an instance in which FireEye alerted the intelligence community, CISA and others about the attack.
“EINSTEIN is only as good as information it is receiving for both the intelligence community as well as from commercial partners to enable partners to build in load signatures into the system that can detect or prevent similar attack techniques,” he said. “There is a capability that is a part of EINSTEIN 3A, that is known as logical response aperture. This capability was developed as an initial attempt to utilize artificial intelligence and machine learning techniques with network-based data in an attempt to identify suspicious malicious data without prior intelligence that could be deployed in a signature-based detection and prevention capability. This is now deployed at two internet service provider locations within the National Cybersecurity Protection System EINSTEIN 3 architecture. It’s been a valuable analytics platform, and, quite frankly, it is limited in its ability to detect verifiable malicious, network-based activity.”
Hartman said CISA’s plan is to focus this capability as a component of its analytical environment, providing one more toolset to review and determine potential and real threats.
Experts says what Hartman is describing is more about how CISA is changing than any one tool or capability.
Evans said CISA is a service provider with authorities from Congress and OMB to manage the results of the technology versus managing the tools themselves.
“This is a culture change. What Congress is asking CISA is ‘what do you need?’ and holding the DHS secretary accountable for delivering results,” she said. “The question that Congress and OMB have to answer is how far they want CISA to go to enforce and manage federal networks. That is the question.”
Tom Bossert, the former Homeland Security advisor to President Donald Trump and now president of Trinity Cyber, said there are new capabilities, similar to the one NSA implemented so many years ago, that could provide greater benefit to agencies.
“[Expanding NSA’s tools] wasn’t necessarily a missed opportunity by government or the private sector, but a reflection of where we stand today. We have mismatched capabilities and defenders have not made the necessary changes as offenders are far more nimble. There are major developments in how we access networks, the diversification of edge and cloud services and a significant amount of innovative technology that could be applied in a different way to prevent cyber breaches,” Bossert said. “We must find happy medium ground within our collective cyber industry. There is a resistance to innovation and there is a strong risk aversion to change because we are worried about unintended consequences.”
Bossert added the latest cyber attacks have shown there is better interagency coordination and clarity of purpose, and that must continue as the threats evolve.
CISA’s Hartman said he believes that EINSTEIN has met its original intent and much more. The capability routinely finds instances of anomalous activity that are confirmed and stopped.
“We are constantly modernizing our portfolio of capabilities,” he said. “We are thinking about EINSTEIN, CDM, the cyber quality service management office (QSMO) and how to evolve all of those capabilities. The evolution is underway, and it will accelerate in the coming months as [a] result of new authorities under [the] 2021 defense authorization act and the funding from the American Rescue Plan Act.”
With the release of the new guidance last week for agencies to submit proposals to receive some of the $1 billion in the Technology Modernization Fund, Federal Chief Information Officer Clare Martorana issued her first major policy decision that her fellow CIOs, industry and especially Congress will be watching the implementation of closely.
“I have called on the TMF board to prioritize modernization proposals that replace legacy IT systems that waste millions of taxpayer dollars each year, as well as threaten our government’s cybersecurity and fail to provide the customer service experience that the American taxpayer deserves,” said Sen. Maggie Hassan (D-N.H.), chairwoman of the Homeland Security and Governmental Affairs Subcommittee on emerging threats and spending oversight, in a statement to Federal News Network. “This announcement is a positive step forward that will help encourage more agencies to take advantage of the Technology Modernization Fund and replace their costly legacy IT systems to ensure that agencies can meet the needs of the 21st century.”
Congress handed Martorana, who became Federal CIO in March, the golden IT modernization ticket, something the men and women who held previously her position didn’t have, but desperately wanted.
Martorana offered some insights through email responses to Federal News Network questions about the TMF guidance, the board’s plans and why agencies should apply for the additional money.
Federal News Network: How will the repayment model work in terms of deciding which model makes the most sense?
Clare Martorana: As agency CIOS, CFOs, and leadership partner together to submit projects to the TMF, they will determine the level of repayment most appropriate to request for each project, based on the agency’s financial posture and risk management strategy. Additionally, the TMF Board will decide based off the new model outlined today— which factors in project prioritization and repayment flexibility.
Federal News Network: How do you think the changes make applying for the TMF more appetizing for agencies?
Clare Martorana: The repayment flexibility model was designed to make it easier for agencies to access the $1 billion appropriated to the TMF to meet the urgent cybersecurity and IT modernization demands that we need. Now, agencies can apply for projects that previously were out of their reach.
Federal News Network: Please comment on the four priority investment areas: High priority systems, cyber, public facing and shared services — how did you choose those four?
Clare Martorana: These four priority investments areas were chosen because these categories are what typically make up an agency’s IT portfolio.
Federal News Network: Is it up to agencies to make their case to the TMF board for what is a high priority system or public facing system?
Clare Martorana: Agencies are encouraged to submit their proposal to the board because they are their own best advocate. No one knows their IT portfolio and mission better than them, therefore they are best positioned to make the case and explain their proposals to the board.
Federal News Network: What will be the TMF board’s biggest challenge to get the money out the door? Time, people resources?
Clare Martorana: This new model is designed to remove those challenges. Initially we anticipate an influx of initial project proposals, but the TMF board and program management office are prepared and are ready to add program management office (PMO) resources as needed.
The idea that Martorana espoused, that the expanded TMF lets agencies apply for funding for projects that were previously out of reach, is both the biggest opportunity and the biggest challenge.
Gordon Bitko, a former FBI CIO and now senior vice president for policy at the IT Industry Council, said there is a sense of urgency to get the money out the door, but it will require a lot of thought, too.
“I think we will see agencies that are well positioned because they have modernization plans that are already in flight and this is an opportunity to tweak or accelerate it,” he said. “We will also see others who haven’t really been as effective in thinking about modernization may be starting from scratch to get proposals to the TMF board.”
He said another real challenges is the five agencies with political CIOs that haven’t been named yet. The acting CIOs may be hesitant to commit to taking on a “loan” without the backing of a permanent leader.
Matt Cornelius, the executive director of the Alliance for Digital Innovation, an industry association, and a former senior technology and cybersecurity advisor at OMB, said in an email to Federal News Network that the TMF program management office shouldn’t wait to expand its staff because the proposals will be coming fast and furious.
“The board must have access to the best information possible to guide their decision making. This means they can’t just meet periodically and look at the paper forms submitted for their consideration. They must actively — as a body — work with industry, Congress, agencies and any other participants to get a better understanding of the real investment opportunities throughout the government and constantly work to refine and improve their bureaucratic processes to ensure that the most dollars get to the best projects, and produce the best results,” Cornelius said. “They need to communicate the successes of any projects they have funded very aggressively, rather than the way the previous administration operated, which was to say very little about the actual outcomes of any TMF investments — a major missed opportunity. Success begets success.”
Mike Hettinger, president and founding principal of Hettinger Strategy Group and a former Hill staff member, said quickly increasing the staff and resources of the PMO would be an important signal to Congress.
“I think it’s primarily a staff bandwidth issue. They just don’t have the bandwidth right now to effectively review the influx of expected proposals. When we were talking to the Hill about this we encouraged them to set aside a percentage of the $1 billion to ensure they had the administrative resources necessary to properly administer the fund,” he said. “The other hang up is likely to be the quality of proposals. The better the proposal, the easier it’ll be to get them approved. From that standpoint, the June 2 date is a tight turnaround.”
Bitko agreed that the board should expand not only its staff but how they work with industry and the instructions it gives to agencies.
“I’d like to see more specificity about a number of things, including how the repayment terms will be modified, what constitutes a priority area, what do those look like and how will the board evaluate the proposals? Those are important steps to give agencies in the guidance,” he said.
The challenge of getting the money out the door isn’t a short term effort either. OMB’s guidance said it would give expedited consideration to proposals from agencies submitted by June 2. But the board also will be reviewing proposals on a rolling basis until the money runs out.
Dave Wennergren, the CEO of the American Council for Technology and Industry Advisory Council (ACT-IAC) and a former Defense Department deputy CIO and vice chairman of the CIO Council, said investments must be made and measurable outcomes achieved quickly, or there won’t be any appetite for future significant additional funding.
“It will be crucial that the process for identifying, approving and successfully implementing projects that use the funds be accelerated and streamlined,” he said. “With less than $100 million cumulatively awarded in TMF projects, the process must be optimized to ensure the timely flow of funds for new projects.”
Almost five years after launching the Transactional Data Reporting (TDR) pilot, the General Services Administration is reporting success.
Jeff Koses, GSA’s senior procurement executive, said the TDR pilot proves it’s valuable and a worthy replacement for the dreaded Price Reduction Clause (PRC).
“GSA has successfully demonstrated the value of TDR under the existing scope of the pilot. It has shown steady progress over the past four years, met most of the pilot’s objectives in the most recent year, and has made the necessary investments to leverage TDR’s potential in the years to come,” Koses wrote in an April 27 blog post. “We will continue to make improvements, especially in contracting officer usage.”
But multiple sources said that although TDR may work on paper, the reality is it’s unclear if any contracting officers are using the data to make decisions or even if the data is valuable enough for acquisition professionals.
“I have not experienced any negotiations based on TDR data in order to form an opinion,” said an industry expert, who requested anonymity because they work closely with GSA. “My clients who are TDR-covered have only been subject to contract-level price comparisons.”
Other sources said GSA’s three-year pilot proved more that it’s possible to use different and possibly better data to make price determinations, but the data is incomplete at best.
The source said customer agencies were hesitant to provide GSA with data on how much they paid for products and services. These issues may come to light in a new GSA inspector general report that it expects to release on the TDR pilot in the coming weeks.
Koses wrote in the blog post that GSA didn’t use the data in 2019 and made no mention of using the TDR information in 2020. He wrote:
“Looking at historical data, the pilot’s overall performance based upon a documented evaluation plan showed steady progress. This includes:
While industry experts said the results of the TDR pilot and their individual experiences led them to see the effort in a new light, GSA still needs to be more transparent about the data and how it’s being used.
Larry Allen, president of Allen Federal Business Partners and a long-time GSA observer, was an outspoken critic of TDR when it started, writing a column with the headline, “Run, don’t walk from Transactional Data Reporting rule.”
But Allen said now he has come around on TDR.
“I think TDR has grown into a viable option for many businesses. I will admit to having been a skeptic, but I think that, so long as a company keeps good records on what they provide to their GSA contracting officer, TDR can be a good way for some companies to obtain a schedule contract that otherwise might not be able to,” Allen said in an email to Federal News Network. “GSA may have even been a little ahead of the curve here in terms of TDR attracting non-traditional contractors, something that is very sought after now in Defense Department and even civilian agencies.”
Others experts said it may be time for TDR to move on from the pilot stage and into real production.
Alan Thomas, the former commissioner of the Federal Acquisition Service and now chief operating officer at IntelliBridge, said it seems TDR is bearing fruit but how much is unclear.
“The claims in the blog post are great but will be even more compelling with accompanying data,” he said in an email to Federal News Network. “Stepping back from the pilot and thinking about full implementation, GSA is going to need a strategy for working with the inspector general on TDR. The IG has a lot invested in the current price reduction clause regime and won’t move away easily from it.”
The IG’s acceptance and use of TDR remains a huge challenge. Auditors regularly issue audit reports that highlight problems with GSA’s pre-award and post-award audit efforts. In April 2020, the IG issued a report that said GSA may have missed out on potentially $1.1 billion in savings through pre-award audits of prices.
Jennifer Aubel, a principal consultant with Aronson, a consulting firm, wrote in a July 2019 blog post that as more companies participate in TDR, the IG’s ability to audit prices before an award is made is becoming harder.
“Under the TDR pilot, the population of auditable contracts has ostensibly been cut in half. When you remove the major resellers and integrators, what remains are largely professional service contractors and products companies under Schedules 84 (Law Enforcement), 71 (Furniture), and 66 (Scientific),” Aubel wrote. “The audit threshold for annual sales is also reduced due to the smaller pool of contracts from which the OIG is selecting. Small businesses who would have never been a blip on the OIG’s radar are now at much higher risk of pre-award audit.”
Aubel wrote that GSA’s systems, in 2019, were not equipped to produce a “dynamic market driven pricing model.”
Now almost two years later, it’s unclear if the systems are able to provide dynamic pricing, and more companies have moved to TDR, making the PRC less valuable.
Another complicating factor with moving away from the price reduction clause for the IG is it’s invested heavily in people and processes to perform those audits so refocusing those resources would be a challenge.
Thomas said the biggest issues with TDR data when he was FAS commissioner were quality and access, and both of these have been addressed.
“We put a small team of FAS leaders (Stephanie Shutt, Judith Zawatsky and Mark Lee) working with Jeff’s team in OGP to improve data quality and broaden access to the data. As a result, I believe the data being collected is now cleaner. Vendors are better trained on how to enter it and are more comfortable with the reporting regime,” Thomas said. “Likewise, regarding access, GSA erred initially on the side of caution. Pricing data is sensitive, but there were only a handful of people who had access to the TDR data. If you are a contracting officer, you can’t use what you don’t have! The team developed and implemented a reasonable set of safeguards that enabled more people to access TDR data.”
Thomas added GSA’s next steps should be to get the data in the hands of contracting officers, who can use the pricing information along with analytic tools and process automation software to make more strategic decisions and improve mission success.
One complicating factor in all of this is GSA’s move toward unpriced contracts under Section 876 of the 2018 National Defense Authorization Act.
This makes the price reduction clause and even TDR less necessary because it puts the burden on vendors to provide the lowest prices as part of contract negotiations.
Industry sources said GSA also can’t move away from TDR easily because of the investment companies have made into the systems to collect the data.
GSA’s Koses wrote that now with five years of pilot behind them, it will train contracting officers on the benefits of having access to more granular prices paid information and to support these efforts with management guidance, as necessary.
He said GSA will also refine and consider:
Thomas said vendors need to keep these changes in mind and potentially invest in back-office systems to more easily collect and report pricing data.
“When this is the case, vendors need to make sure the government understands the art of the possible with current systems and, if an investment is required, what’s the size and scope of that investment,” he said. “Sometimes the government asks for things and doesn’t understand the full implication of the ‘ask.’ Leaders at GSA are reasonable but not clairvoyant, so keep an open line of communication.”