Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

How CISA limited the impact of the SolarWinds attack

Soon after the specifics about the SolarWinds attack came to light, the Department of Homeland Security went to work to limit the damage.

Among the first things it did was put the attack signatures into the EINSTEIN toolset that is used by nearly every agency.

“As part of the SolarWinds campaign, EINSTEIN was extremely useful in terms of identifying suspicious network traffic from a handful of federal civilian agencies that upon further investigation by those agencies helped identify additional victims of this campaign. It’s worth noting that EINSTEIN didn’t prevent the intrusion nor was it able to detect the intrusion until, in this case, we received threat information from private sector partners to inform our detection and prevention mechanisms,” said Matt Hartman, the deputy executive assistant director for cyber at CISA, in an interview with Federal News Network. “As soon as CISA received indicators of this activity from industry partners, we immediately leveraged EINSTEIN to identify and notify agencies of anomalous activity on their networks, which helped accelerate response, remediation and recovery activities.”

Matt Hartman is the deputy executive assistant director for cyber at CISA.

Hartman said it also helped CISA as part of the Unified Coordination Group to provide asset response and remediation of the attacks.

“Without EINSTEIN, we may have departments today that still did not know they were victims of this campaign. Through the EINSTEIN 1 NetFlow capability to — after the fact — look at indicators, identify potential indications of compromise has proven extremely useful,” he said. “This is just one example over the last few months of CISA being alerted via EINSTEIN of a potential compromise of a federal agency’s network. We are consistently flagging this sort of anomalous activity to agencies, which then kicks off further investigation and incident response activity, as appropriate.”

Hartman said EINSTEIN provided insights into specific indicators or call-outs to internet protocol addresses or domains that were known to be part of this campaign at other agencies or the private sector.

He added EINSTEIN helped confirm to multiple agencies that they were victims of the SolarWinds attack.

The value EINSTEIN demonstrated during SolarWinds is overlooked by many in the federal community. Part of the reason is the Homeland Security Department’s poor communication and lack of transparency about EINSTEIN’s capabilities over the last 15 years.

Limitations well known

Suzanne Spaulding, the former undersecretary of the National Protection and Programs Directorate at DHS and now the senior adviser for homeland security and director of the Defending Democratic Institutions project at the Center for Strategic and International Studies, said NPPD, now known as CISA, could’ve done a better job educating the public, Congress and the media about what EINSTEIN was designed to do.

“I can remember these conversations post-OPM breach about what EINSTEIN allowed us to do once we detected malicious activity. We took the signature information and loaded it into EINSTEIN. It provided protection to other agencies who deployed EINSTEIN. It was valuable in that sense,” she said. “I remember having these conversations with folks on the Hill. It missed the initial attack because it was something we hadn’t seen before, but the important value of EINSTEIN was that it prevented the same attack from being used against others.”

Spaulding and other former DHS cyber officials readily admit EINSTEIN’s limitations can be frustrating. Among the complaints about EINSTEIN over the years has been that it is only reactive to known problems and don’t help agencies address the threats in real time.

Additionally, CISA has been slow to evolve the tools and capabilities in EINSTEIN. The technology that EINSTEIN uses doesn’t work well with cloud services and causes latency in networks.

For example, E2, which is a network intrusion detection tool looking for malicious or potentially harmful network activity, is less effective over the last few years because so much of the data is encrypted.

Hartman said CISA recognizes those shortcomings and is trying to move E2 toward the end points.

“We have been working on a pilot or proof of concept for the better part of 18 months now. We’ve seen some great successes, really pairing CISA’s threat hunting analysts, who have an intelligence-driven focus, with the agency security operations center analysts, who have a tremendously rich understanding and context of their environments, to help rapidly detect anomalous activity and potentially malicious activity at the end point to include lateral movement,” he said.

Future of cyber at the end points

Tom Bossert, the former homeland security advisor to President Donald Trump and now president of Trinity Cyber, said he was encouraged by CISA moving its tools closer to the end points and to cloud environments.

He said agencies understood the push to remote work over the last 15 months created a broader attack surface, but only now do they realize they have to do more to protect their employees, data and applications.

“The future of cybersecurity is a different architecture. We put active sensors at the internet-facing edge of network where a department or company connects to the internet and that usually goes through things like firewalls or intrusion detection and prevention tools,” he said. “But if you aren’t going through a central access point and going straight to the cloud, your protections are more limited. The only thing that is preventing the agency or company from being hacked is that web gateway which aren’t necessarily that good. If a hacker accesses a machine that is a pathway to cloud services. Any internet facing access should run through a break-and-inspect type of service that fully interrogates all traffic. You have to decrypt it first, but if you apply that standard you will end up with a better option to protect your networks.”

Despite EINSTEIN’s limitations over the years, cyber experts agree agencies wouldn’t be as safe without it.

“The value of EINSTEIN is old exploits and signatures are still valuable if you are not looking for them,” said Matt Coose, the former director of Federal Network Security Branch at DHS and now CEO and founder of Qmulos. “The real goal is to move the timelines to the left so we get updated signatures more often so we can detect what is going on more quickly. Without the early warnings, we still are in reactive and monitoring mode against old threats.”

Spaulding added because 98% of user population across civilian agencies are using EINTSEIN, it gives CISA a level of visibility into what’s happening on agency networks that is essential for other cyber tools and activities to work well.

CISA fully recognizes that EINSTEIN needs to change more quickly especially as remote work remains this widespread.

“We are evolving all our core programs and capabilities to provide the protections at the network, at the host levels and anywhere else we can secure the civilian enterprise while increasing CISA’s ability to rapidly detect threats,” Hartman said. “We also are working with OMB and the inter agencies to drive toward more sophisticated architectures, including zero trust concepts that are focused on identifying and securing the federal government’s highest value data.”


CISA’s EINSTEIN had a chance to be great, but it’s more than good enough

Back in 2005, the head of the National Security Agency broke out his red marker and circled a section of a white paper written by cybersecurity experts and gave them a two-month deadline to bring this idea to bear.

The concept the experts detailed to Gen. Keith Alexander would let NSA use technology to identify adversary tradecraft in flight, outside the wire so to speak, and treat it as a network problem.

Alexander thought the technology would be a game changer — maybe not a silver bullet — but something that would give the Defense Department a head start against ever-increasing threat before they made their way into the network.

Now, 16 years later, experts say this type of technology would’ve gone far to prevent, or at least limit, the damage for most of the major cyber breaches federal agencies suffered since 2005.

“It was a big deal because we brought intelligence and defensive folks under one roof. The results were profound. We created a rich contextual threat intelligence about what adversaries were doing to DoD and we used it to warn incident responders and others,” said Steve Ryan, a former deputy director of NSA’s Threat Operations Center who coauthored the aforementioned white paper.

“We set out to do something big and bold. We created classified capabilities that were largely tuned to interfere with cyber outside the network,” said Ryan, who is now the CEO and co-founder of Trinity Cyber. “We were leveraging our knowledge of the adversaries.”

Ryan said his team fielded a pilot by the end of calendar year 2005 and presented it to Alexander. By December 2008, the capability was protecting all of DoD.

Generally speaking, the capability is focused on deep packet inspection and the ability to reroute traffic that is potentially a threat to the network. The tool would stop remote code execution and find malicious software to make it more difficult for hackers to get inside an organization’s network. Some experts said capabilities like this, especially now 13 years later, would have limited the impact of the SolarWinds attack.

Stopped in its tracks

As DoD rolled out the capability, NSA started talking to the Department of Homeland Security about adding the technology to the EINSTEIN program.

John Felker, a former Coast Guard and DHS cyber official, said NSA was set to implement a version of these capabilities in EINSTEIN.

“They got all the way to be ready to pull the trigger and the deputy secretary at the time decided to stop it, and decided that DHS could do something similar on their own,” said Felker, who now is president of Morse Alpha Associates. “That was unfortunate. There was an idea that DHS could do it themselves and folks were selling a program that would have a positive impact, but they oversold it. That may be a reason so many people don’t understand what EINSTEIN is or was supposed to do.”

That lack of understanding of what the EINSTEIN tools are and are not came to a head over the last few months as lawmakers and misinformed “experts” questioned why the more than $1 billion investment over the last 16 years didn’t stop the SolarWinds attack.

The fact is the EINSTEIN program was never designed to stop SolarWinds or the Microsoft Exchange hack or even the Office of Personnel Management hack.

Current and former DHS and White House officials said the investment in the EINSTEIN tools made agencies safer and met the initial goals laid out in 2004: To stop known attack vectors or signatures through intrusion detection and prevention tools.

“There seems to be the misconception that EINSTEIN should block every sophisticated cyber threat. Unfortunately, that is a false narrative,” Matt Hartman, the deputy executive assistant director for cyber at the Cybersecurity and Infrastructure Security Agency, said in an interview with Federal News Network. He called EINSTEIN just one component of a layered defense.

“It’s a key piece and its success relies on information provided by commercial and intelligence community partners,” he said. “But it’s not going to pick up a novel supply chain attack that was designed for many months and executed in a matter of hours. For that reason, it must be complemented with other tools like those through the continuous diagnostics and mitigation (CDM) program and through cybersecurity shared services.”

Understanding those layered defense concepts became more critical with the new cybersecurity executive order that President Joe Biden signed May 13. With dozens of new and expanded initiatives, lawmakers and agency leaders should heed the lessons learned from EINSTEIN and other governmentwide cyber programs: The need for flexible, iterative tools and capabilities. The White House needs to break down the DoD, intelligence community and civilian agency silos by not adhering to the old, “not invented here” mindset.

Common choke points

Karen Evans, the former administrator of e-government and IT at the Office of Management and Budget and former DHS chief information officer, said the combination of EINSTEIN, CDM, Automated Indicator Sharing (AIS) and other capabilities were laid out in the Comprehensive National Cybersecurity Initiative (CNCI) in 2008 during the waning days of the Bush administration.

“Our goal was to connect the security operations center initiatives across government,” Evans said. “DoD, NSA, DHS and others were supposed to bring them altogether.”

Evans is referencing paragraph 39 of the CNCI that called for a whole-of-government incident response capability.

Of course that never happened, so tools like EINSTEIN were left to fend for themselves.

A former DHS cybersecurity official, who requested anonymity because they didn’t get permission to speak to the press from their current private sector job, said a common choke point for EINSTEIN was the inability of DHS to get consent from agencies to monitor their internet traffic.

“Even once consent was reached, then it took time to schedule the cut overs onto the services. EINSTEIN versions 1 and 2 were easy. E3A was complex because it was inline and blocking. However, once those legal, privacy and technical hurdles were crossed, the later onboarding of agencies could move rapidly,” the former official said. “The big gap during my time was the lack of internal monitoring data – security alerts from applications, servers, desktops and other endpoints. EINSTEIN can only see what data it is receiving. Cyber is about the whole picture. I don’t think the question is about EINSTEIN as much as it is about whether the National Cybersecurity and Communications Integration Center (NCCIC) and U.S. Computer Emergency Readiness Team (CERT) are authorized and able to receive a much more comprehensive set of security event data.”

Felker, the former DHS and Coast Guard cyber official, said creating a signature for a malicious code, testing that signature and putting it into EINSTEIN is not as easy as some would like to think.

“We’ve had this before where signatures ended up blocking mission critical activities even with testing. That made signatures challenging,” he said. “Add to that the fact that agency networks are not homogenous, it becomes a balancing act and a risk management act.”

Testing new capabilities

Once again, another executive order will try to break down those systemic barriers that prevented the NSA adding its capability to EINSTEIN. Congress and other administrations have attempted to do this through policy and laws, but few have succeeded in making real progress.

CISA’s Hartman said it’s clear no entity can know everything. This was never more true than with SolarWinds, an instance in which FireEye alerted the intelligence community, CISA and others about the attack.

“EINSTEIN is only as good as information it is receiving for both the intelligence community as well as from commercial partners to enable partners to build in load signatures into the system that can detect or prevent similar attack techniques,” he said. “There is a capability that is a part of EINSTEIN 3A, that is known as logical response aperture. This capability was developed as an initial attempt to utilize artificial intelligence and machine learning techniques with network-based data in an attempt to identify suspicious malicious data without prior intelligence that could be deployed in a signature-based detection and prevention capability. This is now deployed at two internet service provider locations within the National Cybersecurity Protection System EINSTEIN 3 architecture. It’s been a valuable analytics platform, and, quite frankly, it is limited in its ability to detect verifiable malicious, network-based activity.”

Hartman said CISA’s plan is to focus this capability as a component of its analytical environment, providing one more toolset to review and determine potential and real threats.

Experts says what Hartman is describing is more about how CISA is changing than any one tool or capability.

Evans said CISA is a service provider with authorities from Congress and OMB to manage the results of the technology versus managing the tools themselves.

“This is a culture change. What Congress is asking CISA is ‘what do you need?’ and holding the DHS secretary accountable for delivering results,” she said. “The question that Congress and OMB have to answer is how far they want CISA to go to enforce and manage federal networks. That is the question.”

Portfolio of cyber services

Tom Bossert, the former Homeland Security advisor to President Donald Trump and now president of Trinity Cyber, said there are new capabilities, similar to the one NSA implemented so many years ago, that could provide greater benefit to agencies.

“[Expanding NSA’s tools] wasn’t necessarily a missed opportunity by government or the private sector, but a reflection of where we stand today. We have mismatched capabilities and defenders have not made the necessary changes as offenders are far more nimble. There are major developments in how we access networks, the diversification of edge and cloud services and a significant amount of innovative technology that could be applied in a different way to prevent cyber breaches,” Bossert said. “We must find happy medium ground within our collective cyber industry. There is a resistance to innovation and there is a strong risk aversion to change because we are worried about unintended consequences.”

Bossert added the latest cyber attacks have shown there is better interagency coordination and clarity of purpose, and that must continue as the threats evolve.

CISA’s Hartman said he believes that EINSTEIN has met its original intent and much more. The capability routinely finds instances of anomalous activity that are confirmed and stopped.

“We are constantly modernizing our portfolio of capabilities,” he said. “We are thinking about EINSTEIN, CDM, the cyber quality service management office (QSMO) and how to evolve all of those capabilities. The evolution is underway, and it will accelerate in the coming months as [a] result of new authorities under [the] 2021 defense authorization act and the funding from the American Rescue Plan Act.”


Federal CIO Martorana: $1B TMF lets agencies ‘apply for projects that previously were out of their reach’

With the release of the new guidance last week for agencies to submit proposals to receive some of the $1 billion in the Technology Modernization Fund, Federal Chief Information Officer Clare Martorana issued her first major policy decision that her fellow CIOs, industry and especially Congress will be watching the implementation of closely.

Clare Martorana is the federal chief information officer.

“I have called on the TMF board to prioritize modernization proposals that replace legacy IT systems that waste millions of taxpayer dollars each year, as well as threaten our government’s cybersecurity and fail to provide the customer service experience that the American taxpayer deserves,” said Sen. Maggie Hassan (D-N.H.), chairwoman of the Homeland Security and Governmental Affairs Subcommittee on emerging threats and spending oversight, in a statement to Federal News Network. “This announcement is a positive step forward that will help encourage more agencies to take advantage of the Technology Modernization Fund and replace their costly legacy IT systems to ensure that agencies can meet the needs of the 21st century.”

Congress handed Martorana, who became Federal CIO in March, the golden IT modernization ticket, something the men and women who held previously her position didn’t have, but desperately wanted.

Martorana offered some insights through email responses to Federal News Network questions about the TMF guidance, the board’s plans and why agencies should apply for the additional money.

Federal News Network: How will the repayment model work in terms of deciding which model makes the most sense?

Clare Martorana: As agency CIOS, CFOs, and leadership partner together to submit projects to the TMF, they will determine the level of repayment most appropriate to request for each project, based on the agency’s financial posture and risk management strategy. Additionally, the TMF Board will decide based off the new model outlined today— which factors in project prioritization and repayment flexibility.

Federal News Network: How do you think the changes make applying for the TMF more appetizing for agencies?

Clare Martorana: The repayment flexibility model was designed to make it easier for agencies to access the $1 billion appropriated to the TMF to meet the urgent cybersecurity and IT modernization demands that we need. Now, agencies can apply for projects that previously were out of their reach.

Federal News Network: Please comment on the four priority investment areas: High priority systems, cyber, public facing and shared services — how did you choose those four?

Clare Martorana: These four priority investments areas were chosen because these categories are what typically make up an agency’s IT portfolio.

Federal News Network: Is it up to agencies to make their case to the TMF board for what is a high priority system or public facing system?

Clare Martorana: Agencies are encouraged to submit their proposal to the board because they are their own best advocate. No one knows their IT portfolio and mission better than them, therefore they are best positioned to make the case and explain their proposals to the board.

Federal News Network: What will be the TMF board’s biggest challenge to get the money out the door? Time, people resources?

Clare Martorana: This new model is designed to remove those challenges. Initially we anticipate an influx of initial project proposals, but the TMF board and program management office are prepared and are ready to add program management office (PMO) resources as needed.

Challenges ahead for agencies, industry

The idea that Martorana espoused, that the expanded TMF lets agencies apply for funding for projects that were previously out of reach, is both the biggest opportunity and the biggest challenge.

Gordon Bitko, a former FBI CIO and now senior vice president for policy at the IT Industry Council, said there is a sense of urgency to get the money out the door, but it will require a lot of thought, too.

“I think we will see agencies that are well positioned because they have modernization plans that are already in flight and this is an opportunity to tweak or accelerate it,” he said. “We will also see others who haven’t really been as effective in thinking about modernization may be starting from scratch to get proposals to the TMF board.”

He said another real challenges is the five agencies with political CIOs that haven’t been named yet. The acting CIOs may be hesitant to commit to taking on a “loan” without the backing of a permanent leader.

Matt Cornelius, the executive director of the Alliance for Digital Innovation, an industry association, and a former senior technology and cybersecurity advisor at OMB, said in an email to Federal News Network that the TMF program management office shouldn’t wait to expand its staff because the proposals will be coming fast and furious.

“The board must have access to the best information possible to guide their decision making. This means they can’t just meet periodically and look at the paper forms submitted for their consideration. They must actively — as a body — work with industry, Congress, agencies and any other participants to get a better understanding of the real investment opportunities throughout the government and constantly work to refine and improve their bureaucratic processes to ensure that the most dollars get to the best projects, and produce the best results,” Cornelius said. “They need to communicate the successes of any projects they have funded very aggressively, rather than the way the previous administration operated, which was to say very little about the actual outcomes of any TMF investments — a major missed opportunity. Success begets success.”

Mike Hettinger, president and founding principal of Hettinger Strategy Group and a former Hill staff member, said quickly increasing the staff and resources of the PMO would be an important signal to Congress.

“I think it’s primarily a staff bandwidth issue. They just don’t have the bandwidth right now to effectively review the influx of expected proposals. When we were talking to the Hill about this we encouraged them to set aside a percentage of the $1 billion to ensure they had the administrative resources necessary to properly administer the fund,” he said. “The other hang up is likely to be the quality of proposals. The better the proposal, the easier it’ll be to get them approved. From that standpoint, the June 2 date is a tight turnaround.”

Industry input is important

Bitko agreed that the board should expand not only its staff but how they work with industry and the instructions it gives to agencies.

“I’d like to see more specificity about a number of things, including how the repayment terms will be modified, what constitutes a priority area, what do those look like and how will the board evaluate the proposals? Those are important steps to give agencies in the guidance,” he said.

The challenge of getting the money out the door isn’t a short term effort either. OMB’s guidance said it would give expedited consideration to proposals from agencies submitted by June 2. But the board also will be reviewing proposals on a rolling basis until the money runs out.

Dave Wennergren, the CEO of the American Council for Technology and Industry Advisory Council (ACT-IAC) and a former Defense Department deputy CIO and vice chairman of the CIO Council, said investments must be made and measurable outcomes achieved quickly, or there won’t be any appetite for future significant additional funding.

“It will be crucial that the process for identifying, approving and successfully implementing projects that use the funds be accelerated and streamlined,” he said. “With less than $100 million cumulatively awarded in TMF projects, the process must be optimized to ensure the timely flow of funds for new projects.”


GSA claims success of TDR pilot, but industry experts not sold

Almost five years after launching the Transactional Data Reporting (TDR) pilot, the General Services Administration is reporting success.

Jeff Koses, GSA’s senior procurement executive, said the TDR pilot proves it’s valuable and a worthy replacement for the dreaded Price Reduction Clause (PRC).

“GSA has successfully demonstrated the value of TDR under the existing scope of the pilot. It has shown steady progress over the past four years, met most of the pilot’s objectives in the most recent year, and has made the necessary investments to leverage TDR’s potential in the years to come,” Koses wrote in an April 27 blog post. “We will continue to make improvements, especially in contracting officer usage.”

Jeffrey Koses
Jeff Koses is GSA’s senior procurement executive.

But multiple sources said that although TDR may work on paper, the reality is it’s unclear if any contracting officers are using the data to make decisions or even if the data is valuable enough for acquisition professionals.

“I have not experienced any negotiations based on TDR data in order to form an opinion,” said an industry expert, who requested anonymity because they work closely with GSA. “My clients who are TDR-covered have only been subject to contract-level price comparisons.”

Other sources said GSA’s three-year pilot proved more that it’s possible to use different and possibly better data to make price determinations, but the data is incomplete at best.

The source said customer agencies were hesitant to provide GSA with data on how much they paid for products and services. These issues may come to light in a new GSA inspector general report that it expects to release on the TDR pilot in the coming weeks.

Koses wrote in the blog post that GSA didn’t use the data in 2019 and made no mention of using the TDR information in 2020. He wrote:

“Looking at historical data, the pilot’s overall performance based upon a documented evaluation plan showed steady progress. This includes:

  • FY 18 results revealed that overall price position was maintained and burden lowered. However, data remained questionable, and no buying strategies resulted.
  • FY 19 results revealed substantial improvement in data completeness and in small business performance. However, we saw that the data hadn’t been used and data policy gaps existed.
  • FY 20 results revealed that data completeness, contract-level pricing and small business metrics all exceeded targets.”

While industry experts said the results of the TDR pilot and their individual experiences led them to see the effort in a new light, GSA still needs to be more transparent about the data and how it’s being used.

Larry Allen, president of Allen Federal Business Partners and a long-time GSA observer, was an outspoken critic of TDR when it started, writing a column with the headline, “Run, don’t walk from Transactional Data Reporting rule.”

But Allen said now he has come around on TDR.

“I think TDR has grown into a viable option for many businesses. I will admit to having been a skeptic, but I think that, so long as a company keeps good records on what they provide to their GSA contracting officer, TDR can be a good way for some companies to obtain a schedule contract that otherwise might not be able to,” Allen said in an email to Federal News Network. “GSA may have even been a little ahead of the curve here in terms of TDR attracting non-traditional contractors, something that is very sought after now in Defense Department and even civilian agencies.”

IG acceptance still a big obstacle

Others experts said it may be time for TDR to move on from the pilot stage and into real production.

Alan Thomas, the former commissioner of the Federal Acquisition Service and now chief operating officer at IntelliBridge, said it seems TDR is bearing fruit but how much is unclear.

“The claims in the blog post are great but will be even more compelling with accompanying data,” he said in an email to Federal News Network. “Stepping back from the pilot and thinking about full implementation, GSA is going to need a strategy for working with the inspector general on TDR. The IG has a lot invested in the current price reduction clause regime and won’t move away easily from it.”

The IG’s acceptance and use of TDR remains a huge challenge. Auditors regularly issue audit reports that highlight problems with GSA’s pre-award and post-award audit efforts. In April 2020, the IG issued a report that said GSA may have missed out on potentially $1.1 billion in savings through pre-award audits of prices.

Jennifer Aubel, a principal consultant with Aronson, a consulting firm, wrote in a July 2019 blog post that as more companies participate in TDR, the IG’s ability to audit prices before an award is made is becoming harder.

“Under the TDR pilot, the population of auditable contracts has ostensibly been cut in half. When you remove the major resellers and integrators, what remains are largely professional service contractors and products companies under Schedules 84 (Law Enforcement), 71 (Furniture), and 66 (Scientific),” Aubel wrote. “The audit threshold for annual sales is also reduced due to the smaller pool of contracts from which the OIG is selecting. Small businesses who would have never been a blip on the OIG’s radar are now at much higher risk of pre-award audit.”

Aubel wrote that GSA’s systems, in 2019, were not equipped to produce a “dynamic market driven pricing model.”

Impact of unpriced contracts unclear

Now almost two years later, it’s unclear if the systems are able to provide dynamic pricing, and more companies have moved to TDR, making the PRC less valuable.

Another complicating factor with moving away from the price reduction clause for the IG is it’s invested heavily in people and processes to perform those audits so refocusing those resources would be a challenge.

Thomas said the biggest issues with TDR data when he was FAS commissioner were quality and access, and both of these have been addressed.

“We put a small team of FAS leaders (Stephanie Shutt, Judith Zawatsky and Mark Lee) working with Jeff’s team in OGP to improve data quality and broaden access to the data. As a result, I believe the data being collected is now cleaner. Vendors are better trained on how to enter it and are more comfortable with the reporting regime,” Thomas said. “Likewise, regarding access, GSA erred initially on the side of caution. Pricing data is sensitive, but there were only a handful of people who had access to the TDR data. If you are a contracting officer, you can’t use what you don’t have! The team developed and implemented a reasonable set of safeguards that enabled more people to access TDR data.”

Thomas added GSA’s next steps should be to get the data in the hands of contracting officers, who can use the pricing information along with analytic tools and process automation software to make more strategic decisions and improve mission success.

One complicating factor in all of this is GSA’s move toward unpriced contracts under Section 876 of the 2018 National Defense Authorization Act.

This makes the price reduction clause and even TDR less necessary because it puts the burden on vendors to provide the lowest prices as part of contract negotiations.

Industry sources said GSA also can’t move away from TDR easily because of the investment companies have made into the systems to collect the data.

GSA’s Koses wrote that now with five years of pilot behind them, it will train contracting officers on the benefits of having access to more granular prices paid information and to support these efforts with management guidance, as necessary.

He said GSA will also refine and consider:

  • The ability of Federal Supply Schedule contracting officers to leverage transactional data for price negotiations in lieu of commercial sales practices (CSP) and price reduction clause (PRC) disclosures;
  • The impact of an expanded data collection on GSA’s ability to leverage the data it currently collects;
  • Impacts on current and future GSA schedule contractors;
  • Communication to industry partners ahead of changes;
  • Training and tools for category managers that are currently not impacted by TDR; and
  • Potential impacts on other FAS initiatives, such as MAS Consolidation and implementation of Section 876 of the FY 2019 National Defense Authorization Act.

Thomas said vendors need to keep these changes in mind and potentially invest in back-office systems to more easily collect and report pricing data.

“When this is the case, vendors need to make sure the government understands the art of the possible with current systems and, if an investment is required, what’s the size and scope of that investment,” he said. “Sometimes the government asks for things and doesn’t understand the full implication of the ‘ask.’ Leaders at GSA are reasonable but not clairvoyant, so keep an open line of communication.”


Sen. Hassan’s technical fix of MGT Act is a major step to overhaul latest IT modernization challenges

The U.S. Agency for International Development asked for the authority to establish a working capital fund for IT modernization in its budget request in 2019, again in 2020 and again in 2021. But as Congress finalized each fiscal year’s budget, appropriators ignored USAID’s request.

Jay Mahanand, USAID’s chief information officer, told members of the House Oversight and Reform Subcommittee on Government Operations during an April 16 hearing that despite support from the Trump administration and the Office of Management and Budget, it was unclear why the working capital fund request failed each year.

USAID isn’t alone. The departments of Education and Commerce also failed the appropriators’ gauntlet. So far, only the Small Business Administration persuaded lawmakers to give them the authority to set up a working capital fund and transfer unexpired funds into it. SBA says in its fiscal 2021 budget request it expects to have $4 million in 2020 and another $2 million in 2021 in the fund.

Sen. Maggie Hassan (D-N.H.), chairwoman of the Homeland Security and Governmental Affairs Subcommittee on Emerging Threats and Spending Oversight, plans to fix this long-standing problem.

Maggie Hassan
Sen. Maggie Hassan (D-N.H.) plans to introduce a technical correction to the MGT Act.

At the subcommittee’s April 27 hearing on legacy IT, Hassan said she plans to introduce a technical amendment to fix the Modernizing Government Technology (MGT) Act, which initially authorized every agency to create an IT modernization working capital fund. Through the WCF, agencies can bank saved money, which then can be used for future IT modernization projects.

A Hassan aide told Federal News Network that expanding working capital fund authorities was one of the main pieces of feedback from the agencies that responded to Hassan’s legacy IT oversight letters last summer, and is one of the pieces that Hassan is looking at for potential legislation. The aide didn’t say how the Senator planned to fix the MGT Act.

“This hearing is the first in a series of hearings that Senator Hassan will hold on legacy IT as chairwoman of the subcommittee,” the aide added. “Our office is working to coordinate another hearing focused on specific ways federal agencies and Congress can work to advance IT modernization and save taxpayer dollars.”

In many ways, the WCF is the more powerful authority granted to agencies under the MGT Act, rather than the much-heralded and watched Technology Modernization Fund (TMF).

Had to take money from somewhere

Max Everett, the former Energy Department CIO, told the Senate subcommittee having a working capital fund would take care of long-term budget planning challenges many agencies face.

“Much of my experience, to be very frank, was robbing Peter to pay Paul. In most cases to do those modernizations, you are going to have to take money from somewhere,” Everett said. “I know that there is long held concerns about WCF turning into slush funds and things of that nature. But I think that simply means they need to have the appropriate oversight, but they would allow that level of longer term planning.”

The lack of that long-term planning is another area where agencies have fallen short on.

Kevin Walsh, the Government Accountability Office’s director of IT and cybersecurity, said it was disheartening that when his office looked at 10 agencies in 2019, three didn’t have a long-term IT modernization plan, five had some aspects of a plan and only two had a firm idea of what needed to be done.

“Having these plans is valuable, just getting agencies to think about it. Agencies that don’t have a documented plan, we aren’t sure what kind of resources they are able to throw at it, what kind of timeframes, and even the scope of the project,” he said. “Having some idea of what needs to be done is the most fundamental step.”

Walsh pointed back to a 2016 OMB memo that would’ve required agencies to create and follow these plans as one reason for the problem.

2016 policy never finalized

Tony Scott, the former federal CIO, who authored that draft policy, said in an email to Federal News Network that the goal “was to institutionalize a set of practices that would, at budget formulation time, identify for agency leadership and for appropriators top priority systems for upgrade and replacement. What I was looking for was to force deliberate decision making at budget time to either A.) accept the risk that legacy systems presented, or, B.) put money in the budget to do something about it.”

He said the annual review of top-tier systems would’ve reviewed cybersecurity and privacy risk, whether the system was still serving the mission well, the cost and whether there was a way to reduce what an agency was spending to support the system and whether the system needed people or software that was harder and harder to find.

“The ideas was to force this exercise on a periodic basis (i.e. every budget year or two), so that no-one could hide and say, ‘we didn’t know, ‘or ‘no-one told us’ about these inherent risks,” he said.

Matthew Cornelius, the executive director of the Alliance for Digital Innovation, an industry association, and a former senior technology and cybersecurity advisor at OMB, said in an email to Federal News Network the draft policy received hundreds of comments, but OMB never finished reviewing them when the election of Donald Trump occurred.

“With a change in administration came a change in focus and priorities, including around IT,” he said. “Rather than take the focus the draft memo outlined, the Trump administration chose a different set of priorities to tackle first, including their own Report to the President on IT Modernization, which including lots of tasks and outcomes that drove modernization in the first half of his term. Then, with the MGT Act passing and the TMF getting stood up, OMB and GSA realized that the highest priorities for agencies in applying for TMF dollars wasn’t necessarily whole-cloth legacy system replacement, so they decided to give more flexibility to agencies in the application process (as provided in M-18-12).”

The Congressional budget process makes planning to move away from legacy IT even more difficult, former agency CIOs told lawmakers.

Hassan said she advocated for biennial budgeting, where Congress makes funding decisions in year one and does oversight in year two.

“The current one-year cycle often leads to hasty decision making and neglects capital investments that take several years to implement,” she said.

Goes back to convincing the appropriators

Renee Wynn, the former NASA CIO, said every time an agency crosses a fiscal year with an ongoing IT project, the risk increases because of possible loss of time and/or people.

“Now you’ve disrupted your project and most likely extended when you will get that project done. That extension, if it goes on too long, means you are potentially using software that is no longer considered modern, available or could reach end of life by the time you get that system back in operation after it has been modernized,” Wynn said. “I would take my total budget and create a reserve. That reserve would be used to make sure the most critical or highest risk projects would get funding for sure going into the secondary years of their project. That way I knew they could be able to continue. If I didn’t do that, I’d run the risk of work stoppage, and then I could lose the talent of my staff, of staff from other mission areas or mission support or even contractor staff, and that would again start to slow down and add more risk to my project.”

With little expectation that Congress will move to a biennial budgeting cycle, the WCF authority becomes more critical.

Cornelius said one thing Hassan should consider as the tries to fix the MGT Act is the need to convince the appropriators why the transfer authority is so important.

“The main thing agencies need to do is to keep making the case to their appropriations subcommittees on what their IT modernization priorities are, how they would ensure appropriate oversight and execution of projects funded by an IT working capital fund, and to provide appropriations staff opportunities to provide their own recommendations,” he said. “Building this trust is key to getting Congress to provide agencies the flexibility they need to manage their finances in a way that best help achieve their IT and security outcomes.”

It’s clear former Rep. Will Hurd (R-Texas), Rep. Gerry Connolly (D-Va.), Sen. Jerry Moran (R-Kan.) and former Sen. Tom Udall (D-N.M.) — the co-authors of the MGT Act — wanted to give all agencies the authority to create WCFs without appropriators’ approval. With a working capital fund, IT modernization planning becomes more critical, meaning agencies would have a clear path to solving two of the biggest remaining challenges to moving off legacy systems.


Is CISA’s third cyber emergency directive in five months a sign that things are getting worse?

When the Cybersecurity and Infrastructure Security Agency released its third emergency cyber directive in the last five months, agencies were once again on notice to fix yet another critical vulnerability.

Last week’s directive detailed a potential major problem with the virtual private network software from Pulse Secure. CISA gave agencies until April 23 to identify all instances of the software and run the Pulse Connect Secure Integrity Tool. Along with this latest directive, CISA told agencies to patch Microsoft Exchange servers in March and another one for the SolarWinds vulnerability in December.

This type of fire drill is becoming far too common for agencies, and really every business, as the cyber threats seem to be ramping up, particularly against companies with global install bases.

“In thinking like an attacker, they go after Microsoft Windows because everyone has Windows. Now they are saying, who else has the biggest market share of infrastructure or products and let’s go after them,” said John Pescatore, a director at SANS. “With one exploit, they can get into 70% of the networks. That is a big target. ServiceNow is another one that we have been warning about.”

The large install base combined with the greater reliance on technology, specifically software, means agencies aren’t necessarily facing more cyber attacks, but the potential for serious harm is much greater. This is especially true as agencies relay on connected devices and internet of things sensors or control systems that are connecting to the network or public internet.

Pescatore and other cyber experts agreed that the current cyber threats are no worse today than they have been, but with the sharp increase of supply chain attacks combined with the pandemic forcing employees to work from home, CISA and agency chief information security officers seem to be constantly on high alert.

John Banghart is the senior director of technology risk management at Venable and the former National Security Council’s director for federal cybersecurity during the administration of President Barack Obama. (Photo credit: Venable)

“Vulnerabilities have been and will continue to be a long standing problem. There aren’t more vulnerabilities than before, but there is more software and as our dependence on it grows those vulnerabilities are more wide spread. Solarwinds is a perfect example where a single vulnerability created a massive exposure,” said John Banghart, the senior director of technology risk management at Venable and the former National Security Council’s director for federal cybersecurity during the administration of President Barack Obama. “We saw the same thing with the Heartbleed vulnerability in 2014. That was the first time the government, and really everyone, had to think about cybersecurity attacks at this kind of scale. We knew there was a vulnerability, but we didn’t know who was or wasn’t vulnerable.”

Banghart said CISA is in better shape today than in 2014 with the authority to scan civilian agency networks and issue directives. At the same time, however, the agency’s insight into civilian agency networks remains limited.

“DHS is more effective in recognizing and sharing what the vulnerabilities are and how to fix them. But currently their only course of action right now is the ‘hair on fire’ approach where they push out this directive and rank it high because they don’t know how vulnerable agencies are so they just have to push out because it’s severe and everyone is in this worse-case scenario,” he said. “That is why we need a lot of effort to get to the fundamental problem of who is vulnerable and who isn’t and what the potential impact is. We need to be able to score it in a way that is more nuanced and more applicable to a specific organization. We have a lot of different scoring systems today and the problem isn’t just for the U.S. government, but a problem across the entire internet.”

Banghart said he is organizing a group of private sector companies and other experts to come up with a new standard vulnerability scoring system.

“Our goal is to help ensure that end user organizations have the ability to influence the standards on which they depend. If you look at a lot of work Common Vulnerabilities and Exposures database or the Common Vulnerability Scoring System (CVSS), a lot of it is being done by academics and security tool vendors and government folks but other critical sectors are all underrepresented. How do we ensure they are getting valuable and meaningful information?” he said. “We need a better refined and more nuanced scoring system that is not just a number that says this is a 9.5 out of 10. That isn’t super helpful, but that is what we have today.”

Suffering from threat fatigue?

This type of scoring system would help address other challenges these emergency directives highlight.

Frank Cilluffo, the executive director of the McCrary Institute at Auburn University, said there is some concern over alert or threat fatigue.

“The threats are dictating the pace, but from another perspective we need to be able to walk and chew gum because other shoes may have dropped that we are unaware of or could drop soon enough,” he said. “In a weird way, we are letting our adversaries define our strategy. We are shaping our strategy around them, and it should be the other way around. To do that, it’s partially a matter of greater awareness, partially more clarity around incidents because of situational awareness improvements and partially the adversary has a vote in how they are acting.”

Banghart added any new directive will interrupt a security office’s workflow and force them to make resource decisions that may have unintended consequences later on.

“A good chief information security officer is ensuring the mission of their agency is able to function. That can sometimes mean making a decision to patch something or not,” he said. “When you get back to back to back directives, that is on top of other vulnerabilities that don’t just come from an emergency directive, you have decide what else doesn’t get done today or this week”

Pescatore said the directives highlight the workforce challenge agencies, and really every organization, faces for cyber talent.

The Center for Strategic and International Studies says in 2019, CyberSeek, an initiative funded by the National Initiative for Cybersecurity Education (NICE), estimated the United States faced a shortfall of almost 314,000 cybersecurity professionals. CSIS also says according to data derived from job postings, the number of unfilled cybersecurity jobs has grown by more than 50 percent since 2015.

1 in 3 cyber jobs go unfilled

In the public sector, the workforce challenge is even bigger. The Cyberspace Solarium Commission says more than one in three cybersecurity jobs in the public sector go unfilled.

“Private industry was hit hard to apply the same patches, but the government, in particular, is suffering from brain drain with skilled security people leaving. They have not invested in hiring or training people. They have spent a lot of money that has been budgeted for cyber buying products for initiatives under the Continuous Diagnostics and Mitigation (CDM) or EINSTEIN programs, which focused on detecting the bad guys but not focused on fixing the computers. The patching is something an IT organization does and the government has been slow to address the patching side of the problem because patching needs skilled people and agencies don’t have enough of them.”

Cilluffo said the directives are forcing agencies to improve their situational awareness. While CDM and other tools have helped over the years, the urgency of these threats gives agencies a more granular view.

“You have to understand what is the real intent behind the adversary’s attack. Is it IP theft or secrets from espionage or criminal enterprises using for ransomware? You’ve got to look at it through these lenses and then decide how to respond,” he said. “From an adversarial perspective, what is the cost and the consequences of an attack that will induce change? People have been getting away with cyber murder and I’m hoping to start to see actions and steps that are not only reacting, but proactively shaping our deterrence mechanisms.”


Lawmakers want to end 8-year debate over the definition of data centers

The Oversight and Reform Committee on Friday celebrated the 11th version of the Federal IT Acquisition Reform Act (FITARA) scorecard, which it released in December.

The good news abounded from cost savings and avoidance — $22.8 billion from PortfolioStat — to no agency receiving a grade below a “D”  to the retirement of the software licensing category because every agency received an “A.”

It’s clear that FITARA is having the intended impact Reps. Gerry Connolly (D-Va.), Darrell Issa (R-Calif.) and others had hoped for when Congress passed and President Barack Obama signed the bill into law in 2014.

The latest hearing on April 16 highlighted some of the ongoing challenges that agencies, Congress and the Government Accountability Office haven’t been able to solve in the six-plus years of FITARA.

Here are three takeaways from that hearing:

The enduring disagreement of the definition of data centers

It wouldn’t be a FITARA hearing unless some member — usually Connolly — moaned about the Office of Management and Budget’s decision in 2019 to, once again, change the definition of data centers.

It’s been eight years and two administrations that the Office of Management and Budget and Congress haven’t seen eye-to-eye on the definition of data centers. A quick history reminder: At the heart of the matter is OMB has floated between having agencies focus on savings and optimization over the last eight years under the data center consolidation and optimization initiative and it removed the requirement to track and close non-tiered, or smaller, data centers. Meanwhile, Congress and GAO have wanted agencies to close down these facilities to both save money and address potential cyber vulnerabilities.

It’s unclear where the Biden administration’s OMB, and Federal Chief Information Officer Clare Martorana, will weigh in on this topic. She was scheduled to testify on Friday, but was unable to because of a family emergency. Martorana’s efforts on data center consolidation and optimization at her former agency, the Office of Personnel Management, provides little insight. OPM has only one data center so consolidation isn’t necessary an option, but optimization has resulted in saving more than $36 million, according to the IT Dashboard.

If the Biden administration comes down in favor of closing more data centers along with optimization and bringing back the requirement to track and close non-tiered data centers, then a bill from Rep. Katie Porter (D-Calif.) may not be necessary.

Porter said the subcommittee may have to consider legislative solutions to ensure OMB is following through Congress’s intent under FITARA.

Connolly, who has been banging the data center drum for several years, piled on.

“You don’t get to come into compliance with FITARA by redefining what a data center is and you don’t get to come into compliance by substituting a word in the law with another that suits your purposes better and gets you off the hook,” he said. “We will insist with compliance with the law. If we have to further refine legislative language to make it very clear and unfortunately more restrictive, we will.”

Kevin Walsh, the director of Information Technology and Cybersecurity Issues at GAO, told the subcommittee agencies saved $5 billion since 2015 by closing and optimizing data centers.

He said GAO continues to encourage OMB to track and emphasize the need to close non-tiered data centers. Porter went a step further and asked whether OMB is even paying attention to GAO’s recommendations.

“It’s a push-pull. We work as collaboratively as we can but sometimes it does feel like it’s more us talking and them not listening,” Walsh said. “There are times when we have worked very collaboratively, and I don’t want to disrespect OMB or the good work they do, but on certain issues we don’t see eye-to-eye.”

It seems it’s time for OMB to have a hard conversation with the committee and GAO about the definition it will use going forward and if it’s not the one lawmakers and GAO believe is correct, then what can be done about it beyond legislation. The fact that this debate has been going on for almost eight years may be good fodder for reporters, but it’s ludicrous that a compromise can’t be struck.

CIO authorities debate settled?

Eight agencies, including the departments of Justice, Labor and State, still do not meet the spirit or intent of the Clinger-Cohen Act of 1996. These eight do not have their chief information officer report directly to the head of the agency.

Lawmakers often point to this shortcoming as a reason why agencies struggle with managing technology.

While GAO said five agencies improved their reporting structure over the history of the FITARA scorecard, only the Department of Health and Human Services improved its CIO reporting structure between the 10th and 11th versions.

Connolly said 21 of 24 still not have established policies detailing the role of their CIO as required by law and guidance.

Despite this limited progress, GAO’s Walsh said FITARA has given CIOs a larger voice in the oversight and spending of the IT budget. He said there is no better example than the accumulated savings or cost avoidance of $22.8 billion from the PortfolioStat program.

“We also have seen incremental progress with CIO authorities. It’s harder to measure which have a seat at table that they didn’t have before, but five who are now reporting to the head of the agency is the most important metric,” Walsh said.

Connolly asked if he’s seen any backsliding of CIOs standing. Walsh said he was not aware of any, mostly due to the attention of the committee.

It’s clear over the last year that technology makes the agency run smoothly.

“The coronavirus pandemic has highlighted that CIOs are more essential now than ever before,” Connolly said. “Nearly every federal program, service, and function relies on IT to work. It is among the duties of the CIO to plan for agency IT needs, including the resources required to accomplish the mission. Outdated legacy systems, software and hardware, however, continually prevent agencies from providing the service the American public expects and deserves.”

But the question, then, has to come back to whether agencies have outgrown the need to mandate CIO’s have a seat at the table.

Labor CIO Gundeep Ahluwalia doesn’t report directly to the secretary, but he’s managed to revamp the agency’s approach to IT modernization, save more than $70 million and centralize common functions.

He said during the pandemic 95% of Labor’s workforce moved remote without any interruptions, and the agency onboarded 1,500 new staff virtually. Labor also closed 73 data centers, is tracking tiered and non-tiered data centers and disconnected 70% of their telecommunications and network circuits from the expiring Networx contract.

“Under the agency CIO authority enhancements category, the department has ensured IT projects use an incremental development methodology. This facilitates a regular cadence of new functionality and continuous feedback to ensure the desired outcome of program areas during the development stage,” he said in his written testimony. “The goal is to deliver value over time. In the Transparency and Risk Assessment category the department re-evaluated and refined its risk assessment criteria for major IT investments to allow the department to apply an appropriate level of focus based on the varying level of risk.”

It’s true that Labor may be an outlier when it comes to CIO authorities, but the pandemic proved the critical role technology plays and it’s hard to believe anyone will soon forget it.

Another attempt to drive performance metrics

Congress is ready to take another bite at getting agencies to measure the performance of their programs. This attempt has a distinct IT modernization flavor.

Connolly and Rep. Jody Hice (R-Ga.), the ranking member of the Subcommittee on Government Operations, introduced the Performance Enhancement Reform Act on April 16. It would be at least the third major bill to attempt to get agencies to develop metrics and success factors in a new way.

“This important piece of legislation would require agency’s performance goals to meet the demands of the ever-changing performance management landscape and include data, evidence, and IT in their performance plan,” Connolly said. “The bill would also require agencies to publish their technology modernization investments, system upgrades, staff technology skills and expertise, and other resources and strategies needed and required to meet these performance goals.”

The bill also would:

  • Require agencies’ chief performance officers, where applicable, to work in consultation with the chief human capital officer, the CIO, the chief data officer, and the CFO to prepare the annual performance plans;
  • Require performance plans to include descriptions of human capital, training, data and evidence, information technology and skill sets needed for the agency to meet the agency’s performance goals.

“You’d think the things this bill requires would be common sense — when making an agency performance plan, use the people with the right expertise and take into account what it’s going to take to set realistic performance goals and make the plan work,” said Hice in a statement. “But that’s not always the case. This costs valuable resources, and it has to change if we want to stop wasting time and money. [W]ith the Performance Enhancement Reform Act, we are taking a step forward in bringing the federal government into the modern era by requiring agencies to coordinate better with key agency leaders and best utilize resources when creating annual performance plans. This will help maximize agency human capital, technology, and time in order to better serve American families and businesses.”

The bill comes as the debate over how valuable performance metrics are heated up over the last few months. First OMB, under the Trump administration, removed the section of Circular A-11 that required these measures. Then OMB, under the Biden administration, put the section back into A-11.

OMB launched Performance.gov in 2011 to hold agencies accountable and issued a playbook in 2016 to help agencies make more progress.

Despite all of these efforts, GAO found in 2018 that the use of performance information to make policy decisions hadn’t changed much between 2013 and 2017.

Another bill doesn’t sound like the answer to driving performance. It sounds like a combination of training, data and consistent oversight by Congress and OMB would be more effective.


Exclusive

OMB, OPM to set up new hiring assessment line of business as part of IT modernization push

On the surface, the difference between the Biden administration and the three previous ones’ focus on IT modernization may seem like $1 billion — the amount Congress approved for the Technology Modernization Fund as part of the American Rescue Plan.

But taking a step back and digging beneath the TMF’s pot of gold, you’ll see how the last 15 years of this journey toward IT modernization is culminating under the Biden administration’s purview.

Take, for example, digital signatures. This technology has been around since the late 1990s, but only in the last year did agencies fully realize its potential. Now the Office of Management and Budget is telling agencies in the budget passback, which Federal News Network obtained, to “accelerate the adoption and utilization of electronic signatures for public facing digital forms to the fullest extent practical in alignment with OMB Memorandum M-19-17 and OMB Memorandum M-00-15.”

Keep in mind, M-00-15 is from 2000 — 21 years ago. That is how long agencies have been looking at electronic signatures. But it was only since the COVID-19 pandemic began that the government realized, “hey, this digital signature thing actually works and we, for the most part, already own the technology.”

Electronic signatures is one of four areas OMB wants agencies to focus on when it comes to modernizing websites and digital services. The others include:

  • Digitizing forms and government services;
  • Electronic consent and access to individual’s records; and
  • Interoperability of data to benefit public-facing services by using tools like application programming interfaces (APIs).

“Agencies should continue to prioritize and identify funding for the modernization requests on websites and services that are highly utilized by the public, or have been identify as a high impact service provider (HISP), or are otherwise important for public engagement, as outlined in the 21st Century Integrated Digital Experience Act, (P.L. 115-336),” the passback document states.

The push for digital signatures is illustrative of how the IT modernization pieces are coming together for the Biden administration. Without the pandemic, there is no reason to believe agencies would implement digital signatures on a broader scale given the rebuffs of the last two decades.

IT modernization is a bull market

The experience during the pandemic, combined with the financial support from Congress and the continued evolution of technologies like cloud, robotics process automation and data analytics opens the door wider for the Biden administration than any of the others in the last 20 years.

The TMF and the even more powerful, but less celebrated, working capital funds from the Modernizing Government Technology Act are among the most important tools Congress has given agencies in the last 25 years. This is especially true if OMB relaxes the payback model for the TMF so the loans become more like grants.

These are all reasons why IT modernization is such a bull market across agency and vendor communities.

Maria Roat, the deputy federal chief information officer, said at the recent CompTIA webinar that the TMF is about accelerating projects and enabling multi-year funding.

Maria Roat, small business administration,
Maria Roat is the deputy federal CIO.

She said the $1 billion in the TMF coffers today will go for a variety of projects including those that focus on protecting “high value assets, improving public citizen services across the federal government, as well as improving and balancing some of the foundational technical maturity across the federal government and those common scalable services that can really drive cost efficiencies across the federal government.”

Roat pointed to one of the TMF funded projects for the Department of Housing and Urban Development as an example of the type of projects the board is looking for. HUD received $13.8 million to modernize five legacy mainframe systems.

“HUD mainframe modernization, there’s a playbook coming out of that. So other agencies, they’re going through their mainframe modernization, they can take lessons learned from HUD and apply that,” she said. “As we as we look to scale and accelerate the board, there’s a lot of things that we’ve already done over the last three years, as we’ve matured, that we can apply to the future funding.”

The maturity that Roat talks about has soared during the pandemic when all of the non-IT leaders realized the importance of online services, applications and systems.

Roat said the TMF is not a CIO program. It’s one for CXOs who want to “drive the success of the program.”

Required to fund customer service efforts

That comes clear in the passback language.

OMB doesn’t just highlight technology initiatives, but agency goals that are underpinned by technology.

Another hot button issue that rose during the pandemic was customer service and specifically the way agencies reach citizens.

This is another example of a long-time goal, dating back to at least 1993 with the Government Performance and Results Act, which had a stated goal to “improve federal program effectiveness and public accountability by promoting a new focus on results, service quality, and customer satisfaction.”

While OMB has made customer service a cross-agency priority goal, issued more than a half a dozen memos and directives, playbooks and executive orders, and Congress has passed the GPRA Modernization Act and the E-Government Act, progress toward real improvements have seen its fits and starts.

The pandemic, however, helped drive home the old ways agencies dealt with citizens — coming into the office, massive call centers and even the old Pueblo, Colorado public service announcement from the 1980s just weren’t cutting it any more.

So OMB is building off the digital transformation COVID-19 forced many agencies to undertake by requiring agencies to spend money on these services.

“The administration is implementing a comprehensive approach to improving the equity, access, and overall delivery of federal services, which includes improving customer experience management,” the passback stated. The “levels included in passback support your High Impact Service Providers (HISPs) implementing the actions identified in their CX Action Plan. As a HISP, [your agency ] should leverage funding provided to prioritize the alignment of customer feedback efforts with governmentwide measures and other activities that increase their programs’ ability to design and deliver services for the American public, consistent with the maturity model and governmentwide measures provided in Section 280 of OMB Circular A-11.”

You can talk all you want about technology and customer experience, but the pandemic, once again, provided a stark reminder that agencies are only as good as their people. It’s trite to say “people are our greatest asset” when time and again actions taken by Congress or the administration undercuts that idea.

Speeding up the hiring process

This is another opportunity for the Biden administration to build on the Trump administration’s upskilling and reskilling programs, pick up the Obama administration’s hiring reform effort and add its own mark on the process that many point to as the one main reason the government struggles in so many ways.

To this end, the Biden administration is set to take the first steps by setting up a new Hiring Assessment Line of Business and requiring agencies to fund it in 2022.

“The LoB will support the implementation efforts of effective assessments and related efforts including governmentwide hiring actions and shared certificates. These contributions must not come from existing HR budgets,” the passback stated.

The Office of Personnel Management and OMB will jointly manage the Hiring Assessment LoB and the program office will reside in OPM.

Along with the new LOB, OMB wants agencies to spend money on “rebuilding their HR workforce to support recruitment and hiring efforts.”

“Agencies are expected to allocate funds in FY 2022 for: (1) dedicated employees to form talent teams (ideally at the agency component level); (2) tools to improve hiring assessment processes; and (3) internship and Pathways Program improvements,” the passback stated. “By June 30, 2021, agencies are required to send a plan to OPM and OMB that includes milestones and expectations, broken out by components, as appropriate, for how they will form a dedicated team responsible for transitioning to using effective hiring assessments for all competitive actions as well as steps to improve their internship/pathways program.”

Few would argue that these policy directives are a good first step, but the real test will be how OMB and its new deputy director for management, Jason Miller (I know, it’s weird to write that) — once the Senate confirms him — supports, oversees and encourages agencies to not just build on the lessons of the pandemic, but normalize these and other efforts that worked so well over the last year and not take a step backward.


Inglis, Easterly tapped for cyber leadership roles at White House, CISA

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The White House finally moved to fill two of the remaining and most important technology leadership roles in government.

President Joe Biden said yesterday he plans to nominate Chris Inglis, the former deputy director of the National Security Agency, to be the new national cyber director in the White House, and Jen Easterly to lead the Cybersecurity and Infrastructure Security Agency in the Department of Homeland Security.

Chris Inglis
Chris Inglis is the former deputy director of the National Security Agency and has been nominated to be the new national cyber director.

These two positions, along with the Energy Department’s Office of Cybersecurity, Energy Security, and Emergency Response (CESER), were among the most prominent unfilled roles in the Biden administration.

Energy also announced yesterday Puesh Kumar will serve as acting principal deputy assistant secretary for CESER.

The president also plans to nominate John Tien to be deputy secretary and Robert Silvers to be the undersecretary for strategy, policy and plans at DHS.

Both Inglis and Easterly come from the private sector, but have deep ties in government. The Senate has to confirm both to their respective roles.

“If confirmed, Chris and Jen will add deep expertise, experience and leadership to our world-class cyber team, which includes the first-ever Deputy National Security Advisor for Cyber and Emerging Technology Anne Neuberger, as well as strong, crisis-tested professionals from the FBI to ODNI to the Department of Homeland Security to U.S. Cyber Command and the National Security Agency. I’m proud of what we are building across the U.S. government when it comes to cyber,” said Jake Sullivan, the National Security Advisor, in a statement. “We are determined to protect America’s networks and to meet the growing challenge posed by our adversaries in cyberspace — and this is the team to do it.”

Along with Senate confirmation, the next step is for the White House to complete its 60-day review of the new cyber director position.

Rep. Jim Langevin (D-R.I.) said in an interview with Federal News Network that the White House’s review will help better outline the scope of the new cyber position.

“Looking at this holistically, what I know is important to Chris and to me, is we now have a point person in charge that will coordinate our cyber defensive strategies and a person who now has the policy and budgetary authority to reach across government and compel departments and agencies to step up their game on cybersecurity and make sure they are addressing vulnerabilities,” he said. “The closest we had before was a cyber coordinator, but they lacked policy and budget authority or any teeth. It was more of a coordinating role. This is a position with the right authorities and will make a big difference for the country writ large.”

Langevin praised both pending nominations, having worked with Inglis in the past both during his time at NSA and as a commissioner on the U.S. Cyberspace Solarium Commission.

Jen Easterly has been nominated to lead the Cybersecurity and Infrastructure Security Agency in the Homeland Security Department.

Easterly also worked at NSA as the deputy for counterterrorism as well as the White House during the Obama administration as the special assistant to the president and senior director for counterterrorism.

Before returning to federal service, Easterly was the head of firm resilience and the fusion resilience center at Morgan Stanley where she was responsible for ensuring preparedness and response to operational risks to the firm.

She also served as the cyber policy lead for the Biden-Harris Transition Team.

Easterly takes over for Chris Krebs, who was forced out by the White House in November.

She will run a CISA organization that has transformed over the last decade. It now has more authorities to oversee federal networks through tools like Binding Operational Directives (BOD) and threat hunting, which came to CISA in January as part of the Defense authorization bill.

CISA also is the beneficiary of Congressional support to the tune of $650 million in new funding from the American Rescue Plan. The Biden administration also is asking for another $110 million for CISA in the fiscal 2022 budget.

Even with both Inglis and Easterly’s nominations and expected confirmations, they still do not answer the question of who is ultimately responsible for federal cybersecurity.

Lawmakers in February called the current approach clunky and inadequate.

“I am encouraged to see the intent to nominate a director for the helm of our nation’s lead federal civilian cybersecurity agency. Ms. Easterly brings substantial credibility and a reputation of working productively between government and the private sector to increase the cybersecurity resilience of the nation,” said Rep. John Katko (R-N.Y.) ranking member of the Homeland Security Committee, in a release. “On the heels of the SolarWinds cyber campaign and the compromise of the Microsoft email server, CISA has found itself at the forefront of two significant, national cyber incidents in just the last few months. As a nation, we are at a crossroads in our strategy to defend and secure the .gov cyber space, and strong leadership is essential. I continue to call on President Biden to support desperately-needed changes to allow CISA more centralized, real-time visibility into the entirety of the civilian .gov and put CISA on a much needed path to becoming a $5 billion agency.”

Langevin said the White House cyber director has been a position he has been trying to create for more than a decade, in part to address the lack of centralized oversight and authority. He said Inglis will be the point person dealing with both internal and external to government cyber threats and challenges.

He said Easterly needs to focus on building CISA cyber capabilities.

“Right now they rely heavily on defense support of civil authorities, which is fine because they get the support they need,” Langevin said. “But going forward, we can’t and we should not rely on defense support civil authorities. For CISA to effectively fulfill their mission, it needs to grow its own cyber capabilities and we look forward to supporting that.”


GSA, DHS making big push to address shortcomings in contractor assessments

If the federal acquisition workforce is ever going to make contractor evaluations meaningful, it’s going to happen this year.

The General Services Administration and Department of Homeland Security are offering two different, but equally important initiatives that either will prove that the federal community cares about past performance as a key evaluation factor or has been playing lip service to the issue since 2009.

Over the past 11 years, successive memos from the Office of Federal Procurement Policy encouraging agencies to do more research and evaluation of contractor performance on contracts have had little impact.

“We think there is a clear appetite for Contractor Performance Assessment  Reporting System (CPARS), but contracting officers and industry also know the current CPARS process is broken. I think OFPP hears it from contracting officers that it’s burdensome, and they hear from contractors that it’s not resulting in fair and accurate ratings,” said Mike Smith, a former DHS director of strategic sourcing and now executive vice president at GovConRx, which has been leading the effort to revamp CPARS for much of the past two years or more. “Agencies can use CPARS data to strategically manage procurements, but there needs to be wholesale relook at it. We need to make sure it results in good information and the information is more strategic and tactically used.”

DHS and GSA will see firsthand this year if that appetite is strong enough to address the systemic problem of CPARS — too many contracting officers are saying a vendor’s performance is satisfactory for two main reasons: A lack of time to explain why the contractor was outstanding or exceptional, and to avoid any lengthy back-and-forth if a rating is below average or poor.

AI pilot in phase 2

DHS is trying to address these shortcomings by applying artificial intelligence tools to the CPARS process.

Its program is in the middle of phase 2 where five companies are building a production-ready software tool. DHS awarded these companies — IBM, CORMAC, TrueTandem, Strongbridge and Hangar — $125,000 to demonstrate their technologies this year.

Polly Hall is the director of DHS’s Procurement Innovation Lab (PIL).

“The user community will take a look during these demos to make sure they feel like they are trustworthy solutions. We want the value to be proved out,” said Polly Hall, director of DHS’ Procurement Innovation Lab (PIL). “The demos are focusing on harder issues. They built these to be commercial solutions and using software-as-a-service (SaaS). We don’t want the federal government to buy the AI and own it. We want to buy licenses and for the tools to ingest the information and present it to us in [a] way that is useful.”

DHS and nine agency partners: The departments of Commerce, Energy, Interior, Veterans Affairs, and Health and Human Services, as well as GSA, NASA, the Air Force and the U.S. Agency for International Development — are reviewing the pilot. The agencies gave the five companies 50,000 procurement records, which they anonymized, to help train the AI.

Hall said by July DHS and its partners will decide which of the technologies should move into phase 3 and will get the software tools an authority to operate in time to launch January.

“If we can solve some of the challenges with policy and security accreditation, we will move into phase 3 where the agency partners will test the technologies on actual solicitations. They still will do a human review, but also bring in the AI solution and compare them on real procurements to validate and compare,” she said. “The final phase would be to move into full production, and maybe create a governmentwide contact so agencies can choose which tool they want to use.”

Hall said offering the AI tools as a shared service is another possibility. She said the more agencies that use the tools, the lower the cost will be and the more value it will provide all agencies.

She said the contracting officers who have tested out the AI tools have found them valuable.

“We are cautiously optimistic and we believe everyone will see the value. This is the year where our hard work comes to bear and we either get it or not,” she said. “We need our partners and OFPP to step up and work with us to make this happen. We feel good that there has been a lot of discussions with agency CIOs and at the governmentwide level about getting through the challenges of the ATO and about addressing the hard policy issues.”

New memo promoting self-assessments

For GSA, it’s a matter of whether contracting officers pick up on the ability for vendors to provide self-assessments on specific projects.

GSA senior procurement executive Jeff Koses issued a memo in February promoting the use of vendor self-assessments as one step in the overall CPARS process.

This is something Smith and GovConRx have been promoting for the past few years.

Jim Williams, a former federal executive with GSA, the IRS and the Department of Homeland Security — now a principal with Williams Consulting LLC and an advisor for GovConRx — said contractors feel they aren’t being judged fairly and have no input into the process. He said the GSA memo is a permission slip for contracting officers to start asking for a self-assessment as part of the broader CPARS process.

“We believe this will give CPARS more balance because of the input by contractors, and it will alleviate [the] burden on contracting officers. It will produce a more accurate and fair rating,” he said.

Williams and Smith said the self-assessment would be just one piece to the puzzle, but would open the door to a wider conversation, similar to an employee doing a self-assessment for their boss. Smith said this self-assessment approach is common in the human resources sector, and no reason the same approach can’t be used by the acquisition workforce.

“Contractor self-assessments can save time while allowing contractors the opportunity to make their case about their performance. Getting the contractor’s point of view early on in the process may reduce the back and forth during the 60-day period contractors have to respond to a CPARS notification following the assessing official’s evaluation in the system,” Koses wrote in the memo. “A contractor actively tracking its performance may have fewer performance issues. If nothing else, editing someone else’s work is much easier and faster than creating an evaluation from scratch.”

OFPP needs to get more involved

GSA recommended contracting officers use the contract kickoff meeting after award to have initial discussions about self-assessments so the contracting officers can track performance during the full life of the program and correct any issues on an ongoing basis.

Mike Smith is a former DHS director of strategic sourcing and now executive vice president at GovConRx.

“The memo is a good first step for GSA. We would like to see OFPP issue something on a more governmentwide basis that encourages the use of contractor self-assessments,” Smith said. “You wouldn’t believe how many contracting officers refuse to take input from industry because they think they aren’t allowed to. As a contacting officer, I’d rather have a back and forth at least by midyear, if not before, so we can adjust course and have a common understanding at the end of the performance period and there are no surprises about ratings and the basis of that rating.”

Williams added that good contractors will jump at the opportunity to do a self-assessment because they will finally be able to have input into the process.

“We think this will help small businesses particularly because when contracting officers see they have done larger jobs and done them well through relevancy search and high CPARS, then they are more likely to feel comfortable with awarding them a contract,” he said. “It also will help contracting officers because they will make better decisions through data, use it as a tool to have discussions that can also be used at the task order level.”

If both initiatives turn out to be successful over the next year, it’s time for OFPP to not just issue another memo but mandate its use and actively promote its use through the frontline forum, at industry events and on Capitol Hill. And they shouldn’t wait until there is a confirmed OFPP administrator.


« Older Entries

Newer Entries »