Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Protest is ‘last resort’ to get GSA’s commercial platforms program to comply with JWOD Act

It’s a shame when the only way to get an agency’s attention on an acquisition problem is through a lawsuit.

Unfortunately, it’s an all too common occurrence with draft solicitations, final solicitations and post-award debriefings.

The latest example from the National Industries for the Blind, the Association for Vision Rehabilitation and Employment and the National Association for the Employment of People who are Blind against the General Services Administration epitomizes this problem.

NIB and its partners were left with no other options — and what they call an unfortunate step — but to file a protest with the Court of Federal Claims over GSA’s commercial platforms request for proposals issued in December.

Despite submitting concerns during the draft solicitation phase and writing a letter to GSA Administrator Robin Carnahan advocating for her help to change the RFP, it took NIB, AVRE and NAEP to take legal action for GSA to act.

Six days after the groups filed their protest, GSA issued a request for information asking industry for feedback on how to accomplish the goals NIB, AVRE and NAEP asked for in its initial comments.

“Filing this lawsuit is a last resort for NIB as we seek to protect the jobs of thousands of people who are blind working in our associated nonprofit agencies — jobs that are threatened by GSA’s Commercial Platforms program as it is currently structured,” NIB said in a statement. “NIB and GSA have a strong partnership that has created many employment opportunities for people who are blind over the course of decades. However, GSA’s Commercial Platforms program — which does not require platform providers to ensure compliance with purchasing laws related to the AbilityOne Program even though it requires compliance with other federal purchasing laws — threatens the jobs of thousands of people who are blind working in our associated nonprofit agencies.”

Compliance has been missing for years

At the heart of the protest and problem with the RFP is what NIB says is non-compliance with the Javits-Wagner-O’Day (JWOD) Act. The 1938 law mandates the AbilityOne Commission publish a procurement list that identifies commodities and services that the commission has determined are suitable to be furnished to the government by companies who employ people with disabilities. Agencies must buy these specific products and services unless there are specific circumstances that require exceptions.

NIB and its co-plaintiffs filed their lawsuit Feb. 2 just prior to the proposal due date of Feb. 3.

In the protest, NIB says GSA’s RFP does not require the same compliance with mandatory source requirements of the JWOD Act. Instead, the commercial platforms solicitation lets offerors either mark or restrict essentially the same (ETS) items on the platform or “have the ability to identify, highlight and promote AbilityOne products offered by authorized distributors.”

“In other words, the RFP will allow prohibited essentially the same items to be listed for sale on the commercial platforms and does not prevent their purchase by federal agencies in violation of the mandatory source requirements of the JWOD Act,” the protest stated.

NIB, AVRE and NAEP are seeking an injunction to stop GSA from going forward with the RFP as it’s currently constructed, and require GSA to modify the solicitation to require offerors demonstrate the ability to block the sale of all ETS items.

“Since 2020, we’ve worked hard to resolve this issue with GSA. Having exhausted our options and with GSA now seeking additional providers to be part of the Commercial Platforms program, we have no choice but to ask the courts to require that providers in the Commercial Platforms program comply with the JWOD Act just as they are required to comply with other federal purchasing laws,” NIB said in its statement. “While bringing this action was a difficult decision, we believe it is necessary to protect the employment of people who are blind in the AbilityOne Program.”

Congress required new models

An industry source, who requested anonymity because they do business with GSA and its customers, said the protest isn’t surprising as NIB has been concerned about the treatment of AbilityOne items since GSA kicked off the commercial platform program in 2020.

“NIB is technically, correct. The micro-purchase threshold rules are very specific in saying that mandatory sources are still mandatory, even at the MPT level,” the executive said.

An email to GSA seeking comment about the lawsuit was not immediately returned.

NIB said it has been working with GSA for several years to address their concerns about the commercial platform initiative. Three years ago, GSA made three awards in June  to Amazon Business, Fisher Scientific and Overstock Government to test out this commercial online platform concept. The three vendors developed or modified existing platforms to meet the government’s requirements. The goal is to better manage and capture the data around online spending that is already occurring so agencies can better understand where this money is going. The other goal that Congress focused on when it included the authority and mandate to set up the commercial platforms in the 2018 Defense authorization bill is to make buying of commercial products better, faster and cheaper — or more like Amazon.

During what GSA calls this proof of concept phase, agencies using the online platforms could buy commercial products below the micro-purchase threshold of $10,000. GSA initially estimated the overall market to be $6 billion, but scaled it back to $500 million in 2021.

Congress in the 2022 defense authorization act required GSA to test out other approaches beyond the current one, leading to the new acquisition effort, which started in March 2022 with a request for information, followed by a draft RFP in September and a final one in December.

NIB says despite its efforts to work with GSA, the evidence showing the impact of the commercial platform effort on companies that employee people who are blind or visually impaired is clear.

NIB told Carnahan in its October letter that the proof of concept confirmed that the new program was not complying with the JWOD Act’s mandatory sourcing requirements. AbilityOne purchases accounted for 2%-to-4% of the total dollar value of sales under GSA Advantage! and 9%-to-10% of the total value of sales under all GSA purchase vehicles, but AbilityOne purchases were less than 1% of the total value of sales on the commercial platforms.

Source: GSA

“This loss of sales is compounded by an increase in sales of ETS items. NIB estimates that less than 1% of the value of sales through GSA controlled contracts (e.g., GSA Advantage!, GSA Global Supply and GSA eBuy) result in non-compliant purchases of ETS products rather than the mandatory AbilityOne products, i.e. lost sales for AbilityOne qualified nonprofit agencies (NPAs),” NIB and the others wrote in the letter to GSA. “Under the current commercial platforms, however, NIB estimates the percentage of non-compliant sales in direct violation of the JWOD Act is at least an order of magnitude higher at approximately 12% of the total dollar value of sales on the platforms.”

In both the letter to Carnahan and in the protest, NIB highlighted the fact that GSA’s other portals, Advantage and the schedules e-Buy could block and substitute if agencies tried to buy non AbilityOne or ETS items.

In fact, in GSA’s response to NIB’s letter, it acknowledged this capability as part of how it ensures compliance with the JWOD Act. But instead of ensuring the commercial platforms were doing the same, GSA explained that it instead “relies on ongoing education, training, and enforcement by agencies” for JWOD Act compliance.

In the end, GSA’s RFI is asking exactly how industry can block ETS products from appearing, block and substitute for AbilityOne products or if there other capabilities that exist to meet the same end goal. Vendors have until Feb. 23 to respond to the RFI.

“The purpose of this RFI is to understand the feasibility and legality of certain changes to the solicitation that are being sought by the plaintiffs. Responses to this RFI will be used to assist the government in determining whether the current RFP should be left unchanged, amended, or canceled,” GSA wrote. “This RFI does not promise or commit the government to any particular course of action or to make award to any party, nor will it impact any existing proposals received.”

If GSA ends up modifying or even canceling the procurement, it would be an all too common result that vendors see far too often across the federal community — an issue that the agency should’ve addressed early on in the process, but instead cost industry time, money and frustrations. And this all could’ve been avoided if someone in GSA would’ve just taken a more pragmatic approach to solving this challenge rather than what looks like GSA just dug their heels in.


What to expect from the new IT/cyber/innovation House subcommittee

Throughout the federal technology community, there were tiny celebrations and the undercurrent of a sense of dread with the reconstitution of the House Oversight and Accountability Subcommittee on federal IT, cybersecurity and government innovation.

The dichotomy of these feelings wasn’t just limited to one specific group. Agency chief information officers and their staffs, contractors and long-time federal observers felt the push/pull of what’s to come.

The celebration:

“From a committee structure standpoint, in many ways the new structure goes back to a proven model that we had when I was on the committee staff in the early 2000s and that more recently proved effective the last time Republicans held the majority with IT separate from more traditional government operations,” said Mike Hettinger, a former House Oversight and Reform Committee staff member and now president of Hettinger Strategy Group. “Given the criticality of the IT modernization and cybersecurity issues currently facing the federal government, from systems modernizations and customer experience to ransomware and zero trust, this targeted focus is the right structure at the right time to make sure they are being addressed.”

The sense of dread:

As my colleague Drew Friedman wrote last week, the House Oversight and Accountability Committee made clear it’s primed and ready to turn up the oversight heat. This mindset surely will trickle down to the subcommittee level.

“If you think about a shift from reform to accountability, that suggests a more adversarial posture. It is going to be less about finding legislative solutions to a particular issue, and more about bringing accountability to people who have potentially done something that Congress thinks is wrong,” said Andy Wright, former staff director and counsel for the committee’s National Security and Foreign Affairs Subcommittee from 2007 to 2011 and now a partner at K&L Gates. “It portrays a deeper change in the posture between the executive branch and the new Congress.”

Taking a closer look at the new chairwoman of the subcommittee, Rep. Nancy Mace (R-S.C.), there aren’t a lot of past signals to pull from.

Nancy Mace
Rep. Nancy Mace is the new chairwoman of the House Oversight and Accountability Subcommittee on federal IT, cybersecurity and government innovation. (AP Photo/Meg Kinnard)

As Julie Dunne, the former staff director and senior counsel for the Oversight committee’s Government Operations Subcommittee from 2017 to 2019, pointed out one example of Mace’s past performance is she co-sponsored with Rep. Ro Khanna (D-Calif.) the newly enacted law Quantum Computing Cybersecurity Preparedness Act.

Another former Hill staff member, who requested anonymity because they didn’t get permission to speak to the press, highlighted Mace’s support and the passage of the Federal Rotational Cyber Workforce Program Act in May 2022 as another example of her bi-partisan and technology focused previous work.

Rep. Gerry Connolly (D-Va.) took over as ranking member of the subcommittee after serving as the chairman of the government operations subcommittee for the last two years where federal IT, cyber and similar issues were folded in back in 2020. Some observers expressed surprise that Connolly took the ranking member role with the federal IT and cyber subcommittee instead of the government operations and federal workforce.

“It’s also great to see Rep. Connolly appointed to the subcommittee as ranking member. As we all know, there’s no one in Congress that’s done more to support the federal IT community than Rep. Connolly and I look forward to his ongoing leadership on federal IT and cybersecurity issues,” Hettinger said.

Going through Mace’s bills she sponsored or was an original co-sponsor of during the last Congress offers a bit more insight into some of her interests. Some of these include:

  • Veterans’ Cyber Risk Awareness Act — VA should run a communications and outreach campaign to educate veterans about cyber risks. These risks can include disinformation, identity theft, scams and fraud spread via the internet or social media.
  • Whistleblower Protection Improvement Act of 2021 — The bill would have established, modified and expanded certain whistleblower protections for federal employees, including with respect to petitions to Congress, whistleblower identity, and protected disclosures.
  • All Economic Regulations are Transparent (ALERT) Act of 2021 — Each agency must submit a monthly report to the Office of Information and Regulatory Affairs (OIRA) for each rule the agency expects to propose or finalize during the following year, including information about the objectives and legal basis for the rule as well as whether the rule is subject to periodic review based on its significant economic impact.

Now at the same time, she also signed on to bills that many believe are anti-federal worker or at least considered questionable support for federal employees such as one to create Schedule F in law, one to require the Office of Personnel Management to report how much official time federal employees are using and one for what purposes and legislation to prohibit agencies from requiring federal contractors to receive the COVID-19 vaccine.

So what does this all mean for federal IT, cybersecurity and government innovation?

Mace offered a little insight in her press release announcing her new chairwomanship, “Securing our nation’s data, protecting our cyber infrastructure, and studying emerging technologies of the future like artificial intelligence, quantum computing, and blockchain integration is more important today than ever.”

Dunne, now a principal at Monument Advocacy, said the fact that Rep. James Comer (R-Ky.), chairman of the full committee, brought back the subcommittee reflects the importance of these issues.

“The portfolio for this new subcommittee is broad and incredibly important given the importance of addressing cybersecurity risk and the critical need to innovate and promote IT modernization across the federal government,” she said. “I think she and Ranking Member Connolly will work well together on these important issues.”

The former Hill staff member said Mace’s history of working across party lines is an important sign for the future of the subcommittee.

“Connolly’s interest and focus on data centers and the cloud is well known. Mace has demonstrated the ability to work across party lines to address current and future challenges federal agencies face. Legislation such as the Federal Cyber Workforce Act and the Quantum Computing Cybersecurity Preparedness Act shows that she is forward looking and serious about problem solving,” the former staff member said.

And as we’ve seen over the past few years, the subcommittee has plenty of fertile ground to begin to plow to plan seeds of change.

There are several questions that the federal IT community will follow around the future of the Federal IT Acquisition Reform Act (FITARA) scorecard, whether the modernization of the Federal Information Security Management Act (FISMA) can finally get across the finish line and whether the number of federal executives with the title of CIO will reemerge as an issue for the subcommittee.

The other big question is how Mace and Connolly will work together. Typically, the subcommittee chairman and ranking member have had a close relationship on these types of issues as federal IT and cybersecurity are mostly non-partisan. Connolly and Rep. Will Hurd (R-Texas) and Rep. Darrell Issa (R-Calif.) worked together closely of several important federal technology laws, including FITARA and the Modernizing Government Technology Act.

Mace’s history suggests she will be similarly bipartisan on these issues, and that’s a good sign as the subcommittee gets off the ground.

 


Navy CDO Sasala jumps ship to the Army

It’s barely February and there already are big federal technology changes happening among several agencies, including FEMA, the Department of the Navy, the Department of Health and Human Services and the General Services Administration.

Let’s start with a major change in the data community.

The Navy is losing its chief data officer. Tom Sasala is joining the Army as the deputy director of the Office of Business Transformation.

“In that capacity I will help drive process optimizations, reform business practices, help ensure the Army business community is connected to the mission,” Sasala wrote in an email sent to colleagues and obtained by Federal News Network.

Tom Sasala is leaving after almost three years as the Navy’s chief data officer to return to the Army.

Additionally, Sasala said he will be leading the Army’s business system transformation, business process reengineering and overseeing the business system modernization efforts.

OBT, which also directly supports the Army’s chief information officer’s office, says it drives business transformation initiatives, performing Army business system portfolio management, achieving an integrated management system, developing enterprise architecture, and ensuring continual business process reengineering. It focuses specifically on improving the business processes and information technology that drives the institutional Army.

Sasala has been the Navy CDO since October 2019 after coming over to the service in April of that same year as the director of data strategy.

He previously worked for the Army for more than two years as the director of the architecture integration center and CDO.

“I came to the Navy to help propel the department forward, and during that time we accomplished more than I could have imagined. We established a comprehensive data management program, created an enterprise data management and analytics platform that is open to all, and standardized data management roles and responsibilities across the department. And those are only the wave tops — we did so much more on a day-to-day basis and our successes are undeniable. All our successes were a team effort and everyone should celebrate the accomplishments. Kudos to everyone and my deepest personal thanks to everyone involved,” Sasala wrote. “Amongst all the highs, there were certainly lows, but we never dwelled on the problems. Instead, we focused on the opportunities. We wanted to positively impact the mission and make every sailor, marine and civilian a true data citizen. While I will admit we have not — yet — accomplished that audacious goal, we have come a long way.”

It’s unclear who will replace Sasala, even on an acting basis. An email to the Navy’s CIO’s office seeking comment was not immediately returned.

During his tenure as the DoN’s CDO, Sasala helped push forward the Jupiter data platform, addressed data duplication across the service and helped improve the culture around using data to drive decisions.

‘As I move on in my career, the Department of the Navy has built a solid and unwavering foundation as an institution. The best I could ever hope for as a leader is to leave something more important, and more permanent, behind in my wake. I believe that is what I have done,” Sasala wrote. “Though the mission is not complete, and much work remains, I am confident you will continue pressing forward, pushing boundaries, and making a difference.”

HHS, FEMA hire new executives

Another move inside the government is happening at HHS.

Karl Mathias, the HHS CIO, continues to fill out his staff bringing Jennifer Wendel over from the FBI to be the deputy CIO. Wendel, who started on Jan. 30, was the acting FBI deputy CIO for three years and worked at the bureau for 17 years.

Mathias became CIO last March and brings over Wendel to fill a key role that has been vacant for some time.

Jennifer Wendel is the new deputy CIO at HHS.

During her tenure at the FBI, Wendel also worked as the section chief for enterprise IT governance in the Office of the CIO where she directed end-to-end IT and data resource operations to streamline delivery to users and mission compliance, according to her LinkedIn bio. She also wrote that she facilitated “data-driven insights impacting enterprise direction on a technical scale through new scaled governance framework. Reengineer[ed] existing programs enabling long-term sustainability and operational capacity. Foster[ed] rapport with personnel as highly reliable leader who inspires confidence and exceeds performance requirements.”

And speaking of technology executives in new roles, you may have heard that FEMA has a new CIO.

Charlie Armstrong is returning to federal services after six years in the private sector. Armstrong spent more than 33 years at the Customs and Border Protection directorate, including the last seven-plus as the CIO and assistant commissioner of the Office of Information Technology before retiring in 2016.

“I owe a lot to Eric Leckey and Tami Franklin [the associate and deputy associate administrators for mission support at FEMA, respectively] for having the confidence in me to bring me back. Make no mistake, Eric can be very persuasive!” Armstrong wrote on LinkedIn. “I’m very appreciative of how welcoming everyone at FEMA has been especially Administrator [Deanne] Criswell and Deputy Administrator [Eric] Hooks.”

Armstrong replaces Lytwaive Hutchinson, who retired last spring after 41 years in government.

Navy, GSA reduce vacancies

The Department of Navy also filled a long-time vacant position. Bringing in Jane Rathbun as the DoN’s new principal deputy CIO. The position had been vacant for several years, possibly since Kelly Fletcher moved to the Homeland Security Department in 2018.

The DoN CIO’s office posted on LinkedIn that Rathbun has ascended to this new role.

Previously, she created the chief technology officer’s office and led the DoN’s reorganization of Program Executive Office Enterprise Information Systems (PEO EIS) into PEO for Digital and business technology Enterprise Services (Digital) and PEO for Manpower, Logistics and Business Solutions (MLB) organizations.

Jane Rathbun is the new deputy CIO for the Department of the Navy.

Additionally, she played a key role in drafting the information superiority vision and oversaw implementation of the DoN cloud policy. She also led the development of DoN CIO’s capstone design concept for information superiority and many other strategic guidance documents.

“Jane’s superpower is her undeniable ability to get stuff done. From Flank Speed to the DevSecOps Task Force to her current contributions to the Cyber Ready effort. Jane is a fierce champion and a source of competitive advantage for the DoN,” the office stated in its post.

Over at the General Services Administration, the Federal Acquisition Service tapped a former DoD executive to run its Enterprise Technology Solutions office in the Office of IT category.

Jake Marcellus will start as the executive director of ETS on Feb. 12, a GSA spokesperson confirmed.

Marcellus who spent 12 years with DoD before coming to GSA, will oversee FAS’s telecommunications and network services contracts, including the Enterprise Infrastructure Solutions (EIS) vehicle.

Tracey Malick has been acting executive director since January 2022, when previous acting executive director Amy Haseltine moved to new role as deputy assistant commissioner for Information Technology Category.

Marcellus came to GSA in May 2021, where he led the DevSecOps effort of the Integrated Award Environment.

While at DoD, Marcellus lead initiatives in network management, enterprise service operations and cyber-tool development. He also deployed the department’s first full-service classified mobile tablet service.

Even as these agencies received a new influx of talent, the Veterans Affairs Department lost a key technology executive.

Todd Simpson, VA’s deputy assistant secretary for information and technology and deputy CIO, moved to industry after 13 years of federal service, including the last two-and-a-half at VA.

Simpson joined Carelon as a staff vice president for technology, engineering and operations in November. Carelon provides technology and services for healthcare providers.

At VA among the areas Simpson focused was on moving applications and systems to the cloud and implementing DevSecOps across the department.

Before coming to VA, Simpson served as the Food and Drug Administration’s CIO and chief product officer at HHS.


NSF joins a growing list of agencies reconfiguring its CIO’s office

The role of the agency chief information officer is undergoing yet another evolutionary step.

The National Science Foundation is joining a small but growing number of agencies remaking their CIO’s office.

The agency is planning on establishing a new Office of Business Information Technology (BIT) Services that will be led by an executive who will be both the CIO and chief technology officer.

A NSF spokesperson said current CIO Dorothy Aronson will remain a principal advisor to the agency’s director and other senior management on all matters involving information technology, but will no long hold the title of CIO.

“Information technology, technology innovation, and data are critical to NSF’s mission. These business areas are especially critical as we anticipate significant growth from the CHIPS and Science Act. NSF is expanding quickly and needs to position itself with the right structure and resources so we can continue to provide outstanding information technology services to our staff and the external research community,” the spokesperson said in an email to Federal News Network.

Aronson has been the NSF CIO since December 2017, where she has led the agency on an IT modernization journey, building governance and customer experience structures and also addressing underlying infrastructure issues.

Three roles into one

NSF released a job announcement for the new combined position on Jan. 24 with applications being due on Feb. 27.

In the opening, NSF describes the new head of BIT as someone who will establish “strategic direction and has overall responsibility on matters involving leadership and direction in the formulation, development and execution of NSF’s IT management program. The incumbent ensures the agency maximizes the use of technology, information and data systems to improve agency mission delivery and performance and provides regular reports to the Office of the Director. The incumbent is also responsible for formulating and articulating the agency’s policy, position or response on current or emerging information technology and its relationship to the NSF and its mission.”

NSF also describes the CIO and CTO roles that align with what a typical position description would entail.

The spokesperson said the Office of Business Information Technology Services came from recommendations made by several groups, including the Evidence and Data Governance Steering Committee and the Business and Operations Advisory Committee subcommittee.

The evidence and data committee recently submitted a proposal for centralized data analytics, and it makes sense to have an integrated IT and data management strategy under one organization.

The business and operations subcommittee recommended NSF review the current IT operational structures for leadership, governance, delivery, operations and oversight to identify opportunities to further streamline the processes or realignment of responsibilities of the CIO’s office. The subcommittee said this will improve the overall visibility, effectiveness and create closer linkages of organizational capability with business objectives.

“Since the individual responsible for strategic IT direction, NSF’s CIO and the individuals responsible for the implementation of technical capabilities and day-to-day operations are in different parts of the organization, ensure that processes defined support maintaining alignment between strategic goals and project and operational activities,” the report stated.

Long history of boosting CIOs

NSF’s decision follows a similar one by the National Institutes of Health earlier this month. NIH decided to separate its CIO from the director of the Center for Information Technology and create two distinct positions after almost 25 years of combining the roles.

And if you add to those two internal changes the push by Congress in the 2023 omnibus bill to ensure CIOs at certain agencies have more authority, it’s easy to see how this evolution is slowly happening.

Both the NSF and NIH decisions to reconfigure their CIO and technology oversight offices are the latest step in this 25-plus year evolution of the agency’s lead technology role.

Beginning in 1996 with the Clinger-Cohen Act, Congress established agency CIO positions with designated roles and responsibilities for the first time in law. Lawmakers continued to build up the positions with the E-Government Act of 2002 and the creation of the Office of E-Government and IT in the Office of Management and Budget, giving agency CIOs White House level leadership.

A third law, the 2014 Federal IT Acquisition and Reform Act (FITARA), further enhanced CIOs’ authorities and responsibilities.

In between each of those points in time in law, cyber and data breaches, assorted OMB policies and presidential executive orders — the most recent coming in 2018 — have tried to push these positions further into the executive suite.

The pandemic is the latest forcing function to drive change in how CIOs are viewed, respected and included, given the “discovery” of how important technology, networks and data became to meet agency mission needs.

Public vs. private CIOs

But what’s different here is NSF isn’t just hiring a CIO or CTO, but someone with a broader range of skillsets, a person that looks more like a private sector CIO than a typical person who may work in the public sector.

A September 2022 report from the Government Accountability Office looking at the similarities and differences between private sector and public sector CIOs further shows how change is coming to the role. GAO surveyed private sector CIOs and held a panel discussion with former federal agency CIOs.

What they found helps put a finer point on this continued evolution.

For instance, GAO reported private sector CIOs say a critical factor for a CIO’s success is an ability to bridge gaps between the technical and business parts of the company and promote two-way information exchange.

“Private sector panelists also stated that the concept of shared accountability was a key part of their business culture. Participants also viewed collaboration between the CIO and other business-centric senior executives as essential, and stated that cross-functional teams often must work together to drive business outcomes, such as increasing revenue and customer satisfaction,” the report found.

The changes NSF is ushering in jives with what GAO’s report highlighted. Both private and federal CIOs discussed the importance of having IT knowledge, but stressed the importance of having business knowledge and experience.

“The high portion of CIOs with company or industry knowledge or previous CIO experience was explained by expert panel CIOs, who stated that communication and project management skills, as well as knowledge of the business of their organization, were more central to their success,” the report stated. “[I]n order to better ensure that CIOs are able to align IT to the organization’s business goals, panelists stated that it was important for CIO candidates to have an understanding of their organization or industry. The current federal CIO also stated that managerial skills were key to the success of federal CIOs. Managerial skills mentioned by the federal CIO included a detailed understanding of both IT and non-IT aspects of the organization’s mission, an ability to communicate the relevancy of IT initiatives to other leaders, and an ability to build relationships and create partnerships.”

Finally, another key point that drives this point home from the GAO survey came around shared accountability vs. shared responsibility as part of the business culture.

Like what NSF seems to be trying to do, the private sector CIOs said cross-functional teams often work together to drive business outcomes, such as increasing revenue and customer satisfaction.

For example, one private sector CIO stated, “To me, a CIO today is really that individual that can sit at the table and really advise the business or advise the mission organizations around the strategic use of technology and information and help them to do better.”

Another private sector CIO stated, “We are involved in virtually everything that goes on inside the company, one way or the other.”

Between the technology enlightenment that occurred during the pandemic and ubiquitous nature of data, systems and software in almost everything the government does from border security to food inspections to buying pens and pencils, this next evolution in the role of the agency CIO seems about ready to take off, and most agree, none too soon.

 


Air Force’s corrective action fails to satisfy unsuccessful bidders for EITaaS contract

Lauren Knausenberger, the Air Force’s chief information officer, hedged her bets back in early December about when the service’s enterprise-IT-as-a-service (EITaaS) would get off the ground.

“My money is on January to be able to move forward with an award. But I understand that there could be a second round of protests that put us into April,” Knausenberger said after speaking at the AFCEA Nova Air Force IT Day. “That’s just how the process works. I do think that it will be January or April, but I’m going to leave it to our acquisition professionals and our lawyers to do the right thing.”

Turns out Knausenberger read the tea leaves well, as two of the unsuccessful bidders for the 10-year, $5.7 billion EITaaS contract filed a second round of protests on Jan. 3.

Peraton and Accenture Federal Services claim Air Force’s corrective action after their initial protests of the award to CACI fell short. CACI won the Wave 1 contract in August to provide IT service management, end user devices and various support services, including operating a centralized helpdesk for the Air Force and Space Force.

Both Peraton and Accenture Federal Services declined to comment on their second set of protests.

But Federal News Network has learned Peraton and Accenture both allege in their protests that:

  • The Air Force misevaluated the strengths and weaknesses of their bids as well as the strengths and weaknesses of CACI.
  • The Air Force used evaluation criteria not specified in the solicitation to evaluate bids. Additionally, the agency’s evaluation reflected disparate treatment and that the Air Force’s best-value selection decision was flawed and unreasonable.
  • Peraton and Accenture again have raised conflict of interest challenges stemming from CACI’s allegedly having hired former Air Force employees. The companies also allege that these individuals provided CACI with inside knowledge of, and access to, non-public competitively useful information. Through that information, Peraton and Accenture allege that CACI gained an unfair competitive advantage and therefore CACI should be excluded from the competition.

The Government Accountability Office has until April 13 to decide the case, assuming the Air Force doesn’t again take corrective action or the vendors withdraw their protests.

The Air Force and CACI didn’t immediately return requests for comments.

Peraton, Accenture Federal Services and SAIC protested the Air Force’s first award to CACI in September. GAO dismissed the protest when the Air Force said it would take corrective action shortly after the companies filed the protests.

Same award decision as before

Federal News Network has learned that the Air Force told GAO that its correction action would include investigating the alleged conflicts based on claims that they resulted in an unfair competitive advantage for CACI. The Air Force also would reevaluate all of the proposals with regard to the competitors’ prior experience and make a new best-value tradeoff and award determination.

The Air Force, once again, awarded the contract to CACI in late December, leading to Peraton and Accenture filing a second protest. SAIC did not file a protest with GAO.

Whether or not Knausenberger’s prediction comes to fruition with an April kick off, the Air Force is moving toward EITaaS and is preparing in other ways beyond the basic contract.

Lauren Knausenberger is the CIO of the Air Force.

She said the Air Force is looking at some of the underlying efforts that will support EITaaS, like the IT service management platform, workflow processes and enterprise license agreements.

“How do we look at our processes end-to-end? What are the services that we want to go into it as wave one? How do we communicate that to the workforce? What does affect our current workforce and what does not? And then also, what are the things that we have to do to prepare? And that’s something where the Air Combat Command is the lead command or the Cyberspace Capabilities Center or our program executive office up in Boston, they’re looking at the very fine details of how do we execute this starting on day one,” Knausenberger said. “I know that our prime vendors are also going back through the proposal, the transition plan and taking advantage of the extra time to really make sure that they can hit the ground running.”

Few expected the Air Force’s plan to outsource the day-to-day management of its technology network and devices to go smoothly. Like previous attempts, whether by the Department of Navy with the Navy-Marines Corps Intranet (NMCI) or the Army’s decision not to proceed with its version of the EITaaS, these programs come with a boat load of questions and challenges.

The Air Force tried to address several of those questions with a pilot of EITaaS through an other transaction agreement. Accenture Federal Services was one of four vendors chosen under the OTA. AFS provided compute and store services to six bases including Maxwell in Alabama, Offutt in Nebraska and Joint Base Elmendorf-Richardson in Alaska.

But as with any large, long-term contract like EITaaS, losing rarely goes down easy for unsuccessful bidders.

The good news is the Air Force recognizes it needs more than one vendor, unlike NMCI had in the late 1990s-early 2000s. Knausenberger said Wave 2 will be multi-award with a goal of bringing all the services together from Waves 1 and 2.

“With Wave 1, even though we [will] have one prime, there is a lot of content there for small business, I want to say north of 40% for small business. The way that we’ve structured it, it’s such that no one business has all of the skill sets that we need for that work, so a successful prime would be a prime that finds the best of all of those services in the market, and brings them together in a seamless way,” she said.

Consolidation, rationalization top priorities

While the future of EITaaS remains a bit murky, Knausenberger’s plate remains full with assorted other initiatives.

One big initiative is consolidating and better managing enterprise licenses for cloud and software as a services like ServiceNow, VMWare and Salesforce, for example.

“With a lot of our large enterprise licenses, we’ve recognized that we have multiple licenses. So I’m going to just make up a use case, let’s say with ServiceNow, that we have 40 different contracts and that we’re spending X millions of dollars. In a lot of cases, the vendors come to us and say, ‘Hey, do you guys know how much you’re spending with us? It’s X millions of dollars and we have 40 different contracts, and we think that we could actually do an enterprise deal with you for the same price, or for only this small amount more, we could do an enterprise deal and anybody new that comes in can get this capability,” Knausenberger said. “It is a win-win for us and for the vendor if we can take a look at our whole spend and if we can figure this out instead of doing this 40 times where we have to issue 40 contracts and they have to manage 40 contracts with different capability sets across the 40 contracts. We can do it once for the enterprise. And then instead of paying separately, we just have lines that go to whatever mission area is using it.”

Knausenberger said this will make it easier for folks to use the SaaS in the future because they will know it’s accredited and ready to use.

Another priority is application rationalization to consolidate the hundreds of applications in each portfolio. She said each portfolio are in a multi-year journey to upgrade, replace, consolidate and/or move to the cloud. The Air Force has shut down hundreds of applications in the last few years.

Knausenberger said other priorities include improving Air Operations Center’s bandwidth and resiliency as well as roll out enterprise identity and access management services.

“We have a software-defined wide area network (SD-WAN) decision coming up in February. We have data mesh and compute pilot going on in the Pacific with all of those things converging around June. I see that June pilot as a capstone for all of the things that we are doing in that digital space,” she said. “Meanwhile, we are really getting the ducks in a row for IT portfolio management, rolling out EITaaS Wave 1. We are making sure that our zero trust strategy is just really, really tight as we move into fiscal 2024.”

 


First Look

How one Russian group exposed the soft underbelly of federal cyber defenses

In early November, at least two agencies fell victim to a cyber attack from a group based in Russia.

The hacking group Killnet took responsibility on Twitter for taking down sites run by the Commerce Department and the Cybersecurity and Infrastructure Security Agency in the Department of Homeland Security.

While the distributed denial of service (DDoS) attack was more of a headache than anything else, it spurred the Office of Management and Budget to relook at agency protections against a type of attack that experts say is tried often, but is rarely successful any longer. But, at the same time, has re-emerged on the threat landscape poking at federal IT’s soft underbelly.

The Federal Chief Information Security Officer’s Office and CISA issued a data call about 10 days after the DDoS attack asking for details on how agencies are protecting against these threats and asking agency chief information officers to validate the domains they are protecting.

Federal and industry cyber executives say the concern isn’t so much this one successful attack against Commerce and CISA, but understanding how well agencies are prepared against DDoS attacks that have been rising in Eastern Europe at an alarming rate and against soft targets like airports in the U.S.

“Our view is it did appear that the activity was starting to target the U.S. government, so if that’s the case, let’s be prudent and take this opportunity to fast track some efforts that were underway already to start doing automated discovery and confirmation of content delivery networks and DDoS mitigation protections across all federally owned websites,” an OMB official told Federal News Network. “We have a general view of the current state, and there’s pretty good coverage across federal government in this space. We know that anecdotally from prior data calls and from conversations with CISOs and CIOs at CIO and CISO council meetings. But this is new and different. CISA has created a tool and capability so let’s get, like, full veracity here and make sure agencies have a central viewpoint of all the websites that you own, operate and are responsible for.”

Impact on sites was minimal

The November attack that Commerce and CISA faced resulted in no operational impact, federal executives say.

Commerce’s main website came down for several hours, government sources say. A Commerce official said they were still doing a root cause analysis and didn’t offer any further details about the attack.

At CISA, the DDoS attack took down the front end of the Protected Critical Infrastructure Information (PCII) Program user website for a few hours, according to internal emails obtained by Federal News Network.

Other agencies, including the National Nuclear Security Administration and the Treasury Department, also faced DDoS attempts by Killnet to take down websites, but were unsuccessful.

A CISA official told Federal News Network that the DDoS attack had no operational impact and only impacted the external facing site.

“The application was still up. The PulseSecure server that sits in front of [the site] went down because it couldn’t handle the traffic load,” the official said.

Another CISA official described the attack as a “general resource exhaustion attack,” where the bad actor is sending too much traffic for the site to handle at once.

The problem of growing technical debt

The attack highlights a bigger issues agencies continue to face: Old technology that either isn’t patched because they don’t know about it, known as shadow IT, or they can’t afford to replace it because of budget limitations.

OMB said in 2016 agency technical debt is more than $7 billion. OMB, agency CIOs, industry and some lawmakers continue to ring the alarm bells about this amount of technical debt as it raises the risk level agencies face from cyber attack.

CISA, the agency, as opposed to the policy shop that puts out binding operational directives and other guidance, is no different.

Early indications from CISA internal emails obtained by Federal News Network showed that the attack found success because the agency was using legacy servers that were unpatched.

“CISA support has identified the Pulse server is currently running on old codes. Issue has been escalated to Akamai engineers to replace CISA old code on Pulse Secures,” according to a Nov. 4 internal email.

The first CISA official said analysis later on determined the root cause was not old code running on the servers, and the agency had all configurations up to date. The official didn’t explain what had happened, except the attack overwhelmed the external facing site.

But one federal cyber official familiar with the incident questioned why CISA would send out an email saying what the problem is if they weren’t sure or were guessing. The official said in the first hours of a cyber incident either you know what the problem is and you tell your team, or you say you don’t know what the problem is because it could cause other problems.

“I don’t think a CISO would send that note unless it’s not true, as communications during an incident are key. Everyone is fine with not knowing what happened, but if you know you should state it. I think they knew exactly what was wrong and their network and security operations center confirmed it,” said the official who requested anonymity in order to talk about the cyber attack. “Agencies were told to get off PulseSecure more than a year ago because they had a known vulnerability. I think they were shocked it was even there because they should’ve gotten rid of it because they knew about it being a problem.”

The official and other experts called the Killnet attack basic and not much better than a “script kiddie.”

The federal official said Killnet was probably surprised the DDoS attack worked at all.

“DDoS are not complicated attacks. They are script driven. The cloud has enabled DDoS attacks with scale and speed that we haven’t seen before. It’s not highly technical. It’s low rent and this was an opportunity Killnet took advantage of,” the official said. “This will happen again. I’d expect Killnet to come back.”

To be fair, it was the weekend before the mid-term election, so many experts give CISA credit for raising awareness at the time.

The data indicates Killnet, or whomever, will be back, which is part of the reason for OMB’s data call. In fact, CISA sent out a governmentwide email on Nov. 4, the day of the successful attack against agencies, warning against a possible spike in DDoS activity.

“We wanted to highlight increased DDoS activity being reported against federal agencies. Several agencies have confirmed impact from this activity. If you are experiencing any DDoS activity, we ask that you please report it to CISA via standard reporting mechanisms and share any relevant information from that activity,” the email, which Federal News Network obtained, stated.

Earlier in October, CISA also updated its DDoS guidance for agencies: Understanding and Responding to Distributed Denial-of-Service Attacks and Capacity Enhancement Guide (CEG): Additional DDoS Guidance for Federal Agencies.

Spike in DDoS attacks in Eastern Europe

OMB reported in the fiscal 2021 Federal Information Security Management Act (FISMA) report to Congress released in September that attrition attacks, which DDoS fall under as a category, accounted for about 1% of all attacks agencies suffered last year. The report states there were 440 known attrition-type attacks last year out of more than 32,000 total incidents.

In the meantime, DDoS activity has picked up massively in Eastern Europe since Russia’s invasion of Ukraine. Eastern Europe is typically the target about 1%-to-2% of global DDoS attacks; since the invasion, Eastern Europe has been the target of up to 30% of global DDoS attacks, said Patrick Sullivan, the chief technology officer for security strategy for Akamai.

“This represents more than a 1,100% increase in DDoS attacks in Eastern Europe compared to trend lines prior to February 2022. We’ve also seen some of these attacks pack quite a punch, with records broken for the greatest number of packets per second of DDoS attacks in Europe,” he said. “There have been DDoS attacks targeting organizations in the U.S. that have been attributed to organizations that have publicly pledged loyalty to Russia in this conflict. U.S. state and local governments, airports and other industries have been targeted by Killnet. Overall, these are more isolated; we are not seeing the massive increase in activity directed at targets in the U.S. like what we have observed in Eastern Europe.”

And hacking groups will continue to look for soft spots, like an unpatched PulseSecure server.

John Pescatore, the director of emerging security trends at the SANS Institute, said agencies should also be concerned about bad actors using DDoS attacks as a distraction to get the security and network defenders focused on one thing and then attacking somewhere else with something more serious like ransomware.

He said DDoS and other basic attack approaches take advantage of old technology that either agencies don’t know exist or haven’t had the resources to upgrade.

“With the PulseSecure vulnerabilities over the past two years, agencies have been super slow to patch them, and the government was a big chunk of their business. The vulnerabilities are getting exploited in variety of ways,” Pescatore said. “Earlier this year, Akamai put out warning about a middle box reflection attack where something vulnerable is used for dual purposes. It became a man-in-the-middle attack doing TCP reflection attack but also a DDoS attack against the organization that owned that machine. We have no idea why the government has been so slow to patch the PulseSecure servers. The patches were coming out and it was not like they were zero days.”

A more scalable cyber defense

This brings us back, again, to OMB’s data call.

OMB wanted agencies by Nov. 18 to identify and validate all domains, and then CISA will provide agencies with data from its content delivery network (CDN) reporting and mitigation database.

Agencies were then to “review the findings and provide feedback to CISA on how they will mitigate the risk of a sophisticated distributed denial of service (DDoS) attack against sites not showing a known CDN provider.”

A third CISA official said they are working on a more scalable approach to determine where protections may be lacking and how could they provide visibility to agencies so they can implement mitigations.

“We generally feel that the federal civilian executive branch enterprise is well protected against DDoS, which is why you’re referencing a couple of attacks over a few weeks when there are 10s of 1000s of web apps across government. But we want to make sure there are always stragglers, there is some low hanging fruit. We want to make sure that agency CIOs and CISOs are aware of where protections may be lacking so they can work urgently to put needed protections in place,” the official said about the data call.

The OMB official added CISA’s tool will help identify gaps where maybe agencies thought they had turned coverage on but for whatever reason it isn’t there today.

“It could be something they’ll be able to make a quick fix on or maybe they may have made an intentional decision at one point in time, 10 years ago or whenever, to not have the coverage and maybe in 2022, they should be making a different decision,” the official said. “The interesting thing about these exercises is you learn a lot as you get the data and as you have agencies discover what is going on. From an automated perspective, this is the first time we’re doing it federalwide, so it’s a big exercise. We’re really excited about it because it just feels like, why not take this moment to fast track that and put it on the front burner?”

Whether or not this was a case of CISA, Commerce and possibly other agencies getting caught with their proverbial shields down because of outdated technology or shadow IT, the fact is the ongoing challenges around old and outdated technology and easy ability and low cost to launch DDoS and other basic attacks makes agencies more vulnerable. While the impact of DDoS and similar attacks may not be great, they do cause federal executives a great deal of time, energy and heartburn. This serves as just another reminder that getting out from under technical debt should be the top priority for the administration and agency leaders.

 


DHS, VA, GSA, OMB hire new federal technology executives

Like leaves changing, federal executives heading into retirement, the private sector or just finding new opportunities in the government is an annual rite of fall.

The executives on the move range from NASA’s chief data officer to the Federal Communications Commission’s head of cybersecurity to the Marshals Service’s chief technology officer.

This is by no means a complete list of every executive in the federal technology sector that moved to a new role over the last few months. If I missed any big moves, feel free to let me know.

Let’s start with some folks in new roles in government.

The CTO at the Marshals Service, Christine Finnelle, is now the new director of enterprise architecture at the Department of Homeland Security.

Christine Finnelle left the Marshals Service for DHS in November.

DHS CTO David Larrimore confirmed Finnelle’s move on LinkedIn on Nov. 26.

She has been with the Marshals Service since 2016 and worked for the Justice Department since 2003. She became CTO for the Marshals Service in 2019 to help create the agency’s long-term technology roadmap.

Finnelle is the second high ranking technology executive to leave the USMS this year. Karl Mathias, the USMS chief information officer, moved to the Department of Health and Human Services in March.

Over at the Department of Veterans Affairs, Kim Pugh announced she is the new director of software-as-a-service (SaaS) in the CIO’s office, coming over from the Veterans Health Administration.

Pugh has worked at VA for 18 years, including the last three years as VHA’s director of investment governance services.

The General Services Administration announced Dan Lopez is the new director of Login.gov. Lopez comes to GSA after serving the last three years as the director of software engineering for the Philadelphia city government.

“Dan holds an extensive background in engineering and leadership, and he hails from Philadelphia. Login.gov program is excited to enter this new chapter with Dan at the helm,” Login.gov tweeted on Nov. 28.

According to his LinkedIn, Lopez started in September.

Dan Lopez is the new director of GSA’s Login.gov program.

During his time in Philadelphia, Lopez said he was responsible for phila.gov, critical applications ranging from payments middleware (handling more than $1 billion a year) to the city’s campaign finance system, geographic information system applications like atlas.phila.gov and property.phila.gov, and much more.

GSA advertised for the position in February after it was vacant for the better part of two years. Amos Stone had been acting since August 2019.

The Office of the Federal CIO also filled a long-time vacant role. Mitch Hericks became the permanent director of federal cybersecurity after being the acting director since January.

“A year ago, I took a big leap in moving (back) from New York City to DC for a job working with our Federal CIO Clare Martorana and Federal CISO Chris DeRusha. It’s been a wild ride, but driving forward our implementation of the Executive Order on Improving the Nation’s Cybersecurity has been a once in a lifetime opportunity,” Hericks said on LinkedIn in early October. “I’m grateful to continue working with this team, driving security outcomes across the federal government as director for federal cybersecurity.”

Hericks replaces Steve McAndrews, who left in January to be the deputy CIO at the Energy Department’s National Nuclear Security Administration.

DoD, NASA, IRS, FCC looking to fill holes

While these agencies brought in new technology executives, several moved on, leaving agencies to fill key holes.

Two long-time federal technology experts retired.

Kevin Dulany, the Defense Department’s CIO’s director of cyber policy and partnerships, retired after 38 years of federal service, including 20 years in the Marines Corps.

“I am truly appreciative for all of the support I have received over my career from leadership, peers, and especially the staff who supported me that really made things happen!” Dulany wrote on LinkedIn in September. “I am also blessed with meeting such great people from not only internally within the department, but throughout the federal space and industry … keep charging and PLEASE — when you are pushing a capability to our warfighters, cybersecurity has to be ‘baked in, not bolted on;’ and last but not least, please keep those military members who are ‘forward deployed in not-so-nice neighborhoods’ in your thoughts and prayers!”

Dulany also worked at the Defense Information Systems Agency and led the Defensewide Information Assurance Program for 11 years.

Over at NASA, Ron Thompson, the space agency’s CDO, also called it a federal career on Sept. 30. He retired after over 30 years of federal service, including the last three at NASA and stints at the Agriculture Department, the Census Bureau and HHS.

“Since 1984, when I put on a uniform, I have been serving this great nation as a public servant. I had the opportunity to serve in seven agencies rising through the ranks to the senior executive service — I have been truly blessed. It has been my privilege to serve this wonderful nation and making lifelong friends, benefiting from amazing mentors’ leaders throughout the journey,” Thompson wrote on LinkedIn. “As I reflect back on my time, it is amazing how quickly the time passed and my hope is my contributions made the mission better than I found it and I helped others as I have been helped along the way.”

Thompson recently joined Quantum Space as its chief data officer and executive director.

The IRS and the FCC also saw key executives move on.

Andrea Simpson left the FCC to be the CISO at Howard University.

Ray Coleman, who had been the IRS’s executive program director of the cloud management office for the last year, became the CIO for Koniag Government Services, an Alaskan Native-owned IT services firm.

Coleman spent the last 12 years in federal service, serving as CIO for the Defense Contract Management Agency and USDA’s Natural Resources Conservation Service.

Andrea Simpson, the FCC’s chief information security officer, is taking her talents to academia. She joined Howard University in Washington, D.C. in the same capacity.

Simpson had been at the FCC CISO for two years, including the last six months as acting CIO, and joined federal service in 2013.


CISA signature federal cyber program warrants more than a passing anniversary nod

The continuous diagnostics and mitigation (CDM) program turned 10 years old last month. And what a long strange trip it has been.

As agencies move toward zero trust and continue to face an ever changing cyber threat, it’s clear CDM has hit its stride.

Now the Cybersecurity and Infrastructure Security Agency is positioning the program to bring a level of visibility and proactive response the original framers of CDM only dreamed of back in 2012.

“CDM was built on continuous monitoring that had been mandated under the Federal Information Security Management Act (FISMA) of 2002. Continuous monitoring was a thing. People talked about it. They did it. But they did it in lots of different ways across the civilian agencies. They did it with very little automation. There was certainly no central visibility. Agencies did it in different ways within their components or their elements,” said Betsy Kulick, the deputy program manager for CDM, at the recent FCW CDM Summit. “Most people relied on manual inventories at the end of the year and spreadsheets that offered a picture in time. There was not much beyond that in terms of accuracy, to say nothing of telling you how well protected at that particular endpoint or device. So there were people at the State Department, wise people at the Office of Management and Budget as well as in Congress that thought that automating it would be the smart way to move and that the state of the industry was such the tools existed that would allow us to do that. We were funded in 2012 to begin to try to standardize mainly continuous monitoring as the first effort in terms of device management, but ultimately, to go through the whole NIST (National Institute of Standards and Technology) Special Publication 800-53 controls to automate that, to the extent possible to provide a far more secure way of securing the federal civilian networks. It was an ambitious program, we knew we’d been working at this for 10 years.”

And 10 years later, the CDM program, warts and all, is widely considered a success.

The Department of Homeland Security launched the program in 2012 making awards to 17 companies with a $6 billion value.

The idea was borrowed by the State Department, which set up a system of continuous monitoring and alerting of hardware and software vulnerabilities.

DHS updated the program in 2017 to its current approach that focused on using system integrators to help groups of agencies with similar needs or in similar places implement approved products to fill in specific cyber gaps.

Since 2017, agencies have been receiving at no charge a series of tools and capabilities to get more visibility into their networks through asset, identity, data security and network protection management tools. CISA also provides a dashboard at both the agency level and one that provides data to create a governmentwide picture for CISA .It also helps small and micro agencies with a shared services platform.

Strong support from Congress

The decade of CDM has been far from smooth. Industry protested task orders. Agencies expressed frustration on several occasions about delays in getting key toolsets. DHS ran into bureaucratic, regulatory and legislative obstacles that needed to be cleared. And then there is the ever-present culture change aspect of trusting CISA to help, but not judge individual agency cybersecurity efforts.

But despite a dozen years of challenges, CDM has consistently found support from multiple administration and from Congress.

Congress has been unusually supportive of CDM and really CISA more broadly when it comes to federal cyber networks. Since 2012, DHS has received more than $2.36 billion specifically for CDM, which also included a sizable chunk of the $650 million CISA received from the American Rescue Plan Act. CISA hopes to receive another $4 billion through 2033 to continue to run and evolve the program.

Source: CISA 2021 report to Congress.

So what did all that money get?

CISA says the foundation for better more proactive cyber defense is coming into place.

Richard Grabowski, the deputy CDM program manager, said agencies are seeing real value from some of the work that CISA has led over the last year plus.

“Everything that we’ve been doing over the last 16 months and in the near term are going about building that collaborative defensive posture. So you see what we see, we can make very helpful recommendations that you can triage and take back at machine data speed,” Grabowski said. “We’ve made investments in the Elastic search tool, in technologies for end-point detection and response (EDR), helping you get in front of mitigation, coverage making sure that every and all shadow IT has some amount of spotlight on it, and then bringing into other asset classes like mobile.”

The CDM toolset has come in handy during every cyber threat and incident agencies have faced over the last five years. Whether it was the WannaCry ransomware attack or Log4J or any number of threats, agencies and CISA can turn to the dashboard from Elastic to discover more complete data more immediately.

Dashboard expansion coming

Judy Baltensperger, the project manager for the CDM dashboard at CISA, said the dashboard has come in especially handy to help address requirements in recent binding operational and emergency directives that CISA put out to the agencies over the last two years such as after the SolarWinds and the Log4J cyber incidents.

“We were able to share with them what CDM data is actually available, and what kind of automated reporting can we feasibly do. I don’t think people realize how expansive the dashboard is,” she said. “We have about 89 dashboards deployed, 78 of them reporting data. We do have a large amount of coverage across the network now, and we were now at the point where that synergy came together.”

Baltensperger added the dashboard has impacted agencies’ ability to meet specific compliance requirements and address long-standing cyber hygiene challenges such as patching and asset management.

There are several new capabilities coming to agencies from CDM to improve this proactive and collaborative defense posture.

Baltensperger said one of them is something called cross cluster search that will give CISA an even deeper look at health of agency networks, which came in handy during a recent cyber threat, something called open SSL3, considered a high risk vulnerability.

“What that gives us here, this is a federal level is object level data visibility into the dashboards. So as of this moment, we have about 20 dashboards out of the 78 that we have this object level data visibility,” she said. “Starting several days ago, last Friday (Oct. 28), we were able to then with that object level data, deep dive down to what was being scanned. Within the ecosystem, we have more visibility than we have ever had in the past. That’s expanding with our implementation and enablement of cross cluster search. But that needs to improve. But that’s a significant improvement.”

Baltensperger added CISA expects to expand this cross-cluster capability to more agencies in 2023 because it provides a level of automation of information collection that will accelerate when agencies know if they have a vulnerability so they can remediate it and reduce their risks.

Additionally, CISA will upgrade all agency dashboards to version 6 and a new service under the dashboard to help agencies identify when they are using end of life products or are getting close to end of life so they can replace them and reduce cyber risks.

More agencies moving to shared services

Finally, Baltensperger said another capability that is gaining momentum is around dashboard-as-a-service.

“If an agency does not have their own hosting environment and they would like to pass that on to us, our team can do that for a significant cost savings of about $80,000 to $100,000 per dashboard. What we can do is on our side is provide you access to that dashboard. So that means the product gets paid for, all the infrastructure gets paid for, the storage gets paid for because we are managing a similar type product and we’re able to repurpose our labor,” she said. “We’ve gotten much more efficient with the number of people that it takes to operate and maintain and upgrade that particular solution because we basically built a shared service off on the side, and we can offer it to all the agencies.”

Currently, five CFO Act agencies are using the dashboard-as-a-service and another seven or eight plan to join in 2023.

“What that means is their dashboard is moving out of their system boundary and we are hosting it on their behalf. Now the data still belongs to them. They’re still responsible for the data. But all of the burden of operating, maintaining patching, keeping up with the operating system patches, figuring out if you are susceptible to OpenSSL, all of that work is coming over to our team, and we’re already doing it for ourselves,” she said. “What we’re doing is just extending it to the agencies. But it means that we’re funding the infrastructure. And because we’re funding the infrastructure out in the cloud in a shared service manner, we’re able to realize cost savings.”

Shared services, cost savings and most importantly, better cybersecurity, those were the initial goals and vision for CDM. No one would claim this was an easy path and CDM is far from perfect, but it’s clear agencies are better off because DHS, the State Department, OMB and a host of visionaries took a collective leap into the cyber unknown.

It’s not often the government celebrates program successes, especially cybersecurity initiatives. But CISA, OMB and every agency should take a moment, offer a smile or two and delight in what they have accomplished through CDM over the last decade.

And I hope CISA at least had some cake to mark the anniversary and all that is good about the continuous diagnostics and mitigation program.

 


50,000 companies on hold because of GSA’s UEI validation problems

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Editor’s Note: Updated on Nov. 2 with comments from GSA about the UEI validation service backlog.

It’s now November, and the General Services Administration still hasn’t fixed the Unique Entity Identifier (UEI) transition to the new validation service that began in April.

And as many as 50,000 companies and grantees still are waiting to fix validation issues that is causing delays in awards and in getting paid.

The almost six-month old problem continues to cause broad concern across industry and on Capitol Hill.

“GSA briefed our subcommittee staff and is working to recover from a transition to new federal contractor system, which experienced many challenges and shortcomings. GSA did not predict or foresee many of the critical responsibilities of a system designed to identify and certify our private sector partners so they can work with agencies to improve how government operates,” said Rep. Gerry Connolly (D-Va.), chairman of the Oversight and Reform Subcommittee on Government Operations. “While GSA addresses these problems, they must work to reduce the unacceptable backlog. I will continue to work with our private sector partners to ensure that companies get certified efficiently and do not risk financial collapse because GSA underestimated the complexity of the system needed to undergird government’s engagement with federal contractors.”

Connolly wrote to GSA in July wrote to GSA asking for a briefing and update on their progress in fixing the validation service.

A GSA spokesperson said the 50,000 backlog represents a “snapshot in time” of vendor problems that are currently under manual review.

The spokesperson said  only a fraction of that 50,000 trouble tickets have been in the manual review process for more than two months.

“The numbers do reflect that we have a high volume of tickets coming in, and we remain focused on completing each review as quickly as possible to minimize burden on businesses and other entities, while maintaining the rigor and integrity of the entity validation process overall,” the spokesperson said.

But while the agency is making progress in reducing the backlog and fixing the problems, Connolly and industry representatives remain frustrated.

Stephanie Kostro, the executive vice president for policy for the Professional Services Council, an industry association, said member companies remain stuck in limbo. While GSA has resolved some of the companies’ issues, PSC has member companies who have been facing significant problems since the summer.

“They can’t submit bids. They can’t get paid from work they completed,” Kostro said in an interview with Federal News Network. “There should be a systemic solution, but all we’ve heard from GSA is that they are working on it. But there is no evidence that the end is near, or any light at the end of the tunnel.”

Grantees impacted on  larger scale

One industry executive, who requested anonymity because they didn’t get permission to speak to the media, said the continued delays to get validated are frustrating for the time it takes as much as the process. The executive said, for instance, GSA tells vendors to log in every five days for an update, even if there is no update. And if the vendor doesn’t log in after five days, they are kicked out of the line.

“That’s just insane. Shouldn’t GSA have an email notification or something if there is a change in your status?” the executive said.

Cynthia Smith, director of government affairs and advocacy at Humentum, a global nonprofit working with humanitarian and development organizations to improve how they operate and to make the sector more equitable, accountable and resilient, said her members have not seen much, if any, improvement since April.

“Humentum continues to hear from our members on a weekly or bi-weekly basis with questions and concerns about the delayed processing of their and their local partners’ UEI tickets. We continue to hear about protracted delays, concerns about the seemingly arbitrary closing of tickets prior to their resolution, and local partners’ lack of responsiveness from the GSA’s Federal Service Desk to their inquiries,” Smith said in an email to Federal News Network. “The continued impacts are delays in contracting, programming, and funding local partners to do the work they are best positioned to do – particularly for those foreign local partners who are not able to avail themselves of USAID’s temporary exception, because they are working with funding from State or another US federal granting/contracting agency.”

The continued problems aren’t just impacting vendors, but agencies too. The Defense Department issued a deviation to its acquisition regulations in September, allowing the services and defense agencies to do business with companies who aren’t fully registered in the governmentwide acquisition system.

Kostro said given the Biden administration’s desire to bring more small disadvantaged businesses as well as new entrants into the federal market, it would make sense for GSA to have a little more urgency and communication on how they are fixing this problem.

“I can’t think of a greater disincentive for a company to want to participate in the federal market if you can’t register or get paid,” she said. “I think that would have lit a fire under GSA but we have not seen any evidence of that yet.”

Smith added as the validation process continues to be hamstrung and with GSA not offering a “clear and viable plan” to remedy the situation, agencies are “working at cross-purposes to its own stated objectives of engaging new and local partners around the world to advance our foreign assistance and national security priorities.”

GSA reducing need for manual reviews

The GSA spokesperson said to date, more than 373,000 entities have successfully completed the validation process. GSA launched the move from the old Dun and BradStreet number to UEI in April after years of planning. The validation piece of the transition became a problem almost immediately.

PSC’s Kostro said there are plenty of examples of simple problems that took weeks to fix, such as having an extra space on the other side of an “&” or  missing the plus-four for a company’s zip code.

“Roughly 80% of these entities did not need a manual review and therefore proceeded without delay. For the roughly 20% of entities that require a manual review, GSA is surging support to the program and prioritizing accordingly,” the spokesperson said. “We are seeing positive results. Due to GSA’s improved workflow, communications and stakeholder outreach and education, the number of entities able to complete their validation with the first manual review increased by 30 percentage points. Once an entity is successfully validated, they are unlikely to face a similar problem again because their validated information is in the new database for future annual renewals.”

The spokesperson added GSA is continuing to take steps to resolve issues, improve response times and provide immediate relief to companies.

“GSA’s biggest challenge has been the volume of tickets submitted, which has far exceeded expectations,” the spokesperson said. “Right now we are focused on managing the unanticipated volume by surging support and prioritizing entities most at risk of financial impacts, while also making changes to help us better understand and support the needs of our users overall. Additionally, we implemented an automatic, 30-day extension for any existing SAM.gov entity registration with expiration dates between April 29, 2022, and April 28, 2023, and a 60-day extension for all registrations expiring in August and September.”

Other sources confirmed GSA’s efforts to fix the problem, including encouraging agencies and vendors to quickly raise any problems particularly around getting paid higher, and by surging people, data and automation to accelerate the ticket resolution process.

Sources also say GSA is trying to address system and workflow processes to help vendors trying to validate their information.

While it sounds like GSA is doing as much as possible, including surging contracting officers and other senior leaders to the problem, it’s clear industry feels more transparency is needed. To GSA’s credit, it has held virtual listening and Q&A sessions as well keeping its Interact site updated. But at the same time, its alert on SAM.gov hasn’t been changed since mid-September. That’s a long time without any word on the hub of federal procurement.

 


The government’s Section 508 transparency problem

Agencies have a transparency problem when it comes to Section 508. It’s not that agencies are ignoring the law Congress passed 24 years ago to ensure federal technology is accessible to people with disabilities. It’s the lack of discussion, data, evidence or even reporting of progress that is causing concern on Capitol Hill and among other experts.

The Biden administration’s clarion call for diversity, equity, inclusion and accessibility (DEI&A) is ringing hollow unless agencies do more to show and tell how they are meeting both the spirit and intent of Section 508.

“Given the current absence of public, governmentwide evaluations of federal technology accessibility, it is critical that the General Services Administration’s timely data and analysis be made available to Congress so that we may better evaluate compliance with and the effectiveness of existing accessibility laws and programs,” wrote a bipartisan group of senators in an Oct. 7 letter to GSA Administrator Robin Carnahan. “Accessible websites and technology are extremely important to these populations—and the federal employees who provide them services—yet there is mounting evidence the government is not meeting its obligations as required by Section 508.”

Sen. Bob Casey (D-Pa.), chairman of the Special Committee on Aging, is leading the charge to bring some sunlight onto agency 508 efforts.

Bob Casey
Sen. Bob Casey (D-Pa.) is the chairman of the Special Committee on Aging.

Efforts that, for the most part, have been in the dark for the better part of a decade.

For example, the Justice Department hasn’t issued a governmentwide report on 508 compliance since 2012. The 1998 updates to the Rehabilitation Act of 1973 required DOJ to issue an annual report.

GSA, as another example, hasn’t made governmentwide 508 compliance summary data public since 2018. The letter from Casey, Sens. Tim Scott (R-S.C.), ranking member of the Aging Committee, Gary Peters (D-Mich.), chairman of the Homeland Security and Governmental Affairs Committee, and Rob Portman (R-Ohio), ranking member of the Homeland Security and Governmental Affairs Committee, and Patty Murray (D-Wash.), chairwoman of the Health, Education, Labor and Pensions Committee, pointed out that despite the Office of Management and Budget’s 2013 strategic plan to improve federal technology accessibility requiring GSA to collect this data, nothing has been published for four years.

“One of things that became clear in the committee’s work looking at the VA was that a lot of the enforcement mechanisms, whether it was GSA or DOJ, were not being carried out in the way Congress had expected  or anticipated,” said a committee source, who requested anonymity in order to talk to the press. “A lot of times these types of reports and functions wither on the vine with absence of attention from Congress. This is about generating some action on the executive branch side.”

The bipartisan letter to GSA requested data about agency 508 compliance by Nov. 14.

A GSA spokeswoman didn’t address the Senators’ request directly, but pointed to the “Governmentwide Strategic Plan to Advance Diversity, Equity, Inclusion, and Accessibility in the Federal Workforce” that highlights GSA’s work  with the Office of Management and Budget, the U.S. Access Board and the Federal CIO Council’s Accessibility Community of Practice to review existing accessibility guidance and best practice resources and make updates as necessary to help agencies build and sustain an accessible federal technology environment.

Federal Chief Information Officer Clare Martorana said in an email to Federal News Network that the administration recognizes why accessibility is critical to digital service delivery, customer experience and DEI initiatives.

“We are continuing to track agencies’ progress on accessibility to make sure they are prioritizing accessibility, remediating existing accessibility issues, and are on a path to deliver more accessible IT from the beginning,” she said.

Additionally, OMB says it’s working with agencies to prioritize IT accessibility in performance and budget conversations — especially when an agency is modernizing a website or digitizing a form that impacts service delivery.

“OMB is working with General Service Administration (GSA), U.S. Access Board, and the Department of Justice (DOJ) to improve public reporting and expand automated data collection so that the public can better see government’s progress on accessibility,” an OMB official said. “Currently, OMB requires agencies to report twice per year information about IT accessibility including web accessibility as well as the maturity of their Section 508 program. OMB analyzes these reports, tracks agency progress and uses these to engage in performance management conversations with agencies.”

More than just data that is lacking

Of course, for some agencies like the Department of Veterans Affairs, it’s more than just a lack of transparency. Casey wrote to VA in June and held a hearing in July highlighting consistent problems with VA and other agency websites not meeting accessibility standards.

It’s true some agencies are better than others, but 24 years into the requirements and, especially as technology has made it easier to ensure people with disabilities can access services, federal efforts remain tepid. A June 2021 report by the IT and Innovation Foundation found 50 of the 72 federal websites tested (70%) passed the accessibility test for their homepage. But as ITIF went further into those sites, that percentage dropped to 52% for the top three visited pages of a specific site.

To be clear, not all the blame can put on agency shoulders. Congress hasn’t really been paying attention either for most the last decade.

Casey’s hearing in July was the first one since at least 2011. The senator’s request in August for the Government Accountability Office to review agency compliance with Section 508 would be its first major study since at least 2016 if not well before that, the committee source said.

The source confirmed that GAO has accepted the request to review 508 compliance.

“In the course of the committee’s oversight work, it became clear that with DOJ not doing their report, GSA not making the information public and the absence of Congressional oversight, there are some agencies who are doing well, but others who aren’t and this needs more attention,” said the committee source. “We are figuring out whether this nearly 25-year-old law may need some updates to refresh what executive branch agencies are doing.”

While websites and other agency efforts may be falling short in some cases, it’s clear agencies are not ignoring the law altogether. At last week’s Annual Interagency Accessibility Forum, sponsored by GSA, the accessibility initiatives from more than 20 civilian, defense and intelligence community agencies were on display.

“I hope that you’ve seen over the last two days as well as the last two years, that accessibility is not an afterthought for President [Joe] Biden, and all of us who work for this administration. We know that government only works if it works for everyone, and we believe that if it’s not accessible, it’s not equitable, both for the people, who are public servants, and for the Americans that we serve, which includes about 16 million who have disabilities,” said Katy Kale, GSA deputy administrator at the event. “Our government and our democracy have a responsibility to ensure that people with disabilities, both visual and invisible, can pretend can participate in public life.”

Expanding governmentwide efforts

GSA continues to be at the center of ensuring agencies have the tools and knowledge to meet and exceed 508 standards through its Office of Governmentwide Policy and through its Federal Acquisition Service.

Andrew Nielsen, the director of governmentwide IT accessibility programs at GSA’s OGP, said at the event there are several ongoing initiatives to help agencies meet 508 requirements including a new tools called open accessibility conformance report (ACR).

“The benefit that we see from this open ACR is to develop and to define a data schema for a machine readable version of an accessibility conformance report. So rather than a report produced in a Word document that is then typically in a PDF, we are encouraging people to use the open ACR data schema, the definition for how that data should be relayed in a machine readable fashion,” Nielsen said at the event. “We can then benefit from the ability to more readily share accessibility conformance information right alongside other product information, specifications and descriptions. So when we’re reviewing other information and making purchasing decisions, we can include more readily accessibility conformance information as part of that.”

Additionally, GSA plans to create an ACR repository in the coming months to make it even easier for agencies to find and use this information to ensure the products they buy meet or exceed the 508 standards.

Nielsen said a second initiative is to update their accessibility requirements tool for procurements. He said this too will be easier to use and published on open source repository so others can customize it or bring it behind a firewall to use on classified systems.

“We also have a tool available for federal employees only that is the available for review of solicitations posted on SAM.gov. We are still developing that solicitation review tool. It’s will use artificial intelligence, machine learning and natural language processing to scrape the information posted in solicitations to identify and then flag those that don’t include any accessibility related requirements,” he said. “The intent there is to train up our machine learning tool to improve the logic and actually interface or reuse the logic from the accessibility requirements tool. In the future state, it not only will flag solicitations for the owners, but also using the logic from the accessibility requirements tool give them recommendations for which language to include in the solicitation. Our hope is to improve our approach to accessible or procurement of accessible products.”

Improving testing consistency

In addition to those two acquisition focused tools, GSA and the U.S. Access Board developed an information communications and technology baseline for websites to reduce testing ambiguity and increase consistency of results.

Dan Pomeroy, the deputy associate administrator in the Office of Information Integrity and Access in GSA’s OGP, said at the accessibility forum that the baseline describes how to evaluate conformance to the 508 standards, which align with Web Content Accessibility Guidelines (WCAG) 2.0.

“It’s organized by categories to help users easily identify applicable requirements. It’s important to note that the baseline is not a test process in and of itself, but rather a tool that should be used to create an accessibility testing process,” he said. “While other baselines such as the ICT baseline for software are in the works, the ICT testing baseline for the web is live. It can be found under the testing section at section508.gov.”

Pomeroy said another related effort from the governmentwide IT accessibility team is the creation of an accessibility policy framework.

This guidance aims to assist agencies with assessing accessibility policies across the functions like finance or procurement.

“The intent of the framework is to help agencies prioritize which policy documents they should review and update to improve accessibility information they contain with the overall goal of improving the digital accessibility of the products or services covered by the policy,” Pomeroy said. “The accessibility framework is currently in development and is expected to be released on Section 508.gov later this fiscal year.”

 

 


« Older Entries

Newer Entries »