Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

3 takeaways from DISA’s forecast to industry day

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

BALTIMORE — The Defense Information Systems Agency annual forecast to industry day on Oct. 29 was a sure sign the federal market is returning to normal.

A nice crowd of maybe 200 industry executives showed up at the Baltimore Convention Center to better understand where DISA is heading in 2022 and beyond. But more than just the usual positioning for meetings and trying to gain a competitive edge, DISA took more of a strategic approach than the typical tactical one focused on contracts and timelines.

Throughout the seven-hour day, DISA officials covered technology priorities ranging from satellite communications to Thunderdome — insert “Mad Max” joke here — to cloud and data centers to small business contracting success and struggles.

More than in previous years where DISA brain dumped more than just upcoming contracts, but detailed how the technology, its priorities and its efforts fit into the Defense Department’s broader modernization strategy.

Here are three major takeaways from DISA’s forecast to industry day:

Enough already

Throughout the day, DISA speakers followed the basic rules of communication: Tell the audience what you are going to say; say it; and then tell them what you said.

In this case, five different speakers drove home the same message: For most products and services we already provide and with the expected budget tightening, we don’t need more of the same.

Air Force Lt. Gen. Robert Skinner, the DISA director, started off with this theme and it continued until mid-afternoon.

Air Force Lt. Gen. Robert Skinner is the DISA director.

“We have too many tools. So how do we optimize what we have first at minimal cost?” Skinner told the audience to lead off the forecast day. “We have to do more with what we have. We need your help so we can take advantage of the capabilities that exist.”

Skinner told industry to make sure the technology is scalable because DoD likes to break things in some form or fashion.

A few speakers later, Steve Wallace, DISA’s newly named chief technology officer and head of the emerging technology office, echoed Skinner’s comments.

“We want you to help us better tune our systems and make sure they don’t drift,” Wallace said. “The capability is only as good as it is implemented.”

Other speakers followed suit. Don Means, the director of the operations and infrastructure center, said industry has to help them “optimize what we have out there,” and help them figure out what they aren’t using to its full potential.

Caroline Bean, the acting director of the Joint Enterprise Services Division, told industry that DISA needs help “optimizing what we have out there that is not being used to its fullest potential. We want to use capabilities to be more proactive and preventative to provide a more seamless customer experience.”

She added DISA wants to use the data to make decisions faster and earlier in the process.

As you can tell, DISA executives followed their talking points around taking advantage of current technologies and tools, especially as the military’s budget is expected to tighten over the next year or two.

In DISA’s 2022 budget request to Congress in May, it asked for $2.7 billion, which is down from $3.4 billion in 2021.

DISA also receives money, just under $12 billion, from the services and defense agencies for enterprise services and acquisition support through its working capital fund.

Skinner and other leaders expect those figures to remain flat or drop as DoD’s budget is under debate on Capitol Hill.

Industry should heed DISA’s warnings. Current providers need to optimize what DISA already is using, and new ones better bring the innovation and target specific requirements because DISA is tightening up its spending.

Pilots, MVPs, OTAs, oh my!

If you’ve been paying any attention to DISA over the last 20 years, one thing is clear: They do love their pilots or proofs of concept.

During the industry day, DISA officials mentioned at least five different pilots that are underway or it will launch in 2022.

The pilots along with the increased use of Other Transaction Authority (OTAs) signals a bigger shift for DISA in how it buys technology products and services.

Take Thunderdome, its zero trust prototype. DISA said it will soon award an OTA to prove out the concepts that make up a zero trust architecture.

“This is a radical rearchitecting of the DoD’s information networks. That pilot is going to prove out and we will listen to the work being done by other military services including the Air Force to make sure doing right things in that space,” Hermann said. “The pilot activity is going to help us inform exactly how we implement this across the entire department.”

The Hosting and Compute Center started a web server container pilot as a key piece to its strategy to modernize their data centers with modern technology. The Center’s Director Sharon Woods said she wants the data centers to become key enablers of the hybrid cloud environment and is starting to see how that would work through the pilot of one web server.

Sharon Woods is the director of the Hosting and Compute Center at DISA.

“We will achieve significant efficiencies, which will let us reinvest our workforce into more complexity going after different problems. It is a team that cuts across our entire organization,” she said.

Woods emphasized this pilot, like all the ones that will come from her shop, must deliver a minimum viable product in less than six months.

“That is rule one when we triage and look at what projects we want to undertake. This product is well underway and, for sure, will deliver in six months,” Woods said. “I would offer to industry when you come to us and offer us different ideas, I don’t want to boil the ocean. We are not interested in doing that. We are interested in identifying bite-sized things we can hit as a team, in partnership, in less than six months, and then go from there and use the momentum from those micro successes to really get after the bigger, fundamental global challenges.”

DISA’s move toward OTAs is another example of this strategic goal of increasing speed to capabilities.

Since 2018, DISA has awarded nine OTA prototypes, moved three into production contracts and have two more in process to be awarded. All of DISA’s OTAs go through Dreamport, which is a cybersecurity collaboration, innovation and prototyping lab created by the U.S. Cyber Command with the Maryland Innovation and Security Institute.

“When a program manager comes to us with an OTA requirement, we will schedule a pitch meeting, and we will have the PM describe to us what this requirement is to make sure it’s a good candidate for an OTA. We have turned several back, saying these need to be [Federal Acquisition Regulation] -based contracts because they don’t fit the criteria,” said Vanessa McCollum, a contract specialist at DISA.

In fact, Jason Martin, DISA’s Digital Capabilities and Security Center director, said the spectrum coordination systems procurement is going through that process now and could be shifted back from OTA to a traditional procurement.

“We’ve seen great success with some of our other OTAs. DISA is really evolving. DISA is using new techniques. DISA is doing things to try to become quicker at delivery,” Martin said. “We are leveraging these different more agile ways of thinking and doing this across the board with experts who are [on] loan to us for a day or two or permanent. Thunderdome will be fundamentally different because of how we are stacking it. When you impact the entire Defense Information Systems Network and you need resources from across the entire agency, you have to think differently.”

The pilots, the OTAs and the proof-of-concepts are all how DISA is demonstrating that its thinking is evolving.

Beyond the Thunderdome

Without a doubt, the most excitement during industry day came from DISA’s soon-to-be-awarded OTA for Thunderdome, its test to create zero trust architecture. DISA is reviewing industry white papers, which were due Sept. 7, and expects to make its choice “soon.”

But beyond the Thunderdome excitement, DISA is taking a lead role in a host of cybersecurity efforts that officials hope are as game changing as Thunderdome could be.

First off, if Thunderdome proves out, DISA will complete its implementation of the Joint Regional Security Stacks (JRSS) initiative this year and start to transition these to the zero trust architecture. DoD launched the JRSS project in 2014 to improve cyber protections. It struggled with implementation during the first several years and the Defense inspector general recommended DoD look for alternatives to JRSS, including zero trust.

At the same time, Martin said DISA will release version 2.0 of its zero trust reference architecture in the coming months.

While JRSS is potentially on the downside of its lifecycle, public key infrastructure (PKI) continues to a foundation DoD is building on. DISA is looking to take its legacy PKI and modernize its work in a hybrid cloud environment.

“We have been running this on-premise for years, but we think we can do a better job, more efficient, more effective, if we move that to a hybrid cloud environment,” Hermann said. “This is very technically complex what we do as it relates to PKI. We need to have really smart people doing this work. I think it’s fair to say within the government we have lost this technical skill set and it’s a niche capability even in industry. So I’m looking for help about how we modernize and move to a hybrid cloud in this space.”

DISA expects to release a request for proposal for PKI modernization support services in the second quarter of fiscal 2022 and make an award in fiscal 2023.

As part of the PKI modernization, Steve Wallace, DISA’s new chief technology officer and head of the emerging technology office, said DISA continues to look for new ways to make identity and access management easier, without losing any of the security rigor.

Steve Wallace is the chief technology officer and director of the Emerging Technology Directorate at the Defense Information Systems Agency.

“One of the side effects of the Commercial Virtual Remote effort is it proved that username and password, and a multi-factor authentication components, whether it was biometrics or tokens, worked,” Wallace said. “I see a heavy focus on that as we move forward.”

End point security is another focus area with a RFP for third party tools integration help coming toward the end of 2022 with an award in early 2023.

Wallace added DISA plans to build on its success with the cloud-based internet isolation program with a reverse browser isolation effort.

“The cloud based internet isolation program is when a trusted end point [is] talking to untrusted data on the internet. The reverse browser isolation effort flips that and it is untrusted machines or workstations talking to trusted data sources. How do we create that separation and look beyond web application firewalls?” he said.

Two other long-term efforts include breach and attack simulation capabilities and a cyber asset inventory management program.

Wallace said DISA worked the Joint Force Headquarters-DoD Information Networks on this challenge to ensure vendor capabilities do what they say they do when it comes to a cyber attack.

“What this category of product does for us is it simulates breaches and attacks, and makes sure the white papers that we implemented capabilities on and the Visio drawings actually hold up in a real scenario. It will help us better tune our systems and make sure things don’t drift because as good as a capability can be, it’s only as good as it’s implemented,” he said. “The cyber asset inventory management program is to get an inventory of all the devices and systems that we have out there. I thought we would end up with a network scanner. But what we ended up with was a product that plugs into all the other infrastructure that we already we have and all systems and repositories, then does a comparative analysis and looks for gray space. Why does something appear in my Active Directory, but not in my anti-virus product? We are really excited as we start to deploy that. We are working with the Joint Service Provider to help us move that one forward.”


Are 2 associations’ questions to GSA about cloud efforts premature or discerning?

The General Services Administration is facing new scrutiny over two separate cloud computing issues.

Two industry associations are raising concerns about specific cloud ideas that GSA is either considering or discussing.

The Coalition for Government Procurement wrote to Laura Stanton, GSA’s assistant commissioner in the Office of Information Technology in the Federal Acquisition Service, expressing concerns about a recent request for information for a new blanket purchase agreement for cloud services.

At the same time, CompTIA wrote to Skip Jentsch, GSA’s cloud products manager and enterprise architect, about public comments he made about buying cloud services during industry outreach sessions in September.

Kudos to both CGP and CompTIA for trying to get ahead of potential problems. But at the same time, one has to wonder if both are making a mountain out of a mole hill?

Cloud BPA questioned

Let’s start with the facts of both situations before we decide how serious these concerns are or need to be.

In CGP’s case, it is asking whether GSA, which was seeking industry feedback on whether it should create a new governmentwide BPA on top of the schedules for cloud services, is going down a path that would be duplicative and wasteful.

GSA released the RFI in May as part of its market research to see how it can make buying cloud services more efficient.

CGP said if GSA launches this new contract, it would “create an additional layer of cost, administration and process that increases costs for all. The use of generic, governmentwide BPAs simply increases costs and complexity in the Schedules program resulting in new barriers to entry for commercial cloud solutions. The result will be reduced competition, efficiency and value for customer agencies, GSA and its industry partners, including potential new entrants to the market. As such, the approach is inconsistent with the administrator’s goal of streamlining the schedules program.”

The coalition, which hosts Off the Shelf on Federal News Network, details five areas of concern about the potential BPA, including the possible competition and scope of the vehicle and draft concepts on how the competition and task order process would work.

“The RFI and previous generic, governmentwide BPAs also raise questions regarding how GSA accounts for additional fees to customer agencies when those agencies already can compete their unique requirements directly against the top-level FSS contracts, rather than second-level generic BPAs,” the coalition wrote. “The use of a generic, governmentwide BPA also prompts concerns regarding the underlying FSS contract terms. To the extent that GSA is seeking to include BPA terms and conditions that would apply governmentwide, the appropriate avenue for consideration of those terms is at the FSS contract level. To that end, GSA should engage with its industry partners and agency customers to determine whether supplemental BPA terms should be negotiated at the FSS contract level. Many times, an added feature from a BPA could have been included at the contract level.”

Comments concerning or ill-phrased?

CompTIA’s concerns, meanwhile, are more focused on comments Jentsch made at industry days, where he is reported to have said vendor lock-in for cloud services may be justified or even desirable.

The industry association says Jentsch also may have said brand name justifications and a single award BPAs for cloud services may be possible to obtain specific services from a cloud provider.

These comments “may undermine federal procurement law, while promoting anti-competitive practices that stifle price competition and innovation,” the association wrote in its letter.

CompTIA highlights both policy from the Office of Management and Budget and the Federal Acquisition Regulations Council, and federal law, such as the Competition in Contracting Act (CICA), that prohibits or severely limits several of the ideals Jentsch brought up.

“Few markets are more vibrant and subject to rapid change than the cloud computing market,” CompTIA wrote. “No assurance exists that today’s reseller of cutting-edge cloud services will have access to the leading cloud services of tomorrow. Single award BPAs insulate the reseller from competition, ending the need for competitive pricing in a market where prices trend downward and technology improves rapidly.”

Both associations are raising these separate cloud-related issues ahead of any more serious problems, but one has to wonder whether they also are jumping to conclusions too fast.

History is on GSA’s side

Let’s look at the RFI. As most know, a RFI is just trying to understand more about an idea or concept. Technically, no decisions are made and the responses from vendors should influence an agency’s direction. So are CGP’s concerned justified?

On one hand, they would be remiss if they didn’t offer feedback, but on the other hand GSA hasn’t announced anything further about its plans with the cloud BPA so CGP’s issues may be much ado about nothing.

GSA’s own history has shown that specific cloud BPAs don’t work well — see the email-as-a-service BPA from 2011. Few, if any, agencies used it and GSA eventually let it expire.

Agencies are buying cloud services with integration help, not as standalone instantiations. This is why that email-as-a-service initial BPA failed.

This exact issue is one CGP brought up back in May soon after GSA released the RFI and asked whether it was repeating history or whether it learned how to create a contract for cloud services that would be successful.

“BPAs that include sound, fulsome requirements enhance competition, reduce costs, increase value, and lead to positive business and mission outcomes,” wrote Roger Waldron, CGP’s president in that blog post.

Similarly, CompTIA’s letter could be seen as raising a red flag prematurely. While Jentsch’s comments are disconcerting to say the least, keep in mind he’s also not a contracting officer. If you look at his background via LinkedIn, it’s more technology than acquisition, so it may be better to give him the benefit of the doubt that he was inartful rather than advocating for a new GSA policy.

GSA should respond to CompTIA’s concerns and clarify Jentsch’s comments. GSA should do this despite the fact that the agency has worked hard over the last two decades to demonstrate their support of multiple award and competitive cloud acquisitions.

As for the cloud BPA, it all comes down to the strategy GSA chooses. If they decide to run a BPA on top of the schedules — which is always a bad idea in my mind — then they need to ensure there are defined requirements and expectations versus just a broad-based cloud procurement. If they choose to develop a new governmentwide acquisition contract or add a new pool or functional area under, say the new services multiple award contract current under development, that may solve some of CGP’s concerns.


Labor makes the case that its CIO reporting structure works despite the IG’s doubts

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

There seems to be a lot of confusion over whether an agency chief information officer must report directly to the head of the agency as required by law.

Since the Clinger-Cohen Act became law in 1996, the “should they or shouldn’t they” debate has been a central part of many IT modernization discussions.

So let’s set the record straight once and for all: Yes, the Clinger-Cohen Act does require the CIO to report directly to the head of the agency.

To understand this, I turned to Mark Forman, who helped write the bill and is a former administrator for e-government and IT at the Office of Management Budget.

Forman, who is now executive vice president for Dynamic Integrated Services, found the reference in the Paperwork Reduction Act section of the bill, 44 U.S.C. 3506 (a) (2)(a), if you are keeping scoring at home:

Ҥ3506. Federal agency responsibilities

(a)(1) The head of each agency shall be responsible for—

(A) carrying out the agency’s information resources management activities to improve agency productivity, efficiency, and effectiveness; and

(B) complying with the requirements of this subchapter and related policies established by the Director.

(2)(A) Except as provided under subparagraph (B), the head of each agency shall designate a Chief Information Officer who shall report directly to such agency head to carry out the responsibilities of the agency under this subchapter.

(B) The Secretary of the Department of Defense and the Secretary of each military department may each designate Chief Information Officers who shall report directly to such Secretary to carry out the responsibilities of the department under this subchapter. If more than one Chief Information Officer is designated, the respective duties of the Chief Information Officers shall be clearly delineated.

(3) The Chief Information Officer designated under paragraph (2) shall head an office responsible for ensuring agency compliance with and prompt, efficient, and effective implementation of the information policies and information resources management responsibilities established under this subchapter, including the reduction of information collection burdens on the public. The Chief Information Officer and employees of such office shall be selected with special attention to the professional qualifications required to administer the functions described under this subchapter.”

Forman admitted that it’s a little hard to understand the way Congress wrote the final version. But when you read the above language, there shouldn’t be any doubt.

Yet, 21 years later despite the requirement being in black and white, despite several executive orders and policies from the Office of Management and Budget, the debate lingers.

IG, agency leadership disagree

The latest example comes from the Labor Department. The Labor inspector general and the department’s associate deputy secretary Nikki McKinney had a fascinating back and forth about the reporting structure of the agency’s CIO.

The IG found Labor’s IT governance structural “does not appropriately align authority and responsibility,” and is “ambiguous, ad hoc and reliant on personnel to fulfill their duties without codified policies or procedures.”

At the heart of the IG’s report is the fact the CIO doesn’t report directly to the secretary or deputy secretary and instead to the assistant secretary for administration and management (ASAM).

“We found DoL’s organizational structure limited the CIO’s ability to execute IT governance, resulted in infrequent contact with the most senior level leadership and left the ASAM to represent IT issues and concerns for several DoL governing boards instead of the CIO,” the report states.

Sounds pretty damning for Labor, right? CIO not reporting to the agency senior leaders is a recipe for IT disaster stew.

Until you read McKinney’s response and dig deeper into the situation where the debate takes an unexpected turn.

First, McKinney points out the flaws in the IG’s report, specifically how the auditors focused on process instead of outcomes. She said the IG has an lack of understanding about how Labor’s leadership functions where one-on-one meetings don’t result in major changes.

“Goals and outcomes are achieved by marshalling the organization’s component parts to achieve success,” McKinney wrote. “One might imagine, in the auditor’s view, the CIO is both the quarterback and the receiver, throwing the ball and then catching it. Instead, the department’s career CIO is empowered by DoL leadership, including the ASAM, to implement the vision of the CIO.”

It’s rare to see such an aggressive response to an IG report.

If it’s not broken, why fix it?

Former and current federal officials familiar with how the Labor Department is set up said the issue the IG highlighted has been a sticking point between auditors and the leadership for decades.

“I think at Labor, the current model has been working well. There is value to moving the CIO to report directly to the secretary, but there are also cons like running to the secretary for every day blocking and tackling. The model is rarely good or bad, it’s really about execution. And the current execution is working well,” said a government official with knowledge of Labor’s CIO office, who requested anonymity because they didn’t get permission to speak to the press. “If the CIO reports to the secretary then it could also become a political position and secretary brings in their own CIO and then you get temporary ones who may not have as much success in longer term enterprise reform of IT that is so needed in all of these departments.”

Current and former officials said if the current model isn’t broken, then why fix it?

One former Labor Department official, who requested anonymity in order to talk about the inner workings of the agency, said the leadership believes the current model meets the spirit and intent of the Clinger-Cohen Act, the Federal IT Acquisition Reform Act (FITARA) and all the executive orders and policies different administration’s issued over the last 20 years.

“The CIO does have direct access to [the] secretary and deputy secretary whenever he feels the need to talk to them, and they both have reached out directly to [Labor CIO] Gundeep [Ahluwalia] to ask a question or ask for something to be fixed. Gundeep can walk directly into the secretary’s office or call directly,” said a former Labor Department senior executive. “The bottom line is the CIO has the authority when he needs it. I think what the IG misses is the CIO is the IT expert not the secretary or deputy secretary or the assistant secretary for administration and management. When Labor’s leadership is looking to the CIO for input, direction, advice and recommendations, we aren’t telling him what to do, he’s telling us and other leaders what we need to know.”

A former deputy secretary at a cabinet agency with knowledge of Labor said there is no barrier to communication between the CIO and senior leaders but when managing a large department, it doesn’t always make sense to go to the secretary or deputy secretary for every issue.

The former deputy secretary pointed out that no CIO talks to the secretary every day. These departments are too big, too complicated for that kind of constant interaction not just for the CIO, but for any of the senior leaders.

The former and current officials said it’s about effectiveness rather than dots or straight lines on an org chart.

Effective CIOs are  the ultimate goal

The sources said Ahluwalia is at every budget formulation meeting, not just for technology projects. He’s at every IT acquisition board meeting and included in almost every strategic conversation across the mission and program areas.

The fact that budget, human resources, acquisition and department administration all fall under the ASAM means that the CIO is in the meetings and has a connection to all the areas that make the department function, the former officials say.

In the end, Forman and other former Federal CIOs said that is what Clinger-Cohen aimed for— having an effective CIO that transforms the agency’s mission.

“From my perspective when I was at OMB, the key to success became the requirement for the agency head to promulgate guidance on the CIO role and authorities for that agency,” Forman said. “It is important to understand the governance model at the time [of the Clinger-Cohen Act] was to hold the secretary/agency head accountable with the support of the CIO. There are few if any hearings today where the secretary/agency head are actually called to task for not properly using the CIO and too many where the CIO is called to testify about their lack of authority, which doesn’t help the situation because it avoids the accountability by a secretary/agency head that doesn’t use the CIO role for the purposes outlined in the Clinger-Cohen Act.”

Suzette Kent, the former federal CIO during the Trump administration, said the laws and policies aimed to ensure there was a clear direction of accountability for execution.

“Today, the direct connection between agency leadership and its CIO is even more critical given the highly digital environment for both workforce and mission activities, resiliency demands and increased security requirements. Direct CIO access structure drives the most effective, timely actions. Less direct interaction structures can dilute the urgency and clarity of communications,” Kent said. “The quality of the outcomes become more dependent on the interpretations of people participating. I have observed teams achieving effective outcomes in less direct reporting structures, but I attribute that to the individuals’ skills and willingness to work in a collaborative nature. Recent times have demanded urgent decisions between agency heads and CIOs and strong pre-existing working relationships were important to those outcomes.”

Maybe it’s time to revisit the true role of the CIO? Few doubt the importance of reporting structures, but maybe a better measure of a CIO’s impact is how the agency transforms. Labor, with its current reporting structure, seems to be doing quite well despite its decision to not exactly meet the Clinger-Cohen Act. We’ve seen CIOs who report directly to the agency head struggle to make a difference.  The discussion should center more on the type of person in the role rather who they report to.


When it comes to supply chain risks, agencies need to know when to hold ‘em, know when to fold ‘em

When it comes to supply chain risk management, the Federal Acquisition Security Council (FASC) is channeling country music superstar Kenny Rogers and his hit song The Gambler:

“You’ve got to know when to hold ’em
Know when to fold ’em
Know when to walk away
And know when to run…”

This advice for card playing applies today just as well to federal technology and managing their supply chain risks. The more agencies learn about the products and services they are buying, the more they will know when to hold ’em and when to fold ’em.

“The last year and a half educated the world that every company and every country needs to care about their extended supply chain. So when we think about physical and digital IT, an effort to make a router is a physical supply chain, and at the same time, it’s digital because there is information that passes through it,” said Jennifer Bisceglie, founder and CEO of Interos, an artificial intelligence-based third-party risk management company. “If you look at the executive order coming out of the White House, I’m looking at my relationship with China, so larger trade concerns. You have concentration risk based on the lack of raw materials. And then you also have the software supply chain bill of materials that they want to get ahead of which is more of a digital supply chain.”

The Office of Management and Budget and the FASC are putting on their cowboy hats and boots to help agencies know when to walk away and to know when to run about both their physical and digital IT supply chains. Chris DeRusha, the federal chief information security officer, told Senate lawmakers in late September that OMB and the FASC will release new guidance in the coming months to help agencies make better decisions about the risk of technology products and services.

“[The FASC] primarily is focused on supply chain risks that have a nexus to national security, foreign threats and others. There is an acute focus by the FASC to make recommendations of exclusion and removal orders for the federal government,” DeRusha said at the Sept. 28 Senate Homeland Security and Governmental Affairs Committee hearing.

Bisceglie said the FASC and other efforts, including a new cyber supply chain risk management (C-SCRM) strategic plan from the General Services Administration are part of how the government is building its supply chain risk management muscles.

An official with GSA’s Office of Governmentwide Policy said in an email to Federal News Network that the strategy focuses on addressing the agency’s cyber risks within its most important information systems and programs, and on improving the capabilities of their workforce.

“While the primary audience of the plan is directed to GSA’s internal operations, GSA has cast a wider net to socialize the plan. We have begun to communicate to both internal and external stakeholders’ efforts outlined by the GSA C-SCRM plan,” the official said. “While GSA is not formally seeking feedback at this time, we recognize this is a highly evolving area, and will revise the plan as needed.”

2012 report from the Senate

For some in government, this muscle building exercise traces back nearly a decade.

In 2012, the Senate Armed Services released a report that showed over a two-year investigation more than 1,800 instances of parts that were likely counterfeit in the Defense Department’s supply chain. Agencies have been well-aware of real problems with the security of their supply chains, but have been slow to take real action. There now are more than 30 different supply chain risk management efforts ongoing from FASC to the National Institute of Standards and Technology to the Defense Department’s Cybersecurity Maturity Model Certification (CMMC) program.

DeRusha said the FASC has efforts to engage industry and other committees across Congress to address the ever-growing risks to the supply chain.

Bisceglie said supply chain risk management is well past the “hype cycle.”

“We are to the point where things need to be implemented and you are seeing that not just based on the executive orders, but the money being pushed to the Cybersecurity and Infrastructure Security Agency (CISA) and the Commerce Department to actually do something about it,” she said. “It has raised itself in priority, and the pandemic, the multiple examples of ransomware like the [Colonial Pipeline] and SolarWinds, and the problems in the Suez Canal, made this something that is being invested in and that people are responsible for.”

GSA’s strategy shows just how implementation could work at one agency. Under the Federal Acquisition Supply Chain Security Act of 2018, agencies are to establish a formal SCRM program and to conduct supply chain risk assessments. Additionally, the law requires GSA to take actions to provide better assurance that the products, services and solutions it offers and provides to its customer agencies appropriately address supply chain risks.

GSA’s efforts are focused across three strategic objectives:

  • Address GSA’s highest enterprise-level supply chain risks
  • Further mature GSA’s acquisition workforce’s awareness of and capabilities to manage supply chain risks
  • Standardize GSA’s key operational (Tier 2) C-SCRM plans

“This plan focuses on the integration of C-SCRM at GSA’s organizational (enterprise) level, discussing the core functions, roles and responsibilities, and the approach GSA will take to implement C-SCRM controls, processes, governance and compliance across the agency,” GSA stated in its strategy. “To date, GSA has taken some actions at both the enterprise and business line levels, including the creation of some Tier 2 plans. Tier 2 plans are focused on subcomponent organizations or programs within GSA (e.g., Federal Acquisition Service and the Public Building Service’s SCRM plans) and Tier 3 plans will address system-level C-SCRM controls. Both Tier 2 and Tier 3 plans will include metrics, as appropriate.”

Through this strategy, GSA says it is updating policies and running pilots to test out C-SCRM capabilities.

For instance, GSA said in March it was revising its CIO-IT Security-06-30, Managing Enterprise Cybersecurity Risk, and CIO-IT Security-09-48, Security and Privacy Requirements for IT Acquisition Efforts and the C-SCRM incident response for GSA IT systems policies. The GSA official said the agency updated the CIO-IT Security-06-30 in May and the CIO-IT Security-09-48 in April.

An interim acquisition policy is making its way through the formal rulemaking process and a proposed rule is currently under OMB review and is found in GSA’s regulatory agenda under GSAR Case 2016-G511 Contractor Requirements for GSA Information Systems.

Vendor risk assessment pilot underway

GSA detailed several pilots it plans or is undertaking including one to create a “risk-based, on-demand device testing to detect potential counterfeit or compromised products,” a “vendor risk assessment tool to illuminate ICT supply chains for select critical programs” and the “software security testing technique for select software products.”

The GSA official said it has not yet awarded the device testing pilot to a vendor but is planning to move forward with establishing this capability.

“The vendor risk assessment tool pilot is nearing the one-year mark, and GSA plans to continue testing these tools to augment our third-party risk management related to internal GSA infrastructure,” the official said. “The software security testing pilot demonstrated that developers tend to focus more on security best practices when required to submit a software bill of materials.”

A lot of its initial effort is focused on GSA’s four high-impact systems, and add new requirements as necessary.

Another focus area will be on the workforce where OGP and the Federal Acquisition Service’s Technology Transformation Service will create a C-SCRM journey map for contracting officers and other acquisition professionals.

“The journey map will also be an ongoing resource for GSA’s acquisition workforce as it will breakdown C-SCRM considerations during various milestones throughout the acquisition life cycle. Using a human-centered design process, the journey map will be based on insights gathered from a diverse set of GSA acquisition workforce members across service and staff offices,” the strategy stated. “GSA will leverage information identified in the journey map process and develop or identify workforce training to further invest in long-term GSA acquisition workforce SCRM skills, resulting in an acquisition workforce that is better equipped to address supply chain risks with additional training, certifications, and learning programs across function areas and program offices related to SCRM, including C-SCRM.”

Interos’ Bisceglie said it’s a positive sign that GSA has moved beyond working groups and task forces to actually take steps to secure the supply chain.

“When we move past the strategy and actually put in metrics for what success looks like, then industry will know where to invest and how best to align their efforts,” she said.


What’s in a name? For the Department of the Navy, it signifies resolve

You have to admit, when an initiative or program includes a creative name, there is something more alluring to the effort.

The Air Force with Kessel Run, the Space Force with Kobayashi Maru or the Defense Department’s JEDI — Joint Enterprise Defense Infrastructure — attracted a lot more attention than if they were called what they actually where: Faster software development, an operational platform and secure public cloud services.

The Navy may be taking this fun and clever effort one step further.

From Operation Cattle Drive to Operation Flank Speed to Super Nova, the Navy is adding the fun back to IT modernization.

Aaron Weis is the Department of Navy’s CIO.

“We have ideas. I think we had a previous undersecretary who was from Oklahoma so the idea of a cattle drive really appealed to him, and we just ran with it,” said Aaron Weis, the Department of the Navy’s chief information officer, after he spoke at the recent AFCEA Northern Virginia Navy IT day in Arlington, Virginia. “We have some great names like Black Pearl [a software delivery initiative] and I think the coolest name in DoD for any program is Kessel Run because I’m a ‘Star Wars’ nerd so I get it. It’s a measure of time not distance. It’s very cool.”

Creativity and fun naming conventions aside, the impact of these programs on the Department of Navy’s IT modernization efforts is becoming clear.

Weis, who has been the DoN CIO since October 2019, has chipped away at what he describes as an infrastructure that is 15-to-20 years behind industry.

“We are making great strides, but we have an infrastructure is much like our fleet that needs to be recapitalized and we need to modernize and reinvigorate it in a new way. We have infrastructure that, for the most part, is not supporting the mission. We have an environment that has not done an exemplary job of defending our information whether that is in our internal networks or in the networks in our partners in the industrial base. We have a lot to do, but we have accomplished a lot over the last couple of years.”

And each of those accomplishments is tied to one of those creative names.

Operation Flank Speed

This is the DoN’s move to Office 365 after the success of the Commercial Virtual Remote (CVR) initiative launched across all of the Defense Department during 2020 at the height of the pandemic.

Weis said they are driving the implementation with “vigor.”

“Across the Department of Navy, we are at about 400,000 seats deployed into a universe of about 650,000 seats. The Marines Corps have finished their roll out and are on to their next step. The Navy is making great strides and are about half way through with plan to be done in March,” he said. “That is a wholesale reimaging of how we enable sailors, marines and civilians to get their job done and do it from wherever they are. Little things that I took for granted when I was a CIO in industry like I can do all of that from my phone, I can access my documents from my phone and I can log in from anywhere from my remote desktop and be completely productive in a matter of minutes.”

Weis said this wasn’t the case two years ago as the Navy was living in a technology world that pre-dated 2008. Now, Weis said, the DoN is closer to 2017 and getting closer to 2021 every day.

“We also articulated a cloud vision and strategy and released an enterprise cloud strategy that talked about how we wanted to leverage the cloud and drive innovative solutions and do that at scale,” he said. “Importantly, we started down the path of a naval identity service. We are rolling out identity and access management across the Department of the Navy as an enterprise service that leverages the cloud. We didn’t start with a pilot. We connected the Navy enterprise resource planning (ERP), and that Navy ERP system has 75,000 users and all will be live imminently on the Navy’s identity service.”

The identity and access management enterprise service is part of a new effort across the DoN to give commands access to subscription based technology capabilities.

Weis said the memo detailing this enterprise services effort should be finalized in the coming weeks.

“Where it makes sense, we will put in place underlying capabilities that multiple organizations and systems could use,” he said. “It’s instructional for acquisition. The acquisition community are the ones that will largely benefit from this.”

Operation Cattle Drive

Weis called this a “mysterious program,” which is an acknowledgement that the Department of Navy is bad at turning off old systems and it needs to get rid of them.

“Operation Cattle Drive is an acknowledgement that we need to put all those cattle on a path to market. If you know anything about a cattle drive, the cattle don’t fare well at the end of a cattle drive,” he said. “We are doing that. We’ve already gone through the financial management portfolio. We’ve combined forces with the financial management community because the DoN CIO doesn’t have a lot of superpowers, but one that it does have is it can de-authorize any systems, take away its authority to operate (ATO). Once that happens, Fleet Cyber Command or Marine Forces Cyber Command will snip it off from the network. So combine that with our ability through the assistant secretary of the Navy (ASN) Financial Management and Comptroller (FM&C) to stop the flow of money, we can drive those cattle to market, and that is what we are doing.”

So far, the CIO and financial management community have identified more than $100 million annually of old systems that will go away.

“They wanted to be the first ones and try it themselves and it was a success,” Weis said. “We work with the stakeholders to understand where the opportunities are and work with acquisition and FM&C and help them drive to that end.”

Weis said next up is the logistics IT portfolio.

“We spend more than $1 billion a year to sustain the logistic IT portfolio and it’s full of cattle. We will go after that,” he said. “Along with our own world as we modernize this IT infrastructure, we have all kinds of boat anchors we want to cut off as we retire this legacy infrastructure.”

Weis said Super Nova—another creative name—is the program looking at the assortment of analytics tools and systems across the Department of the Navy.

“The number is rarely or ever one. But it’s something south of what we have today,” he said. “We really don’t need 300 logistics systems. We probably need a number less than that. Is it one? No. Is it 5, 6, 10 or 12? I don’t know. But that is the spirit of Operation Cattle Drive. We will do that for analytics systems under Super Nova.”


Fast and Furious: The Biden administration’s cybersecurity series

If the Biden administration’s cybersecurity effort was a movie, it would be “The Fast & the Furious” series.

Chapter one of the epic was the May executive order where we understood the premise of fast cars, and the cat-and-mouse game of cops and robbers. By the summer, we saw episodes two and three drop through memos around incident response and critical software. Seeing the reaction of the “fans” — or in this case the federal community — the White House doubled down with more action and more drama by releasing the draft zero trust strategy last month.

Just last week, the Office of Management and Budget came through with their latest series’ installment — consider this the “Fast Five” where the street racing crew must buy their freedom from a drug lord and a federal agent gone bad.

But in the Biden administration’s version, agencies must find their freedom from cyber attackers through the improved use of end point detection and response tools. The new end point detection and response memo details a series of deadlines for agencies and the Cybersecurity and Infrastructure Security Agency (CISA) over the next 90-to-120 days.

Ok, I may be stretching it a bit here, but the fourth memo since August along with the final remote user use case under the Trusted Internet Connections (TIC) 3.0 initiative, probably makes agency chief information officers and chief information security officers feel like they are riding in Dominic Toretto’s (Vin Diesel) 1970 Dodge Charger, which is rumored to have a 900 horsepower engine. The administration hit the pedal and they are holding on, trying not to throw up.

“EDR will improve the federal government’s ability to detect and respond to increasingly sophisticated threat activity on federal networks,” stated the Oct. 8 memo from acting OMB Director Shalanda Young. “EDR combines real-time continuous monitoring and collection of endpoint data (for example, networked computing devices such as workstations, mobile phones, servers) with rules based automated response and analysis capabilities. Compared to traditional security solutions, EDR provides the increased visibility necessary to respond to advanced forms of cybersecurity threats, such as polymorphic malware, advanced persistent threats (APTs), and phishing. Moreover, EDR is an essential component for transitioning to zero trust architecture, because every device that connects to a network is a potential attack vector for cyber threats.”

Meeting the long-term goal of IT modernization

Like each of the previous memos or strategies, the EDR policy lets OMB check off another item on its cybersecurity executive order to-do list.

OMB and agencies have 23 different mandates from the May order.

Steven McAndrews, the director of federal cybersecurity in OMB, said last week at an event sponsored by ACT-IAC and the U.S. Cyber Challenge that the EDR memo, like nearly everything come from the executive order is an attempt to drive the right conversations across the government.

“Guidance need teeth and must have strict deadlines so collaboration across the government is key,” McAndrews said on Oct. 6. “The President’s Management Council and through the budget side of OMB is how we are driving change and ensuring accountability. We are working with our budget colleagues to really articulate this cyber work is incredibly important. The SolarWinds incident kicked this into hyper drive with the budget side about how important cyber and IT modernization is.”

Each of these memos and strategies aren’t just about cybersecurity, just like each “Fast & Furious” movie isn’t just about fast cars and action scenes. OMB is emphasizing the long-term goal of IT modernization through these cyber efforts from EDR to zero trust to TIC 3.0.

Chris DeRusha, the federal chief information security officer, said at the same event that through these memos and other efforts, including updating the Federal Information Security Management Act (FISMA), OMB is driving a new way of looking at security as part of a modernization strategy.

Chris DeRusha is the Federal Chief Information Security Officer.

“It’s a new way of looking at it for an organization. You have to do a lot of business process communication and planning inside an organization to be able to do these new security measures well,” DeRusha said. “I think it’s that maturation that you will see, us doing things to try to push on getting there as fast as we can to adopting the latest security measures. We will move away from very important stuff like adopting the NIST security controls, nothing has changed there, but where we will focus our main attention on and ask about is going to more about tested security measures.”

DeRusha said both the FISMA reform bill from Congress and OMB’s upcoming annual FISMA guidance will focus on tested security measures, whether in application security, pen testing or vulnerability disclosure platforms.

“We want to engage security researchers. We want the help with responsible security disclosures,” he said. “These are tricky to launch. I’ve built and launched these before and it’s a lot.”

OMB’s latest memo gives agencies 120 days to assess their current end point detection and response capabilities, identify gaps and then work with CISA to “enable proactive threat hunting activities and a coordinated response to advanced threats, and to facilitate, as appropriate, network access to CISA personnel and contractors supporting implementation of the EDR initiative.”

Final TIC 3.0 remote user use case

Meanwhile, CISA and the CIO Council have 90 days to develop an EDR technical reference model and develop recommendations to OMB to accelerate the use of these tools governmentwide. Additionally, CISA and the council have 120 days to develop a playbook to implement these tools.

Agencies also received more specifics from CISA about ensuring remote workers are securely connecting back to the agency network or to the cloud to access applications and data. The interim TIC 3.0 remote user use case came about in April 2020 right as the pandemic began. Over the last year, CISA received more than 70 comments on how to improve the guidance.

“[T]he finalized Remote User Use Case provides significantly more depth and detail. The new TIC use case considers additional security patterns that agencies may face with remote users and includes four new security capabilities: User awareness and training, domain name monitoring, application container and remote desktop access,” wrote Eric Goldstein, CISA’s executive assistant director for cybersecurity, in a Oct. 7 blog post. “The final TIC 3.0 Remote User Use Case is aligned to complement CISA’s ongoing efforts to modernize federal networks and support security initiatives driven by the president’s cyber executive order.  Ensuring protected and resilient remote user connections to agency-sanctioned cloud services and internal agency services is paramount for CISA and we expect the security guidance will help agencies improve application performance and reduce costs through reduction of private links.”

Additionally, CISA released the TIC 3.0 capabilities catalog and, along with the CIO Council, the pilot process handbook for agencies to test out the remote use case and other use cases.

Each of these episodes continue to detail the action and adventure story around federal cybersecurity. And like the “Fast & Furious” saga, the tale will continue to get bigger, better and more expensive.


VA has had more acting CIOs than permanent ones since 2009

Since 2009, the Department of Veterans Affairs has had more acting chief information officers than permanent ones. And the House Veterans Affairs Subcommittee on IT Modernization is worried about all of this turnover.

After the Senate confirmed Roger Baker in 2009 to be the assistant secretary in the Office of Information and Technology and CIO, VA has had only two other permanent CIOs: LaVerne Council and James Grfrerer. The average tenure of an acting CIO is more than 10 months, including one who lasted almost two years, according to the committee’s research.

Dr. Neil Evans, the chief officer for Connected Care and who is performing the delegable duties of the assistant secretary for information and technology and CIO, became the  sixth acting technology leader since 2009 at the end of August.

Rep. Matt Rosendale (R-Mont.), ranking member of the Subcommittee on IT Modernization, said this is one of the main reasons why VA continually struggles with major IT projects.

“VA cannot afford to stand still. Operating under an acting chief information officer is no excuse to tread water,” Rosendale said. “Information technology has nothing to do with ideology or party. It is about results. In the VA’s case, it is about continually improving the quality and accessibility of care and benefits for our veterans. I know government bureaucracies tend to value titles and they resist change. Senate confirmation confers authority. However, it would be unacceptable to merely keep the seat warm until the administration puts forward a nominee and the Senate gets around to acting on it.”

Source: Committee on Veterans Affairs

It’s unclear where the Biden administration stands on its plans to nominate a permanent assistant secretary and CIO.

It took the Trump administration about 19 months to nominate Grfrerer, and it took the Obama administration more than two years to nominate Council after Baker served about five years as CIO.

It’s not unusual to have acting CIOs, especially at a place like VA where the assistant secretary requires Senate confirmation. But it’s also why Rosendale is so concerned about the impact of having another acting CIO and whether that will slow down critical programs to serve veterans.

Don’t just tread water

Rosendale wants to make sure Evans has the authority, support from the senior leadership and plan to make progress on IT modernization programs.

“The worst thing for OIT to do … would be to tread water until a new assistant secretary is confirmed. So what are your goals? What have you lined out to achieve as the new head of OIT, during the interim?” he asked Evans.

After some back and forth, Evans said he’s focused on improving the digital experience for veterans, ensuring the infrastructure is reliable and modern and delivering services via platforms to ensure they are agile and can be upgraded as needed.

Rosendale’s concerns come on the heels of a successful past few years for VA.

Evans and staff released a mid-year update on IT modernization progress.

“We’ve moved 133 applications to the cloud; we have 82 in progress. There’s roughly 400 in-house-developed apps still floating around out there, but we’ve reduced our custom development from 57% in 2019 to 45% in 2021,” Simpson said during a Sept. 23 press briefing. “We continue to move toward that model, where I think we’re getting that security inheritance through the software-as-a-service products and through the commercial-off-the-shelf (COTS) products and through the cloud.”

VA also upgraded its network infrastructure that has supported more than 12.5 million telehealth appointments over the last 20 months.

“We now have over 95% of our primary care providers, mental health providers and high numbers of our specialists who can deliver care through telehealth. It’s now something that they can do it is something that is part of their therapeutic choices, what they can offer to patients,” Evans said. “I think we’re going to continue to see growth there. We’re going to continue to see growth in inpatient applications of telehealth and engagement of veterans through technologies closer to where they are.”

Hearing scheduled for October

This is exactly the reason why Rosendale and other lawmakers are worried about the decade-long challenge retaining a long-term CIO.

Committee aides, however, did not support the idea of a term appointed CIO as one solution to this problem.

Instead, the committee would like to make sure Secretary Denis McDonough empowers Evans, or whomever the acting CIO is, to do more than tread water on IT modernization.

This issue may come up during the committee’s scheduled hearing in October with VA Deputy Secretary Donald Remy.

VA is one of five cabinet level agencies without a permanent CIO. The White House nominated John Sherman to be the Defense Department’s CIO, but the departments of Treasury, Interior and Health and Human Services remain under acting CIO leadership. Other DoD and VA CIOs need Senate confirmation.

The good news for VA is the OI&T staff bring a wealth of experience and many were former CIOs themselves so they know the challenges of keeping the ship moving forward. Add to that Evans experience as chief of Connected Care and a practicing physician, it seems clear that VA’s technology leadership understands the pain that slowing down IT modernization efforts would cause.

Rosendale and other lawmakers just want assurances that McDonough and Remy understand the issue in the same way.


Lawmakers directing ire at VA over another struggling IT project

While the attention to the Department of Veterans Affairs’ electronic health record modernization initiative seems to never end, another major IT initiative is struggling and starting to grab the interest of Congress — and not in a good way.

VA is two years into a supply chain modernization effort and it is teetering on the abyss for a variety of reasons, including a court ruling that has shut down part of the initiative, and a lack of overall strategy to address standards and typical technology and culture complexities.

Rep. Mark Takano (D-Calif.) is the chairman of the Veterans Affairs Committee.

“For me as chairman, modernizing VA’s supply chain systems is a high priority, and it does not appear that things are going well with the supply chain system even with the extra CARES Act money that we provided,” said Rep. Mark Takano (D-Calif.), chairman of the Veterans Affairs Committee, during a Sept. 30 hearing on VA IT modernization efforts. “I’ve got to tell you that failure to modernize the system is not an option. I know Dr. Evans you are new to your position as acting CIO, but this is a priority and I hope that you all will focus in on it. I know this is a long-standing problem, but we have to get our arms around it.”

What some estimate to be a $2 billion project, VA has been working on this effort since 2019 when it agreed to move to a 20-year-old system from the Defense Logistics Agency called the  Defense Medical Logistics Standard Support (DMLSS), and off of their own 30-year-old supply chain logistics management system.

VA’s goals for this initiative is two-fold: One part is to modernize its inventory management system, and the other is to move to DLA’s existing contract vehicle and away from its current set of vendors.

While a Court of Federal Claims decision in July all but stopped the move to the DLA’s medical and surgical contracts for the near future, it’s the other piece of the initiative that is causing deep concern among lawmakers and industry.

In more than a year, VA only has implemented the DLA inventory management system at the Captain James A. Lovell Federal Health Care Center in Chicago.

Takano and ranking member Mike Bost (R-Ill.) and Sens. Jon Tester (D-Mont.) and Jerry Moran (R-Kan.), chairman and ranking member of the upper chamber’s Veterans Affairs Committee, respectively, wrote to the VA in July, after the court’s ruling, asking for more details about the supply chain modernization plan.

“VA must explain how it plans to adapt to this new reality. Bifurcating the VA medical surgical supply chain between VISNs 20 and 6 and the rest of the United States is not a viable long-term strategy,” the lawmakers wrote. “A patchwork of supply chains would undermine standardization, present management challenges, increase waste and create unnecessary complexity. There is not a clear path forward to widespread adoption of the DLA medical surgical prime vendor contracts, and substantial changes must be made to the MSPV 2.0 solicitation.”

Hearing scheduled for November

Roger Waldron, the president of the Coalition for Government Procurement, an industry association whose members are part of VA’s medical supply chain, said this modernization effort has been challenging for many years.

“The VA’s industry partners look forward to engaging with VA leadership on the MSPV program and the IT logistics systems that support it,” Waldron said in an email to Federal News Network. “Stakeholder engagement will ensure the broad input necessary to fully understand and address the technical, management, logistical, and procurement considerations facing the program.”

Since that letter, a committee aide for the majority said they have not heard much from VA about their plans.

The aide said while VA responded to the letter, Takano remains concerned.

“We are getting into the phase of getting deeper into it, and it feels like they are having to do a restart,” the aide said. “The committee plans to have hearing in November on this program.”

During the hearing, Todd Simpson, the deputy assistant secretary of DevSecOps, said VA is planning to further deploy the inventory management system at the medical health centers in the Pacific Northwest, known as VISN 20, during mid-fiscal 2022.

“Our main focus with DMLSS is obviously to support the common core technologies to enable a successful DMLSS implementation. We are focusing on a DMLSS cloud enclave that is going to provide testing and training capabilities to practioners and users, and that is really where most of our emphasis is right now in our DMLSS journey,” Simpson said.

Two-step implementation plan

A committee aide for the minority added VA’s decision to continue to implement DLA’s inventory management system DMLSS remains disconcerting.

The committee aide for the minority said VA’s current plan is to move fully to DMLSS by 2027, three years longer than initially planned. There are some discussions about accelerating the timeline to be completed by 2025.

Part of the reason why DMLSS implementation will take so long, the committee aide said, is first VA will implement it in on-premise data centers, and then move to the cloud.

The aide said instead VA could skip that initial on-premise implementation and move directly to the cloud through the Defense Health Agency’s system called LogiCole when it’s available in 2025.

“If an inventory management system is the most important thing, which we agree with, let’s go out and identify the best one and buy it,” the aide said. “We know VA struggles with multi-year, multi-billion dollar big bang IT programs. We believe we can avoid doing another one of those and meet the requirements for a modern system in a better faster, more agile way.”

In many ways DMLSS seems to be stuck in VA’s old way of thinking about IT modernization through a waterfall or more stringent approach. Given the challenges during the pandemic, it seems to make sense for VA to change gears and look for a more modern, agile, cloud system as medical supply chain management doesn’t seem like a unique challenge for VA.


GSA loses 3 technology execs; DHS, Air Force, FDA gain new ones

The General Services Administration saw two long-time career executives head to the private sector and a third executive take a job with a new agency.

At the same time, the Air Force and the Homeland Security Department filled open technology executive positions. And the number of acting agency chief information officers continues to shrink.

Welcome to another edition of As the Technology World Turns, where the drama happens in the boardroom, not the bedroom, and when folks reappear it’s usually after a brief vacation and not coming back from the dead.

The latest movements are centered on GSA, for some reason.

Stacy Riggs, GSA’s former lead of the human resources Quality Service Management Office (QSMO), retired from federal service after more than 15 years.

Dominic Sale, the deputy assistant commissioner for General Supplies and Services in GSA’s Federal Acquisition Service, left after 15 years in government.

Dominic Sale left GSA after seven years to join the private sector.

Riggs and Sale both are joining Dynamic Integrated Services, a service-disabled veteran-owned business. They join Mark Forman, the former administrator of e-government and IT at the Office of Management and Budget, John Condon, who co-founded the Ambit Group and was president of Touchtone Consulting, and several other long-time government contracting veterans to lead this IT and management consulting firm.

Riggs and Sale bring a broad range of experience to DIS.

Sale worked at the Transportation Department as a program analyst before heading over to OMB for six-plus years where he focused on data analytics, federal policy development and oversight of agency IT budgets. He joined GSA’s Office of Governmentwide Policy in 2014 and also worked for the Technology Transformation Service.

During his time at GSA, Sale helped lead OGP’s efforts to improve the Federal Identity, Credential, and Access Management (FICAM) process, oversaw the DotGov domain management, ensured agencies had modern IT accessibility guidance and tools and pushed forward the data center optimization initiative. At TTS, Sale helped start successful initiatives using emerging technologies like robotics process automation, and supported the organization’s drive to promote innovation across the IT modernization spectrum.

Riggs spent two years at the Agriculture Department in the office of the chief information officer before joining GSA in 2009 also as part of OGP.

She served as an associate CIO for enterprise planning and governance for the agency before joining FAS. Along with her work at GSA, Riggs served on the board of AFFIRM, including as president of the volunteer, non-profit organization in 2020-2021.

During her tenure at GSA, Riggs worked on several high-profile initiatives, including category management and the human resources QSMO.

The third person to leave GSA over the last month or so is Bryan Lane, the director of data and artificial intelligence.

Industry sources confirm Lane moved to the Federal Deposit Insurance Corporation last week as its new chief of Business Intelligence Services.

Bryan Lane is leaving GSA to join the FDIC.

Lane also was the co-founder of the Artificial Intelligence Center of Excellence for the IT Modernization Center of Excellence. In that role, he helped the CoE work with the Defense Department’s Joint AI Center to design an agile acquisition strategy for AI, provided unified program management and infrastructure support, and developed an AI Capability Maturity Model and AI Workforce Model.

Lane came to GSA in 2019 and previously worked in industry and for the Defense Department, including serving five years in the Marine Corps.

Along with Riggs and Sale leaving federal service, another long-time technology executive, Stacy Dawn, retired after more than 24 years of federal service. Dawn, who was the senior advisor to the CIO for cybersecurity and privacy at the Department of Housing and Urban Development before leaving in August, joined CGI on a part-time basis working on cybersecurity issues.

She also held similar roles at the EX-IM Bank and the Securities and Exchange Commission during her career.

Portia Crowe, who served as the Army Futures Command’s Network Cross Functional Team chief data officer for two years, left in July to join Accenture Federal Services as its new chief data strategy director for Defense.

Crowe also served as the chief of Cyber Engineering and Operations and as CIO for the Army’s Program Executive Office Communications, Command, and Control Tactical (PEO-C3T).

The people on the move were not all leaving the government.

The Department of Homeland Security named David Larrimore as its new chief technology officer. He replaced Brian Teeple, who left in October 2020 to join industry.

Larrimore joined DHS headquarters after serving as a solutions engineering for Salesforce for the past two years. This is his third stint at DHS. He served the CTO for the Immigration and Customs Enforcement directorate for two-plus years and worked in the CFO’s office for two years starting in 2009. Larrimore also worked at GSA and the Agriculture Department.

Along with DHS, the Air Force filled its vacant CTO role. Jay Bonci joined the service in August after spending the last 14 years with Akamai Technologies.

Bonci replaced Frank Konieczny, who retired in February 2021 after more than 10 years in the CTO role.

You also may have missed that two agencies filled key CIO roles.

The Biden administration tapped another state and local technology executive to take over an agency CIO. The Transportation Department named Cordell Schachter to replace Ryan Cote, who left in January, as its CIO. Schachter comes to Transportation after spending the last 13 years as the New York City Department of Transportation chief technology officer and CIO. Before joining the state government, Schachter worked at several large companies including IBM and Siemens. He inherits a $3.5 billion IT budget and 31 major projects at DoT.

The Food and Drug Administration removed the “acting” title from Vid Desai’s CIO title and put him in charge of its new Office of Digital Transformation.

Vid Desai is the new CIO at the FDA, where he leads the Office of Digital Transformation.

Desai has been acting CIO since April 2021 and joined the agency in 2019 as its CTO.

Desai has worked in the technology industry for 30 years as a technology executive with several medical device and services companies.

As the permanent CIO, Desai inherits an agency in the middle of an IT modernization journey. Over the past few years, the FDA has been moving applications to the cloud, developing software-defined networking capabilities and taking more advantage of its data.

Do you know of a long-time federal executive who recently joined your agency or left government, let me know via email.


DoD planning to create big data platform to better understand supply chain risks

While the future of the Defense Department’s Cybersecurity Maturity Model Certification (CMMC) initiative is in “wait and see mode,” the Pentagon is far from sitting still when it comes to protecting its supply chain.

Publicly, DoD announced a new supply chain resiliency working group on Sept. 3, “to address systemic barriers currently limiting supply chain visibility, conduct resiliency assessments and develop effective mitigation actions.”

And privately, Federal News Network has learned DoD is asking vendors for feedback about how to establish a new blanket purchase agreement for supply chain data and information sharing.

In late July, DoD’s Office of Acquisition and Sustainment sent out a request for information asking for feedback on how best to “provide DoD and affiliated federal agencies with illumination of critical defense industrial base (DIB)-related technology and other sector supplier networks (private and publicly-held companies) along with single network illuminations on affiliated companies and personnel deemed critical to the federal government, on an on-going basis. Data should be collected on the suppliers, their capabilities, their financial and operational health, among other factors deemed relevant by the federal government.”

Industry sources say DoD is collecting recommendations from different defense agencies and military services, including the Defense Contract Management Agency, the Defense Counter Intelligence Security Agency, the assorted military department and agency chief information officers and others about what they would want in a BPA vehicle.

“These would be pre-vetted suppliers and vendors around supply chain risk data. Part of the deliverables of these commercial providers will be artifacts that can be collected once and shared across the military,” the industry source said. “The goal is to make sure the Army, the Navy, the Air Force and the defense agencies are not paying for the same thing over and over again.”

Eight industries under analysis

The RFI is seeking broad information from eight different industries, including pharmaceuticals, aerospace and defense, semiconductors, biotechnology and others.

It wants the information to live in the cloud and have artificial intelligence and machine learning tool to do risk analysis of about 100,000 firms, including the Fortune 1,000.

DoD wants all of this data in “a commercial due diligence software platform for automated vendor vetting, supply chain vendor vetting, and affiliated entity vetting to continuously and dynamically inform supplier health. The software must be immediately deployable, ready to immediately run industrial health assessments and supplier vetting at the execution of the award, for enterprise use in vetting the vendors, associated personnel, and supplier networks associated with companies that will provide services, supplies, goods, and materials under this authority. The software must compile, process and display information of relevance based on pre-configured risk events relevant to the supply chain risk management (SCRM) use case. All content returned must have its provenance and date/time captured and fully auditable.”

More specifically, among the capabilities the Pentagon wants the platform to provide is the ability to:

  • Identify any companies with foreign ownership, control and influence (FOCI) concerns to include adversarial finance risk indicators, and be able to vet and continuously monitor foreign personnel to inform FOCI risk.
  • Continuously monitor for supply or production shortages within the supply chain.
  • Automatically flag and compile in one report industrial health risk and derogatory issues on companies and people to include, but not limited to, criminal proceedings, civil offenses, reputation/brand issues.
  • Continuously monitor companies and associated entities and people for industrial health risk or derogatory flags impacting trustworthiness and eligibility for continued access to government information on up to a daily basis.

Christine Michienzi, the chief technology officer for the Deputy Assistant Secretary of Defense (DASD) for Industrial Policy in the Defense Department, said the need for an enterprise view of the risk and resiliency of the defense supply chain was part of the reason to establish the new working group.

“The services have their efforts. [The Office of the Secretary of Defense] has their efforts. But there needs to be this collaborative, coordinated response,” Michienzi said at the recent Intelligence and National Security Summit sponsored by AFCEA and the Intelligence and National Security Alliance. “The supply chain resiliency working group is going to be looking at things like how do we get greater visibility into the supply chain? How do we better identify risks and issues before they happen? How can we be proactive? How can we put remedies in place? And so that activity is ongoing for the next two years. And the tools and the data are going to be a big focus of that activity.”

Supply chain task force recommendation

This RFI and potential blanket purchase agreement is trying to address what Michienzi said is the big problem for DoD — a lack of visibility across the supply chain.

The BPA likely is an outgrowth of DoD’s supply chain task force recommendation.

“From a DoD perspective, we have to understand the interdependencies because a certain company may know who’s in their supply chain, but they don’t understand which other companies are, which other systems are also using that same supply chain, and therefore makes it more vulnerable than they realize,” she said. “At the DoD level, at the Office of the Secretary of Defense level, we have that visibility into all the systems that use all these suppliers, if we could just get the data down to the lower levels of the supply chain. So we’re starting with some of these illumination tools that use AI. Those are a good starting point, but they’re not the end-all-be-all. That information needs to be there verified and validated. And then we need to understand, okay, what are their capabilities? What are their issues? Are they financially healthy? How much capacity and capability do they have, et cetera, before we can do a complete risk assessment? So we are definitely working toward that goal.”


« Older Entries

Newer Entries »