Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Obituary: Jeff Koch was a ‘renaissance man’ for federal IT

The federal technology community is mourning the loss of Jeff Koch.

You may not know the name and that’s OK. But you’ve probably been impacted by Koch’s creative and practical work on federal technology and management issues over the past 20 years.

Jeff Koch, who passed away suddenly earlier this month, served in both the Bush and Trump administrations.

Koch, who served as the Labor Department’s deputy assistant secretary for administration and management for the last year, passed away suddenly Nov. 3 from liposarcoma, a rare form of cancer that begins in the fat cells. He had been battling the disease since 2015, going in to remission and out of remission several times.

Koch was 55 years old and is survived by Patty Stolnacker Koch, his wife of seven years. The couple is expecting their first child in January.

“Jeff’s sudden passing shocked and saddened his many colleagues and friends at the Department of Labor,” said Pat Pizzella, deputy secretary of Labor, in an email to Federal News Network. “Those of us who worked with Jeff at DOL during the Bush administration and the past year will miss his keen intellect and sharp sense of humor. Jeff’s combined expertise in classical music, personal computers and guns made him always fun to be part of any conversation.”

Koch, who was known for his twin passions of classical music and the Boy Scouts, was a true public servant. After a short time in the private sector, Koch found his third passion – good government. He came to Washington as chief of staff for Rep. Pete Sessions (R-Texas) in 1998, and moved to DOL as its associate chief information officer in 2002.

“It was a shock that we are here but we’ve come to say Jeff Koch is worthy of the accolades he will receive in heaven,” Sessions said the funeral service on Nov. 10 in Alexandria, Virginia.” Jeff excelled in the exuberance of life and shined in the light of other people.”

Where Koch made his biggest impact on federal service was during his time as an e-government portfolio manager at the Office of Management and Budget, where he worked on the government-to-government projects.

Tim Young, who as the deputy federal CIO during the Bush administration and an e-government portfolio manager at OMB, said Koch had an “unwavering commitment” to improving federal technology.

“Jeff was successful in getting so much done because of his poise, persistence, and persuasion,” said Young, who now is a principal with Deloitte, in an email to Federal News Network. “Jeff was the colleague you went to when you had a large, complex, politically-sensitive challenge to solve. You went to Jeff because his response was always ‘Yes, and … ,’ followed by numerous (emphasis on ‘numerous!’) probing questions, some light-hearted humor and refreshing optimism and enthusiasm to get to a solution.”

Koch worked on the internal efficiency and effectiveness portfolio, which included projects such as e-payroll, e-travel and the electronic official personnel file (eOPF).

“In several contentious E-Gov governance board meetings, Jeff showed his distinctive ability to cut through tense moments through his wit, ‘unconventional’ sense of humor, and self-deprecation,” Young said. “He had this amazing ability to lead change by simply being his authentic self.”

As several colleagues said, Koch was the last one to turn out the lights at OMB when the Bush administration ended, sending emails to agencies 30 minutes before Barack Obama was sworn in as president.

“Jeff was a true public servant, whom I had the privilege of serving alongside at OMB for five years. He was an inspiration to those around him, dedicated to his work and achieving results. His loss is not only a loss for the community, but for the nation,” said  Karen Evans, assistant Secresary of the Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response.

On a personal note, I covered Koch during his time at OMB and at Labor, and kept in touch with him over the last decade since he left federal service. He never criticized the new administration, offering only thoughtful insights, historical context and direct questions about federal management issues.

Koch wore his passions on his sleeve and never wavered in his belief that a little hard work from a group of people with shared goals made anything possible.

Time and again, he showed the resilience in pushing federal IT and management issues up a steep hill, whether dealing with grumpy political appointees or frustrated contractors.

Outside of work, Koch enjoyed life. He played the cello in the community orchestra, lead a Boy Scout troop, and entertained the neighborhood with a super-spooky haunted house for  Halloween and mega slip-n-slide on July Fourth. He also was an Eagle Scout, a ham radio operator, a rare arms collector, a competitive cycler and played Ultimate Frisbee.

His friends and relatives called Koch a “renaissance man” for his varied interests and his ability to be feel comfortable in a tuxedo or covered in mud.

“For Jeff, it was less about the activity and more about enjoying the companionship of the people around him,” said his long-time friend Brian Carlson at the service.

Koch may not have been a household name in the federal technology community, but his impact will continue to be felt for decades to come and his legacy is one we all should aspire to.

There are few truer public servants who grace the IT community the way Koch did. For that, we are thankful and will miss him.

GSA’s IT shuffle, ODNI tour ends

In other personnel news, the General Services Administration is losing one technology executive to the private sector and gaining one back at the same time.

Navin Vembar, the GSA chief technology officer since 2016, is leaving to join CollabraLink to be the chief technology officer. Vembar joined GSA in 2011 as an enterprise data manager, became the director of the IT Integrated Award Environment (IAE) in 2013 to rescue the failing site, and eventually CTO.

CollabraLink is an IT services and consulting firm providing systems development and integration, technology infrastructure support and program/project management services.

Meanwhile, Beth Killoran, the former chief information officer at the Department of Health and Human Services, also found a new job, as GSA’s deputy CIO. She updated her LinkedIn page Nov. 12 confirming the rumored move.

Killoran spent the last two-plus years as the HHS CIO. The agency moved her into a new role in August. She replaced Steve Grewal, who left for the private sector in January.

Finally, Tonya Ugoretz, the director of the Cyber Threat Intelligence Integration Center (CTIIC), is heading back to the FBI after serving for two years with the Office of the Director of National Intelligence (ODNI). Ugoretz will return to the FBI as the deputy assistant director for intelligence in FBI’s cyber division. She is a career FBI intelligence analyst who joined CTIIC as its first director in 2016 under a two-year detail.

She entered the government in 2001 as a Presidential Management Fellow and as an all-source analyst with the FBI’s counterterrorism program. In 2003, she became the first analyst to serve as the FBI director’s daily intelligence briefer.

Can DHS get financial shared services right by following OMB’s refreshed strategy?

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Homeland Security Department laid out its plans in late October for a fourth attempt since 2003 to modernize its financial management system.

After failing twice with the private sector and once with a federal shared service provider, DHS told industry on Oct. 24 in a notice in that its market research revealed a two-pronged approach that may just work this time.

“At this point, the government plans to conduct two procurements for the Financial Procurement and Asset Management Systems (FPAMS): 1) software, and 2) system integration support services,” the notice states. “The government anticipates the award of the software procurement in June 2019. The government anticipates the award of the system integration support services in August 2019. The government expects to issue a draft statement of work, evaluation factors, and price schedule in December 2018.”

DHS’s decision to buy software and then integration support services isn’t surprising or unexpected. It is, however, a glimpse into the future of federal shared services.

When the Office of Management and Budget releases the December update to the President’s Management Agenda, the shared services cross-agency priority goal — Number 5 of 14 if you are keeping score at home — will have refreshed strategy.

Suzette Kent, the federal chief information officer, offered a small glimpse into what we should expect at the Nov. 1 Shared Service Summit sponsored by the Association of Government Accountants, ACT-IAC and the Shared Services Leadership Coalition.

“The way that we are going about the journey is in three pieces. The first is we are looking at services that are already fairly widely used and there is a lot of agreement. Maybe some of those are smaller services … like fleet management. We will be elevating those to the model that matches the target state of how the services are provided, which includes a focus on continuous innovation, a priority for customer service and shows ways we can get some quick wins,” she said. “The second thing we are focusing on are in areas [such as human resources or financial management] where we are driving out the standards and defining the journey around that set of solutions.”

Kent said that could mean coming to agreement across the government on the standards and then move out for quick wins in those areas.

Continuous innovation necessary

Finally, the third piece of the strategy is continuous innovation.

“Some of the barriers in the past have been once a service is rolled out that connection to continued improvement, leveraging new technologies, changing the operating model, continuing to build, grow and identify other services, that is a commitment we have to make,” Kent said. “It’s that ongoing commitment that we just don’t get to a place and stop, and get to that place and it’s just a starting point in the journey. And the agency is continually involved in defining new requirements, enhancements and leveraging those innovations so we continue to drive value, benefit and use modern technology.”

Kent said the refreshed strategy helps meet agencies where they are, but gives them an idea of what success looks like today and in the future. The administration is focused on 14 areas that it believes are ripe for shared services and which could save the government $2 billion over the next decade.

Other near term shared services opportunities for shared service that are emerging are around contract close-out and records management.

“When we look at the full scale of what we need to modernize, there are many systems that are decades old. It’s a huge agenda. Shared services and the areas we are focusing on, they appear on many agency’s agenda. They are some of the oldest sets of applications. They are some of the applications that are most in need of updates, and some of the applications that create more substantial threats,” Kent said. “How do we protect the data and the information that are in those systems? I see the shared services agenda absolutely linked with IT modernization because it’s a way we can pick [a] common set of solutions and move a large group quickly to a more modern, more secure, better service platform.”

Beth Angerman, the acting principal deputy associate administrator in the Office of Governmentwide Policy at the General Services Administration and executive director of the Unified Shared Services Management office, put a finer point on the forthcoming update.

She said the PMA CAP goal for shared services includes 10 goals, including the requirement for agencies to participate in the development of standards so agencies can have a big say in what the future capabilities will be for HR or financial management or any area where shared services could work.

One size doesn’t fit all for shared systems

Another one is an acknowledgement that the government doesn’t need to build its own IT systems any more.

“There are commercial systems that exist that can help us drive better processes in government because those commercial solutions already have incorporated so many of those best practices that exist in industry,” she said at the summit. “The second one is one size doesn’t fit all. Hopefully, the new strategy really does give every agency the opportunity to declare some level of success, whether it’s through the participation of standards or through the adoption of existing services, and the goal will point out what those are, or whether it’s thinking through the plan to adopt new centralized services.”

The third piece to the strategy is competition is key and the government needs to be smarter buyer. Angerman said both of these factors came through clearly in the market research the USSM office did over the last year or so.

“That is the role you will see start to emerge for the service management offices to help us prevent the proliferation of instances, to help us make sure that we have smart contracts that give us the opportunity to bring innovation and to make sure we actually don’t have vendor lock-in,” she said. “All of the things that we’ve heard are the concerns of customers over time.”

There are plenty of examples over the last 15 years where these problems arose. Around too many instances, just look at the  E-Travel program. GSA Administrator Emily Murphy said every agency uses the same travel management system, but each has a different version totaling more than 40 across government.

With vendor lock-in, the Labor Department’s experience shows why this has been a concern. Labor moved to a private sector provider for financial management services in 2010 only to have to buy back the software and the interfaces in 2014 for more than $20 million when the vendor went bankrupt.

And around innovation, the USSM office is emphasizing the “as-a-service” approach so agencies can buy services and not systems, which tended to be static while cloud services can be dynamic.

This brings us back to DHS and their seemingly never-ending effort to modernize their financial management systems. In its request for information from March, DHS wanted information on cloud and non-cloud systems. It wanted to know about application programming interfaces (APIs) and it wanted to know how the software is or plans to use artificial intelligence, blockchain and/or robotics process automation.

If DHS writes the solicitation using the tenets of the administration’s new strategy, then there may be hope for it and many other agencies like it that shared services may actually gain some real traction this time. Over the last 15 years, beyond payroll services, there have been only a handful of success stories and too many failures.

“We are making it a little better every time the baton is passed [from one administration to another],” Angerman said. “We are excited about the future.”

Read more of the Reporter’s Notebook

Move aside sports betting, let’s wager on FAR rules

Nine states already allow for sports betting and 19 others have some sort of legislative action underway to make it legal, according to ESPN’s Oct. 30 sports betting tracker.

Betting on sport is so popular that the American Gambling Association estimates that all four major sports leagues would earn a collective $4.2 billion from legalized sports betting.

So what does this all have to do with federal government management issues?

Well, thanks for asking.

The Office of Management and Budget’s Office of Information and Regulatory Affairs (OIRA) recently released its semi-annual regulatory agenda and it got me thinking about the proposed and final Federal Acquisition Regulation (FAR) rules.

What are the chances of any of these rules getting past the finish line?

Well given the fact that over the last two years, the number of FAR rules that were either proposed or finalized was scarce, and there doesn’t seem to be any change on the horizon, I thought we might have some fun with the federal acquisition by putting some odds on how likely the most significant of the 36 proposed and nine final FAR rules would come to fruition.

The oddsmakers should keep in mind that George Washington University professor Bridget Dooling found the number of significant regulatory activity has fallen 74 percent since the Trump administration took office. During the first 18 months of the administration, agencies launched fewer than 250 big rules, compared to 807 in the first year of Barack Obama’s administration and more than 700 in George W. Bush’s first year.

I brought in my own version of a sports oddsmaker in Larry Allen, the president of Allen Federal Business partners and long-time federal acquisition observer and expert, to help me explain the odds we set:

Proposed rules

1.  Determination of fair and reasonable prices on orders under multiple award contracts

Odds: 5 to 1

Rationale: This one has a pretty good chance because several agencies already wrote deviations to the FAR that directs contracting officers to determine price reasonableness on their own. Allen said the next phase would be to bring this concept down to the task order level.

2.  Use of Acquisition 360 to encourage vendor feedback

Odds: 25 to 1

Rationale: This proposed rule goes back to 2016 under the Obama administration so the likelihood of it getting through is not good. At the same time, Allen said it hasn’t gone away either in almost three years. “How would you regulate the feedback? There are a lot of things to get a 360 view of a transparent acquisition that doesn’t require a new rule, but changes in the processes and reminders to follow the rules on the books, including encouraging vendor feedback would be helpful,” he said. “I’m not sure there is a really strong regulatory case for it.”

3. Section 508-based standards in information and communication technology

Odds: 6 to 1

Rationale: The Access Board recently finalized updated Section 508 standards so this FAR update is almost an important formality. Allen said agencies still struggle to get Section 508 right in contracts so changing the FAR shouldn’t be too difficult.

4. Incremental funding of fixed-price contracting actions

Odds: 30 to 1

Rationale: This 2016 proposal isn’t likely to break through after almost three years. Allen said there has been a big push for much of the last decade to bring some uniformity to fixed price contracts as there already are regulations on the books for incremental funding for cost-plus type contracts. “This rule may end up being superseded by other happening with the Section 809 panel and the Defense Department going back to the drawing board on incremental funding for its contracts,” Allen said.

5. Definition of a “commercial” item

Odds: 15 to 1

Rationale: This one comes out of the 2018 Defense authorization bill where lawmakers wanted to expand and clarify what it means for agencies to buy commercial products and services. Allen said while this is an important proposal, there will be a lot of interest and that could slow down the process. This is why the odds of the council finalizing it in the next year are low.

6. Increasing task-order level competition

Odds: 7 to 2

Rationale: This is another one coming from the NDAA, but the most recent one signed into law in August. The proposed rule is a key piece to the General Services Administration’s goal of moving to unpriced schedules, which is why the odds are lower than most others. Allen said while the concept is limited to services contracts, which do account for a majority of federal acquisition spending, the broad goal is for agencies to get better pricing at the time of purchase. “It could create more burden on contracting officers, but it would mean they get more real time pricing based on scope of work,” he said.

7. Governmentwide and other interagency contracts

Odds: 10 to 1

Rationale: The goal of this rule would be to do away with requirements for DoD to need a written determination and finding before using non-Defense contracts. Allen said this requirement is a huge stumbling block for military services and Defense agencies. “Getting a D&F to use GSA’s Alliant or the schedules slows things down and requires more paperwork. Even when the services have an agreement to use Oasis or Alliant,” he said. “Section 875 is being read by industry as eliminating that requirement. It could streamline DoD acquisition and improve the use of non-DoD contracts across the govt.” At the same time, Allen said the odds are lower than some might think as the possibility of push back from DoD is real while the Pentagon waits for Section 809 panel recommendations and/or they want more analysis on how DoD is using non-Defense contracts.

Final rules

1. Set-asides under multiple award contracts

Odds: 50 to 1

Rationale: The chances of this final rule finally cross the finish line remains long, particularly considering the council has been sitting on it since 2014. Allen said recent court decisions — the 2016 Kingdomware case requiring the Veterans Affairs Department to abide by the “rule of two” for small veteran-owned firms — has slowed down the progress of the proposed rule, causing the council to rethink whether they have to apply the “rule of two” to all task order contracts. “The odds are low and the rule may become irrelevant as we get Section 846 [e-commerce marketplace pilot] up and running,” he said. “It is more likely that the FAR case will be closed and a new one will be opened up that reflects all of these changes. This is not to say this isn’t important, but time may have overcome the current rule.”

2. Effective communication between government and industry

Odds: 3 to 1

Rationale: Of the 47 final and proposed rules, this one is the most likely to make it across home plate. Allen said this is one of those cases where OMB issued guidance, but until the FAR regulations change, there are a host of government acquisition people who are more conservative and need regulations to change. There is a lot of support across the government and industry acquisition communities for the use of tools such as reverse industry days, the “show, don’t tell me” approach to bids and other “innovations,” thus making this rule popular and an easy one to agree upon.

3. Prohibition on certain telecommunications and video surveillance services or equipment

Odds: 4 to 1

Rationale: This one also gets good odds because it’s part of the ongoing and increasingly strong focus on supply chain risk management. The FAR Council will implement the 2019 NDAA provision that prohibits agencies from buying products from China-based companies ZTE and Huawei Technologies. “Prohibiting agencies from buying from these companies because of the potential and real impact on the security of their supply chains is a big deal for industry and an even bigger deal for the government,” Allen said. “The rule is putting industry on notice saying technology from these companies are walled off to you whether you can save money or not. The biggest use of this is around the training on supply chain security.”

Prop bet

What are the chances of the Trump administration naming a permanent administrator in the Office of Federal Procurement Policy over the next 12 months?

Odds: 250 to 1

Rationale: It’s been more than two years since OFPP has had a permanent administrator and so far four candidates haven’t made it through the process for a variety of reasons. Allen said he doesn’t see any change on the horizon, either. “The administration has gone this long so they may be saying ‘why do we need one?’” he said. “And even if you named someone, it would be nine to 12 months before they got confirmed and then would only be in the position for nine to 12 months, so why measure for new curtains?”

Read more of the Reporter’s Notebook

CIO Council restocks committee shelves after personnel changes

The federal chief information officer’s council is restocking its committees after a wave of agency volunteer leaders moved to new positions across government or left government altogether.

Steve Hernandez, the Education Department’s chief information security officer, Dorothy Aronson, the National Science Foundation’s CIO, and Ron Bewtra, the Justice Department’s chief technology officer, stepped up to take on new leadership roles.

Hernandez now is the co-chairman of the federal CISO committee, joining Federal CISO Grant Schneider.

Aronson will join Education CIO Jason Gray as the head of the workforce committee. She replaces Beth Killoran, who moved to a new role in August.

Dorothy Aronson, the National Science Foundation CIO, talks with Federal News Radio’s Jason Miller.

Bewtra joins Maria Roat, the Small Business Administration’s CIO, to co-lead the innovation committee.

Along with these changes in the CIO community, a few others caught my eye.

Chris Lowe, the former Agriculture Department CISO, started a new position as CISO at USDA’s Agricultural Research Service (ARS).

The rumor mill heated up back in March that USDA leadership wanted to reassign Lowe to a new position. It’s unclear whether that happened or if Lowe found a new role on his own. Lowe recently updated his LinkedIn page with the new ARS role.

Over at the Department of Housing and Urban Development, former acting CIO Chad Cowan received a promotion to acting assistant secretary for administration and principal deputy assistant secretary for administration

Cowan was acting CIO for six months after Johnson Joy suddenly resigned and was senior advisor to the CIO from August to October. HUD Secretary Ben Carson named David Chow as the new CIO in August.

Over at the General Services Administration, Rob Coen now is the new One Acquisition Solution for Integrated Services (OASIS) program manager, moving over after spending the last two years as the FedSIM and Express strategy director.

Headshot of Rob Coen
Rob Coen is the new program manager of GSA’s OASIS contract.

Additionally, GSA named Penny Grout to be the Federal Acquisition Services regional commissioner in Region 8 and Tom Meiron to be the Regional Commissioner in Region 4, according to an Oct. 29 email from FAS Commissioner Alan Thomas obtained by Federal News Network.

Finally, GSA announced on Oct. 31 that Administrator Emily Murphy named Jeffrey Post as the associate administrator for its Office of Congressional and Intergovernmental Affairs (OCIA) where he will serve as the chief policy advisor.

Over at the Labor Department, Dennis Johnson has been selected as the director of Office of the Assistant Secretary for Administration and Management’s (OASAM) Performance Management Center. He had been acting in the role for the past 23 months where he developed a new four-year strategic plan, expanded the department’s continuous process improvement (CPI) program, and strengthened the connections between the department’s budget and performance functions. That’s according to an email from Bryan Slater, Labor’s assistant secretary for administration and management, obtained by Federal News Network.

One retirement of note has come to our attention: Mike Butler, who spent the last eight years with the Defense Manpower Data Center and has been a leader in the federal identity management community for the last 18 years, decided to try out the private sector.

Butler now is a director with Dignari, LLC where he will lead the emerging technology team to create new and innovative capabilities for clients. Dignari is a woman-owned small business serving national security and defense agencies by providing biometrics, identity management, emerging technologies and data analytics services and technologies.

Butler spent 15 years in the Navy before retiring in 1995. He joined DoD as a civilian helping to stand up the Defense Access Card Office. He also worked at GSA to begin the implementation of Homeland Security Presidential Directive-12 (HSPD-12), served on a seven month detail with the Office of Management and Budget’s e-government office and spent just over a year at the National Institute of Standards and Technology working on smartgrid cybersecurity before going back to DMDC.

Read more of the Reporter’s Notebook

OMB loosening the reins on major cyber programs for 2019

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

If you read through the new Federal Information Security Management Act (FISMA) guidance for fiscal 2019, the letter from Suzette Kent, the federal chief information officer, to the Senate Homeland Security and Governmental Affairs Committee, and listen to what the Office of Management and Budget has been saying about the upcoming Trusted Internet Connections (TIC) memo, the message to agencies is clear.

Agencies can no longer point to the Homeland Security Department as the excuse for why improvements to the security of their networks and data aren’t happening fast enough.

OMB is giving agencies a stronger voice and driving accountability back to CIOs, chief information security officers and deputy secretaries.

In the FISMA guidance and letter to the committee, which Federal News Network obtained, OMB is adding flexibilities in how agencies meet the requirement of governmentwide programs like the continuous diagnostics and mitigation (CDM) and the intrusion detection and protection program called EINSTEIN. At the same time, OMB seems to be telling agencies that what’s most important is not adhering a specific method or approach, but achieving the final result of using advanced tools and techniques to secure their systems and data.

“The Office of Management and Budget (OMB) acknowledges that there is a need to enhance existing capabilities and programs to better safeguard federal information systems and data, and we plan to convey this vision as part of the President’s 2020 Budget,” Kent writes in a Sept. 14 letter to the Senate committee. “In order to inform future investment decisions, the Department of Homeland Security’s National Protection and Programs Directorate (NPPD) is working on a threat-based security architecture assessment. This threat-based security approach, adopted from the Department of Defense, will provide a holistic assessment of existing federal cybersecurity capabilities and creates a common framework to discuss and assess cybersecurity capabilities related to threats. The results are being used to inform DHS’ cybersecurity investment priorities across federal civilian departments and agencies in order to enhance enterprise cybersecurity and reduce risk.”

The underlying message in the letter is clear. The fiscal 2020 budget request will propose more money for agencies to implement cyber capabilities more quickly and effectively whether through CDM or EINSTEIN or in other ways.

FISMA guidance continues CDM evolution

OMB goes even further in the FISMA guidance, released Oct. 25, around the need for more flexibility and accountability at the agency level.

The administration is opening the door for agencies to acquire continuous monitoring tools and capabilities outside of CDM. The memo is part of the continued evolution of CDM.

“[H]owever, they are required to provide sufficient justification should they pursue acquisition of tools with continuous monitoring capabilities that are not aligned with current or future CDM acquisition vehicles (includes CDM Dynamic and Evolving Federal Enterprise Network Defense [DEFEND], GSA IT Schedule 70 CDM Tools Special Item Number, etc.). Prior to purchasing these tools, a justification memorandum must be sent from the agency CISO to the CDM PMO, the respective OMB Resource Management Office (RMO), and the Office of the Federal Chief Information Officer (OFCIO) Cybersecurity Team,” the guidance states.

Additionally, OMB is telling agencies they can continue to use existing tools or capabilities that meet CDM requirements, but were purchased outside the contracts run by the General Services Administration.

Then if you add to what we know about the upcoming TIC guidance, the theme of moving more toward flexibility and accountability continues.

Margie Graves, the federal deputy CIO, said at the 2018 ELC conference in Philadelphia on Oct. 15 that the TIC policy will move toward a risk based approach based on the cyber framework from the National Institute of Standards and Technology.

“The policy doesn’t push us all the way to right in terms of mandating the use of controls. It opens up the aperture in terms of what commercial cloud services already are built into the environments that are meeting the controls. If it’s like-for-like, we’re not going to prescript how as long as it’s meeting the security requirements,” Grave said. “We are doing the same thing for CDM as well. If we can get to the point where we are doing continuous authorization through automated controls and automated use of data, then suddenly all the authority to operate (ATO) paperwork and approach becomes totally different. There is more veracity and more accurate because it’s based on data in the environment. That’s where we are going.”

All of these changes signal a major change in how OMB is involved with and views cybersecurity.

During most of the Obama administration, OMB passed to DHS the responsibility and some of the authority for federal cybersecurity efforts.

Part of the reason for OMB is increasing its oversight and giving agencies more flexibility may be agency frustration with the slowness of the rollout of CDM tools and capabilities as well as the perceived ineffectiveness of EINSTEIN.

EINSTEIN must be operationally relevant

In Kent’s letter to the Senate committee, she said the “National Cybersecurity Protection System (NCPS) detected 379 of the 39,171 incidents across federal civilian networks via the EINSTEIN sensor suite from April 2017 to present.” That is less than a 1 percent detection rate of all cyber incidents. This doesn’t mean EINSTEIN is ineffective, but it means the program isn’t being the proactive tool once envisioned.

Jeanette Manfra, DHS assistant secretary in the Office of Cybersecurity and Communications, said the goal this year and next is to make sure the tools under EINSTEIN are operationally relevant.

“We have been working with agencies to better understand challenges they may have in making sure how best to use the tools under the NCPS,” Manfra said in an interview on Ask the CIO. “Two areas we have been looking at for some time is can we implement some behavior analytics, looking at developments in non-signature based detection capabilities. We’ve had some success in that, what I would call a limited deployment so we will be expanding that.”

She said the goal of the non-signature based detection capabilities is looking for abnormal behavior based on a baseline of normal behavior.

She said DHS also is looking at how EINSTEIN’s on-premise model, similar to the TIC policy, integrates with cloud services.

The question that emerges from all of these changes is how can OMB and DHS ensure CDM, EINSTEIN and other cyber initiatives continue to push agencies down a similar path so there are fewer cyber breaches, unpatched vulnerabilities and a better understanding the government’s overall cyber risk while at the same time not letting the inertia of government prevent real progress?

Read more of the Reporter’s Notebook

What’s driving federal IT, acquisition in 2019 and beyond? PSC has the forecast

Agencies may just remember fiscal years 2017 through 2019 as the best of times. Money was flush — generally speaking — with some agencies actually not being able to spend everything they received in 2018. Congress and the president actually got spending bills done almost on time and not six months into the fiscal year. The threat of shutdown was minor.

So as agencies finalize their 2020 budget requests—agency passback guidance usually is ready by Thanksgiving—there is a real expectation that the “do more with less” mantra will return in force.

The evidence, at least for now, is coming from multiple places. First, President Donald Trump announced he would ask each agency for a 5 percent budget cut in 2020. Deputy Defense Secretary Pat Shanahan said last week at the Military Reporter’s Conference that the Defense Department is developing two budget requests, one without the 5 percent cut and one with it.

“The way I would think about those two budgets and the approach — there are certain things that you can’t change. There are near term costs that we are going to expend in the next year that are on contract and for all intent and purposes are fixed,” Shanahan said. “There are other investments that we will make in science and technology and procurement and we have knobs in terms of timing. The exercise we are going through is there is prioritization we can make. We have a number of options going on with hypersonic missiles. In these projects we can decide to do them or to defer them.”

Shanahan said he is working with the DoD comptroller and the Office of Cost Assessment and Program Evaluation (CAPE) team on what projects could be deferred, and then Secretary James Mattis will make a decision based on those trade offs.

Second is the feeling on the ground. For that, just look at the comments and expectations coming from the Professional Services Council’s 54th annual Vision Forecast. In interviews with hundreds of federal technology and acquisition officials and in analyzing spending data, PSC’s team of industry volunteers found 2019 is likely to be the “high water mark” for spending.

“There are couple of things that caused the team to look at that. Number one, is the sheer magnitude of the federal budget and the challenges we are facing in the next few years in terms of deficits, interest payments and the outlook for economic growth coming up here all tend to indicate we have about as much headroom in the budget as we can possibly stand right now,” said Lou Crenshaw, a Vision volunteer and team lead for the DoD topline and macroeconomic research. “We are starting to see pressure from OMB and other places for people to begin to reduce spending. I think part of that is the realization that we have some real serious problems we will have to deal with. I think the topline will stay the same and there may be movement between defense and non-defense because of the security situation.”

Now, of course, all of this good feeling about budget and shutdown threats could change in November if the House and/or Senate switches parties. Oh and that nasty “s” word — sequestration — could return in 2020 and beyond if Congress doesn’t raise the spending caps.

The PSC Vision Forecast — for those of you who can still make the annual conference it takes place Monday and Tuesday in Falls Church, Virginia —offered several other significant trends around technology and acquisition for 2019 and beyond. Here are just a few that stood out:

Services remain king

The PSC team found agencies expect to continue to increase spending on knowledge-based services and IT services spending continues to see a steady growth. But the biggest difference this year than in past surveys is the discussion around mission priorities.

“In past we’ve seen a lot of emphasis on support services, but not necessarily driving toward how they support the overall mission for the agency,” Kirste Webb, the Vision civilian chairwoman. “One of the biggest messages we are hearing across the board is that all of the agencies are now shifting to everything is about their mission, and if procurement or acquisition is not directly supporting that mission they are taking a second look at how it’s being procured and what’s being done with it.”

Webb said agencies are looking at alternative contracting practices such as Other Transaction Authority (OTAs), best-in-class contracts and sole source awards with a goal of getting to the market faster and bringing innovation to help meet mission goals better.

Interestingly, the use of shared services did not come up as an alternative or even as an option agencies are seriously considering.

O&M equals operations and modernization

The Office of Management and Budget is entering is fourth year where IT modernization is its top priority because of how everything from cybersecurity to citizen services to workforce branch off from it.

The PSC team found the discussion on IT modernization shifted from straight numbers highlighting technical debt or continued support of legacy systems to managing the IT modernization.

“It’s really about Technology Business Management. We’ve seen that changes are being attempted at an unprecedented scale across the federal enterprise. It’s going to improve the quality of the data. These are culture challenges that are extremely daunting but the outlook is promising,” said Steve Vetter, one of the two federal IT and budget Vision chairmen.

Greg Lobbin, the other federal IT and budget Vision chairman, said agencies seen an opportunity because of cloud computing to use operational expenditures (OpEx) for modernization efforts.

This may be part of the reason the impact of the Modernizing Government Technology (MGT) Act is slow to materialize. Agencies are finding ways to modernize without the need to apply for a loan from the Technology Modernization Fund, or by setting up a working capital fund.

A changing industry, government relationship

One of the most positive changes that came from the discussions with agencies is the desire for a better working relationship with contractors.

Webb said the cautious message coming from the government is how can industry and government work to evolve the mission together.

“I think what we are seeing is agencies are trying to get industry involved earlier to avoid what’s been happening which are a lot of protests during the acquisition process. We are seeing an increased exchange across the civilian agencies in terms of industry days, industry exchanges, one-on-one opportunities well in advance of a final solicitation coming out as part of the critical steps in trying to partner more with industry across the board,” she said. “Once the acquisition is complete and a contractor is in place, agencies are recognizing working together is far better to achieve the mission.”

Webb said partnership agreements, such as those used extensively by the Department of Energy, are becoming more common.

“What we are seeing is rather than trying to fight against each other, we are seeing more going toward working with each other to resolve challenges that may occur and trying to identify potential risks and how to solve those risks before they even occur,” she said.

The best and most well-known examples of the change that PSC is highlighting are the IRS’ reverse industry days, the Homeland Security Department’s Procurement Innovation Lab’s efforts and the General Services Administration’s interact site.

4 forces driving federal acquisition

Of all the trends that emerged from the 22 study teams, which conducted more than 300 interviews, the drivers of federal acquisition became clear.

Alan Chvotkin, PSC’s executive vice president and general counsel, said the President’s Management Agenda cross-agency goals continue to be the North Star that agencies are heading toward. But that also means a few other things including a tightening of the market for some vendors and an increased set of opportunities for others, particularly those in the cybersecurity and IT modernization.

“There is no doubt increased focus on IT. Cybersecurity is clearly a high risk and high spend area so we think there will be a lot of business opportunities,” he said.

Agencies are continuing to emphasize and push toward commercial services and nontraditional contractors. But, Chvotkin said, there is a mixed message because agencies also want to push government unique requirements down into the supply chain security and security clearances.

Finally, the competition for workforce talent will remain strong among industry and government alike.

Read more of the Reporter’s Notebook

IT modernization starting to leave its mark on federal procurement

PHILADELPHIA — The Federal Acquisition Service at the General Services Administration spends $100 million a year on systems that are outdated, disliked by their users and arduous to use. There is something like 70 applications that interface with the contract writing system alone.

The IRS wants to replace systems that require tens of thousands of manual hours to process basic procurement actions like contractor determinations.

The debate over the use of Other Transaction Authority remains strong over whether it’s just another tool in the procurement toolbox, or has the Defense Department discovered the “Holy Grail” of contracting because Congress gave them production authority.

All of these examples really are just symptoms to the larger disease — the need to reimagine the entire job of a contracting officer. With the move to automation happening more quickly, contracting officers soon will finally achieve the business acumen and partnership role that has been long talked about.

The good news is the change is starting to happen. One of the major themes that emerged from the 2018 ImagineNation ELC conference sponsored by ACT-IAC was around the evolution hitting the federal acquisition process.

GSA Administrator Emily Murphy said reconceiving how a contracting officer works is one of her main goals for FAS as it modernizes its schedules program by reducing the number of overall of contracts, by moving toward an unpriced schedule and pushing competition down to the task order level.

“When I’ve talked to our 1102 community and when I talk to our vendor community, one of the questions I always ask is, ‘What value are we driving from setting ceiling prices?’” Murphy said after her speech at ELC. “When we are reimaging how the schedules work, if we incorporate things like the e-commerce platform, which deals with the very low dollar value purchases, and we focus on services being an unpriced contract where we actually focus on pricing at the task order level, that frees up our 1102s. They no longer are negotiating the same ceiling prices again and again. They are instead focusing on how to make sure task order competition is real, vigorous and it’s dynamic.”

Murphy said the combination of technology, such as robotics process automation and machine learning, and business process reengineering, contracting officers can spend more time on finding the right solutions based on data and business needs.

“Think about when we awarded OASIS, it required labor hour prices per category, but it really focused a lot more on the technical qualifications of the vendors,” she said. “This gives us the ability to focus on those technical qualifications, what makes each vendor successful, unique and what can they bring to the table as a solution, instead of focusing on that contract hour price. That contract hour price becomes relatively meaningless until you get to an actual scope of work. Then you have dynamic competition at the task order level and drive down prices with a real solution behind those prices.”

GSA CIO helping with acquisition modernization

To get where Murphy wants to go, FAS needs better technology that runs its assorted contracting systems.

Alan Thomas, the FAS commissioner, said its internal systems such as E-Buy, GSA Advantage, the FSS 19 and many others that make up their core business systems are expensive to maintain and not customer friendly.

Thomas and David Shive, the GSA chief information officer, are co-leading an effort to modernize and consolidate systems. He said FAS will lean on GSA’s CIO application maintenance, enhancements and operations (CAMEO) re-compete. The agency held an industry day in early August and plans to issue a request for information and hold a reverse industry day.

“We have picked the capabilities we want to have like the ability to write, modify and manage contracts or to manage catalog information, instead of modernizing system by system,” Thomas said after his panel at ELC. “Within six months, we will have the requirements for our new contracting writing system out to industry, and by the back half of 2019, we expect to begin delivering new capabilities.”

Thomas emphasized that the business system modernization effort is a multi-year strategy and it integrates with other initiatives such as schedules consolidation to reach its full potential.

Like GSA, the IRS is facing antiquated systems as well as a shrinking workforce. The tax agency spends about $2.6 billion a year on 10,000 transactions.

Tim Shaughnessy, a senior program analyst at the IRS, said a new strategic framework for the agency includes the procurement process for the first time.

“We are recognized as partners who create and buy emerging technologies and do acquisition planning,” he said at ELC. “We spend a lot of time at the end of the fiscal year trying to secure dollars early enough to set up process to buy emerging technologies.”

IRS testing first bot

Shaughnessy said for the staff of 300 procurement professionals to do that more effectively, the IRS is turning to robotics process automation to reduce the amount of time spent on basic transactions.

The IRS awarded its first contract for RPA at the end of September for a bot to do contractor responsibility determinations.

“We do about 10,000 of those actions a year and the RPA is our way of dipping our toe in the RPA water,” he said. “We think this will save contracting officers about 10,000-to-15,000 FTE hours a year.”

The bot will go to public facing websites such as or  Dun & Bradstreet to analyze data on vendors to review overall contractor financial resources, integrity and business ethics and anything that that would not otherwise exclude a vendor from bidding.

“The bot gives us the ability to quickly look at the System for Award Management (SAM) and other systems and give the contracting officer a report back on a vendor’s status,” Shaughnessy said. “The contracting officer can analyze the data that the bot brings back and pivot off that in case they need to do more investigations. We also don’t have to wait until a proposal comes in or there is an apparent winner to do a contractor responsibility determination. The bot could bring back data on all the companies who proposed.”

Along with RPA, Shaughnessy said the IRS procurement shop also is testing out a new program under Parts 12 and 13 of the Federal Acquisition Regulation to pilot emerging technologies. The IRS doesn’t have OTA authority so this is the next best thing.

Under FAR Parts 12 and 13, agencies can use streamlined evaluation procedures as long as the awards are under $7 million and it’s not to deploy new systems, and only to test and pilot.

“We haven’t picked which programs we will use this for yet. We are socializing and working with stakeholders as well as partnering with the CIO’s organization,” he said. “One of the things we are trying to do is develop, along with the CIO, a capability for us to take a concept from a white paper to initial deployment.”

Shaughnessy said one possible option is with document imaging and data capture through optical character recognition. The IRS has as many as 27 different document imaging systems. Using this pilot program, it could consolidate and modernize the entire document imaging effort.

The IRS, GSA and other agencies are making it clear the status quo around acquisition isn’t working and working within the system to change is not only possible, but happening every day. This is why the aggressive move to OTAs is disturbing to so many because instead of fixing the procurement system, like GSA, the IRS and others are trying to do, agencies are looking for a way around it.

Read more of the Reporter’s Notebook

Data center metrics are a prism to watch the continued evolution of federal IT

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

If you wanted to track the government’s progress on IT modernization, there may not be a better approach than following the bouncing metrics of the data center consolidation initiative.

The Obama administration started out with a goal to reduce the overall number across the government. Then the Office of Management and Budget said optimization of current data centers was as important as reducing the overall number.

Then somewhere in there, OMB changed the definition of what a data center, is causing a huge increase in the overall number and a nose dive in success, followed by a quick rebound when closing a 3×3 closet with two servers counted.

Now OMB is expected to release yet another memo around data centers that, once again, will move the goal posts — whether they are forward, back or sideways it’s unclear. The data center memo is one of several expected in the coming weeks or months from OMB, which also is working on new Trusted Internet Connections (TIC) requirements and new guidance for protecting high value assets.

The good news is through the Centers of Excellence (CoE) initiative with the Agriculture Department and the move to Technology Business Management (TBM) standards, this may be the last data center memo for awhile.

Dan Pomeroy, the acting deputy associate administrator in the Office of Governmentwide Policy at the General Services Administration, said OMB recognized the government needed a new and better way to calculate costs and therefore savings when it came to data centers.

Pomeroy, who led the data center optimization initiative as well as the infrastructure optimization CoE before taking on this new role in September, said GSA worked with USDA to come up with eight categories to calculate the costs of data centers.

“We are looking at things like the cost of labor that will continue, but maybe it will be less as you reduce the number of data centers,” Pomeroy said. “We are asking what can you save across multiple parameters? Based on the square footage of a data center, there are different levels of savings. If you shut down a closet, there will be less savings then if you shut down a tier 3 data center.”

At the same time, GSA is ensuring the data center metrics are integrating with the TBM cost towers process. OMB is requiring agencies to implement TBM by 2022 under the President’s Management Agenda and as part of its effort to improve the capital planning and investment control (CPIC) processes.

Pomeroy said agencies needed a tool set to calculate savings and return on investment, and whose data would easily fit into the TBM structure.

USDA will further test out these new metrics as part of its effort to close 39 data centers under the CoE initiative. The agency already closed 21 data centers and expects to save $6.9 million to $8.5 million a year.

SSA bringing in industry best practices

On the other side of the IT modernization spectrum is the Social Security Administration. While SSA remained dogged by antiquated systems and processes, IT modernization is happening in some pockets.

For example, SSA expects most states to move to its new and improved Disability Case Processing System (DCPS) by the end of fiscal 2019. The DCPS rollout stands out as a major project in SSA’s five-year, $700 million IT modernization strategy, which it launched last year.

Rajive Mathur, the SSA CIO, said he’s borrowing an approach from his industry days where IT capabilities are based on a business-centric view.

To that end, Mathur said he’s implementing a product management and product manager approach across SSA’s IT efforts.

Mathur said a product owner asks the business or program managers questions such as: What is the strategy? How do we deliver on value? What are the planned product versions?

“We are not investing in a one-year, one product program. It will always be a multi-year view where we are creating value and delivering new capabilities early and often” Mathur said at the conference. “There is a big culture change I’m asking for by moving to product management.”

Mathur said under this approach, the program office and CIO’s office will develop a one-page outline of the project plan, which includes funding, current spend rate, team structure, timeline for delivery of capabilities, any market research and other data that will lead us to a buy or build decision.

This concept is not necessarily new. CIOs over the years have moved toward having business or program connections in their offices. But what Mathur is doing come more from the venture capital world where companies are held to specific metrics to produce results.

Mathur said the product manager is like a mini-CEO who knows everything about their program.

“This is IT modernization at different levels where we are changing the relationships with the business offices,” he said. “The product function is in the CIO shop today, but over time I’d like to migrate it to the business shops.”

Read more of the Reporter’s Notebook

Fraud is not a four-letter word, a new playbook is striving to prove that

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

PHILADELPHIA — “The deceptive nature of fraud makes it difficult to measure in a reliable way.”

“[GAO’s] work has shown that opportunities exist for federal managers to take a more strategic, risk-based approach to managing fraud risks and developing effective anti-fraud controls.”

Both of these comments, which came from the 2015 compendium, “A Framework for Managing Fraud Risks in Federal Programs” from the Government Accountability Office, are just as true today as they were three years, or for that matter 20 years ago.

When it comes to measuring, mitigating and combating fraud in federal programs, agencies are struggling.

The struggle comes despite the fact that GAO issued the framework three years ago, which detailed four broad steps for agencies to implement to combat fraud, few agencies have made any real, measurable progress.

In July, GAO testified before the House Ways and Means Committee that there is no reliable measurement of fraud in the Medicare program despite more than $52 billion in improper payments. The same is true for Medicaid, for the Social Security Administration’s retirement, survivors and disability insurance program and for nearly every other of the 16 high priority programs.

Add to the fact that Congress and the White House have made stopping fraud a major focus area over the last decade, and the shortcomings are even more disheartening. Lawmakers passed multiple pieces of legislation, including the Fraud Enforcement and Recovery Act of 2009 and the Fraud Reduction and Data Analytics Act of 2015. Meanwhile, the Office of Management and Budget updated  Circular A-123  to focus on enterprise risk management and included controls to mitigate fraud and approaches to use data analytics.

Basically what all of this means is the real impact of fraud in federal programs is unknown and agencies are unclear about how to stop it, so therefore billions of dollars are going to people and/or organizations that don’t deserve it.

For all of these reasons, and many others, the CFO Council and the Treasury Department’s Bureau of Fiscal Service took a small effort with the Veterans Affairs Department and turned it into a governmentwide initiative to help agencies start identifying and reducing fraud in programs.

The council and BFS released the anti-fraud playbook on Oct. 18 to provide practical guidance, leading practices and helpful resources.

“We have been working over the course of the last year with VA to look at how we could improve their anti-fraud efforts. As we were doing that, we were taking those lessons learned and talking to the other agencies about what are their needs, what’s working well and what are the gaps?” said Tammie Johnson, a program and management analyst at the Bureau of Fiscal Service, in an interview with Federal News Network at the ACT-IAC 2018 ImagineNation ELC conference. “We also collaborated with OMB, GAO and the inspectors general community to see what their thoughts were so we could build an actionable playbook that agencies can use to build their program out. They don’t need to start from beginning to end. They can actually pick and choose based on where they are in the anti-fraud journey and take action on those items.”

The playbook breaks down 16 plays across four areas:

  • Create a culture
  • Identify and assess
  • Prevent and detect
  • Insight into action

Johnson said the playbook is meant to be actionable and based on current resources agencies have on hand. She said BFS and the CFO Council makes it clear in the playbook that part of the implementation is to share what you know, borrow tools and practices from others and take advantage of tools and capabilities agencies already are using.

Playbook serves as GPS to GAO fraud framework map

Linda Miller, a director and the fraud risk management practice lead with Grant Thornton, said the playbook can be considered the GPS to the GAO fraud framework map.

Miller, who helped draft the GAO fraud framework, said the playbook helps bring the concepts down to an implementation level.

“A lot of agencies were struggling in assessing their fraud risk, where would they use data analytics and where would they start,” Miller said in an interview. “The great thing about the playbook is it really has a paint-by-numbers approach. We really broke out by plays, which are written in a vernacular that is easy to understand and has a lot of graphics and checklists. We really wanted agencies to pick up the playbook, look at one play and say, ‘I’m going to do these three or four things. This is what I’m going to try to get accomplished in the next quarter,’ and not try to bite off a huge amount of work that there is no way they could get done and feel overwhelmed.”

Johnson said the bureau plans on holding training sessions so other agencies can understand how to use the playbook and learn from VA’s experiences. BFS and other agencies have tools they also can share, which eventually will be listed on a single website, and agencies can always hire contractor services.

“The plays all apply, but it’s how you apply them to particular types of fraud,” she said. “At the beginning, you are assessing what your fraud is, what your fraud risk is and where you are in that journey, and where you need to be.”

Miller said part of the challenge is agencies tend to think they don’t have any fraud, or if their programs do, they think it’s the IG’s responsibility.

“One of the big plays we really focus on is building your fraud awareness. I think the play is called ‘Fraud is not a four-letter word,’” she said. “Agencies are starting to realize it is their responsibility.”

Johnson added another play that is trying to change agency perspective: “Think like a fraudster.”

“Generally, when you ask people if they have fraud, they think of what they would do, but not what a fraudster would do. So we give them a map of how you think like a fraudster to see where those entry points are. That will be key to developing their action,” she said.

Miller said agencies shouldn’t get overwhelmed by the playbook and should just get started looking for and mitigating fraud.

“The key is you start somewhere. Ideally, you’d start with a fraud risk assessment because there really where all the exciting work like analytics can come from and guide your investment of resources,” she said.

Read more of the Reporter’s Notebook

DHS to use federal procurement to further reduce risks to the supply chain

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Homeland Security Department’s initiatives over the past year to address supply chain risks aren’t even close to hitting a crescendo. But the pace and volume of the drumbeat is distinctly mounting.

If the efforts to ban Kaspersky Lab, ZTE and Huawei products were just the prelude to the symphony, then the National Risk Management Center’s initial sprint topics, the business due diligence request for information and the latest effort to use the power of federal procurement are the opening sonata.

Chris Krebs is the the DHS undersecretary of NPPD.

“There is a growing awareness and understanding to this issue. Our biggest challenge today is not having a national strategy around it while other countries do,”  said Jennifer Bisceglie, president and  CEO of Interos Solutions, which provides risk assessment services.  “Until we have a national strategy, you will have pop up policies or programs or studies, like the one from MITRE. The time is beyond here to have a national strategy.”

The White House’s National Cyber Strategy gave a brief mention to supply chain risk management, saying the government should “improve awareness of supply chain threats and reduce duplicative supply chain activities within the United States government, including by creating a supply chain risk assessment shared service.” But it offered no specific details or initiatives.

Only now are those starting to emerge through a series of DHS-led efforts.

Chris Krebs, the DHS undersecretary of NPPD, offered further insights at several events over the last few weeks, setting up bigger expectations for 2019.

The National Risk Management Center seems to be one major hub of activity for many of the supply chain initiatives.

Among the first sprints the NRMC is undertaking is around information and communications technology (ICT) with a new task force. Krebs said the kick-off meeting is this week where it will convene under the critical infrastructure partnership advisory council. He said it will be the nexus for the government for addressing supply chain risks.

A fact sheet on the task force provided by DHS details some of its initial goals and plans.

DHS said the group will “examine and develop consensus recommendations for action to address key strategic challenges to identifying and managing risk associated with the global ICT supply chain and related third-party risk.” It also will “focus on potential near- and long-term solutions to manage strategic risks through policy initiatives and opportunities for innovative public-private partnership.”

DHS formally announced plans for the task force in July. Without a doubt one major focus area in 2019 will be around reducing risk in federal acquisition.

“On the one hand, we have to make sure in the procurement cycle we are enabling the contracting officers to write the contracts the right way with cybersecurity in mind. But also as the decision process comes through it can be intelligence and threat informed so that we can knock off the bad options if and when they are presented,” Krebs said at the CyberNext conference. The event was sponsored by the Coalition for Cybersecurity Policy & Law, the Cyber Threat Alliance, and the National Security Institute at George Mason University’s Antonin Scalia School of Law in Washington on Oct. 4. “We also are looking at when are in the deployment phase and something is out there, how do we operationalize what we know so if we have information about a compromise or some other sort of actions, how can we take the appropriate risk management steps to protect federal networks.”

Headquarters of Kaspersky Lab in Moscow

Krebs said DHS wants to get out of reactive mode when it comes to addressing these real and potential risks. The entire situation to ban Kaspersky Lab products, which several cyber experts have said DHS and the intelligence community knew were a problem for years, required nearly a year-long effort to get the software off of federal networks, and left the government embroiled in a lawsuit.

“I don’t ever want to be in a position to have to issue a [bill of distribution] like that ever again. We want to stop those deployments from happening in the first place so how do we operationalize intelligence, how do we get it into the procurement cycle as earlier as possible to write smart contracts and inform the decisions makers,” Krebs said. “We must have good options on the table when [we] take bad ones off the table. One of things the ICT task force will consider is what are those incentives to drive more trustworthy options? The federal government has a great incentive package through the procurement cycle and the power of the purse.”

New details on DHS RFI

The idea of writing smarter procurements is behind the request for information DHS released Aug. 17, and recently made public questions and answers from the Sept. 27 industry day.

In the RFI, DHS wants to see what capabilities exist to provide ICT information through “due diligence” research based on publicly and commercially available unclassified data.

“DHS seeks information about capabilities that address risk as a function of threat, vulnerability, likelihood, and consequences, and aggregate multiple data sets into structured archives suitable for analysis and visualization of the relationships of businesses, individuals, addresses, supply chains, and related information,” the RFI states. “The information generated through the due diligence capability will be shared between organizations and may be used in combination with other information to broadly address supply chain risks to federal, state, local, tribal and territorial governments, and critical infrastructure owners and operators.”

The General Services Administration ran a similar effort several years ago, but it didn’t get a lot of traction.

Interos’ Bisceglie said the recent RFI is addressing many of the same issues as the GSA pilot, but what’s changed is the understanding of the supply chain risks agencies and industry are facing. Interos ran four of the pilots under the GSA effort in 2016 and 2017. GSA also tried to stand up a business due diligence shared service for agencies, but it didn’t get consistent long-term support.

“They had several civilian agencies used it and those that did, they made defendable acquisition or market decisions based on the GSA pilot. The challenge was we couldn’t get executive leadership support or get the program resourced correctly,” she said. “There is a clear need and clear void for a due diligence program. I think DHS will see how the market has matured in four years, and then put out larger multi-year contract for these services. It will be interesting to have multi-year program that is shared between DHS, GSA, NASA SEWP, the National Institutes of Health’s acquisition organization and others. That would get a lot of the large IT acquisition buying under one program where you could collect once and share often.”

DHS said in the questions and answers that it has not yet determined if there will be a solicitation in 2019.

“The Commerce, Justice, and Science Appropriations Act has a requirement that certain agencies (e.g. Commerce, Justice, NASA and National Science Foundation) conduct supply chain risk assessments for all of their FIPS high and moderate IT purchases. DHS is engaged with these stakeholders and reached out to them for help when drafting the RFI,” DHS states in its answers. “There is no way to ingest all data feeds but the desired outcome is to improve awareness. DHS wants to be able to calibrate the risk assessment to the risk tolerance of the end user/company.”

DHS said one less rigorous example of this type of effort already in place is with the continuous diagnostics and mitigation (CDM) program. In August 2017, DHS and GSA updated the CDM cyber supply chain risk management plan, requiring vendors to answer some basic questions related to manufacturing and tracking of the product before being added to the approved products list.

DHS states that it is  working with agencies this year to discover “actionable information” that would be shared across government.

Connected to National Cyber Strategy

“For each risk indicator, we need to figure out what the appropriate shelf life is. Continuous data monitoring will also have an impact. Veracity: we want data from an authoritative source,” DHS states.

And both the business due diligence and NRMC supply chain sprint tag back to the National Cyber Strategy.  In the document, the White House makes a specific point to say DHS will have greater insight and oversight of contractor systems from a cyber perspective if they hold federal data, particularly high value assets.

Krebs said while it’s still too early to determine the exact direction of this effort, he said there are several questions and facets to this effort.

“This is a longer term cycle that we have to look at whether GSA has the appropriate authorities? Do we have the appropriate authorities under FISMA? Do we need other federal acquisition authorities to ensure the supply chain is secure. We have a suite of tools capabilities at NPPD, things like cyber hygiene scanning, things like Automated Indicator Sharing (AIS) so what sort of umbrella can we extend across the contractor base particularly those who touch high value assets,” Krebs said. “Alternatively what are the security outcomes we really want to achieve through contracting and we expect of our contractors, not just in the first tier but second, third and fourth tier and how do they attest to that. There is a lot more to come here. This is a significant opportunity space.”

It’s been over a year since agencies, and DHS more specifically, started to apply a much finer and public focus on supply chain risks. The signs are clear from the White House, from DHS and from Congress that contractors and agencies can no longer be passive participants in this effort.

Read more of the Reporter’s Notebook

« Older Entries

Newer Entries »