“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.
Submit ideas, suggestions and news tips to Jason via email.
The end of fiscal 2019 brought plenty of excitement in the federal acquisition sector over the last week.
We saw protests—of course—much-anticipated contract awards and new acquisition strategies to set the stage for 2020.
All of this culminated what the data likely will show as one of the busiest fourth quarters in a long time.
Let’s start with the big contract awards.
Late on Friday, the Army took a major step toward the goal of getting out of the day-to-day management of its IT services.
The Army awarded three contracts worth a total of more than $34 million under its enterprise IT-as-a-service (EITaaS) program to AT&T ($5.6 million), Verizon ($9.7 million) and Microsoft ($18.2 million).
Using the other transaction agreements (OTA) process, the Army will pilot this concept under a fixed-price contract to install contractor-operated networks on small, medium and large-sized bases.
And because it’s an OTA, the Army turned the awards around in about six weeks. The service released the OTA announcement in July.
At the same time because it’s an OTA, it’s unclear how many other bids the Army received and the likelihood of a protest is minimal.
The awards also put the Army on a similar path as the Air Force and the Navy in outsourcing the management of all IT services.
Army CIO/G6 Gen. Bruce Crawford has said in the past that the goal is to modernize the service’s network and IT services, and keep them updated as technology and needs change.
The Army estimated earlier this year that 70 percent of the servers, routers and end-user devices on its 288 worldwide facilities are at or near the end of life. The figure is even higher for the equipment that handles voice communications — about 90 percent.
While the Army is just getting started, the Air Force announced on Sept. 27 it is expanding its pilot with Microsoft for EITaaS.
Microsoft won a $44.9 million modification to its previously awarded contract for network-as-a-service.
The Air Force says under this contract adjustment Microsoft will provide WiFi, public cellular connectivity, base area network transformation and dual path wide area network connectivity at the three bases—Cannon Air Force Base, New Mexico; Hurlburt Air Force Base, Florida; and Maxwell Air Force Base, Alabama—that are part of this pilot.
This new work extends the pilot effort to Sept. 30, 2021 and means the total OTA now is worth more than $109 million.
Bill Marion, the Air Force’s deputy chief information officer, said at the recent Billington Cybersecurity Conference that EITaaS pilot will pick up steam over the next month or so.
Meanwhile, the Defense Health Agency issued a request for information seemingly following the service’s path for EITaaS.
While the RFI doesn’t specifically mention this as-a-service construct, DHA is asking for everything from IT operations support and lifecycle management to IT asset planning and management to identity management services to help desk to end-user device support to local area network administration.
“The DHA Deputy Assistant Director Information Operations/J-6 (DAD IO/J-6) has identified a significant and pressing need to refine its operational IT service delivery model,” the RFI states.
The delivery model focuses on having a “standardized, consistent and repeatable processes for delivery of IT services across the DHA enterprise to optimize use of shared resources. A data-driven, evidence based and measured improvement of operational processes and IT services through the adoption of industry best practices. A continuous assessment of value and on-going operational optimization that drives efficiencies through redesign, redirection and/or automation.”
Responses are due Oct. 4.
DHA says it will review the RFI responses and ask selected companies to discuss specifics between Oct. 15-17 in San Antonio, Texas.
While DHA didn’t discuss its contracting strategy, it wouldn’t be surprising if it followed the Army and Air Force’s approach and used an OTA.
The question, however, continues to come up: if OTAs are supposed to be used to bring in innovative or non-traditional government contractors, then why are the services hiring companies like AT&T, Microsoft and Verizon? If there are innovative companies working as subcontractors, then it’s incumbent on the Army and the Air Force to be transparent about who those non-traditional vendors are and what services they are providing.
Otherwise, it just looks as though the Army and Air Force went around the federal procurement process and exploited the OTA process.
One reason why DoD, and now potentially other agencies, are so attracted to OTAs is because of the fear of bid protests.
The Air Force experienced this with its recently awarded $728 million contract to SAIC to run its common cloud environment (CCE), now known as Cloud One.
Leidos filed a protest on Sept. 20 with the Government Accountability Office. GAO has until Dec. 30 to decide.
In the meantime, the Air Force’s plans to expand this enterprise commercial cloud platform for applications is delayed.
“The first CCE application went live in March 2018. To get there, the program has had to make initial investments. Each migration costs the U.S. Air Force approximately $446,000, and the total cost for the CCE program since 2015 is $136 million,” the Air Force states in an April 2019 release. “This shift frees up money and manpower for other requirements, like building better apps and improving network security.”’
The goal for the Cloud One program is to create a standardized platform that uses zero trust architecture concepts to host legacy and new applications while providing the Air Force with better security and more agility to change the application as required by mission needs.
The Army is doubling down to ensure its making the most of its troves of data. As part of the Program Executive Office for Enterprise Information Systems (PEO-EIS) reorganization, the Army named a new project manager for data.
Chérie Smith, the program executive officer for PEO-EIS, said last week that Col. R.J. Mikesh will be the Project Manager for Army Data and Analytic Platforms. He will focus on improving Army information readiness and data agility to facilitate fact-based and informed decisionmaking.
The new organization is still in the planning phase, but the goal is to align the PM AR-DAP under the assistant PEO for Business Mission Area within the next year. Mikesh comes to the new PM for data after serving as the PM for Army Enterprise Systems Integration Program.
The federal CIO Council is making it easier to find and understand federal technology policies. The council launched a redesigned website on Friday with a new policy and priorities catalog detailing 19 items such as A-130, data center optimization and several others.
Federal CIO Suzette Kent wrote in a blog post that the council took a customer focused approach to the content and information architecture as part of the redesign.
She said by interviewing stakeholders, conducting user testing and looking at visitor data the council determined what information users were looking for and how to approach design so that they could find it quickly and easily.
This is the first update for CIO.gov since 2017 when it moved to a new hosting site and fixed some back-end challenges.
Over the last 18 months, the Office of Management and Budget painstakingly went over nearly every federal technology policy that came out in the last 20 years with an eye toward modernization.
Every new policy, whether for cloud or identity management or data center consolidation and optimization, that came from that effort included a list of rescinded regulations which had grown old and held no value any more.
The most recent update to the Trusted Internet Connections (TIC) initiative rescinded four policies, some dating back 12 years.
The identity management policy from May terminated five policies, of which some went back to 2004.
As you can see, OMB did more than a little spring cleaning.
This exercise over the last two years or more has led to a new way of thinking about policymaking. OMB and other agencies are taking a page out of the agile software development playbook and applying this methodology to policymaking.
“Because our environment is changing so quickly many of the things we’ve done was create methods, whether it was as simple as a timer to reevaluate every six months or every year and evaluate if this is still effective, or approaches and partnerships like the one with the [the Department of Homeland Security’s] Cybersecurity and Infrastructure Security Agency (CISA) on TIC. If there is a better way or a better idea, we are not changing any of our security expectations and we continue to raise the bar, but we are creating a pathway to ask the question, ‘Is there a better way?’ And make that happen very quickly versus a decade or a long study,” said Suzette Kent, the federal chief information officer, at the CISA Cybersecurity Summit in September. “That is the kind of agility and nimbleness we have to have in this space because cybersecurity is a perpetual state of hyper vigilance. We have to constantly be evaluating what are we seeing, how do we act and what’s the next step?”
For the TIC policy, OMB worked with CISA, the Defense Department and several other agencies who tested out potential new approaches to securing internet gateways between the public and agency networks.
Jeanette Manfra, the assistant secretary for cybersecurity at CISA, said her office worked with OMB in coming up with both the key policy priorities and implementation guidance.
“The concept is to continue to be able to move fast as technology or the threat changes,” Manfra said in an interview. “We are just now in the last six or seven months realizing the benefits of that.”
OMB and DHS recognized that policies can’t be so broad that they can’t measure what successful implementation looks like, and at the same time implementation guidance can’t be so prescriptive that the policy is not effective.
“The concept is you can potentially renew the implementation guidance on a faster basis than the policies. We are still developing this,” Manfra said. “It also means getting everybody on the same page of what we are going to focus on and it provides a more enduring framework as well.”
A senior administration official, who provided answers to Federal News Network questions, said the feedback loop is critical to achieving the right balance in agile policymaking.
At the same time, DoD is mirroring federal civilian efforts on its networks.
Jack Wilmer, the deputy CIO for cybersecurity and chief information security officer for DoD, said at the CISA event this agile approach to policymaking and implementation is a key piece to the defend forward notion around cybersecurity.
“How do I say ‘here is an interesting approach that one of the agencies wants to do in terms of how to connect to cloud or something else’ so let’s get the right set of people together to assess the risk of what that is, to look at the results, to look at how it works, and if it seems like it worked well and it’s a good approach, let’s go ahead and modify the policy to say any other federal agency can use that model,” he said. “The intent is as the threat evolves if we find out that we were letting people do this but now we understand it’s not a good idea, we should be able to rapidly evolve our policy so no new connections use that model that we know now is not the right approach. I am absolutely trying to figure out how do I bring that into the DoD so the policies that I write and we update are things we can evolve in a more agile manner.”
Wilmer said the goal is to increase the cost to hackers for trying to attack federal systems.
The concept of agile policymaking for IT actually started several years ago when OMB began releasing draft policies for industry and other expert comments.
Dan Chenok, a former OMB branch chief and now executive director for the IBM Center for the Business of Government, said putting out draft policies is one step toward the move to agile because it’s a good way to get customer or stakeholder feedback.
“GitHub lets OMB or any agency see the comments, and comments on others people’s comments, and then iterate, instead of having to receive 30,000 comments and figure out what everyone is saying,” he said. “It’s not just policy making, but policy execution as well. If through this iterative process OMB or the government get more buy in, you have a far less cost of compliance because there are fewer people you have to chase and can move to next the policy faster.”
In fact, IBM and the National Academy of Public Administration wants to expand the concept of agile beyond technology policy to all parts of government.
In a blog post from July, Ed DeSeve, a visiting fellow for both IBM’s center and for NAPA, wrote that government reform must adopt the concept of agile software development.
“It is critical that we develop a reform agenda to make governments at all levels more agile. For example, we should work to identify key agile government principles; identify instances of agile government around the country and around the world in order to develop ‘best practices’ that can be available to governments and researchers; and collaborate with governments that wish to use agile principles in their projects, programs, and overall organizational design to assist with strategy and implementation,” DeSeve wrote. “Success will require a new mindset in government and new organizational models.”
Terry Gerton, the president and CEO of NAPA, said the change the government has to make is toward a more proactive rather than reactive policy process.
“It is more response to the environment we are in now. We know some of the regulations are outdated, and there can be volumes and volumes of them. We have a sense that these regulations are very controlling and they may not advance government,” Gerton said. “There are new ways to do government so we are citizen or customer responsive, more timely and using a more cross-functional approach. Virtually no problem we have can be solved by one branch or one agency any more so how do we help users of the regulations be more successful so we have better government.”
The senior administration official said there are several changes needed to move to an agile policy mindset.
“Agencies are well on their way to embracing this cultural mindset and OMB is best positioned to act as an enabler by removing barriers that have long plagued the modernization journey,” the official said. “As we continue our journey, we are looking across the board at opportunities to take a more agile approach to policy development and service delivery. As evidenced through our work with the Technology Modernization Fund and shared services, iterative approaches will enable the federal government to more rapidly improve the digital service experience provided to the American public.”
Kshemendra Paul spent 6 of his 13 years in federal service focused on data and information sharing.
Now in many ways, he’s going back to that world.
Paul, who was the program manager for the Information Sharing Environment from 2010 to 2017, is the new chief data officer at the Department of Veterans Affairs.
He joined VA earlier this month after spending the last almost three years with the Department of Homeland Security as its cloud action officer and deputy director of mission and strategy in the Office of the Chief Information Officer.
Paul replaced Dat Tran, who was the interim CDO and moves back to his current role as deputy assistant secretary for data governance and analysis.
The Evidence-Based Policymaking Act required every agency to name a chief data officer by July 13.
In his new role, Paul will have a great opportunity and huge challenge. VA collects mounds of data from assorted mission areas, and given the complex nature of its network, making the information more valuable will not be easy.
VA launched an open data portal in 2013 and now lists more than 1,500 data sets on the Data.gov site.
“Open Data is an initiative that seeks to advance government transparency and promote innovation by making data accessible to the public. Using machine-readable data that the public can access, use and share, federal agencies can promote a more open and efficient government, identify creative solutions that can address existing challenges, and spur economic growth,” VA states on its open data portal. “VA’s Open Data team is working to establish a new and robust portal where users can access data, application programming interfaces (APIs), tools and resources that can be used to develop web and mobile applications, design data visualizations, and create stories directly from VA resources. When VA establishes this new tool, a more comprehensive Open Data Portal will be made available.”
Before coming to VA and working at DHS and as the PM-ISE, Paul also worked as the chief architect at the Office of Management and Budget and at the Justice Department.
Paul was one of several federal executives on the move over the last month.
Joining in the migration to new agencies are Bill Hunt, Nicholas Andersen and Shila Cooch.
Hunt, who spent the last two years as the cloud policy lead at OMB as part of the U.S. Digital Service, is the new chief architect at the Small Business Administration.
In joining the SBA, Hunt becomes another impressive team member to the group CIO Maria Roat continues to build.
Over the last two years. SBA has been on an IT modernization journey. Hunt will help the agency rationalize more than 50 applications, continue its move to the cloud and ensure the security of its data through innovative approaches.
“I’m looking forward to applying the lessons I’ve learned working on the Cloud Smart, Data Center Optimization, and Application Rationalization policies — and discovering where I got it wrong,” Hunt tweeted earlier this month.
While Hunt left OMB, the Federal CIO’s office started to fill some key openings.
Andersen joined as the federal cybersecurity lead in OMB and Cooch is the new senior policy adviser.
Andersen joined OMB after serving as the chief information security officer for the Vermont government over the last nine months. He replaces Josh Moses, who left in November to join the private sector.
In addition to his time in Vermont, Andersen also is a Marine Corps veteran and worked as a civilian for the Navy and Coast Guard in cybersecurity roles.
Meanwhile, Cooch also fills a key position at OMB. She comes to the Federal CIO’s Office from the Homeland Security Department where she was the chief of staff for the CIO for the last four years. In all, she worked at DHS for the last 15 years.
In her new role, Cooch will lead the development and implementation of IT policy across the government.
And speaking of DHS, the U.S. Immigration and Customs Enforcement is looking for a new CIO.
Michael Brown left ICE after four years to join Gartner as a senior director analyst.
Irfan Malik is the acting ICE CIO. He joined the agency in 2015 as the chief of IT Operations Division where he oversaw the operations and maintenance of IT across the agency.
Brown worked in government for more than 24 years, including for DHS components since 2000, before Congress created the agency, and for the Navy and Marines Corps.
Finally, Lou Charlier is the Labor Department’s new deputy CIO.
He has worked at the agency for 13 years spending time as its director of infrastructure services where he was the principal adviser to the CIO leadership, departmental executives, and key agency managers for large-scale IT initiatives. He also assisted in the planning, directing, and administering of a comprehensive IT program for the department that provided tactical day to day leadership, organization stability, and technical expertise.
Michael Wooten became the 15th administrator of the Office of Federal Procurement Policy about six weeks ago. More importantly, he became the first permanent head of federal procurement during the Trump administration.
In his first two public speeches last week, Wooten hit all the expected notes that an OFPP administrator is supposed to reach—building on existing efforts like category management, upskilling the workforce, unlocking technology to create innovation and harnessing acquisition data to turn it into business intelligence.
“There is a considerable alignment that supports what we are doing. This is a good time to be the administrator. There are a lot of people who are cheerleading, saying ‘go, go go and do things for us.’ It’s a good time to cut regulations that get in our way and I’m happy about that,” Wooten said at the Tech Trends conference sponsored by the Professional Services Council in Washington, D.C. “This is a good time to skill up the workforce. I have a mindset that we need to help the workforce shift to the right or use those human judgement skills as opposed to the rote stuff that software can do.”
That concept of using software — take your pick of a buzzword: Artificial intelligence, machine learning or robotics processing automation — to address manual processes and compliance requirements would be a huge accomplishment, and maybe the biggest of any OFPP administrator in the last two decades.
One of the main reasons why federal procurement has a bad reputation for being too slow, inflexible and lacking innovation is contracting officers and acquisition workers are rightly concerned about auditors and Congressional overseers getting all up in their processes. If not every “T” is crossed and “I” is dotted, the consequences are severe so that tends to cause the risk averse contracting approach we’ve come to know so well.
The Section 809 Panel looking at Defense procurement found in its compendium of recommendations this concept of risk aversion cuts across much of the procurement process.
“In many cases, the Federal Acquisition Regulations (FAR) and other regulations allow for more interaction with industry than is common practice,” the panel states in volume three of its recommendations. “The recommendations … work together in an effort to foster behavior that values interaction with industry and reduces fear of missteps and risk-taking normally associated with interacting with marketplace.”
The panel also says the FAR and even the Defense regulations make it “difficult to effectively navigate and understand the regulations, which prevents acquisition personnel from leveraging the flexibilities, methods and authorities available to maximize speed in the acquisition process and encourage innovation, competition, and investment by the private sector.”
This is why Wooten, who was nominated in February and confirmed in August, wants to, and should, focus on using AI/ML and/or RPA to reduce some of the risks that are inherent in contracting.
“It is time for procurement leadership to engage in conversations with industry and government [about automated technologies],” Wooten said. “We will be in alignment with the federal chief information officer. We need to make sure that we share a common understanding on what AI is. We need to understand government’s AI requirements and we need to understand industry’s AI capabilities. We need to spark innovation in AI in a manner that includes small businesses to the maximum extent practicable. If we get this right, our AI acquisition is not outpaced by obsolescence and is not outpaced by our near peers.”
Wooten said the acquisition system can’t get in the way of delivering on mission needs in a faster, more flexible and more efficient way.
Speaking at the FedInsider event on IT modernization and moving the cloud last week as well, Wooten said one way to do that is to apply AI or other automated technologies to the flow chart processes of acquisition.
“If you have a job that lays out step-by-step-by-step what is prescribed, if you have a flow-chartable job, then I can replace you with software or at least that part of your work that is done with software. We don’t want to replace workers. What we want to do is augment workers and relieve them of the burden of these step-by-step, tedious types of jobs,” he said. “That is one of the things that is on the horizon. This is not the big evil plan of Dr. Wooten to move workers off to the side. I think the leverage of AI into doing those mundane processes faster than we can, cheaper than we can and very regularly. The prospect of that makes the shifting to the right necessary and supplanting workers out of these mundane positions make it inevitable.”
He said he wants to empower contracting officers to be solve problems, and to do that, they need a different set of skills than they have today. He said he wants contracting officers to spend more of their time to make business decisions versus following a flow chart of processes.
At the same time, Wooten admitted that OFPP needs to do more to get the word out about his plans and goals, and ensure contracting officers know his office can help them.
“We have to get out to the folks in the acquisition workforce to help them understand it’s our responsibility to shape the rules, the tools and the schools for them to be successful,” he said. “We have a branding problem. We need to do a better job makings sure they get the content and handle the communications to ensure people understand what OFPP does for them.”
The best thing OFPP could do for them is to do more than talk about the promise of AI/ML or automation, but look at agencies like the Department of Health and Human Services, the Department of Homeland Security, the General Services Administration and others to expand the work they are already doing in these areas to more agencies. Wooten would be well served to move quickly on implementing automation as his time in office may be limited to 17 months, and that’s not a lot of time to get things done.
Agencies finally have a basic understanding of the threat landscape around the federal technology supply chain.
And chief information officers, acquisition executives and others shouldn’t feel good about what they’ve learned.
The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency’s Information and Communications Technology (ICT) Supply Chain Risk Management Task Force identified 190 threats across nine groups, including counterfeit parts, cybersecurity and economics.
The task force highlighted the supply chain topography as part of its interim report and four recommendations released last week.
“I think it’s important to make one note on the scope, the group focused on threats as opposed to risks. There was a lot of discussion in the group on that topic because it’s not necessarily immediately clear where a threat might end and a risk might begin. I think the easiest way to kind of explain about how the group thought about the interplay there is they defined risk as the intersection of threats with assets and vulnerabilities,” said John Miller, co-chairman of the task force and the vice president of policy and senior counsel for the IT Industry Council. “Through that lens, you can see why this group’s work was so foundational.”
Along with the threats, the task force outlined 40 scenarios mapped to each of the nine groupings, covering everything from ransomware attacks to contractor compromise challenges to supplier ownership changes to natural disasters.
“In building out those scenarios, several categories were considered by the group, including the interplay of particular vulnerabilities in that context: business impacts, potential business mitigation strategies and controls,” Miller said. “It was a very contextual analysis for each of them.”
All of this information comes at a time when the focus and concerns about supply chain threats are rising.
The two most obvious examples are the banning of Huawei and ZTE products in federal and contractor networks earlier this summer, and the prohibition on Kaspersky Lab products and services in 2018.
But it’s more than just a few well-known risks.
Jeanette Manfra, the DHS assistant secretary for cybersecurity at CISA, said supply chain is one of four priorities for an interagency working group focused on increasing collaboration and coordination to better secure industrial control systems.
“It can’t be your solution to say ‘I’m air-gapped.’ We all know you are not air-gapped,” Manfra said at CISA cybersecurity conference last week. “You have to make sure you understand both the hardware and software chain of those systems that you are putting into play, and you understand things like access.”
Manfra added it’s more than just understanding the prime supplier of the hardware or software, but getting to know the tier 2, 3 and 4 providers as well as business relationships and ownership.
“Sometimes that’s hard to completely understand, but it’s really important when you are buying a really expensive piece of equipment or system that you make it clear to whomever is selling that to you that you want that level of visibility,” she said. “That can go a long way to solving what I would say are individual supply chain issues.”
The task force’s report tries to gives agencies and industry more insight into all levels of the supply chain.
Bob Kolasky, the deputy director of the National Risk Management Center in CISA, said the task force’s recommendations focus on strategic and tactical aspects of supply chain risk management.
On the tactical side, the group suggested agencies only buy IT products from authorized resellers or from original equipment manufacturers (OEM). It also recommended agencies should rely on a trusted vendors or products list when the risk is greatest.
“There is a higher likelihood in the analysis we’ve seen that if you are not buying from OEMs or authorized resellers, there’s an increased risk of getting counterfeit products in the system, and with counterfeit systems comes a whole level of technical risk within that,” Kolasky said after a panel at the CISA conference. “We thought that this was a risk mitigation strategy that makes sense and there is an opportunity with federal acquisition policy to push that.”
The General Services Administration is considering rescinding the IT schedule special item number for refurbished or used products because of supply chain concerns. The Defense Department also adopted this policy in 2016.
Kolasky said the Federal Acquisition Security Council also is looking at this issue and may make additional recommendations.
“The more you buy from OEMs or authorized resellers, you have the ability to actually monitor their practices and make some judgements around that. There may be some source of concern with the original equipment manufacturer for different reasons, but it raises the bar of trust,” he said. “There was a general consensus that this was an issue we should take on as a task force. We talked about prospects of this and the premise of this was not that controversial.”
The use of an approved products or qualified bidders list came from research around the DHS continuous diagnostics and mitigation (CDM) program, GSA’s IT schedule and NASA’s SEWP contract.
The working group laid out 11 factors where the use of a qualified bidders or approved products list may make sense.
“That group probably didn’t go as far in the initial rounds as one might have thought. We didn’t come in with the recommendation that you have to establish as a qualified bidder or a qualified manufacturer list. Instead, we worked as a task force to come up with the characteristics you should consider if you do that,” Kolasky said. “I think that’s a little bit of a risk management approach to understand the qualified bidder or qualified manufacturer may be the right solution in certain cases, but not in all cases.”
He added industry was supportive of using this approach when appropriate, especially in light of the additional costs that using an approved products or manufacturer list could incur.
On the strategy side, the task force recommended agencies and contractors understand the cyber threats they face, and share information about those risks more broadly and more quickly.
Kolasky said the goal is to improve private-to-private information sharing and how to get that information into the broader ecosystem, including the government. He said that brings in a whole set of legal challenges, including liability
In the report the working group states that it “concluded that legal analysis and guidance are a prerequisite to developing a framework for any systematic, omni-directional information-sharing system relating to suspect suppliers. The result of these legal considerations could set forth the guidelines for addressing the process, operational and financial barriers that restrict effective implementation.”
The second strategic recommendation determined the 190 threats across nine groups and how to mitigate them through tools and controls.
Kolasky said the supply chain task force will figure out its next areas of focus, including helping small and medium-sized business manage their risks and connect with other critical infrastructure providers about how they are managing ICT challenges.
He said the task force will finalize its year two plan at the end of October or in early November.
Buried in an Aug. 27 blog post on identity management by Bill Zielinski, the General Services Administration’s assistant commissioner for the Office of Information Technology Category (ITC) in the Federal Acquisition Service, is a nugget of important news.
GSA announced that the Veterans Affairs Department is moving its smart card identity management program to its USAccess shared service.
By finally convincing VA to be a customer, GSA is almost doubling the number of customers using its identity credentialing service. Nearly 14 years after GSA launched the managed service, USAccess finally snagged its white whale, so to speak.
“This was a decision I wish I had made when I was there,” said Roger Baker, the former VA chief information officer during the Obama administration. “This was an ongoing project that never caused as much pain that the decision had to be made. VA decided to do its own PIV card system and that was a very complex and massive program. Anything at VA is like that because of the scale.”
And it’s that scale that makes this newsworthy, especially at a time when the Trump administration is strongly encouraging shared services and some agencies such as the Department of Health and Human Services are pulling back their shared services offerings, and the Interior Department is considering a similar move.
Historically, GSA provided managed services for smart identity cards for most small and micro agencies and several larger ones including USDA, Interior and Commerce.
But getting VA to join is a huge, long-time-in-coming win for the program.
“If you think about the system and the requirement to keep it modernized, the decision to move to GSA may have just come back to the modernization challenge and the fact that VA is better off just letting GSA deal with it,” Baker said. “These cards are more critical today than they were seven years ago because now they are required to access the network, to provide medical care and so much more. It’s not ‘just an access card,’ it does a lot of things.”
Since 2005 when President George W. Bush signed Homeland Security Presidential Directive-12 establishing the requirement to issue and use smart identity cards, VA provided the services to its employees. It operates and maintains its own personal identity verification (PIV) card management program. VA created an Office of Identity, Credential and Access Management (OICAM) to provide program management and oversight for the system, and the VA’s Office of Information and Technology (OIT) maintains it.
Baker said at one time during the mid-2010s the agency considered adopting the Defense Department’s Common Access Card, but decided against it in the end.
A VA spokeswoman said the decision to move to GSA came down to ensuring IT resources are more focused on serving veterans and their families.
“While the current VA card management system is hosted at a VA data center, USAccess will be hosted outside of VA’s infrastructure. By outsourcing this system to GSA, VA leadership will be able to focus VA IT resources to improve the Veteran Benefits and Health systems,” the spokeswoman said. “Veterans, taxpayers and VA employees will benefit from this move in numerous ways, including strengthened security at VA facilities, and reduced VA IT resources — both personnel and IT infrastructure. The new GSA equipment and the option to use GSA PIV card issuing facilities (PCIFs) will enable VA to focus on providing support to our Veterans instead of producing PIV badges. It also enhances VA’s capabilities to interoperate with other federal agencies by using the same PIV card that over 100 other agencies are using. Lastly, it stabilizes PIV badge costs for the VA, while eliminating the requirements to manage acquisition and maintenance contracts, freeing up resources across VA funding, acquisition, and contracting to focus on delivering support to our veterans.”
The spokeswoman said the move to GSA should take about four months starting in October to install new smart card facilities through USAccess. By January 2020, VA employees who need new or need to renew their PIV cards will go through the shared service.
Another reason for VA’s move is the Office of Management and Budget’s new identity management policy makes it easier for agencies to adopt current and emerging technologies for authentication and verification of users. While the requirements under HSPD-12 are not going away, per se, agencies now have a lot more flexibility for how to meet them. By getting out of the issuance and management of cards, VA OIT could focus its time and resources making identity access and verification more convenient.
CLARIFICATION; GSA and USDA updated their statement about the future of their CoE partnership on Oct. 4.
The Agriculture Department is so confident in its ability to modernize everything from its call centers to its infrastructure to how it uses and analyzes data that it’s sending most of its contractor and government experts home.
After receiving less than a year of assistance, USDA made the surprising decision to end its long-term relationship with the General Services Administration’s IT Modernization Centers of Excellence and many of the 12 vendors, who GSA awarded six contracts to last October to support this effort, in three of the five areas.
On Oct. 4, GSA and USDA released an updated statement of its plans to work together going forward:
“USDA and GSA are proud to have completed their CoE workstreams of data analytics, infrastructure optimization, and cloud adoption ahead of schedule and on budget. Our work together will continue on implementing the ASK USDA Contact Center and the creation of ‘one front door’ for USDA customers. USDA and GSA’s partnership will further be highlighted by Secretary Perdue and Administrator Murphy at GSA’s IT Modernization event on Tuesday, Oct. 8,” the agencies said in a joint statement.
The spokesman confirmed that all Phase II CoE contracts will end on or before Oct. 15.
A GSA spokesman added the end of the partnership between the two agencies was scheduled to close at the end of fiscal 2019.
“The agreed upon plan is that USDA will take ownership at that time and sustain what has been delivered from GSA to USDA. It was a business decision to honor the original agreement. This has been a successful partnership,” said a USDA spokesman in a statement to Federal News Network. “Three of the five CoEs have been transitioned to USDA. USDA is currently transitioning the remaining 2 CoEs to the appropriate organization in USDA. We are implementing our contingency to have USDA sustain these COEs and institutionalize practices learned from the COEs.”
So while both agencies, and even the Office of Management and Budget, are all saying the right things publicly, there are few questions that arise.
First, if the schedule all along was for only one year of support from those 12 vendors, which GSA and USDA pulled out all the stops to develop the request for quotes, hold public industry days and aggressively announce the awards to, then why did four of the five contracts have options years?
Industry sources say it’s almost unheard of for an agency not to pick up an option on a contract after just a year without a good reason, such as poor vendor performance or a major change in policy.
Since the Trump administration continues to support and expand the CoE concept and industry and government officials say USDA has been pleased with the support contractors provided, there must be another reason.
That leads us to the second question about the agency’s decision: multiple sources say USDA didn’t have the money to renew the contracts and that’s why they are taking over the CoEs.
The USDA spokesman didn’t directly respond to the question about funding, saying the decision to take over the CoEs is “not a matter of whether USDA has the money or not,” and there was no need to ask for more money in fiscal 2020.
“Due to where we are in our engagement with the CoEs, there has not been a business need to work with OMB or Congress to request additional funds for the COEs,” the spokesman said.
And the third question that arises over USDA’s decision focuses on the contractor support and how much progress could’ve been made in a year given a typical engagement usually needs a year to begin to see significant change. It’s clear there is still plenty of work around cloud adoption, data analytics and the other areas that may need contractor support in the near future. So by not picking up the options means USDA may have to go through another series of RFQs and awards that potentially would delay progress.
For instance, sources say USDA didn’t issue any task orders against the infrastructure optimization and cloud adoption contract. It also means the 10 contractors who won spots on the contract received nothing for their bid and proposal investments.
“Contractors are not happy, particularly small businesses, who feel they were left not winning any work,” said one industry sources, who requested anonymity in order to speak candidly about the program. “And large businesses thought they would make themselves invaluable and didn’t expect USDA to have them embedded with employee teams and do knowledge transfer. Once that happened, USDA is saying goodbye.”
The source said USDA was edging out contractors by spending every day and learning from them.
“You don’t see this very often, and it’s a model of not having to rely on contractors,” the source said. “It doesn’t say a lot about the contractors either and possibly puts fear in them.”
A GSA spokesman said USDA employees were detailed to the CoE teams to understand how to drive adoption.
“Training, outreach and community of practice development activities were also used to both ensure fiscal 2020 CoE self-sustainment and continued adoption,” the spokesman said. “Training activities were used to address individual mission area adoption of data analytics, increased agile adoption, cloud adoption alternatives, DevSecOps approach and other CoE related topics. Both data analytics and cloud CoEs have initiated communities of practice that will exist long after GSA CoEs leave USDA.”
Another industry source said the way USDA and GSA constructed the support contracts in a way that required the vendors and/or the agency to “sell” or “market” these services to other parts of the agency and that never happened.
“There wasn’t as much interest or clarity in that process,” the source said. “This was all a learning process for how to use these vehicles. I think it’s less about money and more about the clarity of the process in terms of how the agency could use the contracts with existing procurements and programs. That’s the biggest thing CoEs had to find out as they went through these efforts.”
The second source said despite these challenges, the CoEs were successful as a vehicle for moving faster and bringing innovation into the federal IT modernization process.
“The execution of the CoE idea was challenging. I think getting the first two going was hard, but by the third or fourth one, it will get easier,” the second source said. “As a business model for industry, that’s a different issue. Vendors are putting lot of bid and proposal dollars to get these contracts and aren’t getting much in return. I think this is part of the learning process for both government and industry.”
Since the beginning of the CoE effort, GSA and OMB wanted to disrupt the federal market.
Federal CIO Suzette Kent said in a statement to Federal News Network that OMB supports the concept that the CoEs are an accelerator to drive modernization and transformation across the government.
“We are thrilled at the success agencies and GSA have had with CoEs and look forward to continuing expanding the administration’s efforts for a 21st century government,” she said.
USDA and GSA say all five CoEs met their goals in almost all respects.
“The USDA and GSA CoE partnership met or exceeded all IT modernization implementation objectives,” said the GSA spokesman. “The CoE partnership measured its success in terms of adoption and its impact for both taxpayer citizen and the many stakeholders in the U.S. agricultural supply chain.”
USDA said the 5 COEs are providing value and have delivered the technology ahead of schedule and on budget: Below are some of the results USDA says it achieved over the last year:
Data Analytics modernization has enabled USDA to become more customer-focused, data driven and fact-based to provide results for the American People.
Cloud and Infrastructure COE – USDA is becoming more secure and efficient on behalf of the American taxpayers.
Voice of the Customer COE – USDA is improving the overall customer experience through digital modernization and service delivery to its citizens.
Contact Center – USDA is improving the overall customer experience through digital modernization and service delivery to its citizens.
Customer Experience COE – USDA is improving the customer experience through our public facing customers.
GSA also says it’s using the lessons from the USDA experience with other agencies, including the Department of Housing and Urban Development, which is preparing to move into Phase II, and the Office of Personnel Management, which is in the discovery or Phase I of the CoE program.
“There were multiple lessons learned that will be applied at future CoE engagements. First, change management is a bigger challenge than technology complexity. Coordination within leadership is essential to ensure strategic alignment around the work and components,” the GSA spokesman said. “A second lesson learned was to use agency resources and leverage previously started IT modernization initiatives to drive increased interest and adoption. A key lesson learned was the importance to identify these capable in-house resources to leverage prior progress and also leverage their organizational understanding to drive increased adoption. The CoE also brings highly skilled IT resources that an agency would not normally have.”
The National Security Agency’s new cybersecurity directorate is less than a month away from reaching initial operating capability (IoC) and three-and-a-half months from full operational capability (FoC).
In the meantime, NSA is reorganizing some of its mission areas to fit better under the new directorate, and along with that comes the shifting of people and resources.
Anne Neuberger, the director of cybersecurity at NSA, said at the 10th annual Billington Cybersecurity Summit that the new organization will bring together four cyber communities, including the threat intelligence and vulnerability assessment offices.
She said the two other offices are more on the operational side.
“Our traditional keys and codes mission … that builds a million plus keys a year that are at the root of all secure communications across the armed forces and allies,” she said after her speech in a briefing with reporters. “Our operational mitigation teams that generate the various indicators that we tip to partners across the U.S. government and others. We want our folks to see that the directorate coming together gives them a way to have diversity in their careers and to really learn from those other communities to have that more unified, holistic impact.”
NSA also is preparing the workforce for both the Oct. 1 IoC and Dec. 31 FoC by addressing some typical and necessary administrative changes as well as creating work space so the different communities can work more closely together.
“There are certain priorities we are changing, and there are certain ways we are massing resources on particular problems. So if you are a vulnerability researcher, we will change the way we do vulnerability research by, for example, doing it more in an unclassified space and bringing different kinds of people together to do that mission,” Neuberger said. “But other than that, we want people to have that stability within the confines of the changes we are making to deepen our impact.”
NSA also will be opening up new jobs for current employees or other cyber experts to apply for as part of this reorganization.
“In our traditional security mission, the security and cryptographic standards and cryptographic systems, we are really investing in that mission again,” Neuberger said. “In the broader national security shift, we are moving from our counter terrorism fight, though we are still focused on it, but we also are recognizing that nation states are key adversaries today and we have to make shifts to ensure we are keeping up on that.”
The shift Neuberger is talking about isn’t just with people, but in the strategy and operational areas too.
Neuberger said over the course of the next few months she is focused on unifying the cyber organization, focusing on the hardest problems and enhancing collaboration across the public and private sectors.
“We want to deepen the collaboration between our threat analysis community, our vulnerability assessment community and our mitigations communities, and most importantly the people in those communities,” Neuberger said. “NSA generates hundreds of threat intelligence reports on cybersecurity. In those we detail adversary capabilities and threats. We also have a defensive mission that builds the cryptographic algorithms, cryptographic solutions and provides security advice for the nation’s most sensitive systems. They work together, but we need to deepen that and generate one product, ideally unclassified and quickly, to make it really usable.”
She said by concentrating on these areas, NSA will bring offensive and defensive capabilities closer together, and share threat analyses and offer more tactical intelligence to partners.
“There is a shift because we’ve heard a lot of feedback that some of the information we would share, for example IP addresses or domain names, are temporary and by the time they are shared they are no longer useful,” Neuberger said. “And when we share threat information at the unclassified level, there needs to be more context. What are the overall goals of the actor? How do they pull together those goals using an exploit or a particular infrastructure against a particular set of targets? We want to change from the more tactical elements being shared to pictures that help cybersecurity individuals who work the mission each day use that information each day to better impact.”
Those cybersecurity experts who rely on NSA is growing. The Homeland Security Department is relying on the DoD for threat and vulnerability information in a much larger way.
Chris Krebs, the director of the Cybersecurity and Infrastructure Security Agency (CISA) said at the Billington event that working with NSA and other agencies as the Energy Department to improve the security of the nation’s critical infrastructure and federal networks is vital.
“It’s almost like a concept that is widespread in the military where there is a supporting command and a supported command. We are the supported command and NSA is providing us with information to help us execute our mission — elections is just one example — but broader critical infrastructure,” Krebs said after he spoke at the summit. “There is no overlap [with the NSA]. This is understanding the lanes in the road and being able to execute in the same direction.”
Over the course of one-and-a-half days at the 10th annual Billington Cybersecurity Summit, more than 70 speakers hit upon nearly every unclassified topic you could imagine. Attendees even heard from Israeli and U.K. cyber executives, who helped make the world a little smaller by demonstrating their challenges are no different than the ones faced by US federal agencies.
The one common theme that permeated across nearly every keynote, panel and breakout session was the safe discussion about the cyber workforce.
And while discussing the cyber workforce is both a nonthreatening and easy topic for most of the industry moderators, who are worried about making a current or future customer mad by asking more pointed questions, the panelists actually offered some real updates about how they are addressing this long-term challenge.
Let’s start with a little background.
The Center for Strategic and International Studies found in a survey of IT decisionmakers across eight countries that 82% of employers report a shortage of cybersecurity skills, and 71% believe this talent gap causes direct and measurable damage to their organizations. According to CyberSeek, an initiative funded by the National Initiative for Cybersecurity Education (NICE), the U.S. faced a shortfall of almost 314,000 cybersecurity professionals as of January.
There are dozens of studies and surveys that add to the cyber workforce shortage narrative, and how it’s only going to get bigger when agencies and private sector organizations add the need for data scientists and software coders to this cyber workforce.
In the federal sector, the Office of Management and Budget, the CIO Council, the Department of Homeland Security, and the National Institute of Standards and Technology have all sprung up initiatives to tackle the cyber workforce program from a grand scale—think of the cyber workforce reskilling program and the executive order creating a rotational program for public and private sector experts.
“The workforce work under OMB has been incredible in that we’ve actually divided it up amongst the CIO and CISO councils where we are not taking work streams and putting people in work groups to develop an approach to developing data analyst to make sure it’s the same whether at departments of Energy, Veterans Affairs or Treasury,” said Paul Cunningham, the VA chief information security officer at the Billington event. “What’s really important about that, when they get categorized, and their level and coding are done correctly, we can now move them across the federal space and we will know where they are at, what we are getting and what they need to move to the next branch. While it’s important to have the historical side of cybersecurity in a federal organization, it’s also beneficial when we can leverage what is being done in other federal elements.”
For our purposes, let’s delve deeper into a few examples of how specific agencies are adding more cyber firepower to their workforce.
Sometimes agencies have to take some risks with their workforce and while these aren’t the typical risks for CIA employees, the changes make sense for the overall direction of the agency over the last decade.
Sean Roche, the associate deputy director for Digital Innovation at the CIA, said the spy agency changed its pay scale, altered the way it hired by putting some of their business and mission leaders in the field to recruit new employees and decided promotions don’t necessarily have to mean management.
“We did something that hadn’t been acknowledged before which was we now promote people up through the Senior Executive Service as experts and they don’t have to manage. They are better with machines than people and we want to keep it that way,” Roche said. “To be promoted up to an SES, they have maintain the skills, but they have to be mentoring and bringing on others. It’s a significant portion of the people we promote to SES every year. That has really given people a path.”
While the CIA transformed its human resources approach when it launched its digital innovation directorate in 2015, the lessons they offer can be applied to the cyber workforce.
The Defense Department’s implementation of its Cyber Excepted Service has been slower than many would’ve liked. The recent decision by Congress to reject the Pentagon’s request to reprogram $4.8 million for this program tells you a little bit about lawmakers’ frustration with the military’s efforts.
Still, Tom Michelli, the vice director of command, control, communications and computers (C4)/cyber and deputy CIO for the Joint Chiefs of Staff/J6, said the initiative is picking up steam.
DoD has converted 2,500 people in the Cyber Excepted Service and reduced time to hire at the U.S. Cyber Command to 80 days from 111 days.
“We can hire folks at higher grades than we would normally hire and through direct hire. We are able to bring in military folks at different grades than we would normally bring them in at,” Michelli said. “Once they are in, we have the ability to provide additional education and training and a higher pay scale on the civilian side and bonuses on the military side.”
Even though Congress provided DoD with the authorities under the Cyber Excepted Service, there is enough evidence that every agency would benefit from similar rights. The Office of Personnel Management gave all agencies in October 2018 the ability to hire cyber workers directly
DHS, like DoD, has been out in front of addressing cyber workforce shortages.
DHS has used retention bonuses of up to 25% of an employees pay back in 2016. The department also held cyber and technology job fairs where it made on-the-spot offers to 150 candidates. And it has been developing a new cyber talent management system for the better part of two years.
John Zangardi, the DHS CIO, said the goal is not just to find people who know cybersecurity but the skills and abilities they bring to the agency.
“We have to make salaries more comparable to what industry earns. It’s about flexibility. It’s about using technology. And it’s about creating an environment where people can move back and forth [between government and industry],” he said. “How can I actually get on board the right technical skills that can help me with mission? Being in government, I cannot match the salaries of industry so I have to work some unique ways. I have to appeal to their sense of mission and their patriotism.”
Zangardi said the new talent management system should help create more automation in how DHS hires people. He also said a new cyber internship program, which ran this summer with 10 individuals, will help create a pipeline of qualified workers.
“You have to help the team deal with the growth in data and we have to face up the unique challenge the government has in hiring,” he added.
One way DHS is taking advantage of the skills and abilities of its workforce is through new training for cloud computing, which includes some cybersecurity aspects.
Zangardi said the Cloud Stand Down effort is about training and educating technology and non-technology workers about how cloud works and what they need to consider as they buy, manage and use these services.
All three of these agencies have added authorities that others don’t, but it’s clear there are steps every department can take whether it’s asking mission leaders to recruit new employees or investing in training and education resources. It would be nice if we could stop talking about the cyber workforce at every panel as this is a fixable problem.
Over the last few weeks, several lesser known, but significant changes came to the federal IT and acquisition ranks.
While these federal executives may not be known as well as some of their chief information officer colleagues that we usually write about in this space, they nonetheless have a big impact.
Let’s start with Kamela White, who left the Office of Management and Budget in August after 19 years. White joined the Senate Appropriations subcommittee on Homeland Security as a professional staff member.
Many of you may not recognize White’s name, but you’ve been impacted by her efforts. She was a senior program examiner at OMB starting in 2000 where she initially worked on some of the e-government initiatives around cybersecurity and later around shared services.
She later worked on homeland security issues, including immigration and visas.
Since July 2017, White has been the director of enterprise analytics at OMB where she helped accelerated the adoption of advanced analytics to support more data-driven policy, budget and operational decisions.
White is one of those people who made OMB work, putting her head down and drawing little attention to her successes.
Jimmy Jones is another person in the same mold of making the trains run on time and helping agencies find success.
Jones left the Transportation Department, where he was a program analyst in the CIO’s office, after four years working on a host of issues from creating an open source repository to working on emergency response for hurricanes on behalf of the agency.
In a note posted on LinkedIn, Jones writes, “I have decided to take a job offer with Pinellas County’s Tax Collectors Office as a senior project manager around the first of September. This job opportunity came to me as a total surprise, but it is the right position for me at the right time in my life. Therefore, I am moving from a federal position into a county position. As many of you already know, I have been commuting back and forth from D.C. to Florida for over the last four years. Please note that I have been provided a lot of wonderful experiences and challenging assignments during my career.”
Jones started his federal career on Capitol Hill where he worked for the National Republican Senatorial Committee. He moved over to the Education Department shortly after and spent five years developing IT business cases.
He joined the Interior Department in 2006 and then the Recovery Accountability and Transparency Board in 2010.
“I was able to improve to enhance my knowledge while working on so many areas that allowed me to grow at each of my positions,” Jones writes.
Over at the Homeland Security Department, Beth Cappello joined as the new deputy CIO replacing Stephen Rice.
Cappello comes to headquarters after spending the previous almost-three years with the Immigration and Customs Enforcement directorate as its deputy CIO and acting CIO.
She also worked at the Customs and Border Protection directorate for five years as its head of the Enterprise Networks and Technology Support office.
Rice left in June to be the deputy CIO at the Navy Federal Credit Union.
Also at DHS, but on the procurement side, Milton Slade is a new industry liaison. He comes to the headquarters office of the chief procurement officer after spending the last nine years as a contract specialist at DHS.
“As I transition, I look forward to this challenge and the opportunity to engage with many of you on strategy, innovation, outreach and better communication in order to build a stronger, more robust DHS,” Slade writes on LinkedIn.
Over at the Office of the Director of National Intelligence (ODNI), Benjamin Huebner joined as the new chief of the Office of Civil Liberties, Privacy, and Transparency (CLPT).
He also will be the Intelligence Community’s (IC) Civil Liberties Protection Officer, a position established by the Intelligence Reform and Terrorism Prevention Act of 2004, and as ODNI’s Chief Transparency Officer.
Huebner replaces Alex Joel, who held the position since 2005 and left in July.
And finally, Terryne Murphy, who left in August as the Commerce Department’s acting CIO, announced her new position as CIO at U.S. Railroad Retirement Board in Chicago. She replaces Ram Murthy, who held that position since 2013.
|Oct 11, 2019||Close||Change||YTD*|
Closing price updated at approx 6pm ET each business day. More at tsp.gov
* YTD data is updated on the last day of the month.