Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Application rationalization playbook: A prescriptive AND flexible approach to the cloud

The Small Business Administration may be the perfect case study for why agencies needed the new application rationalization playbook from the Chief Information Officer’s Council.

SBA began the process of rationalizing 50-plus applications one at a time as they move more and more into the cloud.

But Deputy CIO Guy Cavallo quickly put a stop to that approach.

Instead, Cavallo said SBA is using the Agriculture Department’s methodology to determine the priority list of applications that need to be modernized or eliminated.

“If you have app ‘x,’ should you do one of our things with it: Turn it off; Keep it where it is; Move it to the cloud in an infrastructure-as-a-service platform; Or should you rewrite it?” Cavallo said during the ATARC cloud summit on June 25. “We have a process and it will be documented. We are in the process of doing that.

SBA isn’t alone in its experience with application rationalization. Agencies have been talking about this effort for several years, but few have made real progress.

Three-pronged approach to the cloud

This is why the CIO Council’s playbook is out and why it’s coming at a perfect time.

“The playbook is one of three interconnected policy strategies that all work together. The Data Center Optimization Initiative (DCOI), Cloud Smart and Application Rationalization feed into, and off of, each other,” said a spokeswoman for the General Services Administration, which is where the CIO Council lives. “The application rationalization process supports optimization and/or closure of federal data centers, as agencies methodically parse through their enterprise IT portfolios and make informed business decisions on where to house applications and data. Agencies can determine whether the existing ‘as-is’ environment, or a proposed ‘to-be’ configuration is the best fit for their core mission, based on cost, business resiliency and service delivery. This rationalization process feeds directly into Cloud Smart decisions. As we streamline existing data centers, efficiency decreases, eventually leading to closure.”

The Office of Management and Budget released both the Cloud Smart strategy and the updated data center policy about a week after making the playbook public.

The council actually sent the playbook to the agencies several months ago after developing it through the Cloud and Infrastructure Community of Practice (C&I CoP). GSA says the application rationalization guide is based on enterprise portfolio management and migration/implementation strategies based on the Technology Business Management (TBM) framework.

“Two agencies, the departments of Energy and Justice, are piloting the playbook, and OMB and GSA will make the results available in the near future,” the GSA spokeswoman said. “Lessons we learn through the pilots will be added to the playbook, and we’re actively engaging with the CFO Act agencies to offer assistance with application rationalization.”

Margie Graves, the federal deputy CIO, said at the ATARC event that the playbook gives agencies a more prescriptive approach to understanding where to start and what applications they need to modernize first.

She said it also helps agencies create their business cases for modernization through the use of TBM standards.

“The biggest mistake that is often made is thinking of cloud migration as simply as replatforming. That’s not what it is. It’s an opportunity for you to reimagine the way you are delivering your services in a digital age,” Graves said. “The Application Rationalization playbook is a practical primer and walks you through all the elements you need to address. It gives you checklists and questions to ask. It makes sure you are covering all your bases.”

The playbook provides a six-step rationalization lifecycle that starts with identifying a need to determining a migration strategy including hosting alternatives.

Source: CIO Council’s Application Rationalization playbook

Graves said this lifecycle should help reduce the risks agencies face as they modernize their systems.

“You should prioritize mission-critical systems for customer service and mission delivery,” she said. “There is a huge opportunity to incorporate cloud technologies into existing commercial services into this equation. It’s not simply automating existing ways of doing business but reimagining that digital world. This is one reason why at the outset you need to have the most critical mission owners with in-depth mission knowledge at the table.”

Focus on the customer

Chris Cairns, managing director at Skylight Digital and the co-founder of GSA’s 18F organization, said starting from the perspective of the end user is the one of the most important factors in the playbook.

“If you don’t understand what their needs, behaviors and pain points are from start to finish (a journey which often traverses the siloed structures of government), you fall into the trap of making assumptions about what’s best for them, which are often wrong, and ultimately designing a disjointed experience,” Cairns said by email. “I think this causes a glut of useless, unusable and duplicative applications that doesn’t support the end-to-end journey a user has to go through. I always hear from CIO organization that ‘we implemented it, but the business/customer doesn’t use it.’ That’s usually because it wasn’t implemented in a user-centric way. The need for application rationalization is really just a symptom of the root-cause problem: A lack of focus on the user experience. We need more investment on providing guidance on how to do that well.”

Cairns said the next logical step would be to create a playbook around creating a cohesive user experiences as part of application rationalization.

“I would like to see more investment from federal IT leadership in thinking about how to ‘rationalize’ the disjointed services that citizens, businesses, etc. experience on a daily basis. The reason these services are so disjointed is because their design reflects the siloed structures of government, when they should be designed to reflect the end-to-end task or transaction that needs to be performed, regardless of structure,” he said. “Let’s rationalize that. I also think such an outside-in view, as opposed to an inside-out view, would lead to even smarter rationalization of applications because you’re looking at things from the perspective of the actual users.”

The CIO Council fully intends to update the playbook as best practices and new requirements emerge.

“Every agency has always performed at least some application rationalization, which begins whenever a new application is launched. We’re now building upon this initial process to gather all the (apparent and not-so apparent) information about cost, risk, business resiliency and service delivery,” the GSA spokeswoman said. “We speak with all components involved with an application, gathering as much information as possible to support informed business decision-making. We aim to track the total cost of operation over time, to clearly see where efficiencies can be gained. The Application Rationalization Playbook is intended to be a living document, and as we collaborate with, and learn from, our communities of practice and centers of excellence, the lessons we learn will be added to the playbook. Since every agency has a different core business mission, the playbook is knowledge-based reference, and each agency can pick and choose their rationalization decision points, as it makes sense for their environment.”


Updated: Industry group asks Senate Appropriations Committee to rein-in FFRDCs

CORRECTION: This story has been updated to reflect the fact that Noblis is not connected to MITRE or FFRDCs.  A Noblis spokesman provided a statement from Noblis CEO Amr ElSawy,  “Noblis strictly adheres to all government regulations regarding conflicts of interest and has implemented a systematic process to identify and avoid all OCIs in the work we do.  Our objectivity is the key to Noblis value to our clients and is the basis of the excellent reputation for ethics and integrity that we have worked hard to build and have earned with our clients.”

Updated July 5, 415 p.m., this story now includes clarifications from MITRE.

The House version of the fiscal 2020 defense authorization bill tells the Defense Department to work with federal funded research and development centers (FFRDCs) for 11 different projects.

These range from a study of the barriers to entry into the Armed Forces for English learners to an independent assessment of the force structure and roles and responsibilities of special operations forces, to a study on how to improve the competitive hiring at DoD.

Over the last half century, FFRDCs have played an important role in helping agencies address some of the biggest challenges. In 2015, the latest data available, 12 agencies awarded 42 FFRDCs more than $11 billion in research and development funding, which accounted for almost 9% of all R&D funding across the government.

FFRDCs are so attractive to agencies for several reasons including the fact that the government establishes and approves these organizations, provide capabilities that do not exist elsewhere and are expected to be independent advice to agencies. These organizations also are supposed to provide advice and studies, but not bid on services to implement their findings.

Over the last decade, the line between FFRDC and service provider has blurred too often, leaving some concerned that these organizations have an unfair advantage to bid on work they have.

The Professional Services Council (PSC), an industry association, raised these concerns to Senate Appropriations Defense Subcommittee lawmakers in a letter last month.

“DoD benefits from FFRDC efforts in basic research for which there is insufficient return available to support private sector investment. DoD also benefits from certain ‘trusted agent’ support from FFRDCs. These are among the legitimate functions for which FFRDCs were established,” PSC writes in the letter obtained by Federal News Network. “However, too often, these same FFRDCs are being awarded sole-source, non-competitive contracts by DoD to perform work that private sector, for-profit U.S. companies can do equally well or better, while saving scarce funds through full and open competition. This violates the intent of the Competition in Contracting Act (CICA) and undermines the real and valid purposes for which FFRDCs exist.”

At the same time, other experts say FFRDCs are restricted from bidding on resulting work from their  research, but can serve as a technical resource to the government and to contractors during the implementation phase.

Conflict of interest concerns raised

David Berteau, the president of PSC, said in an interview that the appropriations committee increased DoD funding by 3% in 2019 for research and development, but didn’t give the department or the FFRDCs any additional employees but that is not the case for 2020. Berteau said the House defense appropriations bill included an increase in the number of employees that FFRDCs hire and he said that could exacerbate the current situation of conflicts of interest lines being blurred.

“The immediate issue we see is with return of threat from China and other nations and the emphasis on innovation across DoD from Under Secretary of Defense for Research and Engineering Michael Griffin, DoD needs technical expertise to address those investments in innovation,” Berteau said. “We believe there is that capacity in the FFRDCs if they stop doing the other things that they weren’t meant to do.”

John Weiler, the CEO of the IT Acquisition Advisory Council, said MITRE is the biggest culprit in this blurring of lines, pointing several instances where the FFRDC provided research and then bid on the services to implement the research creating a conflict of interest.

MITRE operates FFRDCS in areas such as homeland security systems, national cybersecurity and national security engineering.

Weiler claimed there are several examples of MITRE  double dipping to do the research and then winning the technical production contract that creates a conflict of interest.

Weiler said the Army, Air Force and Navy’s distributed common ground system is one of those where MITRE supported both the systems engineering effort as well as the oversight by the service’s acquisition shops.

“Congress directed the Army to review commercial-off-the-shelf (COTS) data analytics and cloud offerings in wake of failed DCGS-A, and the Program Executive Office, Intelligence, Electronic Warfare and Sensors (PEO IEW&S), whom also hired MITRE,” Weiler said. “MITRE did a study which concluded that no commercial products could meet the need again, which followed a similar study years ago leading to the failed Army Red Disk program.  Palantir filed suit in federal court and proved there were commercial products.”

Weiler also said there are other examples of conflicts of interest where MITRE’s seemed to have played both sides of the effort—the Defense Information Systems Agency’s security mobility program and the Centers for Medicare and Medicaid Services healthcare.gov development.

A MITRE spokeswoman said in an email to Federal News Network, “As the nation faces an ever increasing set of national security challenges, we thought it would be good context to explain that the work we are assigned to do for the DoD is the result of a strong governance process run by the DoD to focus MITRE on a unique set of national security challenges facing this country. This DoD process has been independently reviewed by the Government Accountability Office in the past with favorable results.”

The spokeswoman added that MITRE’s efforts are closely aligned with Griffin’s priorities.

“We constantly strive to ensure our work programs are consistent with the Federal Acquisition Regulations and the mission, purpose and scope of the specific FFRDC,” she said. “The taxpayers and those defending this country deserve nothing less.”

FFRDCs need to stay in their lanes

Berteau said PSC, which didn’t mention MITRE directly, fully understands the value FFRDC’s bring to the table.

“They matter a lot particularly around their ability to do science and technology research that is not dictated by always getting a return on investment. It is critical to have a trusted agent side-by-side with the government as it develops requirements,” he said. “It’s about FFRDCs stop doing things they shouldn’t be doing to free up resource so they can tackle the nation’s science and technology challenges to stay ahead of other nation states.”

PSC told the committee that FFRDCS are encroaching into technical work and other functions that should be open only to for-profit companies.

“This encroachment into legitimate private sector business opportunities by government-protected non-profits hurts small businesses by prohibiting them from even competing,” the letter states. “In addition, the cost of FFRDC personnel is much higher. While this might make sense when providing unique talent to DoD, it does not make sense when other companies can do the work with the same proficiency at much lower costs.”

PSC said Congress has taken steps in the past to constrain FFRDC’s alleged encroachment.

“Most recently, section 8024 of the fiscal 2019 Department of Defense appropriations conference report included language to prohibit FFRDCs from certain activities and unauthorized growth,” the letter states. “PSC respectfully requests that these provisions remain in the 2020 Defense appropriations act and that those provisions be coupled with reductions in staff years of technical effort (STEs) from the levels in 2019’s Section 8024(d) and with decreases in appropriations from the levels in Section 8024(f) back to 2018 levels.”

Additionally, PSC asked lawmakers to direct DoD to report to the committees on the “proper roles of FFRDCs in providing essential, unique support to DoD, particularly as they have expanded their engagement well beyond the core missions they were created to perform.”

The Senate defense 2020 appropriations bill is not yet out of committee. The House’s version, which passed the out of the appropriations committee in May, has two FFRDC provisions. One would “prohibits the use of funds appropriated in this Act to establish a new federally funded research and development center (FFRDC), pay compensation to certain individuals associated with an FFRDC, construct certain new buildings not located on military installations, or increase the number of staff years for defense FFRDCs beyond a specified amount.”

The other provision would prohibit funding “to establish a new Department of Defense FFRDC, either as a new entity, or as a separate entity administrated by an organization managing another FFRDC, or as a nonprofit membership corporation consisting of a consortium of other FFRDCs and other nonprofit entities.”


Senate committee details cyber deficiencies at 8 agencies, but is that the whole story?

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

On the surface, the Homeland Security and Governmental Affairs Committee’s report on the cybersecurity at eight agencies is damning. It highlights systemic and profound problems with how some of the largest agencies are protecting their data and systems.

“During the [investigations] subcommittee’s review, a number of concerning trends emerged regarding the eight agencies’ failure to comply with basic National Institute of Standards and Technology (NIST) cybersecurity standards,” the report states. “In the most recent audits, the inspectors general found that seven of the eight agencies reviewed by the subcommittee failed to properly protect personally identifiable information (PII). Five of the eight agencies did not maintain a comprehensive and accurate list of information technology (IT) assets.”

As one of the investigators told Federal News Network, the report showed fundamental problems with federal cybersecurity that auditors have consistently highlighted and agencies have failed to address for a decade.

Basically, the report makes the eight agencies out to have done little to no work on federal cybersecurity over the last decade.

While the report highlights consistent shortcomings across these eight agencies, current and former federal cyber executives say the subcommittee didn’t capture the entire picture and maybe even does a disservice to all the progress made over the last four years—since the 2015 Office of Personnel Management cyber breach.

“It is a look back in history and it doesn’t capture the responses and things that have changed. We do value the information from all of these reports and it has guided us in some of our actions in response to what they found,” said one federal chief information security officer, who requested anonymity because they didn’t have permission to talk to the press about the report. “When the committee goes back to 2009, we didn’t have our contemporary cyber laws in place. If you start in 2014 or 2015 and look forward, we do have a lot more direction from the Office of Management and Budget and the Homeland Security Department, and a lot more legislative requirements and mandates, which have helped generate guiding principles. What DHS, OMB and GAO are auditing and the areas they have targeted, all produce information on how we can improve, which is an ever evolving and constant thing.”

Response to WannaCry

One example of that progress is the response and lack of impact that the WannaCry malware had on agencies. In 2017, where the virus infected hundreds of thousands of computers around the world, agencies suffered little to no problems.

It goes beyond just the one-off attack. Former executives say the government is leading industry in many areas of cybersecurity and the report fails to acknowledge any of these areas.

A former senior cybersecurity official, who requested anonymity because they didn’t get permission to talk to the press from their current employer, said that from continuous monitoring to the use of the Domain-Based Message Authentication, Reporting and Conformance (DMARC) standards — a protocol that authenticates an organization’s emails to identity credentialing and access management (ICAM) — the government is ahead of the curve compared to most industry sectors.

“I wasn’t surprised by the committee’s findings, but my overall sentiment is that I expected net new finding and recommendations once you aggregate all the information together,” the former federal official said. “If you find that current direction isn’t working or recommendations are not met, what is discernable activity that will move [the] needle versus just saying we will follow through on agency activities? To spend 10 months and write 100 pages, which is a considerable effort, but you are not offering any recommendations to agencies or to OMB that were markedly different than the path they are on was surprising.”

The subcommittee made nine recommendations to OMB, DHS and to the agencies. And to the former official’s point, the only two that are on the realm of new or different are around reestablishing the CyberStat process and for each agency to have a dashboard showing open cyber recommendations from auditors, closure rates, accomplishments and a plan for mitigating these problems. OMB would send the dashboard to Congress twice a year.

Ross Nodurft, a senior director for cybersecurity services at Venable and a former chief of OMB’s cyber office, said initially OMB used the CyberStat process to have more insight on activities on a regular basis. But over the last few years, OMB and DHS have evolved in their oversight.

“As those processes have matured and partnerships between agencies and OMB have grown, the oversight has evolved and grown with it. I’m not sure agencies need to have strict a CyberStat process any more.”

Systemic problems put agencies at risk

The federal CISO agreed. The executive said they talk with DHS and OMB almost daily, if not more often if there is a real or potential threat.

“Everything we are doing right now is a result of all the relationships we’ve built up over the last few years,” the CISO said. “We also have tools and techniques today for cybersecurity that weren’t available even a year or two ago. We are automating a lot of what we are doing, and getting out of the manual processes that made change so much harder. We can do analytics and predictive measures much more now so we are less reactive to threats.”

Nodurft added the work under the continuous diagnostics and mitigation (CDM) program, the focus around protecting high-value assets and the continued implementation of the NIST cyber framework have shifted the scope and elevated the visibility of the security function in agencies. He said that work was not as clearly reflected in the findings and recommendations as he would’ve expected.

At the same time, the subcommittee’s investigators said both OMB and some of the agencies they talked to for their research readily agreed that these are systemic problems that are putting the agencies a risk.

The investigators said even some of the resolutions agencies put in place were in adequate. They pointed to the Education Department as one example where the agency added a capability to restrict unauthorized devices, which had been a problem since 2011, but the tool still took 90 seconds to take effect.

“In its 2018 audit, the IG found the agency had managed to restrict unauthorized access to 90 seconds, but explained that this was enough time for a malicious actor to ‘launch an attack or gain intermittent access to internal network resources that could lead to’ exposing the agency’s data. This is concerning because that agency holds PII on millions of Americans,” the report states.

The investigators said the goal of the report was to demonstrate to these eight agencies and OMB that there are consistent cyber issues across the government, and most issues are fundamental cyber practices like patching, having an IT asset inventory and updating legacy systems that are not getting done.

CIO authorities still key to cyber improvements

The subcommittee plans to continue to follow-up with agencies on their progress in mitigating these cyber vulnerabilities.

Nodruft said he would like to see the subcommittee use the report as a starting point for future investigations or hearings where they dig deeper into the areas where the report fell short in producing a fuller picture of what’s happening across the government.

The former federal cyber official said the subcommittee needs to do more than point out problems that everyone knows exists.

“I hope the report is a precursor to something bigger because I think Congress missed the mark here in some ways. If you look at agencies whether it’s Transportation, or State, or Health and Human Services and look at their appropriations, you have monster components that hinder the CIO’s ability to get the kind of meaningful action we are talking about here. Whether it’s the FAA or Diplomatic Security or the Centers for Medicare and Medicaid Services, lawmakers continue to throw money at these components and there is not a centralized authority to oversee that money. When we can move the conversation to centralizing the CIO’s power, then we can do something about many of these cyber challenges in a real way. And that is where this report missed the mark.”


GAO’s top 10 legacy IT systems again confirms the slow progress of modernization

The Government Accountability Office’s third report on legacy technology didn’t just highlight the lack of progress on 10 of the most critical systems across government since 2016. The report displays Congress’ continued failure to recognize the urgency that agencies need funding and leadership to deal with these and billions of dollars in technical debt.

“There is an unmet need for Congress to get this, particularly the authorizers and the appropriators,” said Tony Scott, the former federal CIO during the Obama administration. “They do not have a full appreciation of the size and the nature of the problem. They really need to make this a priority.”

This lack of prioritization comes through loud and clear in the House Appropriations Committee’s fiscal 2020 Financial Services and General Government bill that allocates $35 million to the Technology Modernization Fund, which is well off the Trump administration’s $150 million request, and only $15 million for the IT Reform and Oversight Fund, which is down from $28.5 million in 2019 and $3 million less than the Trump administration requested.

Scott, who is now CEO of the Tony Scott Group, said while there are a handful of members who do get the need to modernize technology, such as Reps. Gerry Connolly (D-VA.), Will Hurd (R-Texas) and Robin Kelly (D-Ill.), the TMF funding for 2020 shows that most lack the understanding of the role they need to play.

“There is a lack of urgency. The question about how agencies are spending money and how do you know it is working are not hard questions to answer and ones that need to be answered,” Scott said. “But Congress can’t sit on their hands because then nothing will happen. Too many folks in Congress are sitting on their hands and not recognizing the severity of the issue.”

Connolly and the rest of the Oversight and Reform Subcommittee on Government Operations will get their chance to show how much they care on June 26 when they release the latest Federal IT Acquisition Reform Act (FITARA) scorecard and hold a hearing on agency progress.

It’s unclear which agency will testify, but Connolly is working through agencies have that struggled and started to make real progress over the last year.

And it’s those struggles and that lack of progress that GAO, once again, highlighted in its legacy IT report.

Carol Harris, a director in GAO’s Information Technology and Cybersecurity team, said GAO asked the 24 CFO Act agencies to update the 2017 and 2016 list of 65 legacy IT systems to see the status of those systems.

Source: GAO Report 19-471

Harris said GAO identified and assigned point values to certain attributes such as age of the system, the operating and labor costs, security risks and vendor warranty and support status to come up with the list of 10 systems based on those with the highest point values.

“From last year to this year, things remained the same, generally,” Harris said in an interview with Federal News Network. “We did take a look at the investments more holistically. We weren’t just focused on what are the most costly legacy systems to maintain or the ones that are at the highest risks. There is a good mix in the top 10. There were some that didn’t necessarily have the highest security risks, but did have some very old hardware or software that either the manufacturers weren’t able to maintain or the agency had a major challenge in identifying technical staff that were able to support COBOL and other aging programming languages.”

The top 10 range from 8-to-51 years old and cost about $337 million annually to maintain.

Harris said agencies will continue to face a rise in procurement and operating costs, meaning the technical debt is going up faster than the average system.

The Social Security Administration, for instance, is struggling with a 45-year-old system that GAO deemed critical to its mission and has a moderate security risk.

Harris said for SSA to maintain this system, which provides benefits to eligible people and collects detailed information from the recipients, the agency must pay a premium to get contractors to maintain the system.

Other systems like those at FEMA and the Interior Department have significant cyber vulnerabilities. For example, FEMA found about 249 reported cyber vulnerabilities, of which 168 were considered high or critical risks to the network.

“What really surprised me about the top 10, a majority of the agencies lack complete plans for modernizing their systems. Of the 10, there were three that didn’t have plans in place: the departments of Education, Health and Human Services, and Transportation,” she said. “The other seven had modernization plans, but only DoD and Interior’s were considered complete.”

Cultural and internal dynamics need to be addressed

The fact that 8 of 10 agencies didn’t have completed modernization plans in place, nearly two years after GAO’s first report and three years after the data breach at the Office of Personnel Management is both sad and shows how deep the problem goes.

Scott said GAO’s findings show that to make this type of major change agencies need to address culture and internal dynamics, which takes a lot of attention from auditors and Congress alike.

“The first thing is to create an awareness of the problem. Every CIO role I’ve ever had, had some element of legacy IT where the company had a set of systems that haven’t been looked at for some time, and now are critical for us to address and change,” said Scott, who also was CIO for General Motors and Disney. “The first thing a CIO needed to do was make sure there was great visibility of the opportunity as well as the risk and challenge that legacy systems present. The second step, now that you have visibility, is what do you want to do about it? So you must figure out who will do the work and what the new strategy will be to move off those legacy systems. It seems obvious, but it takes leadership to do that. It’s the proper role of OMB and CIOs in each agency who need to take this one on as a leadership issue. Without these things, getting out of this technical debt will not happen any time soon.”

OMB guidance needed

Harris said the lack of direction from OMB also continues to hamper this effort. She said GAO recommended last year that the administration issue guidance to require agencies identify legacy systems that need to be modernized.

“We found that, in part, the reason why agencies weren’t doing so is because they weren’t being required to modernize systems by OMB,” she said. “Until OMB requires agencies to modernize all legacy systems, the government will continue to run into the risk of having to maintain these aging investments that have outlived their effectiveness. From the agency standpoint, some of the things that they told is their modernization planning for the top 10 have been delayed due to budget restraints. While we can appreciate that they are operating in a resource-constrained environment, we also maintain that it will be vitally important for them to prioritize funding for the modernization of these very critical systems.”

Harris said GAO made eight recommendations to eight agencies to make sure they document their modernization plans, and seven described plans to address those recommendations.

“For the top 10, I think the agencies do recognize the criticality and risk they are facing in maintaining these legacy systems. I think the challenges they are running into is that some believe they have budget constraints and have continually pushed off modernization plans. We continue to maintain they will have to prioritize funding to modernize these very critical legacy systems.”


Why DoD’s decision to make cybersecurity an ‘allowable cost’ matters

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Cybersecurity is now an allowable cost on certain types of contracts.

Let that sink in for a second.

The Defense Department is telling its vendors that the government, in some cases, will pay for cybersecurity.

That is huge, and if — and it’s a very big “if” — the Pentagon follows through with its promise by not making it so arduous to allocate costs, so long as they don’t make the allocation such a small percentage that it’s not worth it and so long as they make it a true incentive, this is one of those moments in procurement history that we will all remember.

Katie Arrington is the special assistant to the Assistant Secretary of Defense for Acquisition for Cyber in the Office of the Under Secretary of Acquisition and Sustainment in DoD.

Katie Arrington, the special assistant to the Assistant Secretary of Defense for Acquisition for Cyber in the Office of the Under Secretary of Acquisition and Sustainment in DoD, made this bold statement before a roomful of vendors.

“I need you all now to get out your pens and you better write this down and tell your teams: Hear it from Katie Arrington, who got permission to say it from Mr. [Kevin] Fahey [the assistant secretary of Defense for Acquisition in the Office of the Under Secretary of Acquisition and Sustainment] security is an allowable cost. Amen, right?” Arrington said during an acquisition conference sponsored by the Professional Services Council in Arlington, Virginia. “Now what you need to do as industry is help me, help you. I’m not the enemy. I’m literally the one person in government who said, ‘Hi, I’m here to help and I’m legit here to help.”

Arrington is here to help because she is leading the DoD effort to develop and institutionalize the new Cybersecurity Maturity Model Certification (CMMC) standard for vendors.

“We have a great deal of standards for cybersecurity. What we are lacking is a unified standard,” Arrington said June 12 during a webinar sponsored by Government Executive. “It is a major undertaking, but just like we got to ISO 9000, we need to get there with cybersecurity. If we were doing all the necessary security controls, we wouldn’t be getting exfiltrated to the level that we are. We need to level set because a good portion of our defense industrial base doesn’t have robust cyber hygiene. Only 1% of [Defense Industrial Base] companies have implemented all 110 controls from the National Institute of Standards and Technology. We need to get to scale where the vast majority of DIB partners can defend themselves from nation state attacks.”

And DoD is not taking aim at just the 20,000 prime contractors that it spends more than $250 billion a year with, but the approximately 300,000 vendors that make up its entire supply chain.

That is what the CMMC really is — a supply chain risk management approach for DoD and its industrial base.

Arrington, who came to DoD in January, has been working with the Johns Hopkins University Applied Physics Lab and Carnegie Mellon University’s Software Engineering Institute to create the initial requirements.

The draft standard details five maturity levels and DoD will require vendors to be certified through third-party assessment organizations. The standard is incorporating many of the existing requirements from NIST, the Federal Risk Authorization Management Program (FedRAMP) and other existing models.

“We will put the certification requirements for each contract in Sections L and M, and it will be a go or no-go decision,” she said. “It will not be used as a source selection tool.”

12 listening sessions to start

Arrington said DoD will hold 12 listening sessions across the country over the summer to get feedback and insights about the standard from industry and other experts.

She said the goal is to have the final draft standard out this summer with third-party assessors beginning to certify vendors in January 2020. DoD will begin adding the CMMC requirements in requests for information in June 2020 and by September 2020, it will add the standard to solicitations.

“We welcome having a standard because it’s a substitute for every contracting officer making a decision about what is most important,” said Alan Chvotkin, senior vice president and general counsel for PSC. “The long pole in the tent for me is how fast can they move to get the standard in place and then get the body or group of people in position to begin certifying contractors? This will be a very competitive discriminator in the marketplace. A lot of people are nervous about whether DoD will only do the big six contractors or where are we going to be as both a prime and a subcontractor.”

Arrington said DoD recognizes the standard can’t be so burdensome or costly that vendors will choose not to participate. She also said moving to CMMC, like ISO 9000 and other similar certifications, will take time and have some fits and starts.

Congress also is recognizing that this trade off is no longer viable. In the Senate version of the 2020 National Defense Authorization Act, lawmakers included a provision requiring DoD move to a comprehensive cybersecurity standard with its contractors.

“The committee is concerned that contractors within the defense industrial base are an inviting target for our adversaries, who have been conducting cyberattacks to steal critical military technologies. Currently, the Department of Defense mandates that defense contractors meet the requirements of NIST Special Publication 800–171 but does not audit compliance to this standard.

“The committee is concerned that prime contractors are not overseeing their subcontractors’ compliance with these cybersecurity requirements through the entire supply chain and that the Department lacks access to information about its contractors’ subcontractors,” the committee states in its report on the bill. “The committee believes that prime contractors need to be held responsible and accountable for securing Department of Defense technology and sensitive information and for delivering products and capabilities that are uncompromised. Developing a framework to enhance the cybersecurity of the defense industrial base will serve as an important first step toward securing the supply chain.”

Public incentive to secure the supply chain

SASC says DoD should provide direct technical assistance to contractors, tailor for small firms based on risk and doesn’t harm the size of the industrial base and evaluate both incentives and penalties for non-compliance and vendors’ cyber performance.

Lawmakers want DoD to provide the Senate Armed Services Committee a briefing by March 2020 and provide quarterly briefings on how it and its vendors are implementing the standard.

But for these and many reasons, this is why DoD specifically expressing its willingness to pay for cybersecurity as an allowable cost is so important.

Some may say security has always been an allowable cost as part of the basic overhead vendors can charge the government on time and materials and cost-plus type contracts.

The difference, however, is DoD is not only saying this publicly, but using it as an incentive as a way to get vendors to more quickly buy into the CMMC.

Chvotkin said by making security an allowable cost, DoD is acknowledging there is a cost that vendors bear and therefore the government must bear.

“This is an incentive not to force companies to trade off security with other expenses so the government is willing to reimburse some share of that,” Chvotkin said. “It may not be 100%, but it is better than eating the entire cost. That share will likely be based on the contract. The goal is to elevate everyone up above a basic hygiene level and this is why DoD is acknowledging there is a cost of going beyond basic hygiene.”

Chvotkin said cyber would not be an allowable cost in the same way under firm fixed price contracts. He said typically a vendor with a firm fixed price contract adds general overhead as part of the final cost to the government.

Arrington said for too long DoD talked about cost, schedule and performance and the Pentagon and its contractors viewed security as a tradeoff with one of those three.

“Cost, schedule and performance are only effective in a secure environment. We cannot look at security and be willing to trade off to get lower cost, better performing product or to get something faster. If we do that, nothing works and it will cost me more in long run,” she said.


Why contractor past performance data is becoming both more, less valuable

When the General Services Administration released the $65 billion Alliant 2 governmentwide acquisition contract for IT services back in June 2016, it was one of the first major procurement efforts to put a premium on past performance.

As part of the self-scoring approach to Alliant 2, GSA determined that 40 percent of the evaluation points would be on how vendors have performed on previous and similar work with the government.

At the time, industry analysts praised GSA for building on this self-scoring approach.

Now three years later, the self-scoring approach where past performance is a significant evaluation factor has grown in popularity.

At the same time, vendors and federal procurement officials alike heed a common call that the current approach to obtaining past performance is problematic and needs to be rethought.

“As the government moves to buying more services and solutions, things are getting more subjective and you want to be able to do a better job of motivating contractors to do more,” said Jim Williams, a former federal executive with GSA, the IRS and the Homeland Security Department, and now a principal with Williams Consulting, LLC and an advisor for GovConRx. “The forces are coming together to say vendor past performance is a valuable tool. But it’s not working as it should because it’s too burdensome and not as accurate as it should be. No one wants to throw it out. We just want to fix it.”

Experts say the current system, the Contractor Performance Assessment Reporting System (CPARS), takes too much time to fill out, lacks important details and isn’t as accurate as it needs to be.

“The time and effort to draft CPARS evaluations is too much, and with most agency contracting staffs pressed to do more with less, that gets pushed out further, especially if there is a need to justify ratings over satisfactory,” said Mike Smith, a former DHS director of strategic sourcing and now executive vice president at GovConRx. “There needs to be some way to justify anything over satisfactory that is not so hard to track down all the information so it becomes less cumbersome. Contractors also are not actively engaged in the documentation and support of the ratings. That has to change too, especially as agencies are paying a lot more attention to ratings. We have to figure out how to make sure past performance ratings are not done in haste.”

Smith said on average, depending on the type of procurement, it could take a contracting officer a few minutes to a few hours to write an in-depth CPARS review.

New data compiled by GovConRx shows the value of CPARS is dropping dramatically.

Between 2014 and 2018 across the four areas of CPARS—quality, management, schedule and cost control—the number of satisfactory ratings have consistently increased while the number of exceptional ratings bottomed out and the number of very good ratings also are a downward trajectory.

Source: GovConRx analysis of CPARS.gov.

“We’ve seen some changes and a bigger emphasis in using CPARs for source selection decisions, but we don’t think the government is getting the same value for what the evaluations were designed for in the first place,” said Ken Susskind, founder and CEO of GovConRx. “We talked to quite a few agencies and chief procurement officers and we are hearing there is not enough detail or accuracy in CPARS for them to rely on to make educated and informed decisions during the source selection process. And then we hear from a lot of contractors that they believe they are performing at an exceptional and very good level, but they are getting satisfactory ratings in CPARS.”

And getting satisfactory ratings in CPARS combined with the increased importance of past performance makes a bad recipe for both agencies and vendors. especially because  the real value of the data is in the comments written by contracting officers. Experts say contracting officers don’t believe they have the time to justify ratings above or below satisfactory so it’s easier just to give everyone an average rating.

Lesley Field, the deputy administrator in the Office of Federal Procurement Policy, said the chief acquisition officer’s community is recognizing the value of CPARS is diminishing.

“I do think it’s time to reimagine what that systems could do and how it could support our acquisition workforce,” Field said in an interview with Federal News Network. “There is a lot of data in the system, but it may be a little difficult for folks to pull out what they need. We want to make sure the data is accessible and relevant. So we are thinking about ways to start a modernization effort for CPARS so the system can produce information that is more actionable and usable for our contracting officers.”

Field said the data needs to be more specific and better reflect how vendors are performing.

OFPP is beginning a new “in-reach” effort to talk to the frontline workers to ensure they have the tools and training to do their jobs better.

Soraya Correa, the chief procurement officer at DHS, said her office is part of a group of agencies starting to work with OFPP on reinventing CPARS.

DHS chief procurement officer Soraya Correa.

She said it’s clear CPARS has become tedious and there can be too much back and forth with vendors over the ratings and comments. Correa added past performance needs to be simplified and automated to some extent.

“If we use something like artificial intelligence or automation tools to simplify data and sort the data so we can present it in a more streamlined fashion, CPARS would be more valuable for contracting officers,” Correa said in an interview. “I would love to see a more commercial past performance approach, almost like a Yelp approach. Vendors can add comments to the government’s ratings, but there wouldn’t be a back and forth. The contracting officers could have a tool so they can search for projects similar in size, scope and complexity. That is one of the things we look for. They can run a report and it tells them what ratings came back at and key comments from vendors and the government.”

Field said agencies need to keep up with the times so a “Yelp for government” isn’t necessarily a bad idea and is a model to explore and possibly pilot changes to CPARS to make it more in-line with commercial practices.

“What we would like to play with is if there are AI tools out there or emerging technologies that could look across the evaluations, maybe help us come up with more insight,” she said. “Would it be possible to look at the Federal Data Procurement System data and figure out what other awards are out there and if anything interesting pops up? It’s really a holistic reimagining of what that past performance system could do. Obviously, the contracting officers are the ones who will have to make that assessment, take all that information and evaluate it and use their judgement.”

Field said it’s still early in the process to reimagine CPARS, especially as GSA transitions it to a new platform, beta.sam.gov, in the coming months. Field said OFPP is working with a few agencies to figure out what the future of vendor past performance could look like.

Correa said DHS is one of those agencies working with OFPP on modernizing CPARS. She said emerging technologies, like AI, will promote both the use of past performance data and the need to make sure CPARS information is a higher quality.

One common complaint is CPARS doesn’t necessarily include or have room for private sector past performance data.

Both Correa and Field agreed that obtaining that information would be helpful, especially as agencies strive to find non-traditional contractors.

“I do think CPARS is a good tool. We’ve come a long way with that tool. I remember when we first implemented it and it was much more difficult to use. They continue to make improvements on CPARS, the system itself,” Correa said. “I just want to take advantage of automation and the technologies that are available to help us cull the data, simplify it and bring it back in a fashion to the contracting officers that is simple, straightforward and easy to use. If along the way, we can tailor the data requirements, how we get data into the system, how the CORs, program managers and contracting officers interact with the system to get their data in the system and pull it back, then we will have an ideal tool.”


Justice CIO Klimavicz to take on additional duties as the chief data officer

In the coming weeks, the Office of Management and Budget will release new guidance for how agencies should implement the Evidence Based Policy-Making Act, including the requirement to name a chief data officer.

Some agencies like the Transportation Department, the U.S. Agency for International Development and several others already have CDOs and are well ahead of the curve.

The Justice Department is the latest agency to get on the CDO bandwagon, but is taking a bit of a different approach than most agencies.

DOJ named Joe Klimavicz, its chief information officer, as its chief data officer last week.

Joe Klimavicz is the Justice Department CIO and now its new chief data officer too.

DOJ’s decision to give Klimavicz the CDO title is dissimilar to how most agencies, large and small, have handled it over the last four years as the data position has come in vogue.

The debate typically is around whether CDO answers to the CIO or is on the same level as the CIO, reporting to the deputy secretary or an assistant secretary of management type of role. Each agency has handled this decision differently, with some like the Centers for Medicare and Medicaid Services sitting their CDO outside the CIO’s office. Others, such as USAID, have their CDO report to the CIO. As of August 2018, a PricewaterhouseCoopers survey found 64% of federal CDOs report to the CIO.

“No two federal CDOs share the same portfolio of responsibilities, as each one has adapted the role to the unique needs of his or her agency,” Jane Wiseman, a senior fellow at the Ash Center for Democratic Governance and Innovation Harvard Kennedy School at Harvard University, wrote in an IBM Center for the Business of Government report on federal CDOs from September. “Federal CDOs view themselves as enablers of data-driven decision-making capacity in their organizations and execute on that in different ways, ranging from being centralized providers of “analytics as a service” to creating the tools and platforms that enable employee self-service across their departments.”

Wiseman said the private sector model is much different where most CDOs report directly to the COO or CEO of the company.

Information vs. data: Different sides of the same coin?

But at DOJ, the decision seems to bring back the debate about data versus information. If the CIO, Klimavicz in this case, is responsible for information, then why not make him responsible for data, too, as the two are intrinsically linked?

As one chief data officer told me recently, the government is in the information business and the business and mission side of the agency have to value it.

So to some it may make sense to dual-hat the CIO as the CDO, too, because they, through the Federal IT Acquisition Reform Act (FITARA), are already working closely with other C-level officers and mission program managers.

One recommendation from Wiseman in the IBM report is for agencies to have a data leader who has “the authority, executive support and mandate to advance data-driven government. Agency leaders should hire a data leader such as a CDO who demonstrates competency across the domains of infrastructure, innovation and delivery.”

Sound like a job for a CIO?

Interestingly enough, the FBI hired a chief data officer more than two years ago and that person reports to the CIO. The bureau is taking a different approach than headquarters, but maybe that’s why making Klimavicz the CDO makes sense too — he can connect the technology with the information and the data.

On the other hand, one former agency CIO told me that the Justice Department’s decision to dual-hat Klimavicz signals to them that the agency doesn’t respect or appreciate the impact the CDO could have. Among the reasons for this belief, the former official said, are a CDO with a singular focus can have a bigger impact on the agency as a whole than just giving the CIO more responsibility. They also said a stand-alone CDO can affect change in a much different way than by having a legacy CIO where there are real or perceived culture challenges.

Source: IBM Center for the Business of Government September 2018 report.

The IBM report addressed this challenge, too. It highlighted the reason the Transportation Department created a separate CDO who reports not to the CIO, but to the agency’s chief technology officer.

“Frustrated that there wasn’t anyone in the department whose full-time job was data, he decided to create the role of a CDO,” Wiseman writes. “In choosing to create the role, the CIO also decided that it should be a career job — that was a way to signal that the department was making an investment in data as an agency. The mandate for the CDO at DOT is open data, data governance, and self-service data infrastructure.”

As another CDO told me recently, they describe the position as the grease, but the data is the fuel for the agency’s decision-making engine.

To me, that means no matter where the CDO sits or who wears the hat, the person in that role must ensure the business and mission folks have the data and information they need to make better decisions.

In Justice’s case, if Klimavicz can grease the skids, then, give him the extra hat. The good thing is it’s not permanent and DOJ can change its approach if it’s not working.

What will be interesting over the next few months is how prescriptive OMB’s guidance will be in relation to the reporting structure of the CDO and how other agencies approach the reporting structure. Like chief privacy officers back in the mid-2000s, agencies dual-hatted CIOs with that job. And slowly over the past decade, agencies have created stand-alone roles mostly inside the CIO’s office.

DHS’s Kneidinger heading to Energy

In other people news, Mark Kneidinger’s six-year run at the Department of Homeland Security ends June 10.

Kneidinger is now a senior adviser for policy and IT transformation in the Energy Department’s Office of the CIO. He changed his Linkedin profile to reflect the new job.

Mark Kneidinger has taken a new job at the Energy Department as a senior advisor in the CIO’s office.

While it’s unclear what Kneidinger will do in the senior adviser role, it’s clear that DOE CIO Max Everett has been pushing the IT modernization ball quickly forward.

The Energy Department received a $15 million “loan” from the Technology Modernization Fund to accelerate its effort to move email to the cloud.

Everett told me in January that the cloud email consolidation effort is part of how Energy is trying to get a better understanding of the cost of technology.

Additionally, the department has been out in front with figuring out how to meet the requirements of the Trusted Internet Connections (TIC) initiative in a new way. DOE worked with DHS over the past 15 months to create a more flexible use case to secure internet connections.

Kneidinger comes to DOE after spending the last six months as the deputy director of the DHS National Risk Management Center (NRMC), which the agency stood up last summer. It’s a little surprising that he is leaving after six months given the focus of the administration around issues like supply chain and cyber risk management.

Before coming to the NRMC, Kneidinger led the DHS Federal Network Resilience office since 2015 and has been with the office since 2013. Before that, he worked for private sector companies including CSC and CACI, and was the CIO for state offices in New York and Virginia.


VA regulatory change driving a ‘sustained organizational commitment’ to customer service

Trust is at the center of every program and project across the Veterans Affairs Department. The goal, over the last five-plus years, has been to rebuild the trust of veterans, particularly in how the agency delivers healthcare.

VA’s most recent data shows great strides, but it’s the institutionalizing of those efforts that will produce long-term change.

Lee Becker, the chief of staff of VA’s Veterans Experience Office, said the department not only expects to exceed its long-term goal of improving customer service, but make it a permanent part of its employees’ expectations and actions.

Lee Becker is the chief of staff of VA’s Veterans Experience Office.

Secretary Robert Wilkie recently signed an order to change the code of federal regulations—U.S.C. 38 CFR—to add customer service principles in part zero and to measure customer experience through how effective and easy it is to provide care to veterans.

“We are making sure we are providing care with emotional residence by treating every veteran, family member, caregiver, survivor with the utmost respect. In the end, that is what drives trust,” Becker said in an interview with Federal News Network. “This shift to really that true culture of customer experience really takes many, many years. We have been taking these bold moves to reinforce our focus around our customers, our veterans, to ensure that the overall experience is the highest that it can be.”

Becker said changing the code of federal regulations is more significant of change than just issuing new policy or writing a memo.

“We are saying for everything we do it’s going to be with that lens in how we approach providing the best care, benefits and services,” he said. “Under 38 CFR, part zero is where our VA core values are codified. We created an amendment to that and added customer experience principles. What it really does for is hold us accountable to those principles.”

Becker said by changing the CFR, VA now has more permanent and long-term rules in place so future administrations, based on that policy, can set the goals and objectives.

The decision to change the Code of Federal Regulations to include customer experience is part of the reason why VA won a 2019 Service to Citizen Award earlier this month.

This codification of customer services come about a year after Wilkie signed VA’s first customer service policy to further sustain long-term efforts.

The Trump administration has made customer service a cross-agency priority goal with VA being one of the program’s co-leaders. This focused effort helped VA, among other agencies, improve customer service, according to a May 2018 survey by Forrester Research. The most recent data from the President’s Management Agenda shows agencies should have a customer experience dashboard to track their progress against governmentwide and agency specific metric some time in 2019.

But VA has its own goals it’s striving toward, including reaching the 90% mark on the veterans trust scale.

Becker said the Veterans Health Administration, for example, developed a patient experience program.

“It is a full suite of actions to address people, process, technology and engagement to enable that ultimate patient experience to happen,” he said. “At every veterans medical center, there are employees who are red coat ambassadors because veterans have told us it’s hard to navigate medical centers. That provides a very warm connection and high touch with our veterans to make sure they can navigate the facility properly.”

VA also has been rolling out “on the moment” training to make employees aware of and empower them to own the moment to make the veterans and their families’ experience as best as it can be.

“We’ve seen the facilities that have been implementing it fully have actually increased trusted and have an increased experience. Overall, we’ve seen a 2% increase over the past year in customer experience,” Becker said. “We have trained over 50,000 employees with this concept. This training is based off of some of the best practices in the private sector. We’ve also taken some of the best practices of medical centers, who have been doing a great job in how they address customer experience, and we’ve used that for this training.”

As for that 90% goal, Becker said it’s an aspirational goal that is achievable by September 30.

“When we think about how we have been able to get to where we have been able to get to right now and the progress we’ve made, it’s really been through partnerships internally and externally. When you talk about real culture change, that occurs when you have a common mission and there is no competition about how is competing that mission,” he said. “Customer experience takes time. It’s not something that happens overnight. As we’ve demonstrated some of our early successes and we are seeing even more successes in how we are improving experience, as agencies look at us and we look at other agencies, we realize that it’s a journey and through working together, we will get there together.”

 


DISA eyes $170M in savings from Fourth Estate consolidation program

BALTIMORE, MARYLAND–Outside of the drama of the $10 billion cloud procurement known as JEDI and the excitement over the almost $9 billion cloud procurement known as DEOS, there is the Fourth Estate consolidation program in the Defense Department.

It’s as big, worth about $10 billion today, but likely much less over the course of the next decade.

It’s not as controversial with no dramatic court case or battle between system integrators like with JEDI and DEOS, respectively.

But the Fourth Estate consolidation and optimization effort may have more impact, be more significant and, most importantly, show the DoD path forward in its move to the cloud.

Tony Montemarano, the executive deputy director of the Defense Information Systems Agency, at the AFCEA TechNet day, said over the next decade the agency will bring together the networks and commodity IT of the 14 defense agencies, including the Defense Logistics Agency, the Defense Finance and Accounting Service and the Defense Health Agency.

“We are taking the commodity IT of 13 other Fourth Estate organizations and bringing them together with DISA, not mission IT, but the desktops, the business applications, and trying to bring them together, the contracting and personnel,” he said. “Close to 1,000 new employees are coming to DISA effective the first of October. We have to come to grips with taking these independent, commodity environments and bringing them together. It’s a major undertaking when it comes to coming to grips with contracting, coming to grips with personnel, you can imagine the nightmare dealing with the whole thing, and everyone is cooperating.”

Montemarno received the uneasy laugh from the audience with the last comment, but there is actually a lot of truth to what he said.

Drew Jaehnig, the chief of the Fourth Estate optimization program and chief of the defense enclave services, said at TechNet that there are two main oversight bodies for this consolidation. The senior working group led by Danielle Metz, the principal director for the acting deputy CIO for Information Enterprise, in the Pentagon, which is a weekly meeting that provides governance, structure and direction. Then there is the IT Procurement Request board, which handles the change management process for any of the 14 agencies, who want to change any of their current technology or contracts.

Drew Jaehnig, the chief of the Fourth Estate optimization program and chief of the defense enclave services, discusses the new RFI at AFCEA’s TechNet event in Baltimore Maryland. (Photo courtesy Michael Carpenter/AFCEA SIGNAL)

“That goes through our office, basically for a quality check for the lack of a better description, and then it goes up to DoD CIO for adjudication,” Jaehnig said. “For the most part, it’s pretty team oriented. The requirements for the request for information that you see on the street was developed by all 14 agencies together. We had two summits, one partly in person and the rest virtual. There has been very little uncooperative behavior from the Fourth Estate in every sense of the world. Big agencies such as DFAS, DLA and others deserve a big shout out for helping to drive this and to stabilize the project. They have been very cooperative and we have nothing but good things to say about the folks in the Fourth Estate at this point.”

Jaehnig said there was some initial concern about the impact on personnel, but now most of the noise around the Fourth Estate program comes from the vendor community.

And that’s where the new RFI comes in. DISA released the notice on May 10 asking for input to create the Defense Enclave Services, which is a major piece to the broader Fourth Estate consolidation effort.

Jaehnig said DISA hopes industry submits comments to ensure they can meet its goals of cost savings, improved services and a consolidated and hardened network.

“The department thinks we should be able to save a significant amount of money and return that to the lethality for the department by combining these networks and reducing the footprint to the tune of about $170 million a year,” he said. “I like to say my deliverable to the department is not the new service, but the savings.”

The RFI is calling for comments on the Defense Enclave Services, which are a baseline set of services managed by the contractor and are “flexible, scalable, reliable, accessible and secure” from the desktop level through the Local Area Networks and provide “high assurance of connectivity to the data centers, native internet and government/industry cloud services.”

For example when it comes to the DEOS contract, DISA will buy the back-office and desktop collaboration tools for the rest of the 14 agencies instead each organization buying them separately.

DoD CIO memo coming soon

Jaehnig said the DoD CIO will sign a memo designating DISA as the sole service provider for back-office, commodity IT services and network infrastructure for these 14 agencies.

DISA and the 13 Fourth Estate agencies have been working on coming up with a common definition of commodity IT services over the last eight months.

“We tried to figure out where the dividing line is. There are some gray areas where which side of the fence some of these things fall on. We still are working on a few of the tiny details, and it also inserts some interesting complexity in regards to accreditation from the cyber perspective,” Jaehnig said.

Jaehnig said DISA and its partners have not yet decided on the acquisition strategy, which is what the RFI responses will help with. Responses to the RFI are due June 3.

The strategy could include the General Services Administration’s Enterprise Infrastructure Solutions (EIS) telecommunications modernizations contract.

No matter what the acquisition strategy looks like in the end, Jaehnig said the potential savings will come from reducing duplication of functions and of the more than 630 contracts vehicles currently in place across the 14 agencies.

“The amount of management overhead for these contracts is pretty staggering,” he said. “When look at the thread that runs through this, we see advantages and where we can get cost savings and improve services.”

Over the short term, DISA will move the first seven networks into DoDNet version 1 over the next year. The agency has a longer-term goal of moving the remaining nine agencies to and adding more emerging capabilities to DoDNet 2 by 2021 and beyond.

Jaehnig said the current projects around IT modernization will continue, but those agencies will have to go through the change management process to ensure they are interoperable with the future state.


Why OMB is ushering us into the second golden age of acquisition reform

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Welcome to the second golden age of federal acquisition reform.

The frustration and the technology are aligning for the Trump administration, the Congress and industry to come together to make the first set of significant, almost seismic changes since the 1990s.

“We are in that rare moment when we have the combination of factors, the customer demand for speed and agility, the congressional receptiveness for acquisition reform legislation, the strong push from OFPP on the importance of  innovation such as their creation of acquisition innovation councils, category management memos and the myth busters four memo and strong actions from a number of agencies really propelling acquisition innovation,” said Jeff Koses, the senior procurement executive at the General Services Administration, at the Coalition for Government Procurement’s spring conference in Falls Church, Virginia. “Across GSA we have a number of things that I regard as innovation plays in contracting and policy domain, in the communication domain and in the technology domain.”

To that end, the Office of Federal Procurement Policy sent six legislative proposals to Congress at the end of April to clean up a few things, but more importantly to ask for permission to test and spread innovative acquisition concepts across government.

The most significant proposal would create an Acquisition Modernization Test Board, which would modernize OFPP’s statutory authority for governmentwide acquisition testing, which has been in place since Congress created the office in 1974.

The board would develop “test programs that promote incremental improvement of acquisition practices, including through new, innovative or otherwise better business processes and applications of technology, and identifying candidate agencies to conduct tests,” the proposal states.

Through the board, the OFPP administrator would approve waivers of one or more acquisition laws as part of a pilot program to evaluate how changing the statutory requirement(s) might make the procurement process more efficient.

Matthew Blum is the associate administrator in the Office of Federal Procurement Policy at OMB.

Matt Blum, the associate administrator in OFPP, said board also would help ensure agencies had a place to go to find out what innovations exist and who is trying them out.

“We believe the best way to accomplish the goals [of the President’s Management Agenda] is to accelerate the pace of transformation through smart piloting where we learn a little, through testing and getting back from all of you, making adjustments based on what we learn and do additional testing. There are a lot of benefits from doing testing,” Blum said at the conference. “We think it’s critical to an innovative ecosystem—any practice that creates new value for the customer. Testing allows us to constantly change and challenge ourselves to do better, to disrupt the environment in a manageable way. But also equally important, it helps us to manage risk, especially in initiatives that have multiple dimensions.”

The board also gives OFPP and the board a congressionally-approved place to fail. Too often, agencies are risk averse because of concerns about being called out by lawmakers or by auditors.

Greg Giddens, the former executive director of the Veterans Affairs Department’s Office of Acquisition, Logistics and Construction and now a partner with partner for Potomac Ridge Consulting group, said the acquisition board is the type of top cover agencies need.

“People want to talk about being innovative but taking that risk is hard to do. We are in an environment that if you do something well, that is like a tree falling in woods and no one is there to hear it. But if you take a risk and it doesn’t go well, everyone is there waiting to call you out,” he said. “The board isn’t calling for a big bang approach, but for agencies to try somethings. It’s almost like bringing the idea of agile or dev/ops to acquisition reform.”

OFPP’s Blum said whatever pilots the board approves will be proven by the results and the data. He said the acquisition environment has become more complex over the last 20 years that there is widespread agreement that testing and piloting is one of the best ways to innovate.

For the board to be successful, Larry Allen, the president of Allen Federal Business Partners, said OFPP needs to ensure the members are not just acquisition people, but come from a variety of backgrounds, including finance, technology and oversight.

“Waiving rules for pilots could be useful. How about starting with the Schedules Price Reductions Clause?” Allen said. “If you want to attract more small and innovative businesses to your largest commercial acquisition program, that’s a good place to begin. Eliminating the [Schedules Price Reductions Clause] definitely lowers the compliance burden and could attract more innovative companies.”

The White House signaled its desire to create the acquisition modernization test board in its fiscal 2020 budget request sent to Congress in March.

Good government approach to management

The goal of all six of the proposals is straightforward. Russ Vought, the acting director of the Office of Management and Budget, writes, that the ideas are “designed to help the administration achieve its goal of a more nimble and responsive acquisition system. The proposal would transform a statutory framework for governmentwide acquisition testing that has remained unchanged for more than 40 years and fails to adequately support an environment where continual and timely process improvement is an imperative.”

David Grant, a former associate administrator of the Mission Support Bureau at FEMA and now a partner with Potomac Ridge Consulting, said the proposals are part of the administration’s good government approach to management.

“Either there is an opening or there is a sense that there is an opening to make some real changes in the federal acquisition process,” he said. “OFPP wants to have a constructive dialogue with Congress to make some changes.”

OFPP also is asking Congress to make a few other changes, including ending the Defense Cost Accounting Standards Board, increasing the micro-purchase threshold to $10,000 for task order contracts and standardizing the minimum threshold for bid protests of task order contracts to $25 million for all agencies, instead of $10 million for civilian and $25 million for the Defense Department.

Grant said the standardization of bid protest thresholds makes sense given how much agencies go through to create the multiple award contracts.

Moving past the ‘get it right’ mindset

OMB has tried several of these proposals previously, or are borrowing from the Section 809 panel, including the reducing the number of cost accounting boards and decoupling the threshold for using cost accounting standards from the threshold for Truth in Negotiations Act applicability and increasing the basic threshold for the standards’ applicability from $2 million to $15 million.

In the past, Congress also has given OFPP limited test or pilot authority for things like share-in-savings, but nothing as broad as the acquisition board.

And the request for the board underscores how OMB, the agencies, industry, and hopefully Congress, are starting to view acquisition innovations and why many believe we are entering this second golden age of acquisition.

GSA Administrator Emily Murphy put the current environment in some historical perspective.

When she was at GSA in the early 2000s, it was all about “getting it right,” which may have put too much emphasis on following the rules.

“It was an important age of acquisition … but it was so focused on the rules that it lost track of the solution part sometimes and trying to figure out how to use the rules,” Murphy said. “It’s GSA’s job to help agencies find a compliant way to get to the solution they need, not come up with a roadblock. As IT is evolving, as our understanding of service contracting is evolving and how all the pieces fit together, the innovation that is taking place across the government has led to a lot of movement. The IT is enabling a lot of changes and advances in acquisition. It’s just a great collaborative time where people are not afraid to ask the question of how can we do it better.”

That is how federal acquisition improves — not by going around the rules, but asking what is possible within the rules and then taking advantage of all that is possible without worry of punishment.


« Older Entries

Newer Entries »