The Government Accountability Office’s third report on legacy technology didn’t just highlight the lack of progress on 10 of the most critical systems across government since 2016. The report displays Congress’ continued failure to recognize the urgency that agencies need funding and leadership to deal with these and billions of dollars in technical debt.
“There is an unmet need for Congress to get this, particularly the authorizers and the appropriators,” said Tony Scott, the former federal CIO during the Obama administration. “They do not have a full appreciation of the size and the nature of the problem. They really need to make this a priority.”
This lack of prioritization comes through loud and clear in the House Appropriations Committee’s fiscal 2020 Financial Services and General Government bill that allocates $35 million to the Technology Modernization Fund, which is well off the Trump administration’s $150 million request, and only $15 million for the IT Reform and Oversight Fund, which is down from $28.5 million in 2019 and $3 million less than the Trump administration requested.
Scott, who is now CEO of the Tony Scott Group, said while there are a handful of members who do get the need to modernize technology, such as Reps. Gerry Connolly (D-VA.), Will Hurd (R-Texas) and Robin Kelly (D-Ill.), the TMF funding for 2020 shows that most lack the understanding of the role they need to play.
“There is a lack of urgency. The question about how agencies are spending money and how do you know it is working are not hard questions to answer and ones that need to be answered,” Scott said. “But Congress can’t sit on their hands because then nothing will happen. Too many folks in Congress are sitting on their hands and not recognizing the severity of the issue.”
Connolly and the rest of the Oversight and Reform Subcommittee on Government Operations will get their chance to show how much they care on June 26 when they release the latest Federal IT Acquisition Reform Act (FITARA) scorecard and hold a hearing on agency progress.
It’s unclear which agency will testify, but Connolly is working through agencies have that struggled and started to make real progress over the last year.
And it’s those struggles and that lack of progress that GAO, once again, highlighted in its legacy IT report.
Carol Harris, a director in GAO’s Information Technology and Cybersecurity team, said GAO asked the 24 CFO Act agencies to update the 2017 and 2016 list of 65 legacy IT systems to see the status of those systems.
Harris said GAO identified and assigned point values to certain attributes such as age of the system, the operating and labor costs, security risks and vendor warranty and support status to come up with the list of 10 systems based on those with the highest point values.
“From last year to this year, things remained the same, generally,” Harris said in an interview with Federal News Network. “We did take a look at the investments more holistically. We weren’t just focused on what are the most costly legacy systems to maintain or the ones that are at the highest risks. There is a good mix in the top 10. There were some that didn’t necessarily have the highest security risks, but did have some very old hardware or software that either the manufacturers weren’t able to maintain or the agency had a major challenge in identifying technical staff that were able to support COBOL and other aging programming languages.”
The top 10 range from 8-to-51 years old and cost about $337 million annually to maintain.
Harris said agencies will continue to face a rise in procurement and operating costs, meaning the technical debt is going up faster than the average system.
The Social Security Administration, for instance, is struggling with a 45-year-old system that GAO deemed critical to its mission and has a moderate security risk.
Harris said for SSA to maintain this system, which provides benefits to eligible people and collects detailed information from the recipients, the agency must pay a premium to get contractors to maintain the system.
Other systems like those at FEMA and the Interior Department have significant cyber vulnerabilities. For example, FEMA found about 249 reported cyber vulnerabilities, of which 168 were considered high or critical risks to the network.
“What really surprised me about the top 10, a majority of the agencies lack complete plans for modernizing their systems. Of the 10, there were three that didn’t have plans in place: the departments of Education, Health and Human Services, and Transportation,” she said. “The other seven had modernization plans, but only DoD and Interior’s were considered complete.”
The fact that 8 of 10 agencies didn’t have completed modernization plans in place, nearly two years after GAO’s first report and three years after the data breach at the Office of Personnel Management is both sad and shows how deep the problem goes.
Scott said GAO’s findings show that to make this type of major change agencies need to address culture and internal dynamics, which takes a lot of attention from auditors and Congress alike.
“The first thing is to create an awareness of the problem. Every CIO role I’ve ever had, had some element of legacy IT where the company had a set of systems that haven’t been looked at for some time, and now are critical for us to address and change,” said Scott, who also was CIO for General Motors and Disney. “The first thing a CIO needed to do was make sure there was great visibility of the opportunity as well as the risk and challenge that legacy systems present. The second step, now that you have visibility, is what do you want to do about it? So you must figure out who will do the work and what the new strategy will be to move off those legacy systems. It seems obvious, but it takes leadership to do that. It’s the proper role of OMB and CIOs in each agency who need to take this one on as a leadership issue. Without these things, getting out of this technical debt will not happen any time soon.”
Harris said the lack of direction from OMB also continues to hamper this effort. She said GAO recommended last year that the administration issue guidance to require agencies identify legacy systems that need to be modernized.
“We found that, in part, the reason why agencies weren’t doing so is because they weren’t being required to modernize systems by OMB,” she said. “Until OMB requires agencies to modernize all legacy systems, the government will continue to run into the risk of having to maintain these aging investments that have outlived their effectiveness. From the agency standpoint, some of the things that they told is their modernization planning for the top 10 have been delayed due to budget restraints. While we can appreciate that they are operating in a resource-constrained environment, we also maintain that it will be vitally important for them to prioritize funding for the modernization of these very critical systems.”
Harris said GAO made eight recommendations to eight agencies to make sure they document their modernization plans, and seven described plans to address those recommendations.
“For the top 10, I think the agencies do recognize the criticality and risk they are facing in maintaining these legacy systems. I think the challenges they are running into is that some believe they have budget constraints and have continually pushed off modernization plans. We continue to maintain they will have to prioritize funding to modernize these very critical legacy systems.”
Cybersecurity is now an allowable cost on certain types of contracts.
Let that sink in for a second.
The Defense Department is telling its vendors that the government, in some cases, will pay for cybersecurity.
That is huge, and if — and it’s a very big “if” — the Pentagon follows through with its promise by not making it so arduous to allocate costs, so as long as they don’t make the allocation such a small percentage that it’s not worth it and so as long as they make it a true incentive, this is one of those moments in procurement history that we will all remember.
Katie Arrington, the special assistant to the Assistant Secretary of Defense for Acquisition for Cyber in the Office of the Under Secretary of Acquisition and Sustainment in DoD, made this bold statement before a roomful of vendors.
“I need you all now to get out your pens and you better write this down and tell your teams: Hear it from Katie Arrington, who got permission to say it from Mr. [Kevin] Fahey [the assistant secretary of Defense for Acquisition in the Office of the Under Secretary of Acquisition and Sustainment] security is an allowable cost. Amen, right?” Arrington said during an acquisition conference sponsored by the Professional Services Council in Arlington, Virginia. “Now what you need to do as industry is help me, help you. I’m not the enemy. I’m literally the one person in government who said, ‘Hi, I’m here to help and I’m legit here to help.”
Arrington is here to help because she is leading the DoD effort to develop and institutionalize the new Cybersecurity Maturity Model Certification (CMMC) standard for vendors.
“We have a great deal of standards for cybersecurity. What we are lacking is a unified standard,” Arrington said June 12 during a webinar sponsored by Government Executive. “It is a major undertaking, but just like we got to ISO 9000, we need to get there with cybersecurity. If we were doing all the necessary security controls, we wouldn’t be getting exfiltrated to the level that we are. We need to level set because a good portion of our defense industrial base doesn’t have robust cyber hygiene. Only 1% of [Defense Industrial Base] companies have implemented all 110 controls from the National Institute of Standards and Technology. We need to get to scale where the vast majority of DIB partners can defend themselves from nation state attacks.”
And DoD is not taking aim at just the 20,000 prime contractors that it spends more than $250 billion a year with, but the approximately 300,000 vendors that make up its entire supply chain.
That is what the CMMC really is — a supply chain risk management approach for DoD and its industrial base.
Arrington, who came to DoD in January, has been working with the Johns Hopkins University Applied Physics Lab and Carnegie Mellon University’s Software Engineering Institute to create the initial requirements.
“We will put the certification requirements for each contract in Sections L and M, and it will be a go or no-go decision,” she said. “It will not be used as a source selection tool.”
Arrington said DoD will hold 12 listening sessions across the country over the summer to get feedback and insights about the standard from industry and other experts.
She said the goal is to have the final draft standard out this summer with third-party assessors beginning to certify vendors in January 2020. DoD will begin adding the CMMC requirements in requests for information in June 2020 and by September 2020, it will add the standard to solicitations.
“We welcome having a standard because it’s a substitute for every contracting officer making a decision about what is most important,” said Alan Chvotkin, senior vice president and general counsel for PSC. “The long pole in the tent for me is how fast can they move to get the standard in place and then get the body or group of people in position to begin certifying contractors? This will be a very competitive discriminator in the marketplace. A lot of people are nervous about whether DoD will only do the big six contractors or where are we going to be as both a prime and a subcontractor.”
Arrington said DoD recognizes the standard can’t be so burdensome or costly that vendors will choose not to participate. She also said moving to CMMC, like ISO 9000 and other similar certifications, will take time and have some fits and starts.
Congress also is recognizing that this trade off is no longer viable. In the Senate version of the 2020 National Defense Authorization Act, lawmakers included a provision requiring DoD move to a comprehensive cybersecurity standard with its contractors.
SASC says DoD should provide direct technical assistance to contractors, tailor for small firms based on risk and doesn’t harm the size of the industrial base and evaluate both incentives and penalties for non-compliance and vendors’ cyber performance.
Lawmakers want DoD to provide the Senate Armed Services Committee a briefing by March 2020 and provide quarterly briefings on how it and its vendors are implementing the standard.
But for these and many reasons, this is why DoD specifically expressing its willingness to pay for cybersecurity as an allowable cost is so important.
Some may say security has always been an allowable cost as part of the basic overhead vendors can charge the government on time and materials and cost-plus type contracts.
The difference, however, is DoD is not only saying this publicly, but using it as an incentive as a way to get vendors to more quickly buy into the CMMC.
Chvotkin said by making security an allowable cost, DoD is acknowledging there is a cost that vendors bear and therefore the government must bear.
“This is an incentive not to force companies to trade off security with other expenses so the government is willing to reimburse some share of that,” Chvotkin said. “It may not be 100%, but it is better than eating the entire cost. That share will likely be based on the contract. The goal is to elevate everyone up above a basic hygiene level and this is why DoD is acknowledging there is a cost of going beyond basic hygiene.”
Chvotkin said cyber would not be an allowable cost in the same way under firm fixed price contracts. He said typically a vendor with a firm fixed price contract adds general overhead as part of the final cost to the government.
Arrington said for too long DoD talked about cost, schedule and performance and the Pentagon and its contractors viewed security as a tradeoff with one of those three.
“Cost, schedule and performance are only effective in a secure environment. We cannot look at security and be willing to trade off to get lower cost, better performing product or to get something faster. If we do that, nothing works and it will cost me more in long run,” she said.
When the General Services Administration released the $65 billion Alliant 2 governmentwide acquisition contract for IT services back in June 2016, it was one of the first major procurement efforts to put a premium on past performance.
As part of the self-scoring approach to Alliant 2, GSA determined that 40 percent of the evaluation points would be on how vendors have performed on previous and similar work with the government.
At the time, industry analysts praised GSA for building on this self-scoring approach.
Now three years later, the self-scoring approach where past performance is a significant evaluation factor has grown in popularity.
At the same time, vendors and federal procurement officials alike heed a common call that the current approach to obtaining past performance is problematic and needs to be rethought.
“As the government moves to buying more services and solutions, things are getting more subjective and you want to be able to do a better job of motivating contractors to do more,” said Jim Williams, a former federal executive with GSA, the IRS and the Homeland Security Department, and now a principal with Williams Consulting, LLC and an advisor for GovConRx. “The forces are coming together to say vendor past performance is a valuable tool. But it’s not working as it should because it’s too burdensome and not as accurate as it should be. No one wants to throw it out. We just want to fix it.”
Experts say the current system, the Contractor Performance Assessment Reporting System (CPARS), takes too much time to fill out, lacks important details and isn’t as accurate as it needs to be.
“The time and effort to draft CPARS evaluations is too much, and with most agency contracting staffs pressed to do more with less, that gets pushed out further, especially if there is a need to justify ratings over satisfactory,” said Mike Smith, a former DHS director of strategic sourcing and now executive vice president at GovConRx. “There needs to be some way to justify anything over satisfactory that is not so hard to track down all the information so it becomes less cumbersome. Contractors also are not actively engaged in the documentation and support of the ratings. That has to change too, especially as agencies are paying a lot more attention to ratings. We have to figure out how to make sure past performance ratings are not done in haste.”
Smith said on average, depending on the type of procurement, it could take a contracting officer a few minutes to a few hours to write an in-depth CPARS review.
New data compiled by GovConRx shows the value of CPARS is dropping dramatically.
Between 2014 and 2018 across the four areas of CPARS—quality, management, schedule and cost control—the number of satisfactory ratings have consistently increased while the number of exceptional ratings bottomed out and the number of very good ratings also are a downward trajectory.
“We’ve seen some changes and a bigger emphasis in using CPARs for source selection decisions, but we don’t think the government is getting the same value for what the evaluations were designed for in the first place,” said Ken Susskind, founder and CEO of GovConRx. “We talked to quite a few agencies and chief procurement officers and we are hearing there is not enough detail or accuracy in CPARS for them to rely on to make educated and informed decisions during the source selection process. And then we hear from a lot of contractors that they believe they are performing at an exceptional and very good level, but they are getting satisfactory ratings in CPARS.”
And getting satisfactory ratings in CPARS combined with the increased importance of past performance makes a bad recipe for both agencies and vendors. especially because the real value of the data is in the comments written by contracting officers. Experts say contracting officers don’t believe they have the time to justify ratings above or below satisfactory so it’s easier just to give everyone an average rating.
Lesley Field, the deputy administrator in the Office of Federal Procurement Policy, said the chief acquisition officer’s community is recognizing the value of CPARS is diminishing.
“I do think it’s time to reimagine what that systems could do and how it could support our acquisition workforce,” Field said in an interview with Federal News Network. “There is a lot of data in the system, but it may be a little difficult for folks to pull out what they need. We want to make sure the data is accessible and relevant. So we are thinking about ways to start a modernization effort for CPARS so the system can produce information that is more actionable and usable for our contracting officers.”
Field said the data needs to be more specific and better reflect how vendors are performing.
OFPP is beginning a new “in-reach” effort to talk to the frontline workers to ensure they have the tools and training to do their jobs better.
Soraya Correa, the chief procurement officer at DHS, said her office is part of a group of agencies starting to work with OFPP on reinventing CPARS.
She said it’s clear CPARS has become tedious and there can be too much back and forth with vendors over the ratings and comments. Correa added past performance needs to be simplified and automated to some extent.
“If we use something like artificial intelligence or automation tools to simplify data and sort the data so we can present it in a more streamlined fashion, CPARS would be more valuable for contracting officers,” Correa said in an interview. “I would love to see a more commercial past performance approach, almost like a Yelp approach. Vendors can add comments to the government’s ratings, but there wouldn’t be a back and forth. The contracting officers could have a tool so they can search for projects similar in size, scope and complexity. That is one of the things we look for. They can run a report and it tells them what ratings came back at and key comments from vendors and the government.”
Field said agencies need to keep up with the times so a “Yelp for government” isn’t necessarily a bad idea and is a model to explore and possibly pilot changes to CPARS to make it more in-line with commercial practices.
“What we would like to play with is if there are AI tools out there or emerging technologies that could look across the evaluations, maybe help us come up with more insight,” she said. “Would it be possible to look at the Federal Data Procurement System data and figure out what other awards are out there and if anything interesting pops up? It’s really a holistic reimagining of what that past performance system could do. Obviously, the contracting officers are the ones who will have to make that assessment, take all that information and evaluate it and use their judgement.”
Field said it’s still early in the process to reimagine CPARS, especially as GSA transitions it to a new platform, beta.sam.gov, in the coming months. Field said OFPP is working with a few agencies to figure out what the future of vendor past performance could look like.
Correa said DHS is one of those agencies working with OFPP on modernizing CPARS. She said emerging technologies, like AI, will promote both the use of past performance data and the need to make sure CPARS information is a higher quality.
One common complaint is CPARS doesn’t necessarily include or have room for private sector past performance data.
Both Correa and Field agreed that obtaining that information would be helpful, especially as agencies strive to find non-traditional contractors.
“I do think CPARS is a good tool. We’ve come a long way with that tool. I remember when we first implemented it and it was much more difficult to use. They continue to make improvements on CPARS, the system itself,” Correa said. “I just want to take advantage of automation and the technologies that are available to help us cull the data, simplify it and bring it back in a fashion to the contracting officers that is simple, straightforward and easy to use. If along the way, we can tailor the data requirements, how we get data into the system, how the CORs, program managers and contracting officers interact with the system to get their data in the system and pull it back, then we will have an ideal tool.”
In the coming weeks, the Office of Management and Budget will release new guidance for how agencies should implement the Evidence Based Policy-Making Act, including the requirement to name a chief data officer.
Some agencies like the Transportation Department, the U.S. Agency for International Development and several others already have CDOs and are well ahead of the curve.
The Justice Department is the latest agency to get on the CDO bandwagon, but is taking a bit of a different approach than most agencies.
DOJ named Joe Klimavicz, its chief information officer, as its chief data officer last week.
DOJ’s decision to give Klimavicz the CDO title is dissimilar to how most agencies, large and small, have handled it over the last four years as the data position has come in vogue.
The debate typically is around whether CDO answers to the CIO or is on the same level as the CIO, reporting to the deputy secretary or an assistant secretary of management type of role. Each agency has handled this decision differently, with some like the Centers for Medicare and Medicaid Services sitting their CDO outside the CIO’s office. Others, such as USAID, have their CDO report to the CIO. As of August 2018, a PricewaterhouseCoopers survey found 64% of federal CDOs report to the CIO.
“No two federal CDOs share the same portfolio of responsibilities, as each one has adapted the role to the unique needs of his or her agency,” Jane Wiseman, a senior fellow at the Ash Center for Democratic Governance and Innovation Harvard Kennedy School at Harvard University, wrote in an IBM Center for the Business of Government report on federal CDOs from September. “Federal CDOs view themselves as enablers of data-driven decision-making capacity in their organizations and execute on that in different ways, ranging from being centralized providers of “analytics as a service” to creating the tools and platforms that enable employee self-service across their departments.”
Wiseman said the private sector model is much different where most CDOs report directly to the COO or CEO of the company.
But at DOJ, the decision seems to bring back the debate about data versus information. If the CIO, Klimavicz in this case, is responsible for information, then why not make him responsible for data, too, as the two are intrinsically linked?
As one chief data officer told me recently, the government is in the information business and the business and mission side of the agency have to value it.
So to some it may make sense to dual-hat the CIO as the CDO, too, because they, through the Federal IT Acquisition Reform Act (FITARA), are already working closely with other C-level officers and mission program managers.
One recommendation from Wiseman in the IBM report is for agencies to have a data leader who has “the authority, executive support and mandate to advance data-driven government. Agency leaders should hire a data leader such as a CDO who demonstrates competency across the domains of infrastructure, innovation and delivery.”
Sound like a job for a CIO?
Interestingly enough, the FBI hired a chief data officer more than two years ago and that person reports to the CIO. The bureau is taking a different approach than headquarters, but maybe that’s why making Klimavicz the CDO makes sense too — he can connect the technology with the information and the data.
On the other hand, one former agency CIO told me that the Justice Department’s decision to dual-hat Klimavicz signals to them that the agency doesn’t respect or appreciate the impact the CDO could have. Among the reasons for this belief, the former official said, are a CDO with a singular focus can have a bigger impact on the agency as a whole than just giving the CIO more responsibility. They also said a stand-alone CDO can affect change in a much different way than by having a legacy CIO where there are real or perceived culture challenges.
The IBM report addressed this challenge, too. It highlighted the reason the Transportation Department created a separate CDO who reports not to the CIO, but to the agency’s chief technology officer.
“Frustrated that there wasn’t anyone in the department whose full-time job was data, he decided to create the role of a CDO,” Wiseman writes. “In choosing to create the role, the CIO also decided that it should be a career job — that was a way to signal that the department was making an investment in data as an agency. The mandate for the CDO at DOT is open data, data governance, and self-service data infrastructure.”
As another CDO told me recently, they describe the position as the grease, but the data is the fuel for the agency’s decision-making engine.
To me, that means no matter where the CDO sits or who wears the hat, the person in that role must ensure the business and mission folks have the data and information they need to make better decisions.
In Justice’s case, if Klimavicz can grease the skids, then, give him the extra hat. The good thing is it’s not permanent and DOJ can change its approach if it’s not working.
What will be interesting over the next few months is how prescriptive OMB’s guidance will be in relation to the reporting structure of the CDO and how other agencies approach the reporting structure. Like chief privacy officers back in the mid-2000s, agencies dual-hatted CIOs with that job. And slowly over the past decade, agencies have created stand-alone roles mostly inside the CIO’s office.
In other people news, Mark Kneidinger’s six-year run at the Department of Homeland Security ends June 10.
Kneidinger is now a senior adviser for policy and IT transformation in the Energy Department’s Office of the CIO. He changed his Linkedin profile to reflect the new job.
While it’s unclear what Kneidinger will do in the senior adviser role, it’s clear that DOE CIO Max Everett has been pushing the IT modernization ball quickly forward.
The Energy Department received a $15 million “loan” from the Technology Modernization Fund to accelerate its effort to move email to the cloud.
Everett told me in January that the cloud email consolidation effort is part of how Energy is trying to get a better understanding of the cost of technology.
Additionally, the department has been out in front with figuring out how to meet the requirements of the Trusted Internet Connections (TIC) initiative in a new way. DOE worked with DHS over the past 15 months to create a more flexible use case to secure internet connections.
Kneidinger comes to DOE after spending the last six months as the deputy director of the DHS National Risk Management Center (NRMC), which the agency stood up last summer. It’s a little surprising that he is leaving after six months given the focus of the administration around issues like supply chain and cyber risk management.
Before coming to the NRMC, Kneidinger led the DHS Federal Network Resilience office since 2015 and has been with the office since 2013. Before that, he worked for private sector companies including CSC and CACI, and was the CIO for state offices in New York and Virginia.
Trust is at the center of every program and project across the Veterans Affairs Department. The goal, over the last five-plus years, has been to rebuild the trust of veterans, particularly in how the agency delivers healthcare.
VA’s most recent data shows great strides, but it’s the institutionalizing of those efforts that will produce long-term change.
Lee Becker, the chief of staff of VA’s Veterans Experience Office, said the department not only expects to exceed its long-term goal of improving customer service, but make it a permanent part of its employees’ expectations and actions.
Secretary Robert Wilkie recently signed an order to change the code of federal regulations—U.S.C. 38 CFR—to add customer service principles in part zero and to measure customer experience through how effective and easy it is to provide care to veterans.
“We are making sure we are providing care with emotional residence by treating every veteran, family member, caregiver, survivor with the utmost respect. In the end, that is what drives trust,” Becker said in an interview with Federal News Network. “This shift to really that true culture of customer experience really takes many, many years. We have been taking these bold moves to reinforce our focus around our customers, our veterans, to ensure that the overall experience is the highest that it can be.”
Becker said changing the code of federal regulations is more significant of change than just issuing new policy or writing a memo.
“We are saying for everything we do it’s going to be with that lens in how we approach providing the best care, benefits and services,” he said. “Under 38 CFR, part zero is where our VA core values are codified. We created an amendment to that and added customer experience principles. What it really does for is hold us accountable to those principles.”
Becker said by changing the CFR, VA now has more permanent and long-term rules in place so future administrations, based on that policy, can set the goals and objectives.
The decision to change the Code of Federal Regulations to include customer experience is part of the reason why VA won a 2019 Service to Citizen Award earlier this month.
This codification of customer services come about a year after Wilkie signed VA’s first customer service policy to further sustain long-term efforts.
The Trump administration has made customer service a cross-agency priority goal with VA being one of the program’s co-leaders. This focused effort helped VA, among other agencies, improve customer service, according to a May 2018 survey by Forrester Research. The most recent data from the President’s Management Agenda shows agencies should have a customer experience dashboard to track their progress against governmentwide and agency specific metric some time in 2019.
But VA has its own goals it’s striving toward, including reaching the 90% mark on the veterans trust scale.
Becker said the Veterans Health Administration, for example, developed a patient experience program.
“It is a full suite of actions to address people, process, technology and engagement to enable that ultimate patient experience to happen,” he said. “At every veterans medical center, there are employees who are red coat ambassadors because veterans have told us it’s hard to navigate medical centers. That provides a very warm connection and high touch with our veterans to make sure they can navigate the facility properly.”
VA also has been rolling out “on the moment” training to make employees aware of and empower them to own the moment to make the veterans and their families’ experience as best as it can be.
“We’ve seen the facilities that have been implementing it fully have actually increased trusted and have an increased experience. Overall, we’ve seen a 2% increase over the past year in customer experience,” Becker said. “We have trained over 50,000 employees with this concept. This training is based off of some of the best practices in the private sector. We’ve also taken some of the best practices of medical centers, who have been doing a great job in how they address customer experience, and we’ve used that for this training.”
As for that 90% goal, Becker said it’s an aspirational goal that is achievable by September 30.
“When we think about how we have been able to get to where we have been able to get to right now and the progress we’ve made, it’s really been through partnerships internally and externally. When you talk about real culture change, that occurs when you have a common mission and there is no competition about how is competing that mission,” he said. “Customer experience takes time. It’s not something that happens overnight. As we’ve demonstrated some of our early successes and we are seeing even more successes in how we are improving experience, as agencies look at us and we look at other agencies, we realize that it’s a journey and through working together, we will get there together.”
BALTIMORE, MARYLAND–Outside of the drama of the $10 billion cloud procurement known as JEDI and the excitement over the almost $9 billion cloud procurement known as DEOS, there is the Fourth Estate consolidation program in the Defense Department.
It’s as big, worth about $10 billion today, but likely much less over the course of the next decade.
It’s not as controversial with no dramatic court case or battle between system integrators like with JEDI and DEOS, respectively.
But the Fourth Estate consolidation and optimization effort may have more impact, be more significant and, most importantly, show the DoD path forward in its move to the cloud.
Tony Montemarano, the executive deputy director of the Defense Information Systems Agency, at the AFCEA TechNet day, said over the next decade the agency will bring together the networks and commodity IT of the 14 defense agencies, including the Defense Logistics Agency, the Defense Finance and Accounting Service and the Defense Health Agency.
“We are taking the commodity IT of 13 other Fourth Estate organizations and bringing them together with DISA, not mission IT, but the desktops, the business applications, and trying to bring them together, the contracting and personnel,” he said. “Close to 1,000 new employees are coming to DISA effective the first of October. We have to come to grips with taking these independent, commodity environments and bringing them together. It’s a major undertaking when it comes to coming to grips with contracting, coming to grips with personnel, you can imagine the nightmare dealing with the whole thing, and everyone is cooperating.”
Montemarno received the uneasy laugh from the audience with the last comment, but there is actually a lot of truth to what he said.
Drew Jaehnig, the chief of the Fourth Estate optimization program and chief of the defense enclave services, said at TechNet that there are two main oversight bodies for this consolidation. The senior working group led by Danielle Metz, the principal director for the acting deputy CIO for Information Enterprise, in the Pentagon, which is a weekly meeting that provides governance, structure and direction. Then there is the IT Procurement Request board, which handles the change management process for any of the 14 agencies, who want to change any of their current technology or contracts.
“That goes through our office, basically for a quality check for the lack of a better description, and then it goes up to DoD CIO for adjudication,” Jaehnig said. “For the most part, it’s pretty team oriented. The requirements for the request for information that you see on the street was developed by all 14 agencies together. We had two summits, one partly in person and the rest virtual. There has been very little uncooperative behavior from the Fourth Estate in every sense of the world. Big agencies such as DFAS, DLA and others deserve a big shout out for helping to drive this and to stabilize the project. They have been very cooperative and we have nothing but good things to say about the folks in the Fourth Estate at this point.”
Jaehnig said there was some initial concern about the impact on personnel, but now most of the noise around the Fourth Estate program comes from the vendor community.
And that’s where the new RFI comes in. DISA released the notice on May 10 asking for input to create the Defense Enclave Services, which is a major piece to the broader Fourth Estate consolidation effort.
Jaehnig said DISA hopes industry submits comments to ensure they can meet its goals of cost savings, improved services and a consolidated and hardened network.
“The department thinks we should be able to save a significant amount of money and return that to the lethality for the department by combining these networks and reducing the footprint to the tune of about $170 million a year,” he said. “I like to say my deliverable to the department is not the new service, but the savings.”
The RFI is calling for comments on the Defense Enclave Services, which are a baseline set of services managed by the contractor and are “flexible, scalable, reliable, accessible and secure” from the desktop level through the Local Area Networks and provide “high assurance of connectivity to the data centers, native internet and government/industry cloud services.”
For example when it comes to the DEOS contract, DISA will buy the back-office and desktop collaboration tools for the rest of the 14 agencies instead each organization buying them separately.
Jaehnig said the DoD CIO will sign a memo designating DISA as the sole service provider for back-office, commodity IT services and network infrastructure for these 14 agencies.
DISA and the 13 Fourth Estate agencies have been working on coming up with a common definition of commodity IT services over the last eight months.
“We tried to figure out where the dividing line is. There are some gray areas where which side of the fence some of these things fall on. We still are working on a few of the tiny details, and it also inserts some interesting complexity in regards to accreditation from the cyber perspective,” Jaehnig said.
Jaehnig said DISA and its partners have not yet decided on the acquisition strategy, which is what the RFI responses will help with. Responses to the RFI are due June 3.
The strategy could include the General Services Administration’s Enterprise Infrastructure Solutions (EIS) telecommunications modernizations contract.
No matter what the acquisition strategy looks like in the end, Jaehnig said the potential savings will come from reducing duplication of functions and of the more than 630 contracts vehicles currently in place across the 14 agencies.
“The amount of management overhead for these contracts is pretty staggering,” he said. “When look at the thread that runs through this, we see advantages and where we can get cost savings and improve services.”
Over the short term, DISA will move the first seven networks into DoDNet version 1 over the next year. The agency has a longer-term goal of moving the remaining nine agencies to and adding more emerging capabilities to DoDNet 2 by 2021 and beyond.
Jaehnig said the current projects around IT modernization will continue, but those agencies will have to go through the change management process to ensure they are interoperable with the future state.
Welcome to the second golden age of federal acquisition reform.
The frustration and the technology are aligning for the Trump administration, the Congress and industry to come together to make the first set of significant, almost seismic changes since the 1990s.
“We are in that rare moment when we have the combination of factors, the customer demand for speed and agility, the congressional receptiveness for acquisition reform legislation, the strong push from OFPP on the importance of innovation such as their creation of acquisition innovation councils, category management memos and the myth busters four memo and strong actions from a number of agencies really propelling acquisition innovation,” said Jeff Koses, the senior procurement executive at the General Services Administration, at the Coalition for Government Procurement’s spring conference in Falls Church, Virginia. “Across GSA we have a number of things that I regard as innovation plays in contracting and policy domain, in the communication domain and in the technology domain.”
To that end, the Office of Federal Procurement Policy sent six legislative proposals to Congress at the end of April to clean up a few things, but more importantly to ask for permission to test and spread innovative acquisition concepts across government.
The most significant proposal would create an Acquisition Modernization Test Board, which would modernize OFPP’s statutory authority for governmentwide acquisition testing, which has been in place since Congress created the office in 1974.
The board would develop “test programs that promote incremental improvement of acquisition practices, including through new, innovative or otherwise better business processes and applications of technology, and identifying candidate agencies to conduct tests,” the proposal states.
Through the board, the OFPP administrator would approve waivers of one or more acquisition laws as part of a pilot program to evaluate how changing the statutory requirement(s) might make the procurement process more efficient.
Matt Blum, the associate administrator in OFPP, said board also would help ensure agencies had a place to go to find out what innovations exist and who is trying them out.
“We believe the best way to accomplish the goals [of the President’s Management Agenda] is to accelerate the pace of transformation through smart piloting where we learn a little, through testing and getting back from all of you, making adjustments based on what we learn and do additional testing. There are a lot of benefits from doing testing,” Blum said at the conference. “We think it’s critical to an innovative ecosystem—any practice that creates new value for the customer. Testing allows us to constantly change and challenge ourselves to do better, to disrupt the environment in a manageable way. But also equally important, it helps us to manage risk, especially in initiatives that have multiple dimensions.”
The board also gives OFPP and the board a congressionally-approved place to fail. Too often, agencies are risk averse because of concerns about being called out by lawmakers or by auditors.
Greg Giddens, the former executive director of the Veterans Affairs Department’s Office of Acquisition, Logistics and Construction and now a partner with partner for Potomac Ridge Consulting group, said the acquisition board is the type of top cover agencies need.
“People want to talk about being innovative but taking that risk is hard to do. We are in an environment that if you do something well, that is like a tree falling in woods and no one is there to hear it. But if you take a risk and it doesn’t go well, everyone is there waiting to call you out,” he said. “The board isn’t calling for a big bang approach, but for agencies to try somethings. It’s almost like bringing the idea of agile or dev/ops to acquisition reform.”
OFPP’s Blum said whatever pilots the board approves will be proven by the results and the data. He said the acquisition environment has become more complex over the last 20 years that there is widespread agreement that testing and piloting is one of the best ways to innovate.
For the board to be successful, Larry Allen, the president of Allen Federal Business Partners, said OFPP needs to ensure the members are not just acquisition people, but come from a variety of backgrounds, including finance, technology and oversight.
“Waiving rules for pilots could be useful. How about starting with the Schedules Price Reductions Clause?” Allen said. “If you want to attract more small and innovative businesses to your largest commercial acquisition program, that’s a good place to begin. Eliminating the [Schedules Price Reductions Clause] definitely lowers the compliance burden and could attract more innovative companies.”
The White House signaled its desire to create the acquisition modernization test board in its fiscal 2020 budget request sent to Congress in March.
The goal of all six of the proposals is straightforward. Russ Vought, the acting director of the Office of Management and Budget, writes, that the ideas are “designed to help the administration achieve its goal of a more nimble and responsive acquisition system. The proposal would transform a statutory framework for governmentwide acquisition testing that has remained unchanged for more than 40 years and fails to adequately support an environment where continual and timely process improvement is an imperative.”
David Grant, a former associate administrator of the Mission Support Bureau at FEMA and now a partner with Potomac Ridge Consulting, said the proposals are part of the administration’s good government approach to management.
“Either there is an opening or there is a sense that there is an opening to make some real changes in the federal acquisition process,” he said. “OFPP wants to have a constructive dialogue with Congress to make some changes.”
OFPP also is asking Congress to make a few other changes, including ending the Defense Cost Accounting Standards Board, increasing the micro-purchase threshold to $10,000 for task order contracts and standardizing the minimum threshold for bid protests of task order contracts to $25 million for all agencies, instead of $10 million for civilian and $25 million for the Defense Department.
Grant said the standardization of bid protest thresholds makes sense given how much agencies go through to create the multiple award contracts.
OMB has tried several of these proposals previously, or are borrowing from the Section 809 panel, including the reducing the number of cost accounting boards and decoupling the threshold for using cost accounting standards from the threshold for Truth in Negotiations Act applicability and increasing the basic threshold for the standards’ applicability from $2 million to $15 million.
In the past, Congress also has given OFPP limited test or pilot authority for things like share-in-savings, but nothing as broad as the acquisition board.
And the request for the board underscores how OMB, the agencies, industry, and hopefully Congress, are starting to view acquisition innovations and why many believe we are entering this second golden age of acquisition.
GSA Administrator Emily Murphy put the current environment in some historical perspective.
When she was at GSA in the early 2000s, it was all about “getting it right,” which may have put too much emphasis on following the rules.
“It was an important age of acquisition … but it was so focused on the rules that it lost track of the solution part sometimes and trying to figure out how to use the rules,” Murphy said. “It’s GSA’s job to help agencies find a compliant way to get to the solution they need, not come up with a roadblock. As IT is evolving, as our understanding of service contracting is evolving and how all the pieces fit together, the innovation that is taking place across the government has led to a lot of movement. The IT is enabling a lot of changes and advances in acquisition. It’s just a great collaborative time where people are not afraid to ask the question of how can we do it better.”
That is how federal acquisition improves — not by going around the rules, but asking what is possible within the rules and then taking advantage of all that is possible without worry of punishment.
The Technology Modernization Fund still has more than $35 million to “loan” to agencies, and the board is inching closer to making a third round of awards.
Maria Roat, the Small Business Administration’s chief information officer and member of the board, said about 12 projects are in the draft phase where the agencies are working the program management office to finalize their proposals.
“Several others also have come in for phase two where they are pitching their proposals to the board,” she said after the CFO-CIO Summit sponsored by the Association of Government Accountants and the Association for Federal Information Resources Management (AFFIRM).
Roat said during the panel discussion that the board received 50 proposals worth more than $500 million over the last year plus, reviewed 37 and funded seven so far.
She said the board receives agency project details on Fridays and meets on Mondays to deliberate.
One of those projects that received funding was the mainframe modernization effort at the Department of Housing and Urban Development.
HUD deputy CIO Kevin Cooke said the agency received it second tranche of funding from the board in the last few weeks, another $5 million to go with the same amount it received last fall.
“All of the architecture work is done. We’ve built a prototype or pilot already to make sure there was no issues with the hundreds of thousands of lines of code to make sure it would work seamlessly. That was a big part of the proof of concept,” Cooke said in an interview after the panel. “The money will allow that jump into the actual projects, not just the proof of concept.”
Cooke said the first $5 million of the $20 million loan went toward making sure the mainframe modernization plan was sound.
“In order to get this done, there is some reverse engineering of the applications that are there to make sure as we change the platform the programs do not lose any functionality,” he said. “That was a big agreement that we had with them that during this period of time they would be up and running the whole time with their current systems, and this replatform would not cause them to lose any functionality. From that standpoint, it’s a good process, but it’s a slow process. It’s not like we are starting from scratch and you get to decide all the new interfaces and all the new APIs and ways of doing this. We are working closely with them. This allows us to continue on our trajectory on the project.”
HUD is modernizing seven different mainframe systems all of different sizes. He expects the agency to complete about 20 percent of the project by the end of 2019.
“You get smarter as you go along so it gets faster as you go,” he said.
And HUD needs to get smarter and faster because that loan is coming due. HUD has to start paying back the money to the TMF in mid-2020.
Energy won $15 million in June to accelerate its move to email in the cloud and received $2.2 million so far.
Bryan Long, Energy’s deputy CIO, said the project is a little behind schedule because of a protest of its CIO Business Operations Support Services (CBOSS) contract, which it plans to use to move some 65 disparate email systems from laboratories and offices to the cloud.
CBOSS is a $2 billion single award blanket purchase agreement for a host of IT services. The Government Accountability Office rejected ActioNet’s protest and backed Energy’s decision to award the contract to Accenture Federal Services.
Accenture’s team includes Unisys, General Dynamics Information Technology (GDIT) and Red River.
“We are behind where we had hoped to be at this point, but we do have the project awarded now under our new IT project and it will be moving out,” he said. “There is no doubt this certainly will accelerate our shift to cloud email for the remaining on-premise email systems.”
While Roat declined to name which agencies are in the final stages, it’s clear agencies are interested in the TMF loans.
Officials from two of the current awardees, HUD and the Energy, said they have other projects proposals they want to send or resend to the board.
Cooke said the agency submitted three total proposals to the board last year and may send one of those back for a second review.
“One of ones we looked at was looking at our enterprise data management. There were so many different programs involved in that and one of the things we didn’t do is show the direct impact, not just more efficiencies, but it will be easier. When we talk about the ease around data analytics, business intelligence and reporting, those were outcomes that meant something to the department, but I didn’t think we did a good enough job of showing to the outside how much better that would be in terms of being able to look at our data more holistically across all of the 19 different programs that we had.”
Cooke said the project also would’ve taken too long to show a return on investment.
Long said Energy submitted four proposals to the board last year, including the idea to move to a desktop-as-a-service and an application rationalization effort. He said Energy has learned valuable lessons from its experience with the board.
“As you are looking at projects, you need to factor in what are the savings, where are they coming from and how quick are you going to accumulate them, and what’s that overall return on investment look like,” he said. “Will it take you five years or 10 years to recoup that money that you used to do the project. Those are some of the key things you have to evaluate.”
While Cooke and Long both praised the benefits of the TMF because the loan is helping each agency move more quickly on their respective projects, both said they expect more oversight and attention to their programs.
Cooke said HUD’s inspector general recently began an audit of the project.
“You want to make sure you have enough people and the right people to support these projects. It’s very high profile. There are a lot of external eyes on it, and we chose it be because of how important it is to the agency so everybody has got to be focused on it,” Cooke said. “That’s something we would have to consider when we look at what we have in front us right now that we are doing, what spare capacity would we have” for an additional TMF funded project?
One of the most talked about and highest profile pieces to the TMF is how agencies will pay back the loan.
Long said Energy runs a fee-for-service model for its enterprise cloud email so that’s the way it will pay back the loan.
Cooke said HUD expects the costs of moving off the mainframe and into a modern architecture will pay for itself quickly.
It took nearly a year for the Federal Emergency Management Agency to find a new chief information officer.
A year after Adrian Gardner joined a long list of CIOs who were reassigned, FEMA’s search ended with a Defense Department veteran.
The agency quietly named Lytwaive Hutchinson as its new CIO in early May. Hutchinson came to FEMA after serving as the vice director of the Joint Service Provider in the Defense Information Systems Agency (DISA).
This is Hutchinson’s first civilian agency assignment. She spent 21 years as an active duty member of the Army and then joined Washington Headquarters Services in DoD in 2002 after retiring.
Patsy Garnett, who had been acting FEMA CIO since Gardner left, reassumes her previous position as the agency’s deputy CIO. Gardner retired in April after 30 years in government.
As the vice director of the JSP, Hutchinson developed, maintained and facilitated the implementation of the organization’s IT infrastructure across JSP’s customers. She oversaw an annual IT budget of more than $500 million, managed enterprisewide IT programs and initiatives, and served as the executive authority on the utilization of IT capabilities, resources, and systems.
As FEMA’s CIO, Hutchinson inherits major challenges to upgrade the agency’s IT infrastructure. In October, the Homeland Security Department’s inspector general told the House Homeland Security Subcommittee on Emergency Preparedness, Response and Communications that since 2005 auditors have found FEMA’s outdated IT systems and infrastructure did not enable it to effectively carry out disaster response and recovery efforts. The IG found significant and longstanding deficiencies that continue to hamper emergency support operations particularly around its ability to manage and track disaster funding and share information with external partners.
The IG also told lawmakers the CIO’s office lacks budget authority, formalized governance and oversight over IT investments and is missing an overall IT strategic plan.
FEMA is attempting to address some long standing IT challenges. In its fiscal 2020 budget request, FEMA asked for funding to reduce its complexity, such as $18.3 million to upgrade its network, $42.1 million to modernize its grants system and $8.1 million to address its aging financial management system.
It also asked for $9.1 million to continue “a multiyear effort to enable the agency to work smarter through data analytics and ultimately deliver better outcomes for survivors and communities. [The] Enterprise Data and Analytics Modernization Initiative will enable FEMA to streamline the work necessary to stay ahead of emergencies and deliver swift, effective assistance in times of greatest need,” DHS writes in its 2020 budget document.
Along with FEMA, the General Services Administration is bringing in some new IT expertise.
Reynold Schweickhardt joined the agency as a senior technology advisor to “provide technology and cyber management perspective for GSA related technology initiatives,” according to his LinkedIn page.
This is Schweickhardt’s first experience in the executive branch after spending the last 24-plus years working for the legislative branch. He was the House of Representatives’ Director of Technology Policy for eight years before joining GSA. He also served as CIO and chief technology officer for the Government Printing Office.
While FEMA gained a technology executive, the U.S. Citizenship and Immigration Service lost one.
Eric Jeanmaire, the division chief for Identity, Records and National Security Delivery at USCIS, left government to become the CEO of Finality, a security engineering firm.
He spent nearly 10 years in government, including the last seven at USCIS working to modernize the E-Verify program.
Here are a couple of other interesting job openings in government:
The Federal Deposit Insurance Corporation (FDIC) is looking for a chief innovation officer for its Tech Lab (FDiTech). “The CINO will facilitate the transformation process by partnering with other FDIC divisions/offices, the Chief Information Officer organization, and the Office of the Chief Information Security Officer to strategically address challenges through the adoption of innovative technologies. Creates an environment that fosters and enables technological innovation and transformation within the FDIC, by building partnerships and serving as a change agent,” FDIC writes in the job posting.
The agency is accepting applications through May 29.
The Army’s Futures Command is looking for a director of futures integration within the Futures and Concepts Center.
The director will apply analysis to threats “through the lens of the unifying concept and recommends the Army Futures Command (AFC) ‘top-down’ requirements for inclusion in the Army Modernization Strategy and Annual Mission Guidance,” according to the job posting on USAJobs.gov.
The director also will direct the execution of plans and programs to ensure the Future Force Modernization Enterprise maintains consistency across warfighting functions, including synchronization of doctrine, organization, training, materiel, leader development, personnel, facilities and policy development actions, guiding requirements efforts of the nine FCC capabilities development and integrations directories (CDIDs).
Hurry up and apply, applications are due by May 17.
Oracle fired another salvo at the Defense Department’s $10 billion cloud procurement initiative. The soap opera that is the Joint Enterprise Defense Initiative (JEDI) cloud procurement entered another dramatic turn last week with a new court filing by Oracle, which alleges former Defense Department employees have been “caught in a web of lies, ethics violations and misconduct” in the development of the JEDI solicitation.
The Court of Federal Claims filing — 128 pages that reads much like a paperback novel — reiterated and added more details about the potential role of two DoD officials who the software giant had already claimed had direct influence in the development of the JEDI solicitation while having job offers in hand from Amazon Web Services.
One of the officials, Deap Ubhi, has been at the center of this controversy for most of the year-long battle. Federal News Network now has confirmed through multiple sources that the third person in the latest redacted filing and who DoD found may have violated acquisition policy and laws is Victor Gavin, the former deputy assistant secretary of the Navy for command, control, communications, computer systems and intelligence. Gavin is currently head of federal technology vision and business development for Amazon Web Services. The second person who figures into this controversy is Anthony DeMartino, who served as a consultant for AWS through January 2017 and became deputy chief of staff for the Office of the Secretary of Defense.
DoD’s internal investigation found JEDI didn’t suffer any prejudice from the participation of the three officials who are alleged to have had conflicting connections with AWS, although it did refer potential ethical violations to the department’s inspector general.
But Oracle says in the filing that the contracting officer didn’t interview Ubhi or anyone from AWS or anyone on the JEDI solicitation team, and that there were additional inconsistencies about Ubhi’s claim that he firewalled himself from the JEDI solicitation.
The complaint also details Gavin’s role in JEDI and what Oracle says are clear conflicts of interest and mistakes. Oracle alleges Gavin, whose name is redacted throughout the complaint for unknown reasons, “began employment discussions in the late summer of 2017 and continued the discussions throughout JEDI. Like Ubhi, [Gavin] continued participating on JEDI even after accepting an employment offer from AWS. For instance, in [Gavin’s] final JEDI meeting, held three days after [Gavin] accepted an offer to serve as a principal in AWS’ [federal technology] division, [Gavin] participated in and received access to the DoD source selection sensitive draft acquisition strategy.”
Oracle says DoD has determined both Ubhi and Gavin violated Federal Acquisition Regulation section 3.101-1 [improper business practices and personal conflicts of interest] and possibly 18 U.S.C. § 208 and its implementing regulations about taking actions that directly benefit their financial interests.
An AWS spokesman declined to comment about Oracle’s filing and declined to make Gavin or Ubhi available to answer questions citing ongoing litigation.
AWS, however, has over the last few months pointed to three separate investigations—one by the Government Accountability Office and two by DoD— that found no conflicts of interest that would’ve affected the JEDI procurement.
“Rather, Oracle seeks to engage in a broad fishing expedition primarily to find support for its claim that the solicitation at issue is tainted by alleged conflicts of interest involving two former Department of Defense employees and defendant-intervenor, Amazon Web Services, Inc.,” the government wrote in its January filing in response to Oracle’s initial filing.
AWS also has called Oracle’s amended complaint “wildly misleading and a desperate attempt to smear” the company by distorting the facts.
DoD said in its January response that Oracle is creating “unnecessary delays, burdensome information requirements, and excessive documentation” in order to conduct a detailed review of Ubhi’s actions.
At the same time, Oracle’s revised complaints seem to bring new details to light about both Ubhi and Gavin’s role in JEDI.
“Indeed, the record also makes clear that AWS failed to take the necessary steps to firewall Ubhi and [Gavin] fully and adequately when they joined AWS and the contract officer’s suggestion to the contrary contradicts existing law,” Oracle writes. “As previously discussed, the contracting officer knows that the affidavit submitted by [Gavin] was inaccurate. For example, [Gavin] averred in his original affidavit that he ‘had no access to the DoD’s acquisition plan, source selection procedures, or any other information that could provide a competitor an unfair advantage.’ But the contracting officer knew this statement was inaccurate given that she attended the JEDI Cloud meeting with [Gavin]. During which the participants discussed the source-selection-sensitive draft acquisition plan. Significantly, the contracting officer determined this much without conducting any search of the JEDI records related to [Gavin].”
While DoD, Oracle and AWS joust over facts in court over the next few months—the judge told the parties to expect a ruling by early-to-mid July—the software giant’s latest filing has to call into question whether JEDI is even viable any more. The court announced on May 9 that oral arguments over the protest would take place July 10 in Washington, D.C.
“Like their previous pleadings, Oracle’s supplemental complaint eloquently paints a damning picture of deeply flawed process. My guess is that, for casual followers who have been quick to dismiss Oracle’s prior filings as sour grapes, reading this document would be a real eye-opener for them,” said Steve Schooner, a Nash and Cibinic Professor of Government Procurement Law at The George Washington University in Washington, D.C. in an email to Federal News Network. “This is not business as usual, nor should it be the way DoD conducts its business generally. And it surely shouldn’t be the way DoD awards its largest, most important, highest profile contracts.”
Many long-time federal procurement experts said IF the details in the Oracle complaint are true or even mostly true, the JEDI procurement is starting to rise to the same level as the Air Force’s tanker procurement where Darlene Druyun, the principal deputy undersecretary for acquisition, ended up going to jail for inflating the price of the contract to favor her future employer, Boeing, and for passing information on the competing contractors.
Schooner said he would hope that DoD would do all it can to ensure it resolves even the optics of a conflict of interest around JEDI.
“DoD would be particularly inclined to do the right thing, send a strong message to the community and take immediate, bold, clear, and definitive action to ensure that a contract decision of this size and institutional significance was not tainted,” he said.
Oracle’s court filing also confirms something Federal News Network reported in March: that the FBI is investigating the JEDI procurement.
While Oracle offers no more details about the FBI’s involvement, the fact its lawyers discussed it twice in the complaint reinforces the seriousness of concern about JEDI.
Oracle said in its filing that the problems are not just with Gavin, Ubhi or DeMartino, but also with AWS’s actions.
“The contracting officer likewise treats AWS as somehow blame-free despite its heavy hand in the misconduct,” Oracle states in its amended complaint. “For instance, AWS necessarily knew both that AWS had entered employment discussions with Ubhi and that Ubhi was serving as the JEDI lead product manager. Yet, AWS did not advise DoD of the employment discussions or even require Ubhi to provide an ethics letter to support Ubhi’s simultaneous participation in employment discussions with AWS while serving as the JEDI lead product manager. Instead, AWS purportedly relied on Ubhi’s statement that he had no restrictions on his conduct notwithstanding that AWS necessarily knew that to be false.”
Oracle also contends that AWS knew it had offered Ubhi a job and that he didn’t recuse himself from JEDI.
Schooner said Oracle’s complaint should be the wakeup call to the Pentagon.
“In addition to the pathologies evident in the original acquisition strategy, the current conflicts narrative, as painstakingly laid out in the supplemental complaint, offers DoD a much needed — even if not initially welcome — lifeline to reassess and reevaluate their original approach to the procurement, and start again with a clean slate,” he said. “Frankly, DoD would do well to grab the rope, escape to safety and start from scratch on this procurement. Sadly, I fear that the level of investment to date may be too high to permit DoD’s leadership to come the right conclusion at this point.”
Oracle is asking the Court of Federal Claims to either find that AWS is ineligible for award or require DoD to further investigate and resolve the conflict of interest claims.
|Jun 26, 2019||Close||Change||YTD*|
Closing price updated at approx 6pm ET each business day. More at tsp.gov
* YTD data is updated on the last day of the month.