“DoD Reporter’s Notebook” is a biweekly feature focused on news about the Defense Department and defense contractors, as gathered by Federal News Network DoD Reporter Jared Serbu.
Submit your ideas, suggestions and news tips to Jared via email.
In April 2018, an Air Force KC-135 tanker landing in Rota, Spain, suffered a failure on one of its hydraulic pumps. No spares were available at the base, so instead of performing its missions the tanker sat on the tarmac for five days waiting for repairs.
That same scenario — involving the same pump — repeated itself two dozen times over the last four years, and the Air Force estimates the downtime costs involving failures of one single part amounted to $6.6 million.
It’s the sort of problem officials think they can now start to put behind them using a practice commercial airlines have been employing for a decade: Condition-based maintenance (CBM). Instead of replacing parts after they’ve failed or relying on fixed schedules, air carriers have figured out that they can predict failures with a high degree of accuracy and get ahead of problems when maintenance makes the most operational sense.
The Air Force and DoD brand their variant “CBM+.” It’s still in its infancy, but officials have high hopes for how it might cut maintenance costs and boost aircraft readiness. The Air Force’s program differs from the commercial ones that inspired it in at least one major respect: Many of the military airframes the service operates are decades old, and aren’t outfitted with the same number and quality of sensors that spit out detailed data about which components are coming due for service.
Because of that, at Air Mobility Command, about 80% of the CBM program will rely on what officials there call Enhanced Reliability Centered Maintenance (eRCM). Instead of depending on data feeds from an individual aircraft’s sensors, algorithms will crunch through detailed records the Air Force already has about how a particular part has historically performed across the fleet — and on a particular airplane — and determine the ideal time to replace or repair it.
The AI algorithms that analyze that data are largely being developed by the Air Force Lifecycle Management Center. AFLCMC is also working on ways the Air Force can take advantage of the performance data it does have from its newer airframes that are outfitted with more modern sensors.
Downs said that sensor data will feed into a separate line of effort — the other 20% of the CBM+ approach — called Predictive Algorithm Development (PAD).
“We’re focusing on eRCM just because it’s going to get us our biggest return on investment, but eventually we will go back to the PAD side, because that completes the holistic view of CBM+,” he said. “But it’s going to be very challenging for some of the aircraft that do not have those onboard diagnostics. The KC-135, for example, was built in the mid-fifties, so there’s not a lot of bells and whistles on that aircraft. But what we can do is look at things like the flight data recorder that’s typically used for safety investigations. We’re looking for some things that we may be able to tie back to the maintenance side.”
And even on some of its newer platforms that are outfitted with sensors, the data they collect is often encoded or encrypted by the original equipment manufacturer, because the Air Force wasn’t thinking about CBM at the time it signed the acquisition agreements for those systems.
But now that the service is starting to get a better handle on what sorts of data are useful for maintenance purposes, it’s beginning to use those lessons to inform its intellectual property policies on the front-end of the acquisition process.
“Right now we’re in some significant conversations with the B-21 program about things that they need to be writing into their contracts as the aircraft is developed to capture all of these lessons learned,” Downs said. “And as we go through some of our other platforms, as contracts come up for renewal with our OEM partners, we’re having those same conversations just to make sure that we’ve got on-ramps and off-ramps going through as CBM+ matures over the next number of years.”
The Air Force’s gradual move toward condition-based maintenance also has major implications for its supply chain.
On one hand, the government and industry systems and processes that deliver parts to where they’re needed will have to adjust to a cadence that ensures they are ready to install well before they’ve failed.
But if all goes according to plan, it also means the Air Force and Defense Logistics Agency will be able to reduce the total number of spare parts they keep on hand just in case of unexpected problems.
“Our data point for this is Delta Air Lines,” said Brig. Gen. Steven Bleymaier, AMC’s director of logistics, engineering and force protection. “They were able to reduce or remove $500 million from their supply inventory by better predicting failures and reducing that ‘just in case’ inventory. So we would expect to see similar results in the Air Force supply system as we are able to pinpoint where we need parts, when we need them.”
The Defense Department has an ambitious schedule for a serious overhaul of the way it monitors and enforces cybersecurity within its industrial base. If all goes as planned, vendors could start to see the new model showing up in formal solicitation documents in less than a year from now.
The Cybersecurity Maturity Model Certification (CMMC), in development since March, is the department’s attempt to create a simpler, more consistent framework for the cyber demands it imposes on companies. Rather than insisting they self-certify that they meet a long list of National Institute of Standards and Technology security controls, all contractors and subcontractors — whether they deal with sensitive information or not — will have their cyber acumen scored on a scale of 1 to 5. Likewise, every Defense contract will use the same scale to stipulate which companies are allowed to bid.
The CMMC certifications will begin to show themselves in contract documents next June, when they’ll be reflected in requests for information for upcoming contracts, said Katie Arrington, the special assistant for cyber in the Office of the Assistant Secretary of Defense for Acquisition. The first requests for proposals that will insist on only CMMC-certified vendors will most likely appear in September or October.
“We cannot afford not to do this,” she said during a teleconference organized by the Professional Services Council last week. “[The U.S. is] losing $600 billion a year to our adversaries in exfiltrations, data rights, R&D loss. If we were able to institute good cyber hygiene and we were able to reduce, let’s just say email phishing schemes by 10%, think of the amount of money that we could save to truly reinvest back into our partners in the industrial base that we need to stay on the competitive edge. And the only way that we saw fit to do this was to create this CMMC so we can ensure that we are doing everything we can do to buy down the risk of our adversaries stealing our hard work.”
DoD won’t be issuing the new cyber certifications directly. Instead, companies will have to have their IT systems and practices audited by a third-party assessor. The Pentagon wants those assessors to be independent and unbiased, so the firms doing the certifications won’t be allowed to sell other cyber services to companies. Each of them will be overseen by a single nonprofit entity that will manage the CMMC program.
Standing up that superstructure of third-party assessors also represents a time crunch. The department said it plans to pick the management nonprofit by January, about the same time it intends to publish the first draft version of the CMMC model, detailing the sorts of steps firms will need to take to achieve each level of the certification program.
Once the nonprofit is picked, DoD wants it to be able to help continually update the model while also providing alerts and warnings to the Defense industry about new cyber threats and exfiltrations.
“I need to be able to educate the community at large. I may need to dial up certain areas and make changes in the next year’s certification process to ensure that we are doing our best to protect not just the U.S. government, but our vendor community as well,” Arrington said.
As for the companies who will do the on-the-ground assessment work, the Pentagon is optimistic that there’s enough expertise in the existing pool of private sector cyber auditors to handle the task, even though the entire industrial base of 300,000 contractors — from shoemakers to IT service firms — will have to be certified in order to continue doing business with DoD. And contractors will be allowed to seek reimbursement from the government for achieving their CMMC certifications as an “allowable cost” in their contracts.
“There are a great deal of companies out there that do NIST 800-171 compliance work as a service, and they do a great deal of the healthcare and the financial sector certifications. We see that marketplace taking the CMMC on as another avenue for their businesses,” she said. “The defense sector is a little bit slow to get to this point, but we’re not unique in the U.S. marketplace.”
Also, the department does not plan on converting all of its contracts to CMMC overnight. Arrington said DoD is planning a “crawl, walk, run” approach to ensure a smooth rollout. Long before the first RFPs go out, it’s also planning a series of nationwide “listening sessions” with industry to help refine the plan.
But before it starts inserting the new certification demands into contract language, DoD realizes it needs to train its own acquisition workforce on the intent behind the model and how to apply it. Otherwise, Arrington said, contracting officers may have a tendency to insist on top-tier “Level 5” vendors for every RFP they release.
“If you’re on a contract for boots and you’re the subcontractor who’s sewing the eyelets for the laces, you may not need state of the art cybersecurity,” she said. “We want them to have good cyber hygiene. We want them to protect their employees, their IP, but as far as the government, we should not be sending them anything more than the instructions on how to make the eyelet, and a level-one certification would be good enough. The prime contractor may need a level three, because they’re receiving controlled unclassified data that has to do with where the boots need to be shipped. The contract will have specific areas of work that will have specific levels of maturity that will be needed. That’s why we’re doing an entire reeducation of our contracting officers and program managers. We want them to really understand what security is going to cost, and why you need it.”
The Pentagon’s new personnel system for cyber employees is still in an experimental stage after having taken years to get off the ground, but it does appear to be achieving at least one of its intended objectives: Speeding up the federal hiring process.
So far, new employees coming in under the Cyber Excepted Service (CES) are being hired in less than half the time it took to hire them within the traditional competitive service, according to Gen. Paul Nakasone, the commander of U.S. Cyber Command.
Nakasone told members of the House Armed Services Committee last week the average time-to-hire under CES is about 44 days, compared to 111 days before CYBERCOM implemented the excepted service.
“We have done over 21 different fairs. We’ve interviewed over 2,700 people. We’ve provided over 90 acceptances for job applications,” he said. “My perspective, early phase, is I’m a supporter of it. I look forward to continuing to utilize it.”
When Congress authorized CES in 2015, it gave DoD wide discretion to recruit employees into the new personnel system via any means they choose. They can opt to advertise positions via the government’s USAJobs website, but can also bypass it entirely and recruit candidates directly.
CYBERCOM says it’s used the authority extensively at job fairs, where it can give candidate job offers on-the-spot, usually after having pre-screened their written applications.
But Cyber Command, the Joint Force Headquarters-DoD Information Network (JFHQ-DoDIN) and the DoD chief information officer’s office are the only organizations the department allowed to use the excepted service during its first phase.
A broader rollout — to other Defense agencies and to the cyber components of the military services — has been slow to take place, partly because the Pentagon offices in charge of implementing CES have been underresourced.
Defense officials testified last month that only five full time employees in the DoD CIO’s office were working on policy and other implementation work. Kenneth Rapuano, the department’s principal cyber advisor, said last week that DoD had recently added two more staff members.
“But we need to supplement them going forward, and we believe we have a path to resources to do that in the relatively near term,” he said. “This is a priority. A challenge for the department is that we have a lot of priorities, but everyone acknowledges there’s no higher priority than this.”
In his written testimony, Rapuano said the department currently plans to convert about 15,000 of its existing civilian positions into the excepted service, a significant increase from the 3,000 DoD targeted when it began the first phase in 2016. But only 403 jobs have been converted so far.
The conversion of “positions” does not necessarily mean existing employees have moved from the Title 5 personnel system into the new excepted service.
When DoD established the rules for CES, it said that current employees would be given the option to stay grandfathered into the competitive service if they chose, even if their agency had decided to convert their positions to the new system. Employees have a one-time opportunity to decide to move to CES, and have to do so within 15 to 30 days after their agency converts their position.
But in its promotional materials for the new personnel system, the department has been telling people that there is no downside to moving into CES.
DoD said for most existing employees, all of the civil service protections and appeal rights of the Title 5 system still apply, but they’ll be eligible for potentially-higher, market based salaries, and possibly speedier promotions, since CES doesn’t require civil servants to spend a set time in one pay grade before moving up to the next.
But the rules are somewhat different for first-time federal employees who are hired directly into the excepted service. For example, unlike longtime civilians who are converting into the system, they’ll stay in a probationary status that makes them much easier to fire for the first three years of their careers.
After several years in which the Pentagon knowingly scrimped on facility upkeep while it scrounged for operation and maintenance dollars to put toward military readiness, the Defense Department’s 2020 budget is finally beginning to approach the funding levels its own models say are needed to keep its infrastructure in decent shape.
For 2020, the Pentagon told the military services to peg their facility sustainment, restoration and modernization (FSRM) accounts to at least 85 percent of what DoD’s facility sustainment model says is necessary (the department’s longstanding goal has been 90 percent). Each service met or exceeded that mandate, and put billions of new dollars into FSRM.
Funding levels reached a low ebb in 2015, when the allocations were as low as 70 percent of the model. This added to a growing maintenance backlog and lengthening the list of facilities that have fallen into poor or failing condition.
The Air Force has planned the largest year-over-year increase of any the military services: Its $4.1 billion FSRM proposal for 2020 would be a 46 percent boost over what it received for 2019. Along with the funding increase, the service is shifting its philosophy for how to allocate its facility investments.
Officials said they would prioritize maintenance projects based on mission needs and where the funds would deliver the biggest bang-for-the-buck, abandoning an earlier strategy of fixing its worst buildings first.
The previous method seems to have been a losing battle, since facilities were deteriorating faster than the Air Force could fix them at recent funding levels. The service now has a backlog of $33 billion in deferred maintenance, officials told Federal News Network.
And it is not alone. Navy budget officials also told reporters last week that the Navy has $14 billion in deferred maintenance and repairs on its bases; the Marine Corps has $9 billion.
But both services also plan sizable increases in their FSRM budgets for 2020. They’re budgeting to 87 and 88 percent of DoD’s facilities model, respectively, up from only about 80 percent this year. The Navy’s FSRM budget would increase about 25 percent compared to 2019 and the Marines would get a 43 percent boost, partly to help deal with damage caused by Hurricanes Florence and Matthew last year.
“This is an area where we’ve taken some risk in recent years,” Rear Adm. Randy Crites, deputy assistant secretary of the Navy for budget, said. “This investment is going to arrest the degradation of shore facilities, and it makes targeted investments in mission-critical infrastructure. And I think the increased funding is absolutely going to help with our material condition.”
The Army, meanwhile, would see about a 22 percent increase in FSRM funding, a level that would pay for about 85 percent of the spending suggested by DoD’s model.
The FSRM funds are separate from the Base Operating Support accounts that pay for day-to-day services, and from the Military Construction (MILCON) spending that funds new or replacement facilities. Each of the spending lines play a role in ensuring base infrastructure is adequate.
The department said it was requesting $36 billion in combined FSRM and MILCON funding, including nearly $3 billion to replace facilities that were destroyed or damaged by last year’s hurricanes.
The MILCON budget also includes $3.6 billion in funds the department is setting aside just in case President Trump decides to use emergency authorities to spend military construction money on his proposed border wall in 2020, just as he is preparing to do this year. Officials said they made that allocation to avoid having to take money away from projects Congress will have already decided to fund by that time.
But it’s too late to set aside similar funding for 2019, and any MILCON money the president diverts to the wall this year will have to come from projects Congress has already explicitly funded, and so the 2020 request also includes another $3.6 billion to “back-fill” those diversions.
On Monday, the Pentagon repeated an earlier promise that it would not divert any funding for contracts that have already been obligated. Rather, officials said, wall construction would only be paid for by deferring some 2019 MILCON projects until next year.
The department also released a full listing of the 2019 projects Congress has funded but for which no contracts have been signed yet. The list represents a rough approximation of the ones that could be vulnerable to delays this year if the president prevails in his legal and political fight with lawmakers over the emergency declaration.
At a hearing of the Senate Armed Services Committee last Thursday, Patrick Shanahan, acting defense secretary, said he would release the list by that afternoon — a commitment the department did not meet until Monday.
But even then, some Senators were incensed that they had not already received any of the MILCON details from DoD, and that the data would arrive only after a previously-scheduled vote to disapprove the president’s emergency declaration.
“I feel completely sandbagged,” Sen. Tim Kaine (D-Va.) told Shanahan. “The service secretaries have had that list … they have been willing to share the list of their unobligated MILCON projects, but they have been told that they cannot do that, it has to come through the OSD … I think we’re entitled to know where the money might come from, especially since you just said this is a multi-year declaration that opens up a spigot into the MILCON budget. I don’t think you giving us that list after the vote, when we’ve been asking for it for a month, is a good faith response to the request of this committee.”
The Defense Department is just starting its second year of full-scale financial audits, and it’s likely to take many more before those efforts yield a clean opinion. But the process is already having at least one beneficial effect: It’s pushed the military services to account for tens of millions of dollars in government property they’d lost track of.
According to DoD’s auditors, property accountability issues are still among the most serious problems preventing it from passing an audit. In the first year of the full-scope examination, auditors issued more than 170 separate findings and recommendations detailing the military services shortcomings in tracking their small-item inventory and real estate.
But David Norquist, DoD’s CFO and comptroller, said progress along those lines has already delivered concrete proof for why the audit is not merely a paperwork drill.
“We discovered there are certain facilities where what they thought they had in inventory did not match what they had in inventory. And if your responsibility is spare parts for airplanes, the accuracy of that inventory matters,” he told the Senate Armed Services Committee last week.
One example was how, at Utah’s Hill Air Force Base, a stockpile of missile motors was erroneously listed as unserviceable even though they were in perfectly good condition. Putting them back into circulation instead of ordering new ones saved the Air Force $53 million.
“In other places, if you go to Osan and Kadena [air bases in Japan], they had 14,000 munitions worth $2.2 billion, and 100 percent were accounted for — not a single exception,” Norquist said. “What we’ve learned is there are some places that are doing this quite well, and there are others where we need to help them fix their processes, but the commanders in the field recognize the direct connection to mission and readiness. They saw the tangible value, and I think as we move forward, the accuracy of the data and adopting more businesslike practices will be tremendously helpful.”
Instances of bad or missing data about entire warehouses worth of parts came up more than once during the course of the 2019 audit.
Thomas Modly, the undersecretary of the Navy, said the Navy found something similar when its auditors began examining a facility in San Diego.
“When we went out and actually started counting inventory and understanding where our stuff was, they found a warehouse that no one knew existed, and it had $26 million worth of parts for the E-2 and the F-18,” he said. “It was not categorized. It did not sit on any inventory system that we had in the whole Department of the Navy. Once that was identified, we were able to requisition $19 million worth of parts to aircraft that were waiting for them and were down because we didn’t even know we had those parts. This is a serious problem for us that we really have to get after, because at the end of the day, it impacts our ability to perform the mission, and our costs.”
The DoD Inspector General reported similar issues in its summary of the 2019 audit findings. More than 100 Blackhawk helicopter blades that were listed as available for use, but that were actually damaged. Fuel injectors stored in warehouses with no documentation to show which military service owned them. Entire facilities that had been demolished years ago, but are still listed as active on the military’s property books.
The IG reported 20 overall material weaknesses after the first audit, and then refined the list down to six that auditors thought were most concerning. Two of the six had to do with property — one encompassed spare parts and other inventory, while the other dealt with bigger-ticket items like real estate.
“We’ve gone out and said, ‘Give us a list of a certain asset and how many you have and where they’re located.’ And when we go, we either find that they have more than they thought, or the ones on their lists don’t exist,” said Carmen Malone, the deputy assistant inspector general for audit. “If you have something in your inventory records that actually can’t be used, you’re not going to order something, because you think you already have it. From an inventory standpoint, that is a big deal.”
Malone said one of the reasons the IG considers the property issue so serious is that it has a direct bearing on military readiness.
“It’s not just from a financial statement standpoint,” she said. “We are out talking to the everyday operating people and making sure that they understand that what they do impacts not just the financial statements. This information will be used as a central location for decision makers across the department from a readiness and logistics standpoint as well. If the information is accurate for financial statements, it’s going to be accurate for the decision makers, which ultimately affects the operations and readiness of the department.”
At last week’s hearing, Norquist declined to predict when the department will finally earn a clean opinion on its full financial statement, but he said he expected that either the Army or the Marine Corps would pass an audit of a small portion of their individual statements — namely, their working capital funds — within the “next couple years.”
But Modly said his department has major, systemic challenges it still needs to solve with its accounting systems before audit passage is a reasonable probability — at least on an ongoing, repeatable basis.
“We have nine current general ledger systems. They’re not connected, and they create all kinds of disparities in our ability to truly understand our financial information,” he said. “We have business systems that are even more complicated that require interfaces that cause breaks in data security. Because of all those problems, we’re doing a lot of estimating, a lot of hand-jamming of information that most modern industrial corporations never have to do. Most modern industrial corporations can push a button and generate a financial report. We are not even close to that, and we have to get better.”
The Navy has some new theories about how its bases should connect to each other, and with the internet. If they come to fruition, they could begin to displace the Defense Information Systems Agency’s longstanding role in providing the wide-area network backbone the military services depend on.
As part of a prototype project set to begin this spring, the Navy is testing whether it might make sense to bypass DISA as its main provider for long-haul telecommunications services and outsource them to one or more commercial providers. The project would both connect the Navy’s users to the commercial cloud, and connect its bases with one another.
The concept, which the Navy calls Network-as-a-Service, would fundamentally alter the logic of how data flows across Navy networks, and comes as the service aims to move 100 percent of its IT systems to public and private clouds.
As part of that vision, a large number of applications will be hosted by commercial cloud providers. So Navy leaders are wondering if the current model — where off-site users’ traffic is first funneled through a secure connection to a Navy facility, then back out through a government-operated internet access point — makes any sense.
“We have folks in the acquisition community who say, ‘Well, maybe we can do 100 percent of our job from the commercial cloud,’” Andrew Tash, the technical director for the Navy’s Program Executive Office for Enterprise Information Systems told an audience at a Navy industry day in San Diego last week. “If that’s the case, then why shouldn’t we have the most efficient access to those services and not be forced to actually log into an on-premise network and then be routed over? We really want to take advantage of direct access to those services.”
For the prototype, the Navy wants vendors to help prove or disprove its current working theory: that cloud service providers (CSPs) and telecom companies can deliver more seamless, less expensive routes between its users and the commercial cloud, do a better job of interfacing with the public internet than DISA’s current Internet Access Points, and connect Navy bases with one another.
“We have a lot of decisions to make in the Department of Navy with respect to network architecture, and many of those decisions are based on assumptions, not on quantitative information about performance,” said Will Stephens, who leads business and technology strategy for PEO-EIS. “So the purpose is to set up alternative connection methodologies to allow our users to get to the cloud through the CSP’s own internet access point, and also to get connectivity from one base to another using the CSP’s services rather than our current way and services, which we know are a little bit difficult — they’re not dynamic to allocate and adjust.”
The Navy is soliciting the work through its Information Warfare Research Project, a $100 million Other Transaction Authority vehicle it established last year for rapid IT and cyber prototypes.
Officials expect to make an award for the Network-as-a-Service experiment by Apr. 26; the Navy wants a working prototype up-and-running by July 26.
“It will be a connection to our production environment, and we’ll have two network paths from that point of presence: one across our current network path, and one across the new network path that we’re setting up as part of this Network as a Service architecture,” Stephens said. “We’ll also get non-binding cost estimations so that we can determine whether or not this is feasible from a cost perspective.”
With the Defense Department’s JEDI Cloud contract at the center of bid protests, a new conflict of interest investigation and now a separate criminal probe, the most important elements of a cloud computing strategy DoD published only a month ago have been essentially frozen in amber.
The key features of the strategy were the concepts of “general purpose” and “fit-for-purpose” clouds. In it, the department said it wanted to move most of its applications and data to the former — JEDI — while also making decisions about which of the mission-specific clouds being built by defense components should be allowed to survive.
But in testimony to the House Armed Services Committee last week, Dana Deasy, DoD’s chief information officer, said all of that work is effectively on hold until the dust settles around JEDI.
“The longer we delay standing up a JEDI capability, the military services are going to need to go solve for mission sets, and they’re going to continue to stand up their own individual environments. I don’t see that as being beneficial over the long term to the department,” he said. “The fine line we’re walking right now is to not impede the need for mission success — where people are standing up [their own] clouds — and as soon as we can, provide clarity to the DoD on when the enterprise cloud will be available and then redirect those activities onto JEDI.”
The strategy the department released on Feb. 4 envisions a universe in which an overwhelming majority of the military’s systems and data are housed in the JEDI cloud, partly because officials believe that is the only reasonable approach to eliminating DoD’s existing IT stovepipes and making its vast data holdings available to the various artificial intelligence algorithms.
At the same time, the DoD CIO is supposed to comb through the roughly 300 cloud projects various DoD components have already begun, and decide which are candidates for the “fit-for-purpose” clouds that won’t fit within JEDI.
But Deasy made clear that neither of those things can happen until the JEDI matter is resolved.
He said he currently believes that up to 90 percent of the new applications the military develops going forward should be designed for cloud architectures, and should be able to operate within the “general purpose” cloud.
“But the big thing hanging out there right now is until we know what that architecture and that cloud’s going to look like, it’s very difficult to start estimation exercises.”
As for the strategy’s promise to begin determining which existing clouds will be allowed to continue operating, that work is also on hold.
“That is something we still have to do,” he said. “Right now, obviously, our focus is to make sure we know what the architecture is going to look like for our general purpose, which will help inform us on things that will stay fit-for-purpose, or move over. I would be surely guessing as to a certain percentage of a number of those 300 that will be migrated onto general versus fit-for-purpose until we understand the overall architecture.”
Deasy said his office believes it will have clear enough picture of JEDI’s eventual architecture about 60 days before it comes online, and at that point, will be able to start making decisions about which applications can transition to the general purpose cloud.
But the “go-live” date is still highly uncertain.
DoD had initially planned to make a JEDI award by April of this year. But in a legal disclosure last month, Chandra Brooks, the project’s contracting officer, said she is conducting a new investigation into allegations of conflict of interest involving Amazon Web Services and a former DoD employee, Deap Ubhi.
In an affidavit to the Court of Federal Claims, Brooks said even after the investigation is complete, she will wait another 90 days before making an award.
The military services are proceeding on the assumption that something resembling DoD’s JEDI vision will eventually exist, but are not waiting around to begin at least some large-scale transitions to commercial cloud providers.
The Navy, for example, is already in the process of moving some of its largest business systems — including its main enterprise resource planning system and several personnel databases — to the cloud, using its own contracts.
It remains unclear whether those will eventually be deemed “fit-for-purpose” clouds that will survive the JEDI transition, or whether they will be subsumed into JEDI.
“Defining what those are will be key as we move forward,” said Ruth Youngs Lew, the Navy’s program executive officer for enterprise information systems. “We have a couple of Navy cloud contracts right now, but those are intermediate steps. Our plan is to fully transition to JEDI at some point in the future, when they get it awarded.”
The Air Force spends north of $40 billion on acquisition and R&D each year, a sum that, in theory, ought to be plenty to sustain a healthy and competitive industrial base.
But one of the service’s big problems, according to its acquisition chief, is that those dollars tend to be concentrated among only a relative handful of major systems. Meanwhile the big investment decisions the Air Force makes to introduce new technologies are several years apart from one another.
“It takes decades between fighters, decades between tankers,” said Will Roper, the assistant secretary for acquisition, technology and logistics. “When the Air Force started off, it was years — in the small single digits. A fundamental flaw that we have in working with the industry base is that we don’t do enough big ideas, enough prototypes, enough diversifying, because the frequency of our awards is too slow.”
Among the disadvantages in the Air Force’s habit of making large, infrequent spending decisions is that there aren’t many opportunities for new competitors to gain a foothold in the market, Roper said.
So, as a remedy, he said the service will create a “big idea pipeline” in which it will commit itself to funding a greater number of technologies and at a faster clip. And he suggested the Air Force’s funding commitments will go beyond small prototypes: They’ll need more institutional commitment behind them to carry new technologies across the “valley of death.”
“There’s no valley in a pipeline, right? A pipeline keeps flowing. It’s important that we adopt this mentality, because a pipeline is inherently competitive,” Roper told the Air Force Association’s annual winter symposium in Orlando. “We can’t predict the future. If I told you what 2030 would be like … the future is so uncertain that it doesn’t make sense to predict the adversary at that timeframe and then build the Air Force that beats them. That’s too risky and it’s too prescriptive. We need to be creating new concepts constantly that challenge what our opponents think about us, that impose cost, that make them react to us, that pull them off their game plan. And right now there’s not a concerted effort to do it.”
The Air Force is already in the midst of an expansive review of its existing science and technology policies and spending. Heather Wilson, the Air Force secretary, announced the study in Sept. 2017, saying she wanted the service to find ways to expand its science and tech partnerships in ways that would be directly relevant to warfighting in the coming decades.
Wilson told last week’s conference the study is “very close” to being finished, and Roper suggested the service planned to use the results to inform its “big idea pipeline.”
As one example of how the new approach might feed into real-world systems, he offered the forthcoming Advanced Battle Management System.
ABMS is the Air Force’s notional replacement for J-STARS, the airborne surveillance system it uses to track enemy forces on the ground. Instead of replacing one aging surveillance airplane with a newer airframe, it’s decided to try to deliver the same capabilities via a networked system of sensors that it hopes will be more survivable in future combat scenarios.
“We are going to have to break it up and put part of it in space, part of it in air, potentially have an attritable layer, and have the networking and data sharing amongst all of it so that it works as one organism, even though it really is a separate diversified family of systems,” he said. “We’ve never done anything like that. So has to be shepherded differently, because it will bring in next gen technology, but in a framework that will move forward into a program of record.”
Feeding the pipeline will require some money beyond the $2 billion the Air Force already spends on science and technology. And Roper says he hopes to find much of it by squeezing savings out of the service’s existing weapons system sustainment costs, which currently make up about 70 percent of the total lifecycle spending for an average weapons platform.
Cutting those costs by as little as 5-10 percent would go a long way toward making room for additional “big ideas” that would not otherwise see the light of day in the Air Force’s current funding portfolio, he said.
And Roper said there’s good reason to believe there are a significant number of dollars to be wrung from sustainment, because even though the Air Force’s logistics workforce is “defying gravity” with the quality of their work, the depots they work in are not always equipped with state-of-the-art technology and business processes.
But increasing the Air Force’s uptake rate for new technologies isn’t just a matter of making room for them in its budget.
Roper said the service realizes it also needs to make itself a much more attractive customer for smaller, non-traditional vendors, and that its acquisition workforce needs to become comfortable working with multiple types of companies — from huge defense primes to small startups.
As one way to address the latter group, the service is planning an event in New York later this week where it will sign $40 million in contracts with small firms that make dual-use products. At the “Pitch Day,” each company the Air Force selects out of a group of 60 will be given a contract that’s no longer than one page, and the service will pay the firms on the same day, using government purchase cards.
“Pitch day will be our first entree to have an open door to nontraditional companies with dual-use technologies,” Roper said. “But it needs to be more than that. Prototyping is helping our defense industry base, but we need to do more than that. We will shepherd through the first generation of big ideas, but we need to be discussing those at every AFA.”
A federal court has agreed to temporarily delay any further proceedings in a lawsuit challenging the the Pentagon’s JEDI Cloud contract while the Defense Department conducts its own investigation into whether the procurement was compromised by conflicts of interest.
In a one-paragraph ruling on Tuesday, Court of Federal Claims judge Eric Bruggink ordered a stay in the case, filed by Oracle. The court had previously been scheduled to hear oral arguments on the company’s bid protest during an April 4 hearing.
But the government filed a motion on Tuesday asking the case to be put on hold. That motion was filed under seal, but according to Bruggnik’s order, DoD requested it because it is “reconsidering whether possible personal conflicts of interest impacted the integrity of the JEDI Cloud procurement.”
Oracle has long contended that at least two Defense officials with ties to Amazon Web Services helped shape the procurement in a way that favored AWS from the outset, including by insisting that the up-to $10 billion contract be awarded to only one vendor.
In particular, the company has pointed to Deap Ubhi, a former member of the Defense Digital Service who it claims had a “personal and substantial” role in the early days of JEDI procurement planning. Ubhi at that time was a former AWS employee, and has since returned to the company. Oracle alleges he was also in negotiations for AWS to purchase his startup company during his tenure at DDS.
As one indicator of Ubhi’s role, Oracle, in a court filing last week, cited an internal Sept. 2017 message to him from Sharon Woods, DDS’s general counsel, saying that she was “nervous” about the single-award approach he had been advocating, and asking for a more detailed explanation for why it made sense.
Later, “Ubhi drafted the ‘Problem Statement’ for DoD leadership that ‘explains the problem we are solving with this initiative,’ including ‘why can’t we solve the problem with multiple clouds,’ and ‘why is only one cloud a truly necessary requirement,’” according to last week’s court filing. “Ubhi also contributed to the Business Case Analysis, a document that ‘serv[ed] as a foundation’ for JEDI.”
The company claims there was a similar conflict of interest on the part of Anthony DeMartino, who served as Shanahan’s chief of staff during the early stages of the JEDI planning effort.
DeMartino had previously done consulting work for Amazon, and because of that, department ethics officials had advised him not to participate in any matters involving AWS without prior approval from the Standards of Conduct Office. But Oracle, again, citing DoD records, claims he ignored those directions by directly involving himself in the JEDI procurement.
The conflict of interest claims are only one of the complaints Oracle made in its lawsuit.
In last week’s motion, which asked the court to issue a summary judgment in Oracle’s favor, the company said the Pentagon had settled on a single-vendor strategy for its JEDI Cloud contract from the very outset, but sought to hide that fact in order to avoid public controversy.
Citing internal records it obtained as part of the suit, it said DoD’s Cloud Executive Steering Group decided on the single-vendor approach at its very first meeting. That meeting was held just one day after Deputy Defense Secretary Patrick Shanahan signed a memo establishing the CESG in September 2017.
Lawyers for the tech firm argued that the early date is relevant, because it is one indication that the legal analyses DoD eventually prepared for Congress and contractors to justify its single-award decision were “one sided” and designed to support a conclusion its officials had already reached months earlier.
They also alleged that the government had withheld documents about those and other parts of the planning process from the Government Accountability Office during an earlier protest, in which GAO ruled in DoD’s favor.
“The [contracting officer’s] July 17, 2018 memorandum (prepared months after DoD chose the administrative convenience of a single award) prejudicially violates the law,” attorneys wrote. “The CO did not meaningfully consider the benefits of competition, arbitrarily inflated the cost of competition, and undermined Congressional policy.”
Although federal law tells agencies to give preference to multiple-award ID/IQ contracts whenever they can, Oracle claims DoD took several steps to restrict competition from the early stages of the procurement.
Among the decisions it’s challenging in the suit, the Pentagon is insisting that cloud competitors have their services authorized by the government’s FedRAMP program in at least three physical data centers that are at least 150 miles apart.
The company claims those prerequisites have more to do with reducing the number of potential bidders than actual performance on the JEDI program, since the contract will use its own cybersecurity plan, not FedRAMP. Another “gate criteria” Oracle criticizes as irrelevant is the requirement that the winning firm’s JEDI revenue not be larger than the commercial cloud business it had as of January 2018.
“The gate criteria served their improper purpose,” attorneys wrote. “When first announced, JEDI sparked intense interest in the contracting community: Hundreds of companies attended the industry day, and more than sixty companies submitted detailed RFI responses identifying extensive, sometimes unique capabilities. But following the issuance of the final RFP and its restrictive gate criteria, only four companies attended DoD’s in-person question and answer session and ultimately submitted proposals.”
In its earlier decision, GAO acknowledged the preference in federal law for multiple-award contracts, but found that DoD has wide discretion to take advantage of exceptions to the rule for national security reasons.
GAO did not rule on another Oracle’s conflict of interest claims, because it said any conflicts wouldn’t be relevant until and unless AWS actually wins the contract.
But at the Court of Federal Claims, Oracle has maintained that the alleged conflicts had a direct bearing on how the Pentagon structured the contract in the first place.
Later this year, the Navy will make billions of dollars in awards for the next phase of its Next Generation Enterprise Network contract. But the longer-term future of the Navy’s ashore networks might be determined by a series of smaller, upcoming other transaction authorities.
Next week, the Navy plans to issue three separate problem statements to pave the way for what it’s calling Modern Service Delivery: A cloud-centric concept that posits that its IT users ought to be able to access roughly the same services, no matter whether they’re physically located on a Navy installation or are remotely connected via a mobile device.
“Today we have capabilities available to us when we’re on the network, and then a different set when we’re mobile. We want to drive parity for access to services and systems and data, whether we are at work, at home or on the go,” said Andrew Tash, the technical director for the Navy’s Program Executive Office for Enterprise Information Systems (PEO-EIS).
Speaking at the Navy CIO’s annual San Diego conference, Tash said the three OTA projects PEO-EIS is launching represent its latest approach to a rapid technology development process it first began in 2015 called the “innovation cell.” It plans to manage the awards through the Information Warfare Research Project, a $100 million OTA vehicle the Navy’s Space and Naval Warfare Systems Command established last summer.
As a starting point for ubiquitous data access, the Navy thinks it needs a new approach to identity and access management. So the first problem statement will ask vendors for an “integrated suite” of identity management capabilities that can work across the service’s systems.
“Identity is the new boundary. It assumes breach and focuses our security in a different place: it focuses on the data, and it focuses on user behavior as opposed to the network boundary,” Tash said. “That’s really the main thrust of our strategy, getting away from tightly-coupled services that force people to be on a network to access them. It’s just users connecting to data and services. It may seem pretty simple, but it removes a lot of silos and bad IT behavior that we’ve grown up with over the past 15 or so years.”
The Navy is already beginning to move some of its largest business IT systems — including its main enterprise resource planning system and some of its personnel databases — into the cloud.
And since its key data assets are already beginning to live in a mixture of on-premises systems and commercial hosting environments, giving location-agnostic access to all the data a particular user might be entitled to isn’t just a matter of creating a secure tunnel to the Navy-Marine Corps Intranet.
So the second problem statement will aim to build out the concept of Network-as-a-Service — a structure that would virtualize the Navy’s networks in the same way cloud computing lets it virtualize its servers.
“If all of our business systems are moving to commercial cloud, then shouldn’t we have the most efficient connectivity to the commercial cloud, to the point where I can do 100 percent of my job from commercial cloud services? So now we’ve got to rethink what the [DoD Information Network] means in that context,” Tash said. “Yes, the data is part of the DoDIN, but how we consume that and how we inter-operate with the rest of the DoD community is up for debate. But we look at the business systems as being the primary opportunity right now to leverage things like network as a service.”
The third task to the IWRP consortium’s members will look for technologies the Navy could use to manage its users’ access to the systems it’s moving to commercial cloud environments.
That approach, called Cloud Access Security Broker (CASB), tries to monitor users’ interaction with cloud services and enforce an organization’s security policies even when they’re connecting directly to the cloud provider from an “off-premises” location like their home or mobile device.
Many of the long-term details of the Navy’s cloud-dependent vision will depend on the final disposition of DoD’s forthcoming Joint Enterprise Defense Infrastructure (JEDI) contract.
The highly-contentious solicitation is still tied up in federal court, but in its new cloud computing strategy earlier this month, the Pentagon made clear that it expects the military services and Defense agencies to use JEDI as their primary solution for cloud services.
But Navy officials say they are not waiting for JEDI before beginning some of their major cloud transitions. Navy ERP is the first major application it’s moving to the cloud, but the service is also in the process of consolidating nine separate manpower and personnel databases into a single authoritative data source; it’s looking, for now, to put that information in Amazon Web Services’ GovCloud.
Notionally, the DoD strategy allows the military services to keep their data in non-JEDI clouds, called “fit-for-purpose” clouds, but only by special permission from the DoD CIO.
“Defining what those are will be key as we move forward,” said Ruth Youngs Lew, the Navy’s Program Executive Officer for Enterprise Information Systems. “We have a couple of Navy cloud contracts right now, but those are intermediate steps. Our plan is to fully transition to JEDI at some point in the future, when they get it awarded.”
|Nov 14, 2019||Close||Change||YTD*|
Closing price updated at approx 6pm ET each business day. More at tsp.gov
* YTD data is updated on the last day of the month.