DoD Reporter’s Notebook

New DoD personnel system hires cyber workers faster, but numbers still small

The Pentagon’s new personnel system for cyber employees is still in an experimental stage after having taken years to get off the ground, but it does appear to be achieving at least one of its intended objectives: Speeding up the federal hiring process.

So far, new employees coming in under the Cyber Excepted Service (CES) are being hired in less than half the time it took to hire them within the traditional competitive service, according to Gen. Paul Nakasone, the commander of U.S. Cyber Command.

Gen. Paul Nakasone, the commander of U.S. Cyber Command, speaks to Congress on March 13, 2019.

Nakasone told members of the House Armed Services Committee last week the average time-to-hire under CES is about 44 days, compared to 111 days before CYBERCOM implemented the excepted service.

“We have done over 21 different fairs. We’ve interviewed over 2,700 people. We’ve provided over 90 acceptances for job applications,” he said. “My perspective, early phase, is I’m a supporter of it. I look forward to continuing to utilize it.”

When Congress authorized CES in 2015, it gave DoD wide discretion to recruit employees into the new personnel system via any means they choose. They can opt to advertise positions via the government’s USAJobs website, but can also bypass it entirely and recruit candidates directly.

CYBERCOM says it’s used the authority extensively at job fairs, where it can give candidate job offers on-the-spot, usually after having pre-screened their written applications.

But Cyber Command, the Joint Force Headquarters-DoD Information Network (JFHQ-DoDIN) and the DoD chief information officer’s office are the only organizations the department allowed to use the excepted service during its first phase.

A broader rollout — to other Defense agencies and to the cyber components of the military services — has been slow to take place, partly because the Pentagon offices in charge of implementing CES have been underresourced.

Defense officials testified last month that only five full time employees in the DoD CIO’s office were working on policy and other implementation work. Kenneth Rapuano, the department’s principal cyber advisor, said last week that DoD had recently added two more staff members.

“But we need to supplement them going forward, and we believe we have a path to resources to do that in the relatively near term,” he said. “This is a priority. A challenge for the department is that we have a lot of priorities, but everyone acknowledges there’s no higher priority than this.”

In his written testimony, Rapuano said the department currently plans to convert about 15,000 of its existing civilian positions into the excepted service, a significant increase from the 3,000 DoD targeted when it began the first phase in 2016. But only 403 jobs have been converted so far.

The conversion of “positions” does not necessarily mean existing employees have moved from the Title 5 personnel system into the new excepted service.

When DoD established the rules for CES, it said that current employees would be given the option to stay grandfathered into the competitive service if they chose, even if their agency had decided to convert their positions to the new system. Employees have a one-time opportunity to decide to move to CES, and have to do so within 15 to 30 days after their agency converts their position.

But in its promotional materials for the new personnel system, the department has been telling people that there is no downside to moving into CES.

DoD said for most existing employees, all of the civil service protections and appeal rights of the Title 5 system still apply, but they’ll be eligible for potentially-higher, market based salaries, and possibly speedier promotions, since CES doesn’t require civil servants to spend a set time in one pay grade before moving up to the next.

But the rules are somewhat different for first-time federal employees who are hired directly into the excepted service. For example, unlike longtime civilians who are converting into the system, they’ll stay in a probationary status that makes them much easier to fire for the first three years of their careers.

Read more of the DoD Reporter’s Notebook


After years of neglect, military facility upkeep gets attention in 2020 budget

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

After several years in which the Pentagon knowingly scrimped on facility upkeep while it scrounged for operation and maintenance dollars to put toward military readiness, the Defense Department’s 2020 budget is finally beginning to approach the funding levels its own models say are needed to keep its infrastructure in decent shape.

For 2020, the Pentagon told the military services to peg their facility sustainment, restoration and modernization (FSRM) accounts to at least 85 percent of what DoD’s facility sustainment model says is necessary (the department’s longstanding goal has been 90 percent). Each service met or exceeded that mandate, and put billions of new dollars into FSRM.

Funding levels reached a low ebb in 2015, when the allocations were as low as 70 percent of the model. This added to a growing maintenance backlog and lengthening the list of facilities that have fallen into poor or failing condition.

The Air Force has planned the largest year-over-year increase of any the military services: Its $4.1 billion FSRM proposal for 2020 would be a 46 percent boost over what it received for 2019. Along with the funding increase, the service is shifting its philosophy for how to allocate its facility investments.

Officials said they would prioritize maintenance projects based on mission needs and where the funds would deliver the biggest bang-for-the-buck, abandoning an earlier strategy of fixing its worst buildings first.

The previous method seems to have been a losing battle, since facilities were deteriorating faster than the Air Force could fix them at recent funding levels. The service now has a backlog of $33 billion in deferred maintenance, officials told Federal News Network.

And it is not alone. Navy budget officials also told reporters last week that the Navy has $14 billion in deferred maintenance and repairs on its bases; the Marine Corps has $9 billion.

But both services also plan sizable increases in their FSRM budgets for 2020. They’re budgeting to 87 and 88 percent of DoD’s facilities model, respectively,  up from only about 80 percent this year. The Navy’s FSRM budget would increase about 25 percent compared to 2019 and the Marines would get a 43 percent boost, partly to help deal with damage caused by Hurricanes Florence and Matthew last year.

“This is an area where we’ve taken some risk in recent years,” Rear Adm. Randy Crites, deputy assistant secretary of the Navy for budget, said. “This investment is going to arrest the degradation of shore facilities, and it makes targeted investments in mission-critical infrastructure. And I think the increased funding is absolutely going to help with our material condition.”

The Army, meanwhile, would see about a 22 percent increase in FSRM funding, a level that would pay for about 85 percent of the spending suggested by DoD’s model.

The FSRM funds are separate from the Base Operating Support accounts that pay for day-to-day services, and from the Military Construction (MILCON) spending that funds new or replacement facilities. Each of the spending lines play a role in ensuring base infrastructure is adequate.

The department said it was requesting $36 billion in combined FSRM and MILCON funding, including nearly $3 billion to replace facilities that were destroyed or damaged by last year’s hurricanes.

The MILCON budget also includes $3.6 billion in funds the department is setting aside just in case President Trump decides to use emergency authorities to spend military construction money on his proposed border wall in 2020, just as he is preparing to do this year. Officials said they made that allocation to avoid having to take money away from projects Congress will have already decided to fund by that time.

But it’s too late to set aside similar funding for 2019, and any MILCON money the president diverts to the wall this year will have to come from projects Congress has already explicitly funded, and so the 2020 request also includes another $3.6 billion to “back-fill” those diversions.

On Monday, the Pentagon repeated an earlier promise that it would not divert any funding for contracts that have already been obligated. Rather, officials said, wall construction would only be paid for by deferring some 2019 MILCON projects until next year.

The department also released a full listing of the 2019 projects Congress has funded but for which no contracts have been signed yet. The list represents a rough approximation of the ones that could be vulnerable to delays this year if the president prevails in his legal and political fight with lawmakers over the emergency declaration.

At a hearing of the Senate Armed Services Committee last Thursday, Patrick Shanahan, acting defense secretary, said he would release the list by that afternoon — a commitment the department did not meet until Monday.

But even then, some Senators were incensed that they had not already received any of the MILCON details from DoD, and that the data would arrive only after a previously-scheduled vote to disapprove the president’s emergency declaration.

“I feel completely sandbagged,” Sen. Tim Kaine (D-Va.) told Shanahan. “The service secretaries have had that list … they have been willing to share the list of their unobligated MILCON projects, but they have been told that they cannot do that, it has to come through the OSD … I think we’re entitled to know where the money might come from, especially since you just said this is a multi-year declaration that opens up a spigot into the MILCON budget. I don’t think you giving us that list after the vote, when we’ve been asking for it for a month, is a good faith response to the request of this committee.”

Read more of the DoD Reporter’s Notebook


DoD audit uncovers millions of dollars in unaccounted-for spare parts

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Defense Department is just starting its second year of full-scale financial audits, and it’s likely to take many more before those efforts yield a clean opinion. But the process is already having at least one beneficial effect: It’s pushed the military services to account for tens of millions of dollars in government property they’d lost track of.

According to DoD’s auditors, property accountability issues are still among the most serious problems preventing it from passing an audit. In the first year of the full-scope examination, auditors issued more than 170 separate findings and recommendations detailing the military services shortcomings in tracking their small-item inventory and real estate.

But David Norquist, DoD’s CFO and comptroller, said progress along those lines has already delivered concrete proof for why the audit is not merely a paperwork drill.

“We discovered there are certain facilities where what they thought they had in inventory did not match what they had in inventory. And if your responsibility is spare parts for airplanes, the accuracy of that inventory matters,” he told the Senate Armed Services Committee last week.

One example was how, at Utah’s Hill Air Force Base, a stockpile of missile motors was erroneously listed as unserviceable even though they were in perfectly good condition. Putting them back into circulation instead of ordering new ones saved the Air Force $53 million.

“In other places, if you go to Osan and Kadena [air bases in Japan], they had 14,000 munitions worth $2.2 billion, and 100 percent were accounted for — not a single exception,” Norquist said. “What we’ve learned is there are some places that are doing this quite well, and there are others where we need to help them fix their processes, but the commanders in the field recognize the direct connection to mission and readiness. They saw the tangible value, and I think as we move forward, the accuracy of the data and adopting more businesslike practices will be tremendously helpful.”

Facilities ‘no one knew existed’

Instances of bad or missing data about entire warehouses worth of parts came up more than once during the course of the 2019 audit.

Thomas Modly, the undersecretary of the Navy, said the Navy found something similar when its auditors began examining a facility in San Diego.

“When we went out and actually started counting inventory and understanding where our stuff was, they found a warehouse that no one knew existed, and it had $26 million worth of parts for the E-2 and the F-18,” he said. “It was not categorized. It did not sit on any inventory system that we had in the whole Department of the Navy. Once that was identified, we were able to requisition $19 million worth of parts to aircraft that were waiting for them and were down because we didn’t even know we had those parts. This is a serious problem for us that we really have to get after, because at the end of the day, it impacts our ability to perform the mission, and our costs.”

The DoD Inspector General reported similar issues in its summary of the 2019 audit findings. More than 100 Blackhawk helicopter blades that were listed as available for use, but that were actually damaged. Fuel injectors stored in warehouses with no documentation to show which military service owned them. Entire facilities that had been demolished years ago, but are still listed as active on the military’s property books.

The IG reported 20 overall material weaknesses after the first audit, and then refined the list down to six that auditors thought were most concerning. Two of the six had to do with property — one encompassed spare parts and other inventory, while the other dealt with bigger-ticket items like real estate.

“We’ve gone out and said, ‘Give us a list of a certain asset and how many you have and where they’re located.’ And when we go, we either find that they have more than they thought, or the ones on their lists don’t exist,” said Carmen Malone, the deputy assistant inspector general for audit. “If you have something in your inventory records that actually can’t be used, you’re not going to order something, because you think you already have it. From an inventory standpoint, that is a big deal.”

Malone said one of the reasons the IG considers the property issue so serious is that it has a direct bearing on military readiness.

“It’s not just from a financial statement standpoint,” she said. “We are out talking to the everyday operating people and making sure that they understand that what they do impacts not just the financial statements. This information will be used as a central location for decision makers across the department from a readiness and logistics standpoint as well. If the information is accurate for financial statements, it’s going to be accurate for the decision makers, which ultimately affects the operations and readiness of the department.”

At last week’s hearing, Norquist declined to predict when the department will finally earn a clean opinion on its full financial statement, but he said he expected that either the Army or the Marine Corps would pass an audit of a small portion of their individual statements — namely,  their working capital funds — within the “next couple years.”

But Modly said his department has major, systemic challenges it still needs to solve with its accounting systems before audit passage is a reasonable probability — at least on an ongoing, repeatable basis.

“We have nine current general ledger systems. They’re not connected, and they create all kinds of disparities in our ability to truly understand our financial information,” he said. “We have business systems that are even more complicated that require interfaces that cause breaks in data security. Because of all those problems, we’re doing a lot of estimating, a lot of hand-jamming of information that most modern industrial corporations never have to do. Most modern industrial corporations can push a button and generate a financial report. We are not even close to that, and we have to get better.”

Read more of the DoD Reporter’s Notebook


For flexibility, Navy may bypass DISA for some long-haul network needs

The Navy has some new theories about how its bases should connect to each other, and with the internet. If they come to fruition, they could begin to displace the Defense Information Systems Agency’s longstanding role in providing the wide-area network backbone the military services depend on.

As part of a prototype project set to begin this spring, the Navy is testing whether it might make sense to bypass DISA as its main provider for long-haul telecommunications services and outsource them to one or more commercial providers. The project would both connect the Navy’s users to the commercial cloud, and connect its bases with one another.

The concept, which the Navy calls Network-as-a-Service, would fundamentally alter the logic of how data flows across Navy networks, and comes as the service aims to move 100 percent of its IT systems to public and private clouds.

As part of that vision, a large number of applications will be hosted by commercial cloud providers. So Navy leaders are wondering if the current model — where off-site users’ traffic is first funneled through a secure connection to a Navy facility, then back out through a government-operated internet access point — makes any sense.

“We have folks in the acquisition community who say, ‘Well, maybe we can do 100 percent of our job from the commercial cloud,’” Andrew Tash, the technical director for the Navy’s Program Executive Office for Enterprise Information Systems told an audience at a Navy industry day in San Diego last week. “If that’s the case, then why shouldn’t we have the most efficient access to those services and not be forced to actually log into an on-premise network and then be routed over? We really want to take advantage of direct access to those services.”

For the prototype, the Navy wants vendors to help prove or disprove its current working theory: that cloud service providers (CSPs) and telecom companies can deliver more seamless, less expensive routes between its users and the commercial cloud, do a better job of interfacing with the public internet than DISA’s current Internet Access Points, and connect Navy bases with one another.

“We have a lot of decisions to make in the Department of Navy with respect to network architecture, and many of those decisions are based on assumptions, not on quantitative information about performance,” said Will Stephens, who leads business and technology strategy for PEO-EIS. “So the purpose is to set up alternative connection methodologies to allow our users to get to the cloud through the CSP’s own internet access point, and also to get connectivity from one base to another using the CSP’s services rather than our current way and services, which we know are a little bit difficult — they’re not dynamic to allocate and adjust.”

The Navy is soliciting the work through its Information Warfare Research Project, a $100 million Other Transaction Authority vehicle it established last year for rapid IT and cyber prototypes.

Officials expect to make an award for the Network-as-a-Service experiment by Apr. 26; the Navy wants a working prototype up-and-running by July 26.

“It will be a connection to our production environment, and we’ll have two network paths from that point of presence: one across our current network path, and one across the new network path that we’re setting up as part of this Network as a Service architecture,” Stephens said. “We’ll also get non-binding cost estimations so that we can determine whether or not this is feasible from a cost perspective.”

Read more of the DoD Reporter’s Notebook


DoD’s long-term cloud strategy in stasis amid JEDI controversy

With the Defense Department’s JEDI Cloud contract at the center of bid protests, a new conflict of interest investigation and now a separate criminal probe, the most important elements of a cloud computing strategy DoD published only a month ago have been essentially frozen in amber.

Dana Deasy, Department of Defense, Chief Information Officer

The key features of the strategy were the concepts of “general purpose” and “fit-for-purpose” clouds. In it, the department said it wanted to move most of its applications and data to the former — JEDI — while also making decisions about which of the mission-specific clouds being built by defense components should be allowed to survive.

But in testimony to the House Armed Services Committee last week, Dana Deasy, DoD’s chief information officer, said all of that work is effectively on hold until the dust settles around JEDI.

“The longer we delay standing up a JEDI capability, the military services are going to need to go solve for mission sets, and they’re going to continue to stand up their own individual environments. I don’t see that as being beneficial over the long term to the department,” he said. “The fine line we’re walking right now is to not impede the need for mission success — where people are standing up [their own] clouds — and as soon as we can, provide clarity to the DoD on when the enterprise cloud will be available and then redirect those activities onto JEDI.”

The strategy the department released on Feb. 4 envisions a universe in which an overwhelming majority of the military’s systems and data are housed in the JEDI cloud, partly because officials believe that is the only reasonable approach to eliminating DoD’s existing IT stovepipes and making its vast data holdings available to the various artificial intelligence algorithms.

At the same time, the DoD CIO is supposed to comb through the roughly 300 cloud projects various DoD components have already begun, and decide which are candidates for the “fit-for-purpose” clouds that won’t fit within JEDI.

But Deasy made clear that neither of those things can happen until the JEDI matter is resolved.

He said he currently believes that up to 90 percent of the new applications the military develops going forward should be designed for cloud architectures, and should be able to operate within the “general purpose” cloud.

“But the big thing hanging out there right now is until we know what that architecture and that cloud’s going to look like, it’s very difficult to start estimation exercises.”

Consolidating clouds

As for the strategy’s promise to begin determining which existing clouds will be allowed to continue operating, that work is also on hold.

“That is something we still have to do,” he said. “Right now, obviously, our focus is to make sure we know what the architecture is going to look like for our general purpose, which will help inform us on things that will stay fit-for-purpose, or move over. I would be surely guessing as to a certain percentage of a number of those 300 that will be migrated onto general versus fit-for-purpose until we understand the overall architecture.”

Deasy said his office believes it will have clear enough picture of JEDI’s eventual architecture about 60 days before it comes online, and at that point, will be able to start making decisions about which applications can transition to the general purpose cloud.

But the “go-live” date is still highly uncertain.

DoD had initially planned to make a JEDI award by April of this year. But in a legal disclosure last month, Chandra Brooks, the project’s contracting officer, said she is conducting a new investigation into allegations of conflict of interest involving Amazon Web Services and a former DoD employee, Deap Ubhi.

In an affidavit to the Court of Federal Claims, Brooks said even after the investigation is complete, she will wait another 90 days before making an award.

The military services are proceeding on the assumption that something resembling DoD’s JEDI vision will eventually exist, but are not waiting around to begin at least some large-scale transitions to commercial cloud providers.

The Navy, for example, is already in the process of moving some of its largest business systems — including its main enterprise resource planning system and several personnel databases — to the cloud, using its own contracts.

It remains unclear whether those will eventually be deemed “fit-for-purpose” clouds that will survive the JEDI transition, or whether they will be subsumed into JEDI.

“Defining what those are will be key as we move forward,” said Ruth Youngs Lew, the Navy’s program executive officer for enterprise information systems. “We have a couple of Navy cloud contracts right now, but those are intermediate steps. Our plan is to fully transition to JEDI at some point in the future, when they get it awarded.”


Air Force looks to build ‘big idea pipeline’ to expand industrial base

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Air Force spends north of $40 billion on acquisition and R&D each year,  a sum that, in theory, ought to be plenty to sustain a healthy and competitive industrial base.

But one of the service’s big problems, according to its acquisition chief, is that those dollars tend to be concentrated among only a relative handful of major systems. Meanwhile the big investment decisions the Air Force makes to introduce new technologies are several years apart from one another.

“It takes decades between fighters, decades between tankers,” said Will Roper, the assistant secretary for acquisition, technology and logistics. “When the Air Force started off, it was years — in the small single digits. A fundamental flaw that we have in working with the industry base is that we don’t do enough big ideas, enough prototypes, enough diversifying, because the frequency of our awards is too slow.”

No ‘valley’ in a pipeline

Among the disadvantages in the Air Force’s habit of making large, infrequent spending decisions is that there aren’t many opportunities for new competitors to gain a foothold in the market, Roper said.

So, as a remedy, he said the service will create a “big idea pipeline” in which it will commit itself to funding a greater number of technologies and at a faster clip. And he suggested the Air Force’s funding commitments will go beyond small prototypes: They’ll need more institutional commitment behind them to carry new technologies across the “valley of death.”

“There’s no valley in a pipeline, right? A pipeline keeps flowing. It’s important that we adopt this mentality, because a pipeline is inherently competitive,” Roper told the Air Force Association’s annual winter symposium in Orlando. “We can’t predict the future. If I told you what 2030 would be like … the future is so uncertain that it doesn’t make sense to predict the adversary at that timeframe and then build the Air Force that beats them. That’s too risky and it’s too prescriptive. We need to be creating new concepts constantly that challenge what our opponents think about us, that impose cost, that make them react to us, that pull them off their game plan. And right now there’s not a concerted effort to do it.”

The Air Force is already in the midst of  an expansive review of its existing science and technology policies and spending. Heather Wilson, the Air Force secretary, announced the study in Sept. 2017, saying she wanted the service to find ways to expand its science and tech partnerships in ways that would be directly relevant to warfighting in the coming decades.

Wilson told last week’s conference the study is “very close” to being finished, and Roper suggested the service planned to use the results to inform its “big idea pipeline.”

As one example of how the new approach might feed into real-world systems, he offered the forthcoming Advanced Battle Management System.

ABMS is the Air Force’s notional replacement for J-STARS, the airborne surveillance system it uses to track enemy forces on the ground. Instead of replacing one aging surveillance airplane with a newer airframe, it’s decided to try to deliver the same capabilities via a networked system of sensors that it hopes will be more survivable in future combat scenarios.

“We are going to have to break it up and put part of it in space, part of it in air, potentially have an attritable layer, and have the networking and data sharing amongst all of it so that it works as one organism, even though it really is a separate diversified family of systems,” he said. “We’ve never done anything like that. So has to be shepherded differently, because it will bring in next gen technology, but in a framework that will move forward into a program of record.”

Using savings from weapons sustainment

Feeding the pipeline will require some money beyond the $2 billion the Air Force already spends on science and technology. And Roper says he hopes to find much of it by squeezing savings out of the service’s existing weapons system sustainment costs, which currently make up about 70 percent of the total lifecycle spending for an average weapons platform.

Cutting those costs by as little as 5-10 percent would go a long way toward making room for additional “big ideas” that would not otherwise see the light of day in the Air Force’s current funding portfolio, he said.

And Roper said there’s good reason to believe there are a significant number of dollars to be wrung from sustainment, because even though the Air Force’s logistics workforce is “defying gravity” with the quality of their work, the depots they work in are not always equipped with state-of-the-art technology and business processes.

“If you go across this enterprise, it’s not a lot of technology – not the things you would see in commercial industry. You’re not going to see a lot of 3D printers. You’re not going to see data analytics. And if we look at what that’s doing in commercial companies, it’s saving a lot of money and actually increasing efficiency. Additive manufacturing is a novel concept for us, but historically it’s not novel. Militaries in the past, if you went back and we asked Julius Caesar, he had to be able to shoot arrows and also make them, right? We’re getting back to the point where we can make things.”

But increasing the Air Force’s uptake rate for new technologies isn’t just a matter of making room for them in its budget.

Roper said the service realizes it also needs to make itself a much more attractive customer for smaller, non-traditional vendors, and that its acquisition workforce needs to become comfortable working with multiple types of companies — from huge defense primes to small startups.

As one way to address the latter group, the service is planning an event in New York later this week where it will sign $40 million in contracts with small firms that make dual-use products. At the “Pitch Day,” each company the Air Force selects out of a group of 60 will be given a contract that’s no longer than one page, and the service will pay the firms on the same day, using government purchase cards.

“Pitch day will be our first entree to have an open door to nontraditional companies with dual-use technologies,” Roper said. “But it needs to be more than that. Prototyping is helping our defense industry base, but we need to do more than that. We will shepherd through the first generation of big ideas, but we need to be discussing those at every AFA.”

Read more of the DoD Reporter’s Notebook


Court puts JEDI lawsuit on hold while DoD investigates conflicts of interest

A federal court has agreed to temporarily delay any further proceedings in a lawsuit challenging the the Pentagon’s JEDI Cloud contract while the Defense Department conducts its own investigation into whether the procurement was compromised by conflicts of interest.

In a one-paragraph ruling on Tuesday, Court of Federal Claims judge Eric Bruggink ordered a stay in the case, filed by Oracle. The court had previously been scheduled to hear oral arguments on the company’s bid protest during an April 4 hearing.

But the government filed a motion on Tuesday asking the case to be put on hold. That motion was filed under seal, but according to Bruggnik’s order, DoD requested it because it is “reconsidering whether possible personal conflicts of interest impacted the integrity of the JEDI Cloud procurement.”

Oracle has long contended that at least two Defense officials with ties to Amazon Web Services helped shape the procurement in a way that favored AWS from the outset, including by insisting that the up-to $10 billion contract be awarded to only one vendor.

In particular, the company has pointed to Deap Ubhi, a former member of the Defense Digital Service who it claims had a “personal and substantial” role in the early days of JEDI procurement planning. Ubhi at that time was a former AWS employee, and has since returned to the company. Oracle alleges he was also in negotiations for AWS to purchase his startup company during his tenure at DDS.

As one indicator of Ubhi’s role, Oracle, in a court filing last week, cited an internal Sept. 2017 message to him from Sharon Woods, DDS’s general counsel, saying that she was “nervous” about the single-award approach he had been advocating, and asking for a more detailed explanation for why it made sense.

Later, “Ubhi drafted the ‘Problem Statement’ for DoD leadership that ‘explains the problem we are solving with this initiative,’ including ‘why can’t we solve the problem with multiple clouds,’ and ‘why is only one cloud a truly necessary requirement,’” according to last week’s court filing. “Ubhi also contributed to the Business Case Analysis, a document that ‘serv[ed] as a foundation’ for JEDI.”

The company claims there was a similar conflict of interest on the part of Anthony DeMartino, who served as Shanahan’s chief of staff during the early stages of the JEDI planning effort.

DeMartino had previously done consulting work for Amazon, and because of that, department ethics officials had advised him not to participate in any matters involving AWS without prior approval from the Standards of Conduct Office. But Oracle, again, citing DoD records, claims he ignored those directions by directly involving himself in the JEDI procurement.

The conflict of interest claims are only one of the complaints Oracle made in its lawsuit.

In last week’s motion, which asked the court to issue a summary judgment in Oracle’s favor, the company said the Pentagon had settled on a single-vendor strategy for its JEDI Cloud contract from the very outset, but sought to hide that fact in order to avoid public controversy.

Citing internal records it obtained as part of the suit, it said DoD’s Cloud Executive Steering Group decided on the single-vendor approach at its very first meeting. That meeting was held just one day after Deputy Defense Secretary Patrick Shanahan signed a memo establishing the CESG in September 2017.

Lawyers for the tech firm argued that the early date is relevant, because it is one indication that the legal analyses DoD eventually prepared for Congress and contractors to justify its single-award decision were “one sided” and designed to support a conclusion its officials had already reached months earlier.

They also alleged that the government had withheld documents about those and other parts of the planning process from the Government Accountability Office during an earlier protest, in which GAO ruled in DoD’s favor.

“The [contracting officer’s] July 17, 2018 memorandum (prepared months after DoD chose the administrative convenience of a single award) prejudicially violates the law,” attorneys wrote. “The CO did not meaningfully consider the benefits of competition, arbitrarily inflated the cost of competition, and undermined Congressional policy.”

Although federal law tells agencies to give preference to multiple-award ID/IQ contracts whenever they can, Oracle claims DoD took several steps to restrict competition from the early stages of the procurement.

Among the decisions it’s challenging in the suit, the Pentagon is insisting that cloud competitors have their services authorized by the government’s FedRAMP program in at least three physical data centers that are at least 150 miles apart.

The company claims those prerequisites have more to do with reducing the number of potential bidders than actual performance on the JEDI program, since the contract will use its own cybersecurity plan, not FedRAMP. Another “gate criteria” Oracle criticizes as irrelevant is the requirement that the winning firm’s JEDI revenue not be larger than the commercial cloud business it had as of January 2018.

“The gate criteria served their improper purpose,” attorneys wrote. “When first announced, JEDI sparked intense interest in the contracting community: Hundreds of companies attended the industry day, and more than sixty companies submitted detailed RFI responses identifying extensive, sometimes unique capabilities. But following the issuance of the final RFP and its restrictive gate criteria, only four companies attended DoD’s in-person question and answer session and ultimately submitted proposals.”

In its earlier decision, GAO acknowledged the preference in federal law for multiple-award contracts, but found that DoD has wide discretion to take advantage of exceptions to the rule for national security reasons.

GAO did not rule on another Oracle’s conflict of interest claims, because it said any conflicts wouldn’t be relevant until and unless AWS actually wins the contract.

But at the Court of Federal Claims, Oracle has maintained that the alleged conflicts had a direct bearing on how the Pentagon structured the contract in the first place.

 

Read more of the DoD Reporter’s Notebook


Navy plans 3 OTAs to modernize its networks

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Later this year, the Navy will make billions of dollars in awards for the next phase of its Next Generation Enterprise Network contract. But the longer-term future of the Navy’s ashore networks might be determined by a series of smaller, upcoming other transaction authorities.

Next week, the Navy plans to issue three separate problem statements to pave the way for what it’s calling Modern Service Delivery: A cloud-centric concept that posits that its IT users ought to be able to access roughly the same services, no matter whether they’re physically located on a Navy installation or are remotely connected via a mobile device.

“Today we have capabilities available to us when we’re on the network, and then a different set when we’re mobile. We want to drive parity for access to services and systems and data, whether we are at work, at home or on the go,” said Andrew Tash, the technical director for the Navy’s Program Executive Office for Enterprise Information Systems (PEO-EIS).

Speaking at the Navy CIO’s annual San Diego conference, Tash said the three OTA projects PEO-EIS is launching represent its latest approach to a rapid technology development process it first began in 2015 called the “innovation cell.” It plans to manage the awards through the Information Warfare Research Project, a $100 million OTA vehicle the Navy’s Space and Naval Warfare Systems Command established last summer.

As a starting point for ubiquitous data access, the Navy thinks it needs a new approach to identity and access management. So the first problem statement will ask vendors for an “integrated suite” of identity management capabilities that can work across the service’s systems.

“Identity is the new boundary. It assumes breach and focuses our security in a different place: it focuses on the data, and it focuses on user behavior as opposed to the network boundary,” Tash said. “That’s really the main thrust of our strategy, getting away from tightly-coupled services that force people to be on a network to access them. It’s just users connecting to data and services. It may seem pretty simple, but it removes a lot of silos and bad IT behavior that we’ve grown up with over the past 15 or so years.”

The Navy is already beginning to move some of its largest business IT systems — including its main enterprise resource planning system and some of its personnel databases — into the cloud.

And since its key data assets are already beginning to live in a mixture of on-premises systems and commercial hosting environments, giving location-agnostic access to all the data a particular user might be entitled to isn’t just a matter of creating a secure tunnel to the Navy-Marine Corps Intranet.

So the second problem statement will aim to build out the concept of Network-as-a-Service — a structure that would virtualize the Navy’s networks in the same way cloud computing lets it virtualize its servers.

“If all of our business systems are moving to commercial cloud, then shouldn’t we have the most efficient connectivity to the commercial cloud, to the point where I can do 100 percent of my job from commercial cloud services? So now we’ve got to rethink what the [DoD Information Network] means in that context,” Tash said. “Yes, the data is part of the DoDIN, but how we consume that and how we inter-operate with the rest of the DoD community is up for debate. But we look at the business systems as being the primary opportunity right now to leverage things like network as a service.”

The third task to the IWRP consortium’s members will look for technologies the Navy could use to manage its users’ access to the systems it’s moving to commercial cloud environments.

That approach, called Cloud Access Security Broker (CASB), tries to monitor users’ interaction with cloud services and enforce an organization’s security policies even when they’re connecting directly to the cloud provider from an “off-premises” location like their home or mobile device.

Cloud future depends on JEDI

Many of the long-term details of the Navy’s cloud-dependent vision will depend on the final disposition of DoD’s forthcoming Joint Enterprise Defense Infrastructure (JEDI) contract.

The highly-contentious solicitation is still tied up in federal court, but in its new cloud computing strategy earlier this month, the Pentagon made clear that it expects the military services and Defense agencies to use JEDI as their primary solution for cloud services.

But Navy officials say they are not waiting for JEDI before beginning some of their major cloud transitions. Navy ERP is the first major application it’s moving to the cloud, but the service is also in the process of consolidating nine separate manpower and personnel databases into a single authoritative data source; it’s looking, for now, to put that information in Amazon Web Services’ GovCloud.

Notionally, the DoD strategy allows the military services to keep their data in non-JEDI clouds, called “fit-for-purpose” clouds, but only by special permission from the DoD CIO.

“Defining what those are will be key as we move forward,” said Ruth Youngs Lew, the Navy’s Program Executive Officer for Enterprise Information Systems. “We have a couple of Navy cloud contracts right now, but those are intermediate steps. Our plan is to fully transition to JEDI at some point in the future, when they get it awarded.”

Read more of the DoD Reporter’s Notebook


With new legal authority, Pentagon says it’s turned corner on enterprise IT

When Congress gives the Defense Department new reform-oriented legal authorities, there’s never a guarantee that DoD will ever pick them up and run with them.

But when it comes to the powers that officially took effect for the DoD chief information officer at the beginning of this year, officials don’t intend to waste any time.

Brig. Gen. Dennis Crall, principal deputy cyber adviser to the secretary of Defense

The new laws, meant to give the Pentagon more authority to enforce common IT standards throughout the military services, are coming into play already, officials told the Senate Armed Services Committee last week.

After nearly a decade in which DoD has used more carrots than sticks to push IT consolidation, they suggested that the department will take a much more directive approach over the coming year.

“You have a set of leaders that are very impatient, including myself, that are done admiring the problem and are moving on to tasking,” DoD CIO Dana Deasy said. “This includes being less tolerant with people being able to go off and use their own solutions. The authorities that you all gave me starting this year around being able to set architectural standards are quite significant, and we are now starting to use those new authorities.”

Defense officials offered few specifics about the types of changes they intend to direct, but much of the way forward will be laid out by a cross functional team led by Brig. Gen. Dennis Crall, the principal deputy cyber adviser to the secretary of Defense.

Crall said the department spent most of the last year setting the stage for the reforms it plans to implement, including by publishing a new cyber strategy. This year, he said, will be all about starting to deliver meaningful results.

“So while it’s a good year for implementation, I would say it may not be a good year for some other things,” he said. “The first is stovepiped solutions. It’s a bad year for those who like to approach this in a way that we have endless niche capabilities and do business their own way: lack of standards, individual development, and difficulty in integrating. We’re putting an end to that practice, which has really robbed us of success. It’s also a bad year for those who don’t like measures of effectiveness or discussions on data-driven return on investments. We owe an accountability for how we’ve spent our money and also a level of accountability on what capabilities we’ve achieved in the expenditure of that money and effort.”

The sentiments Deasy and Crall expressed appeared to be in tune with the frustrations Congress laid out when it passed the 2018 National Defense Authorization Act, which contained the new authorities.

Specifically, the legislation gives the CIO the authority to set departmentwide IT and cyber standards, plus the mandate to conduct annual reviews of the military services’ budgets to ensure they’re setting aside enough funds to implement a common cyber approach.

“Countless efforts across the Department of Defense are plagued by poorly enforced standards and a CIO position whose policy and guidance are largely considered as advisory by the services,” the Senate Armed Services Committee wrote in a report accompanying the legislation. “As a result, each service continues to pursue disparate information technology and business systems efforts.”

That same year, lawmakers vented frustration that the cyber advisor position now held by Crall has historically been a weak player in the Pentagon’s overall power structure.

“The office of the [principal cyber adviser] has been chronically under-resourced since its establishment, and (members) are concerned about the impact of under-resourcing on the PCA’s ability to effectively execute its assigned roles and responsibilities,” they wrote in the final NDAA’s explanatory statement. “The conferees believe that the PCA should be robustly manned and resourced.”

But Crall suggested the tide had turned since that language was written. He said the cross-functional team Congress ordered to staff the office has proven effective over the past year.

“Congress got that right. The cross functional team works, and it has several advantages,” he said. “It’s only as good as it’s paid attention to, but the cross functional team that’s involved under the PCA is well-resourced in the sense that we’ve got the right people. The participating agencies that provide representation of the workforce sent us their best. We got good people. The second piece is we can approach problems in ways that don’t have some of the biases. We don’t have any stake in the legacy systems that we hold onto, it really is about the mission. So we normally come to the table with an advantage in solving some of those problems. It’s been instrumental in moving the strategy into implementation.”

DoD first started drawing up plans to collapse its legacy networks and IT services in 2010 under an initiative known as the Joint Information Environment, and began taking serious steps toward implementation — with an emphasis on shared cyber infrastructure, enterprise services and cloud computing — in 2012.

However, the Pentagon has struggled to herd the large number of cats involved in implementing the vision, and to define the project’s cost and scope, according to the Government Accountability Office. And in recent years, the “JIE” moniker has begun to fall out of favor: It was not uttered a single time during last week’s 90-minute hearing on cyber standards.

In its annual report last week, the office of the Director of Operational Test and Evaluation did offer a short description of JIE’s status, but said that DoD’s Joint Regional Security Stacks — one of its key foundations — is performing poorly, and “calls into question the current JIE cybersecurity approach.”

Read more of the DoD Reporter’s Notebook


New DoD strategy touts multiple clouds, but with bias toward JEDI

A long-awaited update to the Defense Department’s cloud strategy gives a nod to the “multi-cloud” approach many of its vendors have urged the Pentagon to adopt. But the same document makes clear that the department still intends to move the majority of its applications to a cloud service operated by a single vendor.

The new strategy, released on Monday, envisions a mix of hosting environments: A “general purpose” cloud that will be provided through the department’s controversial JEDI contract, several “fit for purpose” clouds operated by other vendors and some remaining DoD-operated data centers that will continue to house applications that still aren’t cloud-ready.

But the goal is for systems to migrate to the “general purpose” infrastructure- and platform-as-a-service cloud unless there’s a compelling reason not to.

“Only when mission needs cannot be supported by general purpose will fit for purpose alternatives be explored,” Defense officials wrote in the strategy, signed by acting Defense Secretary Patrick Shanahan. “In such a case, a mission owner will be required to submit for approval an exception brief to the office of the DoD CIO describing the capability and why the general purpose cloud service does not support their mission.”

The document also makes clear who will be in charge of implementing, overseeing and governing the department’s cloud plans: DoD’s chief information officer. Initially, the JEDI effort had been led by a steering group made up of multiple DoD organizations; the first instantiation did not include the CIO’s office as a voting member.

“At some future date, once the general purpose cloud environment is fully implemented and fit for purpose implementations have matured, it is possible that overall leadership could be transitioned to a different organization inside DoD,” officials wrote. “The DoD CIO will establish an enterprise cloud organization with appropriate leadership and the required governance forums to ensure that overall objectives and implementation plans as described in this strategy are enacted. The DoD CIO will leverage existing governance forums to the greatest extent possible.”

DoD CIO task list

One of the CIO office’s tasks will be to begin scrutinizing the cloud environments the military services and defense agencies have already created or contracted for on their own. The office will work with those DoD organizations to either come up with “thoughtful” migration strategies to move to JEDI, or potentially give the clouds they’re already using an official “fit for purpose” blessing.

But the strategy strongly suggests that many of those existing clouds will not survive the process, because the Pentagon believes a proliferation of cloud environments across the department is already hindering interoperability.

“A lack of guidance has led to departmental inefficiencies and has hindered the department in IT modernization efforts,” the officials said. “It has led to disparate efforts with siloed teams, disjointed implementations with limited capability, siloed data and inefficient acquisitions that cannot take advantage of economies of scale.”

But more formal and written guidance is coming, officials promised. It will be drawn up in cross-agency forums organized by the CIO. Those forums will also come up with detailed implementation plans for how the department will move to the cloud.

The strategy itself offered few details about how DoD will accomplish its cloud migrations. Rather, it reads largely as a statement of principles and an explanation for why the department believes it must finally adopt the “cloud first” stance that the entire government was supposed to embrace more than eight years ago.

But the document did lay out five “lines of effort” DoD considers essential, and that will form the basis for a still-forthcoming “cloud migration playbook.”

Those are:

  1. Technical Build
  2. Governance
  3. Automated provisioning and billing
  4. Migration capability
  5. Workforce development

In a statement, DoD CIO Dana Deasy said the strategy would pave the way for his office to consolidate the department’s IT services into a construct that’s more secure, scalable and reliable.

“This marks a milestone in our efforts to adopt the cloud and also in our larger efforts to modernize information technology across the DoD enterprise,” he said. “A modern digital infrastructure is critical to support the warfighter, defend against cyberattacks and enable the department to leverage emerging technologies like machine learning and artificial intelligence.”

Read more of the DoD Reporter’s Notebook


« Older Entries