Despite what may have been the best of intentions over the last several years, the Navy, like the rest of the Defense Department, has struggled mightily to migrate its systems to the cloud over the last several years. Although most of its public-facing websites have made the move, all but about 20 of its internal applications are still operating in legacy Navy-owned data centers.
But officials say they’re taking several steps over the coming year that are meant to remove cloud transition bottlenecks.
One is to expand the number of cloud options available to Navy commands and system owners. As of now, the service still has just one broadly-available commercial cloud contract: the limited-scope agreement for Amazon Web Services that’s allowed it to migrate all of those public websites.
But the Navy expects to make a much larger “enterprise” contract award by June 2018 with the explicit purpose of accelerating cloud adoption throughout the service, said Dan Delgrosso, the technical director for the Navy’s Program Executive Office for Enterprise Information Systems. A draft request for proposals should be released to industry before the end of this calendar year.
Insight by Veritas and Carahsoft: Learn about the range of data practices and strategies needed for today’s policy and compliance environment in this free webinar.
“We have about 1,200 applications and systems scattered all across CONUS and overseas. Our job over the next five years is to fix that,” he said, at AFCEA Northern Virginia’s annual Navy IT Day. “Our goal, our position, is to move 100 percent of them to the commercial cloud. That means Amazon, that means Microsoft, that means Google, IBM and so on and so forth. We recognize the fact that that may not be the case for some of the applications and systems that are out there, but our default position is 100 percent of it will move to the commercial cloud.”
The Navy intends to release a draft request for proposals for the new enterprise cloud contract by the end of this calendar year, and wants the new vehicle to offer a wide variety of options, including infrastructure-as-a-service, platform-as-a-service and software-as-a-service. In addition, its San Diego-based SPAWAR Systems Center-Pacific awarded a cloud contract in September that the service intends to use for DevOps.
Officials have previously said they intend to treat 2018 as a “bridge year” during a five-year plan to move most systems (up to and including top secret ones) to commercially-operated cloud environments by 2021. In 2018, the Navy has enough capacity in its existing arrangement with Amazon (and Red River, its current cloud systems integrator) to move 50 more applications to the cloud before it starts to leverage the larger contract it plans to sign next year.
The Navy is also using the bridge period to tightly define its cybersecurity requirements for future cloud migrations, including by drafting new, standard contract language that will have to be incorporated into each of its cloud agreements with industry. Rear Adm. Danelle Barrett, the Navy’s new chief information officer, is leading that project.
“We shouldn’t have a whole lot of cloud contracts, because it gets real complicated in terms of responding to cybersecurity incidents, and less is more, but the contract language will put the service providers on notice that it is still our data even though it’s in your environment,” DelGrosso said. “At the same time, there will be a very close partnership with these providers in terms of being able to see our logs, being able to hunt, being able to do incident response in partnership as though we were sitting right next to them. In fact, I would not be surprised if there comes a day when we are physically sitting side-by-side with our providers watching our data in the cloud.”
Long before things get to that point though, the Navy needs to develop more of its own internal technical expertise on cloud computing, something DelGrosso said is sorely lacking at the moment and that will also have to be incorporated into future cloud contracts.
“Let’s just talk reality here: we don’t have a very deep bench when it comes to cloud experts inside the Navy. We have engineers at our warfare centers, we have some others scattered around that kind of know cloud, but don’t really know cloud. There’s probably a handful of those in the Navy,” he said. “So we’re going to put training in our cloud contracts, because we need to. The intelligence community does that today, and we’re going to replicate that.”
The intention is to disperse those newly-minted cloud experts broadly across the service so that the Navy has a multitude of entry points to help commands transition their systems to the commercial cloud.
Until this point, the Data Center and Application Optimization office within PEO-EIS has served as the Navy’s sole cloud broker. DCAO has been in charge of providing system owners with technical advice about how to transition their legacy applications to the cloud or other data center environments, and, at one point, planned to open its own “cloud store” to serve Navy customers with multiple service offerings from multiple commercial vendors.
But DCAO’s broader menu of cloud options never came to fruition, and DCAO itself will shut down in 2018 once it’s finished its primary mission of moving applications out of 118 data centers already targeted for closure.
After that, the Navy will move to a fee-for-service model in which it envisions a “franchised” network of cloud brokers working within various commands and functional areas to help system owners procure cloud services and migrate their systems. The Navy’s CIO office will oversee the decentralized model.
“We’re going to learn as we go, we’re going to work with DoD on how they’re doing business try to align with that to the greatest extent possible and then we’ll go from there,” DelGrosso said. “The goal between now and the end of 2018 is to build the landscape, build that highway for the app system and functional area managers to succeed from 2019 and beyond, until all of our data, to the greatest extent possible, is in the cloud. There will be exceptions, but those exceptions will have to show due diligence as to why someone or something or system cannot go to the commercial cloud. We are going to be very, very particular about that.”