Back in January, the General Services Administration set out for comment a draft acquisition letter detailing how buying consumption-based cloud services could work through the schedules program.
In the letter, GSA says this approach would provide cost transparency without burdening contractors with additional transactional price reporting requirements.
It also would promote cost efficiency as it reduces the need to lock into long term contracts in markets where falling prices are anticipated.
This consumption model approach also would provide agencies with greater flexibilities to take advantage of technology improvements and better support cyber security.
Here we are nearly 10 months later, and the status of that acquisition letter is unclear. Did GSA shelve it or is it still working through the regulations process?
Either way, the move to a consumption-based model for cloud services is happening with or without a new policy.
Greg O’Connell, the senior director of federal at Nutanix, said there are several reasons why agencies are realizing the value of this pay-by-the-drink approach.
First, he said is the better understanding that 75% of all cloud workloads are predictable, meaning agencies know when there will be usage spikes and when there will be down time.
Second, he said the coronavirus pandemic has proven that having access to this type of approach was one of the big lessons learned over the last nine months.
O’Connell said for both of these reasons the combination of using the consumption model and a hybrid cloud sets agencies up for success.
“The federal enterprise represents a variety of these consumption models that have really emerged out of necessity,” O’Connell said on the IT Innovation Insider show, sponsored by Nutanix. “There are many instances where it’s unknown when there will be a burst requirement. But I think agencies also recognized there are services that they put or were looking to put into the cloud that were predictable but not really cost effective for the cloud.”
Rick Harmison, the federal cloud economist at Nutanix, said that cost is one of the biggest risks in moving to the cloud.
“In a case where it’s mission critical and it’s running at a fairly stable level, we’ve found it’s better to have it in a private data center or a private cloud versus out in the public cloud,” he said. “Security and cost optimization are factors that go into that burst versus non-bursting for workloads decision.”
A recent Nutanix and Market Connections survey found agencies remain committed to this hybrid approach. Out of 150 government decision makers across civilian, Defense Department and intelligence community found 6 out of 10 respondents say they are considering or already have moved application workloads back on-premise or to a private cloud from the public cloud.
“It’s more than economics. While there are cost overruns and unexpected expenses associated with public cloud, but there are also specific issues around data privacy and sovereignty risks of public cloud that has come into place as well as control over applications,” O’Connell said. “There are still a lot of legacy applications and services in government and there is a concept called data gravity that has to do with having certain applications running on-premise and other servicing applications being nearby that aren’t really capable of being supported in the cloud today. It gets back to the burden of reengineering these legacy applications.”
Harmison said another factor in the decision of which type of cloud is best is the idea of perishability versus non-perishability.
“It’s kind of like if you have home exercise equipment. This is an asset you bought in your home versus a gym membership. If I don’t use that gym membership, I essentially lose the value of that. If I have a physical piece of equipment a server or a data center, it’s still there. If I have to push out a project or not able to stand up a workload, I still have those servers and equipment and it’s still available to me in a month or whenever I need it. I think that’s what you are seeing in the cloud,” he said. “Agencies are aligning procurement, development and other financial aspects that the government and a lot of our private customers are still working through. This use it-or-lose it concept in the cloud is a big thing so cost optimization comes into play to make sure you are using all the resources in the cloud you are paying for.”
Harmison said the cloud is good for workloads and applications that will change and evolve in the short term as opposed to legacy systems that are more stable.
He said using on-premise data centers or private clouds helps control runaway costs or underutilization of compute or storage.
O’Connell said another important consideration is the need for flexibility and diversity during a time of budget uncertainty.
“We are seeing this more and more with the complexity of these contracts and these relationships with managed services, on-premise, private cloud, hybrid cloud and multi-cloud environments and the like. This concept of a subscription based services allowing an operational expense (OpEx) option gives the agencies that much needed flexibility to purchase what they need, when they need it and providing a degree of stability at a time when budget instability is an annual part of government business due to the annual continuing resolutions we are seeing,” he said. “In addition in this COVID environment and the complexities that you are layering on in terms of contracting availability and delivering services in these conditions all comes into play.”
Harmison added the pandemic has showed the cloud also can be the big enabler everyone expected it to be nearly a decade ago. He said agencies can take on projects more quickly because of the agility and limited capital investment.
Basically what it comes down to, Harmison and O’Connell say is moving to the cloud makes sense for the right workloads, and relying on the consumption model will help reduce the risks of cloud services.