The never-ending talk about cloud computing makes it seem like agencies have fully bought in and everything is going to the cloud. But a recent event with sever...
The never-ending talk about cloud computing makes it seem like agencies have fully bought in and everything is going to the cloud.
But a recent event with several federal technology executives showed just how far cloud and open source have to go.
Stan Kaczmarczyk, the General Services Administration’s director of cloud computing services project management office, said on July 21 at the GovExec and Red Hat cloud conference in Washington, that pick up on the infrastructure-as-a-service (IaaS) blanket purchase agreement has been decent, but not anywhere close to the government’s estimate of $76 million when GSA awarded almost five years ago.
Kaczmarczyk said the BPA has awarded about $55 million in contracts over really the last 3 1/2 years because it took time for the vendors to receive cybersecurity approvals.
He said GSA is hoping to extend the BPA, which ends in October, for another six months as the agency figures out its plans for the next generation of cloud contracts.
“We have a couple of things we are considering new contracts for,” Kaczmarczyk said.
GSA issued a request for information for a new cloud contract earlier this summer and received about 76 responses. Kaczmarczyk said GSA is reviewing the responses as part of its plans to develop a new multiple award contract.
And it’s not just IaaS, GSA’s email-as-a-service BPA also hasn’t generated the attention by agencies. Kaczmarczyk said the five-year BPA is half way through its lifecycle with an estimated value of $2.5 billion and agency use is nowhere approaching that ceiling.
GSA is working with the Defense Information Systems Agency on a memorandum of agreement to make the EaaS BPA a major force in the Defense Department.
“We think there will be a lot of traction for moving to the cloud in 2016 and 2017,” he said. “We have a budget process, the FAR and things don’t move as fast as some would have hoped.”
That’s very true too for the Federal Communication Commission, which said it plans to move a majority of its IT infrastructure to the cloud by 2017.
The hesitation around cloud isn’t just measured by the use of GSA’s BPAs, but agencies still are in the “crawl” stage in many cases.
Take the Transportation Department. Maria Roat, the agency’s chief technology officer, said there have been a lot of one-off contracts across DoT and they have deals with “quite a few” cloud services providers.
Roat said DoT is trying to standardize what they are doing in moving to the cloud, taking the stance that it’s just an enabler and trying to take a big picture perspective.
“When I came on board, I asked the folks who were doing the all the governance and managing the investments, print me out what’s going on for fiscal 2015, 2016 and 2017 around cloud. I said I want to know what the spending is for this year and the next two years,” she said. “It was an interesting report. There wasn’t a lot of money reported on it and it was not entirely accurate. Fast forward probably about five months after I was on board, somebody calls and says, ‘So and so just awarded a contract, and by the way it’s to a cloud provider.’ Really, I pulled up the little sheet and guess what? Zero, zero and zero for this year and the next two years for cloud spending. It turns out that as the contracts are issued, we have a lot of work that we need to do around the investment and the governance process because as the contract was issued, it went through the processes, but contracts didn’t catch that it was a services contract and behind that service was a cloud. They didn’t catch up and they didn’t understand it.”
Roat said DoT is working closely with the acquisition community so they can recognize and share information as it relates to cloud contracting.
“I’m trying to look out over the next three years where things are moving because it’s not only data centers, but how things are changing and what are the technologies and as we are moving to common platforms around geospatial and some of the things we are doing around data and building out hybrid cloud solutions, as we are moving into that, I need to some insight into those investment dollars,” she said. “I think Federal IT Acquisition Reform Act is going to help with that visibility, but it will take some time with that.”
DoT’s challenges with contracting for cloud aren’t unusual.
Wolfe Tombe, the CTO for the Homeland Security Department’s Customs and Border Protection Service, said his agency experienced a major email outage early on because a blade server failed and there was not a clear enough definition of what cloud means.
“We started asking some really directed questions. How is it conceivable that a failure of a blade could bring down your cloud? How do you define cloud? How is it architected? How is it set up? What are your automation controls? What are your elasticity capabilities?” he said. “For us, it was a critical lesson for us to learn. The vendor now is offering a real cloud service. We’ve learned to put in standard definitions into all of our contracts for cloud.”
He said CBP makes sure the vendors acknowledge these requirements and if they are missing any of them, they are not an actual cloud.
Tombe said the email in the cloud was CBP’s first foray into this technology four or five years ago, and it has taken time to adjust to the new oversight and requirements.
At the same time, Tombe said he’s trying to get CBP to adopt an open source first approach to all IT.
He said from a mission and cost perspective, it just makes sense to use non-proprietary software to the greatest extent possible.
“This is still a work in progress, but we really should look to see if there is an open source solution to a problem first and if not, look at the proprietary solutions,” Tombe said. “I think in some ways, it’s far more secure. All CIOs are heads down trying to make sure their systems, networks and apps are secure. But guess what, when it comes to a proprietary technology, I can’t tell for sure that technology is secure or not. Why? I can’t look inside it. I have to go to the vendor and say, ‘Can you tell me if it’s secure, please?’ and of course the vendor says, ‘Of course, it’s secure. It’s absolutely airtight, and don’t forget to patch every Thursday.’ That makes me feel really warm and comfortable. With open, I can look at it and I can do an analysis of the code.”
He said there is a lot of value and benefit with open source, and it’s definitely something to look at as CBP moves deeper into the cloud.
Tombe said he’s still trying to get buy-in from CBP executives, but it’s obvious open source is starting to gain momentum in government.
The General Services Administration issued a policy in July 2014 for the agency to consider open source first. The Justice Department in December released two open source application program interfaces (APIs). And the Office of Management and Budget is expected to issue a new policy on open source in government by the end of December as part of the administration’s commitment to open government.
This growing trend is something vendors and agencies alike should pay close attention to over the next year.
This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED