Who is watching over “cloud town” to preempt vendor lock-in, track and enforce service level agreements (SLAs), and ensure that when a new cloud provider says they are backing up your data locally in the United States, that they aren’t in fact sending it overseas to take advantage of cheaper hosting options and pocketing the difference?
After all, how would you ever really know?
For years the General Services Administration (GSA) has contemplated the viability of implementing a cloud service brokerage model. The primary goal is to provide agencies with affordable, yet secure, access to cloud services.
An ambitious undertaking, without question. But according to Market Research Media, “the U.S. Federal cloud computing market will surpass $10 billion by 2020, growing at CAGR 16.2 percent in the period 2015-2020,” and CIO.com concurs that “overall public-sector spending on the cloud up 72 percent since 2012, and poised to reach $6.5 billion by 2017.” So with more and more federal agencies making large investments to move away from traditional agency-specific private data centers to more hybrid and public cloud offerings, doesn’t it make sense to create a governmentwide indefinite delivery, indefinite quantity (IDIQ) vehicle with closely coupled brokerage services to facilitate this cloud movement?
Yes, but like everything in life, it’s much more complicated than it appears.
In general terms, a broker is a person or entity that acts as an intermediary between two or more parties during negotiations. A cloud broker is a third-party individual or business that acts as an intermediary between the purchaser of a cloud computing service and the seller(s) of that service.
This broker model works quite well for services like airline travel. Think of Orbitz for example, they take your “requirements” such as travel dates, origination and destination airports and provide back airlines and pricing results based on those simple criteria.
Orbitz really doesn’t get into the details about how much headroom, elbowroom or legroom you will have on different flight options, or what kind of drinks and food they will serve on that flight, they just offer you a seat to and from a location on a certain date — relatively simple.
Now compare that to complex IT solutions. Even at the infrastructure-as-a-service (IaaS), where you would expect brokering services to be somewhat straightforward, there exists a wide variety of virtual machine (VM) flavors, both within and across vendors, each with compatibility, performance and cost implications not altogether obvious to the typical business consumer. As you start moving up the tech stack to platform services (middleware, OS and database instances, development tools, etc.) the level of complexity and options becomes unwieldy, and often paralyzing to consumers. And as we know all too well, increased IT complexity almost always translates to increased risk.
As a business user, I want this complexity abstracted away.
To make matters worse, many cloud service providers (CSPs) today do not do an adequate job of identifying and reporting on SLA violations. By leaving the burden of evidence to the consumer, this not only creates additional work for the service brokers but more importantly puts mission critical services, applications and sensitive data at risk.
Question: So how do we set up a cloud services brokerage model that supports easy and affordable access to cloud services for federal agencies, while also keeping our precious data safe and preserving a high level of service integrity and quality, thereby ensuring customer value?
Answer: Establish a third party entity to serve in an unbiased capacity to keep the peace, increase transparency and ensure accountability.
This solution would comprise of a well informed and cloud savvy SWOT team including:
Acquisition and procurement experts — FAR knowledge a must
Contract lawyers (ugh…I know)
XaaS service subject matter experts
Cloud engineers and architects
Quality assurance experts
Security and data assurance experts
What did I miss? need your input here!
This unbiased, independent and vendor agnostic team of resources would be equipped with the tools necessary to perform continuous diagnostics and monitoring of performance measures against SLAs, audit vendor charges/bills against actual metered customer usage, investigate security procedures to expose data security exploits, ultimately serving to aggregate massive amounts of cloud vendor data so that it can be mined to provide insight, awareness and decision-making for government consumers.
This solution could provide a true “checks and balances” system between the consumers and the service providers, empowering a cloud service brokerage organization with the necessary real time insight to effectively manage complex service delivery and procurement ecosystems.
You know how difficult it is to enforce SLAs against your current, single IT vendor? Now imagine how that complexity scales when you have a dozen vendors, all with distinct promises and varying ways of measuring and reporting (or NOT reporting) on service levels. And how do you aggregate all of that mess into a single consumable and actionable report to enforce vendor penalties and customer credits for failed performance levels?
In this new abstract, disparate cloud based service era, with data being stored in multiple regional location, often in multi-tenant/shared resource environments, so:
How do you establish trust that as your data traversing back and forth between vendors it’s not subject to exploitation and exfiltration?
How do you ensure that the right measures are in place to prevent data leakage or corruption or that technical failures do not result in inadvertent deletion or destruction?
In the event of a security breach, providers are liable for compensation, in a multi-vendor environment where services are chained together, who helps untangle the legal web of liability?
Like others who are passionate about this space, I’m eager to move the ball forward and help federal agencies adopt cloud services; a move that could ultimately result in substantial and enduring cost savings for our taxpayers and efficiencies for CIO shops everywhere.
In this article I’ve put together some thoughts and ideas on how we can help our government achieve this difficult task, and I’d like to enlist your talents, feedback and creativity as we further refine and ultimately deploy this idea (or some version of it).
If we are serious about standing up a cloud service brokerage framework to facilitate the movement from legacy private data centers to public and hybrid cloud environments, for all the reasons listed in this article, we will need the help of an independent, unbiased and vendor agnostic organization to provide consumers with the continuous assurance that they are getting the services they are paying for, and that their data is in fact secure. We will need a team of experts to help effectively manage complex service delivery and procurement ecosystems across multiple vendors, contracts, customers and service models.
In all likelihood, we will need a cloud sheriff.
Giles Kesteloot is the director of the Blackstone Technology Group, an IT consulting firm with its federal practice headquartered in Arlington, Va., specializing in cloud, agile and security.