Since the days of Cloud Smart and Cloud First, government agencies have been encouraged to move data and applications to the cloud to save time and money.
Since the days of Cloud Smart and Cloud First, government agencies have been encouraged to move data and applications to the cloud to save time and money. But has the public cloud lived up to this promise? The answer is complicated.
The public cloud certainly has tangible benefits. First, you only pay for what you use, which can yield significant cost savings and lower operating expenses. Public cloud services are also generally reliable and very scalable. The fact that another organization is taking care of your data can also help minimize maintenance costs.
But while the public cloud is good for certain types of workloads, it’s not ideal for everything. The public cloud poses unique pitfalls that, if you’re not careful, you might fall into.
Before moving all your applications to the public cloud, here are four things to consider.
Hosting everything in the cloud can be dangerous. First, you lose the ability to control your workloads and costs. And while you may save money in the long run, the short-term can get expensive.
Many agencies have learned this lesson the hard way. For example, one government organization opted to move most of its data to the cloud and ran instances 24 hours a day, seven days a week, the same as they did in their data centers. Further analysis found that their citizens were using the services far less than 24 hours a day, seven days a week. Instead of running their application all the time, they scaled back and saved money by running the application 5 days a week, 14 hours a day. In the long run, they learned how to leverage the cloud’s consumption usage model instead of continuing operations as before.
Before you migrate your workloads, reconsider how you think about data management. Running workloads all the time in your private data center is fine, but that’s not what the public cloud is meant for. Before you embark, ask: What will the usage be like over the week, month and during the year? Can my workload scale up and down depending on operating parameters? Do I have the budget to cover those usage costs?
Then decide which workloads are worth moving. Not all of them will be.
Predictive performance can be highly unpredictable
Most application developers and DevOps engineers use a process called predictive performance to gauge how long it will take to run workloads, like build and test cycles. With predictive performance, you’ll be able to tell roughly how long it’ll take to run one of your workloads in the public cloud as compared to your data center.
At least, that’s the idea. The reality is that public cloud performance can be subject to wild swings and unpredictable variability. It may take 15 minutes to run a workload or two hours, depending on the number of tenants that are using the CSP’s infrastructure at any given time.
Many government organizations cannot manage this unpredictability. For instance, applications that deliver vital information to warfighters in theater or anything that requires real-time analytics need precise and predictable timing.
If you need your workloads to be highly predictable, keep them in your own data center. You can prioritize them according to your needs and ensure they’re completed when you need them to be.
Public cloud providers do not charge anything for data ingestion. But just like The Eagles once sang, “You can check out anytime you like, but you can never leave.”
Actually, your data can leave the public cloud, but it will likely cost you more than you bargained for, especially if you have a lot of data and are continually moving it
out. A report by IDC discovered that 99% of cloud storage users incurred planned or unplanned egress fees — an average of 6% of their organizations’ cloud storage bills.
Agencies that deal in high-performance computing need to be especially careful. If a public cloud provider charges 5 cents per gigabyte for 150 to 500 TBs of data, how much will it cost your agency if you’re constantly moving petabytes of HPC data?
To mitigate egress costs, consider moving your data into the cloud, processing it there, taking a small amount out, and destroying it after you’ve removed it. You’ll need to build a good data architecture that allows you to prioritize your data flow so that you’re only moving what you need when you need it, but it will be well worth the savings.
The public cloud can be reliable — up to a point
The public cloud’s record on five nines of uptime has been spotty. An Uptime Institute survey indicated that in 2022 32% of respondents believe the public cloud is “resilient enough to run only some” applications.
Public cloud providers can offer you high levels of resiliency, but, again, you must pay for it. That’s because increased resiliency generally requires running workloads across multiple public cloud data centers in multiple regions. And cloud providers charge extra to move data between those regions. This process is known as data transfer, and it results in additional egress costs.
Costs notwithstanding, you’re likely to find higher degrees of reliability in your own data center. So if you’re running critical workloads that require five nines of availability, you must architect a solution that has multiple clouds, regions and possibly workloads running on-premises.
But all the time, remember: The public cloud is not like your private data center, and it should not be treated as an extension of such. It’s a different tool with its own unique cost and operating models. It can offer substantial benefits and will save you money, if you use it correctly.
Darren Pulsipher is chief solutions architect for the public sector at Intel.
Four keys to managing public cloud cost
Since the days of Cloud Smart and Cloud First, government agencies have been encouraged to move data and applications to the cloud to save time and money.
Since the days of Cloud Smart and Cloud First, government agencies have been encouraged to move data and applications to the cloud to save time and money. But has the public cloud lived up to this promise? The answer is complicated.
The public cloud certainly has tangible benefits. First, you only pay for what you use, which can yield significant cost savings and lower operating expenses. Public cloud services are also generally reliable and very scalable. The fact that another organization is taking care of your data can also help minimize maintenance costs.
But while the public cloud is good for certain types of workloads, it’s not ideal for everything. The public cloud poses unique pitfalls that, if you’re not careful, you might fall into.
Before moving all your applications to the public cloud, here are four things to consider.
Get tips and tactics to make informed IT and professional services buys across government in our Small Business Guide.
“Lifting and shifting” workloads can be expensive
Hosting everything in the cloud can be dangerous. First, you lose the ability to control your workloads and costs. And while you may save money in the long run, the short-term can get expensive.
Many agencies have learned this lesson the hard way. For example, one government organization opted to move most of its data to the cloud and ran instances 24 hours a day, seven days a week, the same as they did in their data centers. Further analysis found that their citizens were using the services far less than 24 hours a day, seven days a week. Instead of running their application all the time, they scaled back and saved money by running the application 5 days a week, 14 hours a day. In the long run, they learned how to leverage the cloud’s consumption usage model instead of continuing operations as before.
Before you migrate your workloads, reconsider how you think about data management. Running workloads all the time in your private data center is fine, but that’s not what the public cloud is meant for. Before you embark, ask: What will the usage be like over the week, month and during the year? Can my workload scale up and down depending on operating parameters? Do I have the budget to cover those usage costs?
Then decide which workloads are worth moving. Not all of them will be.
Predictive performance can be highly unpredictable
Most application developers and DevOps engineers use a process called predictive performance to gauge how long it will take to run workloads, like build and test cycles. With predictive performance, you’ll be able to tell roughly how long it’ll take to run one of your workloads in the public cloud as compared to your data center.
At least, that’s the idea. The reality is that public cloud performance can be subject to wild swings and unpredictable variability. It may take 15 minutes to run a workload or two hours, depending on the number of tenants that are using the CSP’s infrastructure at any given time.
Many government organizations cannot manage this unpredictability. For instance, applications that deliver vital information to warfighters in theater or anything that requires real-time analytics need precise and predictable timing.
If you need your workloads to be highly predictable, keep them in your own data center. You can prioritize them according to your needs and ensure they’re completed when you need them to be.
Read more: Commentary
Egress costs can cost more than you bargained for
Public cloud providers do not charge anything for data ingestion. But just like The Eagles once sang, “You can check out anytime you like, but you can never leave.”
Actually, your data can leave the public cloud, but it will likely cost you more than you bargained for, especially if you have a lot of data and are continually moving it
out. A report by IDC discovered that 99% of cloud storage users incurred planned or unplanned egress fees — an average of 6% of their organizations’ cloud storage bills.
Agencies that deal in high-performance computing need to be especially careful. If a public cloud provider charges 5 cents per gigabyte for 150 to 500 TBs of data, how much will it cost your agency if you’re constantly moving petabytes of HPC data?
To mitigate egress costs, consider moving your data into the cloud, processing it there, taking a small amount out, and destroying it after you’ve removed it. You’ll need to build a good data architecture that allows you to prioritize your data flow so that you’re only moving what you need when you need it, but it will be well worth the savings.
The public cloud can be reliable — up to a point
The public cloud’s record on five nines of uptime has been spotty. An Uptime Institute survey indicated that in 2022 32% of respondents believe the public cloud is “resilient enough to run only some” applications.
Public cloud providers can offer you high levels of resiliency, but, again, you must pay for it. That’s because increased resiliency generally requires running workloads across multiple public cloud data centers in multiple regions. And cloud providers charge extra to move data between those regions. This process is known as data transfer, and it results in additional egress costs.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
Costs notwithstanding, you’re likely to find higher degrees of reliability in your own data center. So if you’re running critical workloads that require five nines of availability, you must architect a solution that has multiple clouds, regions and possibly workloads running on-premises.
But all the time, remember: The public cloud is not like your private data center, and it should not be treated as an extension of such. It’s a different tool with its own unique cost and operating models. It can offer substantial benefits and will save you money, if you use it correctly.
Darren Pulsipher is chief solutions architect for the public sector at Intel.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Public cloud offers agencies an accelerated path to improving their cyber posture
Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena
Cloud Exchange: Leveraging cloud data and migration in the public sector healthcare arena