Let’s stop the end-of-year cloud rush

A closer look shows that public clouds are not always less expensive than on-premise solutions and are not a panacea for controlling IT costs. This is especially...

It happens near the end of every year: Agency leaders consider shifting federal IT resources to “the cloud” when agencies feel their budgets are stretched or uncertain – and as rampant data growth keeps demanding more compute, network and storage resources. The prevailing opinion is that the public cloud is inherently less expensive than an on-premise solution, since someone else runs the server, network and power infrastructure. 

Certainly, these leaders are not alone. Federal agency demand for cloud infrastructure is projected to grow from $15.9 billion in FY 2023 to $23.5 billion in FY 2027, according to Deltek. However, the report notes that challenges remain, including the complexity of multi-cloud environments, and other security and automated IT service management issues.  

A closer look shows that public clouds are not always less expensive than on-premise solutions and are not a panacea for controlling IT costs. This is especially the case with transactional systems, where the application is on-premise and the data is in the public cloud. High egress fees – the cost to read data from the cloud – can present an unwelcome surprise that threatens the cost containment that the cloud was tasked to deliver. In addition, other important considerations on security, sovereignty and performance expectations can also become surprises further down the road.   

Knowing your needs: A seven-point checklist

Assessing IT infrastructure across a seven-point checklist can help teams determine if it makes sense to move workloads to the cloud.  

  1. Consider variable costs – Cloud providers generally price their solutions based on GB per month but may not emphasize the variable portion: egress fees. These costs scale up or down based on application/end-user requests for data in the cloud. Variable costs play havoc on budgets if teams plan only for the fixed portion of the bill. 

Moving to cloud versions of previously on-premise software systems, such as analytics or backup, often ends up being much more expensive, even including the element of paying to cover infrastructure cost. A recent analysis for a civilian agency compared the cost of a cloud-based backup solution to the same solution on-premise. The total cloud solution was 10 times the cost of the on-premise storage portion of the solution. Furthermore, the customer’s public cloud network connectivity would not meet the recovery time objective (RTO) requirement. The customer may have been able to increase the bandwidth to help improve the RTO, but that would have added even more cost.  

In contrast, an on-premise cloud’s costs remain constant as there are no egress fees and the on-premise solution is connected to the agency’s existing internal network, delivering better performance than via a network connection to off-premise clouds.  

  1. Look at scalability – This has traditionally been seen as one of the key benefits of the cloud: easy scale up or down, to expand or downsize systems as needed. The reality is that it is unusual for a storage environment to downsize. Most storage environments continue to grow as new types of data are added. One civilian agency, for example, saw a significant growth in storage needs because they implemented a program using drones at multiple locations. The video footage from the drones is used for both security and analysis over time. New technologies, such as the previous example, often require IT support and that often means more data may need to be retained.  
  1. Determine the best storage type – Studies have shown that 80% of the data on expensive block and file systems has not been touched in 90 days or more. The longer the time, the higher the percentage. Because of this, moving inactive files from block and network-attached storage (NAS) to an object archive that can seamlessly provide access for users can reduce the size of the expensive block or NAS storage needed, possibly delaying an expansion as well as reducing backup times.   
  1. Consider security – The best security will always be behind the organization’s firewall, where operations staff have completed background checks or have clearances at the appropriate levels. In the federal realm, it’s not always a given that all public cloud servers meet or exceed the same standards as on-premise equipment that may need Security Technical Information Guideline hardening that may include SELinux with Federal Information Processing Standards 140-3 compliant libraries.  

Other important aspects, including zero trust security and ransomware protection, need to be included in the analysis. One of the first attack vectors for ransomware is to attack the backups so that the data cannot be easily restored. One level of protection is ensuring that backup and archive applications work with the storage target to ensure that the data written is locked to ensure that data cannot be deleted. If a significant amount of data does need to be restored, can the organization absorb the cost of the egress fees when pulling the data out of a public cloud?  

  1. Think about classifications – Some cloud providers have multiple clouds to support different classifications. IT teams must vet that the appropriate data is being written to the cloud with the appropriate classification. If it isn’t, any data spillage – when higher classification data is incorrectly written to lower classification systems – must be addressed by the public cloud. With today’s architectures, this may not be an easy task.
    Teams must also verify that everyone supporting a public cloud has clearances at the appropriate levels, especially if there is data spillage. While these issues can exist with an on-premise cloud, organizations have had decades to refine the methodologies and will have staff with the appropriate clearances.   
  1. Data sovereignty – Do you know where your data is being stored in the cloud? To certain agencies, it’s vital that federal data stays on U.S.-only systems supported by U.S.-only citizens on U.S. soil. Depending on which cloud is selected, this may not be clear. On-premise systems can replicate or even distribute to other locations belonging to the agency to provide site-level data protection and redundancy allowing faster failover, if required.
  1. Performance requirements – In general, systems writing to the cloud will usually deliver lower performance results to on-premise systems, especially if the agency has only one or two gateways to the internet. In these cases, someone in Arizona, for example, may have to traverse the agency WAN to a gateway, possibly in D.C., then out to the internet to the public cloud. Specific performance levels need to be assessed well in advance of the decision to move to a public cloud. 

Adding it up

Certainly, public clouds can be run cost effectively, but they are not the unrestricted cure that many organizations believe. Positioning for the best experience and cost means thoroughly assessing application dynamics, and factoring them into any decision of total impact, including cost. Answering the seven checkpoints above are one way to help ensure the organization is making the best decision possible.  

With limited agency resources strapped and demands ever increasing, the right move for all – for budgets, users and successful project completion – may be the one that’s counter-intuitive, yet thoughtfully meets all needs. 

Jon Irwin is an enterprise IT architect with over 20 years of experience helping Federal agencies to optimize multi-cloud storage systems, including in his present role as solutions engineer for cloud deployments at Scality, a file and object storage software provider.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Getty Images/iStockphoto/Roman BabakinCommerce Department

    Commerce is in better shape to understand cost, benefits of moving to the cloud

    Read more
    FILE - In this May 18, 2020 file photo TSA officers wear protective masks at a security screening area at Seattle-Tacoma International Airport in SeaTac, Wash. Federal safety officials are investigating people who took part in last week's riot at the U.S. Capitol to decide whether they belong on the federal no-fly list. The move is one of several that officials outlined Friday, Jan. 15, 2021. (AP Photo/Elaine Thompson, File)

    TSA plans to set up ‘data mesh,’ coordinate efforts through cloud center of excellence

    Read more
    U.S Navy/Corporal Bernard Pearson,

    Naval Intelligence adopting cloud services to sort signal from noise

    Read more