In the wake of COVID-19, it should be obvious to everyone that public sector health organizations have been functioning in extremis over the past couple of years. It’s also now evident that technological capability in general — and cloud power specifically — has an outsized impact on how effectively such critical institutions can meet evolving demands.
As noted by Deloitte last month, “The flexibility and scalability of cloud allowed governments to meet the urgent challenges of the pandemic, such as massive surges in demand for services or the sudden shift to remote work.” Estimates indicate federal spending on cloud in the health sector increased 46% between 2020 and 2021; a FedRamp survey last year indicated that about half of all U.S. state, local and federal governments report having either some or most systems and solutions in the cloud.
But that still leaves a vast swath of the public sector — and its health-related services, agencies, and institutions — laboring without the benefit of cloud power.
While a host of nontrivial factors have contributed to this deficit, not the least of which are intransigent operational models beholden to legacy on-premises infrastructure, there are now approaches to migration that can both ease and speed the path to public sector IT modernization considerably, and to powerful effect.
When transitioning to any new infrastructure, traditional migrations have focused on applications because that’s how most organizations see their workloads: “We log into this system to do this, and we log in to another system to do that.”
When it comes to information technology, day-to-day work revolves around the applications that people interact with; their tasks and their deliverables are really centered around that application space. So it is natural to think in terms of migrating applications because that’s what we work with directly. We tend to view data as something that “comes along with” applications.
Data-led migration centers on recognizing that it’s actually the other way around: data is what really matters most.
Applications are inherently transient; they evolve over time, as real-world needs evolve. The applications we use today may not be the applications we use tomorrow. They either get deprecated in favor of new applications, or they get modified over time with new features and functionality. The only thing that remains constant in that evolution is the need for the data. The real value is in the datasets.
Data-led migration is a “do what matters most first” approach to cloud adoption. The focus is on getting the data into the cloud first and foremost. It then becomes much easier to handle applications driven by those data, and to develop new applications around that inherently scalable cloud data repository quickly and efficiently.
Reasonable path to radical power
A lot of information about moving to the cloud touts radical digital transformation, which is great! But it also implies that data migration necessarily entails drastic changes to existing workflows and operations, a huge red flag in the often overstretched and under-resourced public sector.
It doesn’t have to be that way. The truth is that data-led cloud migrations can be — and often are — phased. Migration can be essentially transparent to the application layer initially, and then iterated and optimized over time. Phase One might involve using the same existing toolset: You can simply move from a SQL server implementation on premise to a SQL server implementation on the cloud, for example. It’s just that now the data for that toolset are in the cloud versus in some on-prem database. The cloud supplies the opportunity for radical digital transformation, which can be incredibly creative and productive for the mission. But no one has to boil the ocean on day one.
The only drastic change is the immediately wider degree of options available in terms of data constructs. Once the data are in the cloud, there are new and different ways they can be used leveraging previously unattainable resources: Is a relational database still the optimal format for your purposes? Where might you think about migrating to a data lake or breaking out operational data stores from analytics data stores? A host of new capabilities is immediately viable.
Considering these types of options becomes viable because the economics of provisioning and deprovisioning compute is very different in the cloud. For example, a public health organization might have to purchase $1 million worth of servers to equip an existing on-prem data center for running machine learning experiments. But with data in the cloud, you can spin up a supercomputer in the sky in mere minutes, run an experiment for half an hour, then shut it all down, and it might cost a few hundred bucks.
By building in one or more public clouds, an organization can quickly spin up new environments and selectively utilize as-a-service options while avoiding the expenses of acquiring, storing, configuring and managing physical infrastructure in-house.
A prototypical example of this polarity can be drawn from the evolution of IT at NASA. Consider the Voyager 1 and 2 spacecraft, which were launched in the late 1970s. When they sent amazing pictures back to earth, NASA had to commission new supercomputers and multiple millions in capital outlay to work with the data, and it took months to do image analytics.
Fast forward to the Cassini and Mars missions, after the cloud had transformed NASA’s capabilities. As explained at the SpaceOps 2010 Conference, “Instead of procuring a machine, a project simply rents a machine and pays for it by the hour. In fact, at times, a mission can rent a machine for as low as 3 cents an hour. If an application requires a machine, one can be provisioned within 5 minutes instead of 5 weeks.”
By 2012, utilizing the cloud, NASA would “process nearly 200,000 Cassini images within a few hours under $200 on AWS,” according to NASA JPL Senior Solution Architect Khawaja Shams, where previous inelastic internal resources expended “15 days on the same task.” As the Curiosity rover transmitted the first images of Mars back to Earth in August that year, “all of the raw images that came in went straight to AWS. People everywhere could see them on their smart devices. NASA JPL was able to stream 150 TB of data in just a few hours.”
NASA’s cloud journey illustrates orders upon orders of magnitude reduction in public sector IT cost combined with breathtaking advances in speed, scale and capability. By any traditional measures of return-on-investment (ROI), cloud adoption clearly demonstrates excellent stewardship of public funds. Data-led migration is an accessible means of capturing that value.
Transposed to the mission of public health in our time, it’s important to recognize even more compelling incentives for the transition to cloud. The ultimate ROI for the organization is agility. Public sector agility greatly affects the day-to-day affairs of human beings right here on Earth: The means to quickly analyze and stream pictures of Mars is also the means to enact a regional COVID response in days instead of months.
In a field like healthcare, that agility literally saves lives.
Data-led cloud migration for public sector health organizations
There are now approaches to migration that can both ease and speed the path to public sector IT modernization considerably, and to powerful effect.
In the wake of COVID-19, it should be obvious to everyone that public sector health organizations have been functioning in extremis over the past couple of years. It’s also now evident that technological capability in general — and cloud power specifically — has an outsized impact on how effectively such critical institutions can meet evolving demands.
As noted by Deloitte last month, “The flexibility and scalability of cloud allowed governments to meet the urgent challenges of the pandemic, such as massive surges in demand for services or the sudden shift to remote work.” Estimates indicate federal spending on cloud in the health sector increased 46% between 2020 and 2021; a FedRamp survey last year indicated that about half of all U.S. state, local and federal governments report having either some or most systems and solutions in the cloud.
But that still leaves a vast swath of the public sector — and its health-related services, agencies, and institutions — laboring without the benefit of cloud power.
While a host of nontrivial factors have contributed to this deficit, not the least of which are intransigent operational models beholden to legacy on-premises infrastructure, there are now approaches to migration that can both ease and speed the path to public sector IT modernization considerably, and to powerful effect.
Learn how federal agencies are preparing to help agencies gear up for AI in our latest Executive Briefing, sponsored by ThunderCat Technology.
Doing what matters most first
When transitioning to any new infrastructure, traditional migrations have focused on applications because that’s how most organizations see their workloads: “We log into this system to do this, and we log in to another system to do that.”
When it comes to information technology, day-to-day work revolves around the applications that people interact with; their tasks and their deliverables are really centered around that application space. So it is natural to think in terms of migrating applications because that’s what we work with directly. We tend to view data as something that “comes along with” applications.
Data-led migration centers on recognizing that it’s actually the other way around: data is what really matters most.
Applications are inherently transient; they evolve over time, as real-world needs evolve. The applications we use today may not be the applications we use tomorrow. They either get deprecated in favor of new applications, or they get modified over time with new features and functionality. The only thing that remains constant in that evolution is the need for the data. The real value is in the datasets.
Data-led migration is a “do what matters most first” approach to cloud adoption. The focus is on getting the data into the cloud first and foremost. It then becomes much easier to handle applications driven by those data, and to develop new applications around that inherently scalable cloud data repository quickly and efficiently.
Reasonable path to radical power
A lot of information about moving to the cloud touts radical digital transformation, which is great! But it also implies that data migration necessarily entails drastic changes to existing workflows and operations, a huge red flag in the often overstretched and under-resourced public sector.
It doesn’t have to be that way. The truth is that data-led cloud migrations can be — and often are — phased. Migration can be essentially transparent to the application layer initially, and then iterated and optimized over time. Phase One might involve using the same existing toolset: You can simply move from a SQL server implementation on premise to a SQL server implementation on the cloud, for example. It’s just that now the data for that toolset are in the cloud versus in some on-prem database. The cloud supplies the opportunity for radical digital transformation, which can be incredibly creative and productive for the mission. But no one has to boil the ocean on day one.
The only drastic change is the immediately wider degree of options available in terms of data constructs. Once the data are in the cloud, there are new and different ways they can be used leveraging previously unattainable resources: Is a relational database still the optimal format for your purposes? Where might you think about migrating to a data lake or breaking out operational data stores from analytics data stores? A host of new capabilities is immediately viable.
Read more: Commentary
Understanding cloud economics
Considering these types of options becomes viable because the economics of provisioning and deprovisioning compute is very different in the cloud. For example, a public health organization might have to purchase $1 million worth of servers to equip an existing on-prem data center for running machine learning experiments. But with data in the cloud, you can spin up a supercomputer in the sky in mere minutes, run an experiment for half an hour, then shut it all down, and it might cost a few hundred bucks.
By building in one or more public clouds, an organization can quickly spin up new environments and selectively utilize as-a-service options while avoiding the expenses of acquiring, storing, configuring and managing physical infrastructure in-house.
A prototypical example of this polarity can be drawn from the evolution of IT at NASA. Consider the Voyager 1 and 2 spacecraft, which were launched in the late 1970s. When they sent amazing pictures back to earth, NASA had to commission new supercomputers and multiple millions in capital outlay to work with the data, and it took months to do image analytics.
Fast forward to the Cassini and Mars missions, after the cloud had transformed NASA’s capabilities. As explained at the SpaceOps 2010 Conference, “Instead of procuring a machine, a project simply rents a machine and pays for it by the hour. In fact, at times, a mission can rent a machine for as low as 3 cents an hour. If an application requires a machine, one can be provisioned within 5 minutes instead of 5 weeks.”
By 2012, utilizing the cloud, NASA would “process nearly 200,000 Cassini images within a few hours under $200 on AWS,” according to NASA JPL Senior Solution Architect Khawaja Shams, where previous inelastic internal resources expended “15 days on the same task.” As the Curiosity rover transmitted the first images of Mars back to Earth in August that year, “all of the raw images that came in went straight to AWS. People everywhere could see them on their smart devices. NASA JPL was able to stream 150 TB of data in just a few hours.”
That was ten years ago; these days, cloud power is helping NASA fly little helicopters on the red planet.
What is gained
NASA’s cloud journey illustrates orders upon orders of magnitude reduction in public sector IT cost combined with breathtaking advances in speed, scale and capability. By any traditional measures of return-on-investment (ROI), cloud adoption clearly demonstrates excellent stewardship of public funds. Data-led migration is an accessible means of capturing that value.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
Transposed to the mission of public health in our time, it’s important to recognize even more compelling incentives for the transition to cloud. The ultimate ROI for the organization is agility. Public sector agility greatly affects the day-to-day affairs of human beings right here on Earth: The means to quickly analyze and stream pictures of Mars is also the means to enact a regional COVID response in days instead of months.
In a field like healthcare, that agility literally saves lives.
Gerry Miller is founder and CEO of Cloudticity.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
With cloud migration, Army learned valuable lessons on standardization, sizing, culture
Federal IT challenges when transitioning to the cloud
Cloud underlies ability to modernize at speed of need