Data-enabled missions require data resiliency. Here’s how to achieve it
We’re living in an era of “modernization inflation,” in which agencies must keep up with the rate of innovation or risk falling behind. A bold modernization...
Let’s face it: We’re living in an era of “modernization inflation,” in which agencies must keep up with the rate of innovation or risk falling behind. A bold modernization goal can pay dividends in helping agencies harness the latest technology and stay ahead of industry trends, but the more ambitious the goal, the greater the risk of disruption.
The modern tech environment has created more complexities and risks that need to be managed. Data owners and consumers must understand these dependencies and the risks related to the critical missions the data enables. As federal agencies continue to embark on ambitious, data-enabled modernization journeys, they must take full advantage of their data to enhance decision-making and advance missions, while continuing to protect critical assets to deliver durable mission solutions. The key to achieving this is data resiliency.
Why data-enabled missions require data resiliency
Data resiliency, or an agency’s ability to make its data available to those who need it, when and where they need it, is the key to enabling data-driven decisions, accelerating new mission solutions, and mitigating the risk of disruption. It’s the ability for agencies to quickly bounce back from a data outage (be it accidental or malicious) to ensure key stakeholders can regain access to accurate, relevant data as soon as possible.
Amid emerging threats and surges in user needs, agencies must be able to leverage the entirety of the data at their disposal to deliver on missions now and in the future because when access to data is disrupted, the results can be dire. Data disruption can impact everything from citizens’ trust in government, to veterans’ access to critical services, and even homeland security and emergency response time. From a healthcare perspective specifically, a breach could hinder the government’s ability to partner with the private sector by locking patient data behind ransomware, thereby impeding the approval process to deliver life-saving medicines, treatments and services.
The data we need to fully understand and solve critical global challenges exists, but it is often fragmented, unstructured or stuck in siloes that make it challenging to utilize. This data must be liberated, shared and understood to enhance mission outcomes. For modern governments to fulfill their obligations to their citizens and partners, leverage data as a strategic asset, and accelerate mission objectives, data resiliency is essential.
How to achieve data resiliency
To achieve data resiliency, agencies must take a multi-layered approach that begins with identifying their data goals. From there, agencies should develop a data resiliency initiative that can be infused across missions, business and partner ecosystems.
An initial audit of data – including what data is stored, where it is stored, and the dependencies of various products/services on the data – can help agencies determine where they stand on the road to achieving data resiliency. This initial assessment should also bring to light if there are obstacles to consolidating data and ensuring responsible access to it, if there is data not being collected that should be and, if so, what measures need to be taken to collect it.
Then agencies can build a roadmap to achieve their goals. This roadmap should include processes, best practices, automation, traditional back-up and recovery and data snapshots, infrastructure, and security teams as well as preparation and procedure for unplanned disruptions.
If they haven’t already, agencies must establish cloud-based IT infrastructure to distribute, democratize and analyze data. Beyond making it easier to scale data capacity and processing, cloud-based operations enable organizations to leverage data to its fullest potential and keep it secure, while also making it more accessible to stakeholders, within a controlled environment. Agencies should also adopt artificial intelligence and machine learning to leverage more data, more impactfully. These technologies enable agencies to analyze mass amounts of data and make connections that were previously unattainable.
How to manage threats to data resiliency
There are two primary threats to realizing data resiliency: organizational culture and the pace of technology change.
When it comes to culture, it’s important to remember that having a back-up and recovery plan, or continuity of operations plan, is not enough. While these plans are important, they’re not useful if they’re outdated. For an agency to achieve data resiliency, it must have a multi-layered approach. It must have a clear strategy and buy-in from leadership, and it must be part of the way people work and continually practiced. This requires a digital culture and advanced engineering expertise including platform, site reliability engineering, cyber and data management, to name a few.
It’s also worth mentioning that while shifting to a cloud-based infrastructure is imperative, it shouldn’t create a false sense of security without proper processes, governance, automation and active monitoring. Consider, for example, if a cloud SaaS vendor is compromised. In that circumstance, it is critical that the mission owner understands the service-level agreements and objectives, understands the tooling of the technology, and has multiple back-up options available. Mission owners may and often should consider insulating their data by preserving it separately from the SaaS tooling to maintain resiliency and continuity.
The benefits of data resiliency extend beyond safe, reliable and accessible data. Another positive impact of data resiliency is the ability to share insights across agencies, particularly between the Defense Department and civilian agencies, ensuring that data can travel across different levels of classification.
Often, data-enabling work in one sector reverberates across others. For example, climate data from sensors on Earth and in orbit is continuously distributed to state, federal and international agencies. Most of this data becomes trapped in siloes; however, when it is deliberately shared, other agencies can use it for predictions and research. Taking it one step further, democratized access to data also means that state and federal transportation and infrastructure agencies can use it for resiliency planning, the agricultural sector can make better-informed decisions based on clearer meteorological predictions, and public health organizations can use it to anticipate the expanding impacts of heatwaves and animal-borne pathogens.
Beyond any doubt, the far-reaching benefits of data resiliency outweigh the potential costs and short-term challenges. Now is the time for agencies to put the right technologies, skills and processes in place to fully leverage their trove of data toward achieving the missions that matter most.
Craig Swanson is a vice president at Booz Allen Hamilton who focuses on digital transformation for civil government agencies.
Data-enabled missions require data resiliency. Here’s how to achieve it
We’re living in an era of “modernization inflation,” in which agencies must keep up with the rate of innovation or risk falling behind. A bold modernization...
Let’s face it: We’re living in an era of “modernization inflation,” in which agencies must keep up with the rate of innovation or risk falling behind. A bold modernization goal can pay dividends in helping agencies harness the latest technology and stay ahead of industry trends, but the more ambitious the goal, the greater the risk of disruption.
The modern tech environment has created more complexities and risks that need to be managed. Data owners and consumers must understand these dependencies and the risks related to the critical missions the data enables. As federal agencies continue to embark on ambitious, data-enabled modernization journeys, they must take full advantage of their data to enhance decision-making and advance missions, while continuing to protect critical assets to deliver durable mission solutions. The key to achieving this is data resiliency.
Why data-enabled missions require data resiliency
Data resiliency, or an agency’s ability to make its data available to those who need it, when and where they need it, is the key to enabling data-driven decisions, accelerating new mission solutions, and mitigating the risk of disruption. It’s the ability for agencies to quickly bounce back from a data outage (be it accidental or malicious) to ensure key stakeholders can regain access to accurate, relevant data as soon as possible.
Amid emerging threats and surges in user needs, agencies must be able to leverage the entirety of the data at their disposal to deliver on missions now and in the future because when access to data is disrupted, the results can be dire. Data disruption can impact everything from citizens’ trust in government, to veterans’ access to critical services, and even homeland security and emergency response time. From a healthcare perspective specifically, a breach could hinder the government’s ability to partner with the private sector by locking patient data behind ransomware, thereby impeding the approval process to deliver life-saving medicines, treatments and services.
Get tips and tactics to make informed IT and professional services buys across government in our Small Business Guide.
The data we need to fully understand and solve critical global challenges exists, but it is often fragmented, unstructured or stuck in siloes that make it challenging to utilize. This data must be liberated, shared and understood to enhance mission outcomes. For modern governments to fulfill their obligations to their citizens and partners, leverage data as a strategic asset, and accelerate mission objectives, data resiliency is essential.
How to achieve data resiliency
To achieve data resiliency, agencies must take a multi-layered approach that begins with identifying their data goals. From there, agencies should develop a data resiliency initiative that can be infused across missions, business and partner ecosystems.
An initial audit of data – including what data is stored, where it is stored, and the dependencies of various products/services on the data – can help agencies determine where they stand on the road to achieving data resiliency. This initial assessment should also bring to light if there are obstacles to consolidating data and ensuring responsible access to it, if there is data not being collected that should be and, if so, what measures need to be taken to collect it.
Then agencies can build a roadmap to achieve their goals. This roadmap should include processes, best practices, automation, traditional back-up and recovery and data snapshots, infrastructure, and security teams as well as preparation and procedure for unplanned disruptions.
If they haven’t already, agencies must establish cloud-based IT infrastructure to distribute, democratize and analyze data. Beyond making it easier to scale data capacity and processing, cloud-based operations enable organizations to leverage data to its fullest potential and keep it secure, while also making it more accessible to stakeholders, within a controlled environment. Agencies should also adopt artificial intelligence and machine learning to leverage more data, more impactfully. These technologies enable agencies to analyze mass amounts of data and make connections that were previously unattainable.
How to manage threats to data resiliency
There are two primary threats to realizing data resiliency: organizational culture and the pace of technology change.
When it comes to culture, it’s important to remember that having a back-up and recovery plan, or continuity of operations plan, is not enough. While these plans are important, they’re not useful if they’re outdated. For an agency to achieve data resiliency, it must have a multi-layered approach. It must have a clear strategy and buy-in from leadership, and it must be part of the way people work and continually practiced. This requires a digital culture and advanced engineering expertise including platform, site reliability engineering, cyber and data management, to name a few.
It’s also worth mentioning that while shifting to a cloud-based infrastructure is imperative, it shouldn’t create a false sense of security without proper processes, governance, automation and active monitoring. Consider, for example, if a cloud SaaS vendor is compromised. In that circumstance, it is critical that the mission owner understands the service-level agreements and objectives, understands the tooling of the technology, and has multiple back-up options available. Mission owners may and often should consider insulating their data by preserving it separately from the SaaS tooling to maintain resiliency and continuity.
Read more: Commentary
What the future holds
The benefits of data resiliency extend beyond safe, reliable and accessible data. Another positive impact of data resiliency is the ability to share insights across agencies, particularly between the Defense Department and civilian agencies, ensuring that data can travel across different levels of classification.
Often, data-enabling work in one sector reverberates across others. For example, climate data from sensors on Earth and in orbit is continuously distributed to state, federal and international agencies. Most of this data becomes trapped in siloes; however, when it is deliberately shared, other agencies can use it for predictions and research. Taking it one step further, democratized access to data also means that state and federal transportation and infrastructure agencies can use it for resiliency planning, the agricultural sector can make better-informed decisions based on clearer meteorological predictions, and public health organizations can use it to anticipate the expanding impacts of heatwaves and animal-borne pathogens.
Beyond any doubt, the far-reaching benefits of data resiliency outweigh the potential costs and short-term challenges. Now is the time for agencies to put the right technologies, skills and processes in place to fully leverage their trove of data toward achieving the missions that matter most.
Craig Swanson is a vice president at Booz Allen Hamilton who focuses on digital transformation for civil government agencies.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
A resilient, secure supply chain hinges on strong data management
Commoditization of data will drive federal mission success
Intel community’s new data strategy looks to lay foundations of AI future