Unlocking the power of data: Revolutionizing federal agencies for mission success
Without universal standards around data collection and formatting, merging and aggregating data from different sources is like fitting together puzzle pieces th...
In a digital-first world, there’s no denying data is crucial to mission delivery. However, for federal agencies to drive the most impact, it’s time for mission and IT leads to strengthen their data sharing capabilities, or risk sacrificing cross-agency interoperability and collaboration. To build a data-activated culture and improve access to insights, agencies must embrace a three-pronged mindset shift. This includes standardizing practices for data collection and sharing; adopting a “data as a product” philosophy; and long-term strategizing around funding and personnel. These efforts are the difference between having an efficient, data-activated culture that meets mission objectives, and missing out on crucial insights and falling behind in the ever-evolving digital landscape.
The data sharing maze
Without universal standards around data collection and formatting, merging and aggregating data from different sources is like fitting together puzzle pieces that don’t match. Critical data points may be missing or fragmented, creating gaps in data sets. This lack of interoperability can not only hinder cross-agency collaboration, but pose challenges around data analysis and forging insight-oriented solutions.
To improve efficiency, agencies must create uniform processes around how data is retrieved, formatted and analyzed. If done effectively, agencies will then have the ability to easily share and combine data sets across different entities, removing unnecessary friction and helping mission leaders identify salient points and trends with ease. When these core pieces are put into place, agencies can expect improved mission delivery, with greater agility.
To streamline data sharing, agencies can also adopt data mesh principles, which is a decentralized, cloud-based approach that allows for greater collaboration and analysis across domains. In the past, agencies were readily able to analyze their own data, but real-time sharing and collaboration with other entities required extra steps – and additional friction. To most effectively analyze and activate data, however, agencies need to be able to examine insights across sources to identify trends and patterns. Such a feat requires real-time access to data and computing resources. By decentralizing ownership of data in the cloud and allowing domain teams direct access to data, mission leaders can collaborate on data anytime, anywhere.
The ability for different agencies to combine data sets can not only make analysis more efficient, but ensures those agencies’ workforces are best allocating their time and not unnecessarily duplicating efforts. With a distributed data approach, data scientists can now swiftly explore large datasets and experiment with machine learning algorithms, allowing agencies to consolidate data sets and apply AI to identify notable trends. Ultimately, a decentralized data model that allows greater access to relevant data begets deeper knowledge sharing and enables cross-agency collaboration, enabling the swifter implementation of mission delivery.
Embracing the “product thinking” revolution
Deploying cloud-enabled technologies isn’t enough to guarantee agencies’ success. In fact, many data-driven solutions remain underutilized due to a lack of “product thinking.” Product thinking refers to thinking about data as a mission-critical product, in its own right, rather than limiting this classification to technology solutions. Establishing a clearer connection between data products and the mission at hand can ensure the proper attention to data quality and interoperability is applied and maintained by the organization over time and across changes in leadership.
Data leaders should also draw on industry best practices around delivering operational IT products. One valuable technique is domain-driven design, a well-established approach that has demonstrated its effectiveness in delivering software components by solving problems through code. When applied to data analytics, domain-driven design can help identify data products. This “data-as-a-product” perspective can inform the development of interoperability standards and data warehouse design, which then facilitates data sharing and analysis across different domains.
By adopting these principles and leveraging modern data access mechanisms, agencies can more effectively meet mission objectives, such as increasing data quality. From there, it’s easier for agencies to take a more forward-looking approach, such as applying a machine learning technique, or coordinating
responses to rapidly evolving emergencies, such as those raised during the pandemic.
Investing resources for the long haul
Wanting to embrace a data-activated culture is one thing. Finding the proper resources to scale and maintain these practices is another. Often, federal agencies struggle to attain funding, personnel and the technological infrastructure to elevate their data assets. Even temporary shortages of these assets can be stumbling blocks when it comes to effective mission delivery. We saw this firsthand during the pandemic, when short-term funds were available to advance data, but many were one-time-only opportunities.
Progress is a marathon, not a sprint. Agencies need long-term grants and funding to effectively harness and utilize data at scale. When they acquire these resources, mission leaders should properly allocate funds to not only stand up new technologies, but to ensure data systems can routinely be updated or supported, or else risk collecting another zombie app.
The same forward-looking approach should also be applied to personnel. It’s not enough to hire new personnel. Rather, mission leaders must take strides to ensure they’re also routinely upskilling their workforce to stay on top of maintenance and retain sought-after talent who are crucial to maintaining the power behind the data.
Last year, the Centers for Disease Control and Prevention announced Strengthening U.S. Public Health Infrastructure, Workforce, and Data Systems, a new, flexible funding opportunity that will provide almost $4 billion over five years to improve public health infrastructure to health departments across the country. This long-term investment focuses on hiring skilled data personnel at state and local levels, aligning with the current and future public health needs. Similar initiatives should be encouraged across agencies to ensure a capable and empowered workforce that can effectively utilize and extract value from data, leading to improved decision-making and mission outcomes.
Unlocking the power of data is no easy feat, but it’s a task worth taking on. By establishing standardized data collection practices, leveraging the cloud for seamless data sharing, adopting data-as-a-product mindsets, and investing in sustainable resources, agencies can unleash the true potential of their data assets. Embracing these initiatives enables federal agencies to stay ahead of the digital curve and significantly enhance their ability to drive meaningful impact. It is time for leaders to revolutionize the way federal agencies operate in order to optimize data usage and more efficiently reach mission objectives in the age of data.
Unlocking the power of data: Revolutionizing federal agencies for mission success
Without universal standards around data collection and formatting, merging and aggregating data from different sources is like fitting together puzzle pieces th...
In a digital-first world, there’s no denying data is crucial to mission delivery. However, for federal agencies to drive the most impact, it’s time for mission and IT leads to strengthen their data sharing capabilities, or risk sacrificing cross-agency interoperability and collaboration. To build a data-activated culture and improve access to insights, agencies must embrace a three-pronged mindset shift. This includes standardizing practices for data collection and sharing; adopting a “data as a product” philosophy; and long-term strategizing around funding and personnel. These efforts are the difference between having an efficient, data-activated culture that meets mission objectives, and missing out on crucial insights and falling behind in the ever-evolving digital landscape.
The data sharing maze
Without universal standards around data collection and formatting, merging and aggregating data from different sources is like fitting together puzzle pieces that don’t match. Critical data points may be missing or fragmented, creating gaps in data sets. This lack of interoperability can not only hinder cross-agency collaboration, but pose challenges around data analysis and forging insight-oriented solutions.
To improve efficiency, agencies must create uniform processes around how data is retrieved, formatted and analyzed. If done effectively, agencies will then have the ability to easily share and combine data sets across different entities, removing unnecessary friction and helping mission leaders identify salient points and trends with ease. When these core pieces are put into place, agencies can expect improved mission delivery, with greater agility.
To streamline data sharing, agencies can also adopt data mesh principles, which is a decentralized, cloud-based approach that allows for greater collaboration and analysis across domains. In the past, agencies were readily able to analyze their own data, but real-time sharing and collaboration with other entities required extra steps – and additional friction. To most effectively analyze and activate data, however, agencies need to be able to examine insights across sources to identify trends and patterns. Such a feat requires real-time access to data and computing resources. By decentralizing ownership of data in the cloud and allowing domain teams direct access to data, mission leaders can collaborate on data anytime, anywhere.
Learn how federal agencies are preparing to help agencies gear up for AI in our latest Executive Briefing, sponsored by ThunderCat Technology.
The ability for different agencies to combine data sets can not only make analysis more efficient, but ensures those agencies’ workforces are best allocating their time and not unnecessarily duplicating efforts. With a distributed data approach, data scientists can now swiftly explore large datasets and experiment with machine learning algorithms, allowing agencies to consolidate data sets and apply AI to identify notable trends. Ultimately, a decentralized data model that allows greater access to relevant data begets deeper knowledge sharing and enables cross-agency collaboration, enabling the swifter implementation of mission delivery.
Embracing the “product thinking” revolution
Deploying cloud-enabled technologies isn’t enough to guarantee agencies’ success. In fact, many data-driven solutions remain underutilized due to a lack of “product thinking.” Product thinking refers to thinking about data as a mission-critical product, in its own right, rather than limiting this classification to technology solutions. Establishing a clearer connection between data products and the mission at hand can ensure the proper attention to data quality and interoperability is applied and maintained by the organization over time and across changes in leadership.
Data leaders should also draw on industry best practices around delivering operational IT products. One valuable technique is domain-driven design, a well-established approach that has demonstrated its effectiveness in delivering software components by solving problems through code. When applied to data analytics, domain-driven design can help identify data products. This “data-as-a-product” perspective can inform the development of interoperability standards and data warehouse design, which then facilitates data sharing and analysis across different domains.
By adopting these principles and leveraging modern data access mechanisms, agencies can more effectively meet mission objectives, such as increasing data quality. From there, it’s easier for agencies to take a more forward-looking approach, such as applying a machine learning technique, or coordinating
responses to rapidly evolving emergencies, such as those raised during the pandemic.
Investing resources for the long haul
Wanting to embrace a data-activated culture is one thing. Finding the proper resources to scale and maintain these practices is another. Often, federal agencies struggle to attain funding, personnel and the technological infrastructure to elevate their data assets. Even temporary shortages of these assets can be stumbling blocks when it comes to effective mission delivery. We saw this firsthand during the pandemic, when short-term funds were available to advance data, but many were one-time-only opportunities.
Progress is a marathon, not a sprint. Agencies need long-term grants and funding to effectively harness and utilize data at scale. When they acquire these resources, mission leaders should properly allocate funds to not only stand up new technologies, but to ensure data systems can routinely be updated or supported, or else risk collecting another zombie app.
The same forward-looking approach should also be applied to personnel. It’s not enough to hire new personnel. Rather, mission leaders must take strides to ensure they’re also routinely upskilling their workforce to stay on top of maintenance and retain sought-after talent who are crucial to maintaining the power behind the data.
Read more: Commentary
Last year, the Centers for Disease Control and Prevention announced Strengthening U.S. Public Health Infrastructure, Workforce, and Data Systems, a new, flexible funding opportunity that will provide almost $4 billion over five years to improve public health infrastructure to health departments across the country. This long-term investment focuses on hiring skilled data personnel at state and local levels, aligning with the current and future public health needs. Similar initiatives should be encouraged across agencies to ensure a capable and empowered workforce that can effectively utilize and extract value from data, leading to improved decision-making and mission outcomes.
Unlocking the power of data is no easy feat, but it’s a task worth taking on. By establishing standardized data collection practices, leveraging the cloud for seamless data sharing, adopting data-as-a-product mindsets, and investing in sustainable resources, agencies can unleash the true potential of their data assets. Embracing these initiatives enables federal agencies to stay ahead of the digital curve and significantly enhance their ability to drive meaningful impact. It is time for leaders to revolutionize the way federal agencies operate in order to optimize data usage and more efficiently reach mission objectives in the age of data.
Joshua Phillips is chief data officer at ICF.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Unlocking the power of data: Revolutionizing federal agencies for mission success
Use data and analytics to improve your agency’s workforce dynamics
Increasing the value of data starts with further breaking down silos