Intelligence Community business data still lacks critical management capabilities
The intelligence community (IC) has stepped up its data practices. Across 18 different intelligence agencies, chief data officers have been working to improve d...
As online presences grow, social media continues to explode and more people interact online, the universe of external datasets is expanding exponentially. Gone are the days of consulting an organization’s almanac; today a few clicks can place historical weather notifications for a precise location at your fingertips. And that’s just the tip of the iceberg, as trillions of data points are collected with every post, click or even scroll online. Businesses have used this data to make more sales and improve customer experience, but the intelligence community has struggled to connect the dots and cut through the noise to deliver sophisticated enterprise data analysis tools to its workforce.
This external intelligence data isn’t subject to common standards (yet) and finding the right tools to use it in a way that is both efficient and effective is currently an uphill battle, as many agencies have made few and isolated investments in using this data to better understand the needs of their customers. The benefits of using data to educate and inform decisions are infinite, but the conclusions are only accessible to the select few who can overcome the challenges to get there.
The intelligence community (IC) has stepped up its data practices. Across 18 different intelligence agencies, chief data officers have been working to improve data lifecycle management. Analysts are present within every agency and at various levels. Agencies are working to source more accurate data while maintaining superior security and global governance.
However, it’s not enough. An estimated 80-90% of data used by these agencies is still largely unstructured, making the jobs of government analysts increasingly difficult. Agencies invest in isolated data management solutions and find themselves left with a foundational external data system that doesn’t track data consumption, frequency of usage or accessibility. Determining what data matters for shaping future security products, policies and regulations is currently siloed. So how do federal agencies level up their third-party data integration game to reap the benefits? By implementing incredibly specific segments with a focus on interoperability, accessibility, technology and skilled labor.
Government agencies aren’t unique in their struggle with siloed teams and departments, as many organizations struggle to navigate this common division caused by growth. Over the years, agencies have issued strategic roadmaps focused on how intelligence professionals and analysts integrate big data into their work. For example, 2016’s Data Center Optimization Initiative, which aimed to improve the efficiency of remaining data centers. Every year since, agencies face an uphill battle trying to successfully consolidate resources when time and money are not always on their side.
Focusing on data interoperability between silos and providing a centralized data management system can quickly get every agency on the same page on what exists, in what format, and the information a database contains. Additionally, providing access to the same data and monitoring its usage by agency, department, role and use case can provide yet another layer of critical information that can be analyzed by the IC to drive product and policy.
Removing silos also reduces costs. Implementing an external data pipeline comes with a price tag, and paying that same cost over and over at each agency is a waste of funds that can be better spent on other imperative projects. Accessibility, technology and skilled labor can be addressed individually by each agency, but interoperability is the critical change to prevent redundancy.
External accessibility is also a new avenue the IC could explore. These agencies are in a unique position where they are consumers of data but also have an opportunity to become data suppliers. Making data collection transparent for individuals and providing access to it for businesses opens up a wealth of infinite data points that can be used to make better-informed decisions.
Matt Conner, chief information security officer for the Office of the Director of National Intelligence, has said, “The cybersecurity apparatus is still geared around a traditional definition of systems—you know, full-stack, storage, compute, processing—all in one device. I think that we don’t talk enough about data: data integrity, data security.”
Not only do these agencies have to bring their tech stack up to speed, but they have to ensure everything is digitally accessible, especially if they’re going to supply data to the masses
Of course, none of these incremental changes are even possible without the highly skilled experts required to do the work. Speed is a necessity to snag these human resources, and agencies often move slower through their red tape to ensure regulations are followed to a tee.
To combat this, federal agencies should reroute their focus to their existing employees by offering re-skilling incentives and training. This is especially necessary if the agency is wanting to build its data pipeline from scratch, as the labor costs are high to begin with, and then taper out – but do not disappear, and are often underestimated – for the remaining lifetime of the data platform.
Interoperability, accessibility, technology and skilled labor are four areas essential to improving the Intelligence Community’s data pipeline. By taking each initiative step by step and truly focusing on getting the details right, the IC can become a major player in third-party data consumption and supply.
Intelligence Community business data still lacks critical management capabilities
The intelligence community (IC) has stepped up its data practices. Across 18 different intelligence agencies, chief data officers have been working to improve d...
As online presences grow, social media continues to explode and more people interact online, the universe of external datasets is expanding exponentially. Gone are the days of consulting an organization’s almanac; today a few clicks can place historical weather notifications for a precise location at your fingertips. And that’s just the tip of the iceberg, as trillions of data points are collected with every post, click or even scroll online. Businesses have used this data to make more sales and improve customer experience, but the intelligence community has struggled to connect the dots and cut through the noise to deliver sophisticated enterprise data analysis tools to its workforce.
This external intelligence data isn’t subject to common standards (yet) and finding the right tools to use it in a way that is both efficient and effective is currently an uphill battle, as many agencies have made few and isolated investments in using this data to better understand the needs of their customers. The benefits of using data to educate and inform decisions are infinite, but the conclusions are only accessible to the select few who can overcome the challenges to get there.
The intelligence community (IC) has stepped up its data practices. Across 18 different intelligence agencies, chief data officers have been working to improve data lifecycle management. Analysts are present within every agency and at various levels. Agencies are working to source more accurate data while maintaining superior security and global governance.
However, it’s not enough. An estimated 80-90% of data used by these agencies is still largely unstructured, making the jobs of government analysts increasingly difficult. Agencies invest in isolated data management solutions and find themselves left with a foundational external data system that doesn’t track data consumption, frequency of usage or accessibility. Determining what data matters for shaping future security products, policies and regulations is currently siloed. So how do federal agencies level up their third-party data integration game to reap the benefits? By implementing incredibly specific segments with a focus on interoperability, accessibility, technology and skilled labor.
Learn how high-impact service providers have helped the government reinvent the way they deliver their mission and services to the public in this exclusive ebook, sponsored by Carahsoft. Download today!
Government agencies aren’t unique in their struggle with siloed teams and departments, as many organizations struggle to navigate this common division caused by growth. Over the years, agencies have issued strategic roadmaps focused on how intelligence professionals and analysts integrate big data into their work. For example, 2016’s Data Center Optimization Initiative, which aimed to improve the efficiency of remaining data centers. Every year since, agencies face an uphill battle trying to successfully consolidate resources when time and money are not always on their side.
Focusing on data interoperability between silos and providing a centralized data management system can quickly get every agency on the same page on what exists, in what format, and the information a database contains. Additionally, providing access to the same data and monitoring its usage by agency, department, role and use case can provide yet another layer of critical information that can be analyzed by the IC to drive product and policy.
Removing silos also reduces costs. Implementing an external data pipeline comes with a price tag, and paying that same cost over and over at each agency is a waste of funds that can be better spent on other imperative projects. Accessibility, technology and skilled labor can be addressed individually by each agency, but interoperability is the critical change to prevent redundancy.
External accessibility is also a new avenue the IC could explore. These agencies are in a unique position where they are consumers of data but also have an opportunity to become data suppliers. Making data collection transparent for individuals and providing access to it for businesses opens up a wealth of infinite data points that can be used to make better-informed decisions.
Matt Conner, chief information security officer for the Office of the Director of National Intelligence, has said, “The cybersecurity apparatus is still geared around a traditional definition of systems—you know, full-stack, storage, compute, processing—all in one device. I think that we don’t talk enough about data: data integrity, data security.”
Not only do these agencies have to bring their tech stack up to speed, but they have to ensure everything is digitally accessible, especially if they’re going to supply data to the masses
Of course, none of these incremental changes are even possible without the highly skilled experts required to do the work. Speed is a necessity to snag these human resources, and agencies often move slower through their red tape to ensure regulations are followed to a tee.
To combat this, federal agencies should reroute their focus to their existing employees by offering re-skilling incentives and training. This is especially necessary if the agency is wanting to build its data pipeline from scratch, as the labor costs are high to begin with, and then taper out – but do not disappear, and are often underestimated – for the remaining lifetime of the data platform.
Read more: Commentary
Interoperability, accessibility, technology and skilled labor are four areas essential to improving the Intelligence Community’s data pipeline. By taking each initiative step by step and truly focusing on getting the details right, the IC can become a major player in third-party data consumption and supply.
Will Freiberg is CEO at Crux.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
White House wants agencies to increase, improve collection of LGBTQ data
Spy agencies look to standardize use of open source intelligence
Top intel official touts telework, increasing workplace flexibilities