Bad data costs Americans trillions. Let’s fix it with a renewed data strategy
Nick Hart of the Data Foundation and Suzette Kent, a former federal CIO, explain how using federal data could cut waste, create jobs and own America's future.
Over the past five years, the federal government lost $200-to-$500 billion per year in fraud to improper payments — that’s up to $3,000 taken from every working American’s pocket annually. Since 2003, these preventable losses have totaled an astounding $2.7 trillion. But here’s the good news: We already have the data and technology to greatly eliminate this waste in the years ahead. The operational structure and legal authority to put these tools to work protecting taxpayer dollars needs to be refreshed and prioritized.
The challenge is straightforward: Government agencies often can’t effectively share and verify basic information before sending payments. For example, federal agencies may not be able to easily check if someone is deceased, verify income or detect duplicate payments across programs.
These challenges are compounded by outdated laws that actively prevent agencies from sharing critical data. For instance, until 2023, the Social Security Act prevented Treasury from accessing SSA’s Death Master File, a basic tool for preventing payments to deceased individuals. While a pilot program has already prevented $650 million in improper payments, this access could end without congressional action. Similar barriers exist in the Privacy Act, Computer Matching Act and Paperwork Reduction Act, all of which were written before modern data systems and security measures existed.
To overcome these systemic challenges and create a coordinated approach to data sharing across government, the Federal Data Strategy established during the first term of President Donald Trump provides a blueprint. Now, we need focused leadership to turn the promise of the Federal Data Strategy into reality to stop fraud and build a government that works smarter for the American people. The initial strategy developed in 2019 received extensive stakeholder feedback, including from nonpartisan agency staff, industry, academia and nonprofits. The collaborative process culminated in a 10-year roadmap with annual plans that provided clear, cohesive principles, practices and actions to federal agencies and data leaders — like the chief data officers, privacy officers and evaluation officials — about key emerging priorities.
A clear strategy combined with leadership focus creates a powerful framework for improving the use of high-quality data to serve the American people. Consider what is possible when we get this right: The Treasury Department’s “Do Not Pay” system already helps some agencies verify eligibility before sending payments. But its effectiveness is limited because agencies cannot easily share data across organizational boundaries, may face legal restrictions, or other barriers. By breaking down data silos while maintaining appropriate privacy protections, we could prevent billions in improper payments before they happen rather than spending more money trying to recoup costs for taxpayers once the payments are made.
Improving how the government uses data is not just about stopping waste before it happens — using data better also leads to the creation of opportunities. When government data is accessible and usable, American businesses innovate. Weather data powers a multi-billion dollar forecasting industry. GPS spawned countless navigation and logistics companies. Financial technology firms are using standardized data to create tools that help investors and protect markets. New data resources are now re-estimating flood risks for insurance providers.
States are crucial partners in achieving the goal of the Federal Data Strategy, recognizing data as a strategic asset. Take identity verification as an example. States manage driver’s licenses and vital records which can help prevent identity theft in federal programs and also match to deceased individuals to ensure payments are only made to living beneficiaries. But old rules and systems make it difficult to securely share this information, even when doing so is safe and common-sense. For many federal programs, states manage qualification for entitlements and the actual distribution of benefits. A 2024 survey of federal chief data officers conducted by the Data Foundation suggested that CDOs have limited interaction with state and local governments, with most engagement occurring only on an annual basis or not at all.
By creating better mechanisms for collaboration between federal and state CDOs — including shared working groups focused on key policy domains like healthcare and education, joint data governance frameworks, and mentorship programs pairing experienced state CDOs with federal counterparts or vice versa — we can build more resilient data-sharing systems that work for everyone. States like Virginia, Indiana and Arkansas already demonstrated success in combining data across health services, education and workforce, which are models that can be scaled through intentional state-federal partnership.
Success requires meaningful engagement with the American people, particularly communities or businesses that have historically faced heightened privacy risks or previously experienced harm from data misuse. By actively including these perspectives in data governance decisions and being transparent about privacy protections, we can build public trust while ensuring data initiatives advance. The National Vital Statistics System demonstrates this approach through long-term collaboration between federal, state local and tribal governments, alongside academic and nonprofit partners, this data sharing initiative improves public health surveillance, supports research and enhances program integrity while maintaining robust privacy protections. When agencies proactively engage stakeholders about privacy safeguards and potential benefits, they are more likely to develop sustainable data-sharing solutions that work in the long-term.
The workforce implications are equally compelling. Better data integration between education providers, employers and workforce agencies could help match people to opportunities more effectively. Some states are already pioneering these approaches and they can be scaled better across regions and the country. This includes efforts like the successful Midwest Regional Data Collaborative that encourages states like Missouri, Iowa and Illinois to increase cross-state data sharing to improve analytics of outcomes.
Implementation is always the hard part, but it is worth the time, energy and focus.
First, agency leaders must prioritize data quality and sharing in their modernization efforts. In early 2025, the Trump administration can immediately focus on issuing long-overdue data governance guidance to federal agencies that enables progress on data governance, data quality, data inventories and other key features of a 2019 law called the OPEN Government Data Act.
Second, we must accelerate research on and adoption of privacy-preserving technologies that enable data sharing while protecting sensitive information. The National AI Research Resource provides a model for how to do this at scale and encourages innovation.
Third, the federal government must explore strategies for better empowering states to take a leadership role and be part of the shared governance structure for data at the federal level. This could include formal involvement of state CDOs in the federal CDO council and in the establishment of federal data standards, in addition to state-federal data governance working groups building on successful models from states that have effectively integrated data across sectors. Coordination and collaboration through the Federal Data Strategy and its annual action plans enables these efforts.
Fourth, Congress must modernize data-sharing laws to reflect current technology and security capabilities while maintaining appropriate privacy protections. An obvious place to start is the Paperwork Reduction Act that imposes challenges and frustrations on the data management ecosystem with clear areas for improvement. Without legislative updates to decades-old restrictions, even the best data strategy will face unnecessary barriers to implementation. Improving data sharing saves money for taxpayers because agencies might currently buy each other’s data from commercial providers — perhaps unnecessarily — when sharing is excessively burdensome, complicated or fragmented.
Finally, sustained resources for data infrastructure cannot be an afterthought. Quality data systems pay for themselves by improving services, reducing waste and minimizing fraud. These sustained resources must include the champions and leadership structures to ensure success.
These changes are not costly or excessively difficult — in fact, not pursuing the Federal Data Strategy stands to increase costs long-term by not effectively using data. American taxpayers are already paying the price for years of inaction through high improper payments and missed opportunities for innovation.
Building a government that works by using data to solve problems, prevent waste and create opportunity is common-sense. America cannot afford to stand by while global competitors are moving forward.
The foundations are in place. The technology exists. The return on investment is clear. It’s time to move from strategy to action and build the data infrastructure America needs to own its future.
Nick Hart is president and CEO of the Data Foundation and previously served as the policy and research director of the U.S. Commission on Evidence-Based Policymaking.
Suzette Kent is a former federal chief information officer and co-led the development of the Federal Data Strategy.
Bad data costs Americans trillions. Let’s fix it with a renewed data strategy
Nick Hart of the Data Foundation and Suzette Kent, a former federal CIO, explain how using federal data could cut waste, create jobs and own America's future.
Over the past five years, the federal government lost $200-to-$500 billion per year in fraud to improper payments — that’s up to $3,000 taken from every working American’s pocket annually. Since 2003, these preventable losses have totaled an astounding $2.7 trillion. But here’s the good news: We already have the data and technology to greatly eliminate this waste in the years ahead. The operational structure and legal authority to put these tools to work protecting taxpayer dollars needs to be refreshed and prioritized.
The challenge is straightforward: Government agencies often can’t effectively share and verify basic information before sending payments. For example, federal agencies may not be able to easily check if someone is deceased, verify income or detect duplicate payments across programs.
These challenges are compounded by outdated laws that actively prevent agencies from sharing critical data. For instance, until 2023, the Social Security Act prevented Treasury from accessing SSA’s Death Master File, a basic tool for preventing payments to deceased individuals. While a pilot program has already prevented $650 million in improper payments, this access could end without congressional action. Similar barriers exist in the Privacy Act, Computer Matching Act and Paperwork Reduction Act, all of which were written before modern data systems and security measures existed.
To overcome these systemic challenges and create a coordinated approach to data sharing across government, the Federal Data Strategy established during the first term of President Donald Trump provides a blueprint. Now, we need focused leadership to turn the promise of the Federal Data Strategy into reality to stop fraud and build a government that works smarter for the American people. The initial strategy developed in 2019 received extensive stakeholder feedback, including from nonpartisan agency staff, industry, academia and nonprofits. The collaborative process culminated in a 10-year roadmap with annual plans that provided clear, cohesive principles, practices and actions to federal agencies and data leaders — like the chief data officers, privacy officers and evaluation officials — about key emerging priorities.
Learn how federal agencies are preparing to help agencies gear up for AI in our latest Executive Briefing, sponsored by ThunderCat Technology.
A clear strategy combined with leadership focus creates a powerful framework for improving the use of high-quality data to serve the American people. Consider what is possible when we get this right: The Treasury Department’s “Do Not Pay” system already helps some agencies verify eligibility before sending payments. But its effectiveness is limited because agencies cannot easily share data across organizational boundaries, may face legal restrictions, or other barriers. By breaking down data silos while maintaining appropriate privacy protections, we could prevent billions in improper payments before they happen rather than spending more money trying to recoup costs for taxpayers once the payments are made.
Improving how the government uses data is not just about stopping waste before it happens — using data better also leads to the creation of opportunities. When government data is accessible and usable, American businesses innovate. Weather data powers a multi-billion dollar forecasting industry. GPS spawned countless navigation and logistics companies. Financial technology firms are using standardized data to create tools that help investors and protect markets. New data resources are now re-estimating flood risks for insurance providers.
States are crucial partners in achieving the goal of the Federal Data Strategy, recognizing data as a strategic asset. Take identity verification as an example. States manage driver’s licenses and vital records which can help prevent identity theft in federal programs and also match to deceased individuals to ensure payments are only made to living beneficiaries. But old rules and systems make it difficult to securely share this information, even when doing so is safe and common-sense. For many federal programs, states manage qualification for entitlements and the actual distribution of benefits. A 2024 survey of federal chief data officers conducted by the Data Foundation suggested that CDOs have limited interaction with state and local governments, with most engagement occurring only on an annual basis or not at all.
By creating better mechanisms for collaboration between federal and state CDOs — including shared working groups focused on key policy domains like healthcare and education, joint data governance frameworks, and mentorship programs pairing experienced state CDOs with federal counterparts or vice versa — we can build more resilient data-sharing systems that work for everyone. States like Virginia, Indiana and Arkansas already demonstrated success in combining data across health services, education and workforce, which are models that can be scaled through intentional state-federal partnership.
Success requires meaningful engagement with the American people, particularly communities or businesses that have historically faced heightened privacy risks or previously experienced harm from data misuse. By actively including these perspectives in data governance decisions and being transparent about privacy protections, we can build public trust while ensuring data initiatives advance. The National Vital Statistics System demonstrates this approach through long-term collaboration between federal, state local and tribal governments, alongside academic and nonprofit partners, this data sharing initiative improves public health surveillance, supports research and enhances program integrity while maintaining robust privacy protections. When agencies proactively engage stakeholders about privacy safeguards and potential benefits, they are more likely to develop sustainable data-sharing solutions that work in the long-term.
The workforce implications are equally compelling. Better data integration between education providers, employers and workforce agencies could help match people to opportunities more effectively. Some states are already pioneering these approaches and they can be scaled better across regions and the country. This includes efforts like the successful Midwest Regional Data Collaborative that encourages states like Missouri, Iowa and Illinois to increase cross-state data sharing to improve analytics of outcomes.
Implementation is always the hard part, but it is worth the time, energy and focus.
First, agency leaders must prioritize data quality and sharing in their modernization efforts. In early 2025, the Trump administration can immediately focus on issuing long-overdue data governance guidance to federal agencies that enables progress on data governance, data quality, data inventories and other key features of a 2019 law called the OPEN Government Data Act.
Read more: Commentary
Second, we must accelerate research on and adoption of privacy-preserving technologies that enable data sharing while protecting sensitive information. The National AI Research Resource provides a model for how to do this at scale and encourages innovation.
Third, the federal government must explore strategies for better empowering states to take a leadership role and be part of the shared governance structure for data at the federal level. This could include formal involvement of state CDOs in the federal CDO council and in the establishment of federal data standards, in addition to state-federal data governance working groups building on successful models from states that have effectively integrated data across sectors. Coordination and collaboration through the Federal Data Strategy and its annual action plans enables these efforts.
Fourth, Congress must modernize data-sharing laws to reflect current technology and security capabilities while maintaining appropriate privacy protections. An obvious place to start is the Paperwork Reduction Act that imposes challenges and frustrations on the data management ecosystem with clear areas for improvement. Without legislative updates to decades-old restrictions, even the best data strategy will face unnecessary barriers to implementation. Improving data sharing saves money for taxpayers because agencies might currently buy each other’s data from commercial providers — perhaps unnecessarily — when sharing is excessively burdensome, complicated or fragmented.
Finally, sustained resources for data infrastructure cannot be an afterthought. Quality data systems pay for themselves by improving services, reducing waste and minimizing fraud. These sustained resources must include the champions and leadership structures to ensure success.
These changes are not costly or excessively difficult — in fact, not pursuing the Federal Data Strategy stands to increase costs long-term by not effectively using data. American taxpayers are already paying the price for years of inaction through high improper payments and missed opportunities for innovation.
Building a government that works by using data to solve problems, prevent waste and create opportunity is common-sense. America cannot afford to stand by while global competitors are moving forward.
The foundations are in place. The technology exists. The return on investment is clear. It’s time to move from strategy to action and build the data infrastructure America needs to own its future.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
Nick Hart is president and CEO of the Data Foundation and previously served as the policy and research director of the U.S. Commission on Evidence-Based Policymaking.
Suzette Kent is a former federal chief information officer and co-led the development of the Federal Data Strategy.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Securing the AI data pipeline with confidential AI
Taming the breach: Is U.S. incident disclosure working?
Shining a light on shadow AI: Three ways to keep your enterprise safe