The digital revolution, ubiquity of the internet and rise of big data have given government an unprecedented capability to produce, collect, utilize and disseminate a vast array of information and data. These trends have ushered in a new era of data-powered government innovation and citizen services based on the undeniable value in making government data widely available – to citizens, activists, companies, academics, and entrepreneurs. This is often referred to as the “open government” era, which thrives on government transparency, public accountability and citizen-centered services.
Consequently, the last 20 years have seen a transformation of public policies – legislative, regulatory, and administrative – grounded in the philosophy that access to and dissemination of government data is a public right and that any constraints on access hinder transparency and accountability. While there is broad recognition of the need to maximize access to government data, the types of government data are increasingly diverse and complex. For instance, there are many cases where the government collects or licenses private sector data, often combining this data with other data produced by the government. These datasets are often referred to as “hybrid data” or “privately curated data” –data licensed to or collected by the government that comprises both public and private sources. Access to and use of hybrid data is increasingly critical for government to transform data into actionable information.
Given the focus and challenges that come with this transformation, I offer some ideas that agencies and Congress should consider over the short and long term.
A more balanced open data policy should acknowledge and encourage government utilization of the most accurate and cost-effective data available, whether public, private, or a combination of the two. Technology mandates to release all datasets to the public domain fail to serve the broader public interest.
The Obama-era open government policies created a valuable framework for the maximization of purely government data, but these need to be applied carefully and updated to reflect the importance of hybrid data and the growing trends in technology and government data use.
In its 2013 Open Data Policy, the Obama administration expressed a preference for data formats that are “nonproprietary, publicly available, and [with] no restrictions… placed upon their use.”
While a preference for openness is appropriate, effective policy should strive to avoid the unintended consequences of moving from preference to technology mandate when applied to hybrid data. Mandates that make public–private initiatives economically unfeasible serve nobody’s interests.
Policymakers should caution against statutory restrictions that limit new and innovative technological choices simply because they happen to be proprietary. Instead, a better aim for legislative and regulatory oversight would be encouraging the market and public entities to develop the most efficient and effective data solutions possible while maintaining appropriate openness.
So how do we achieve this balance?
The first concrete step toward answering this question should take form in a GAO study of executive agencies’ use of public, private, and hybrid data sets. This study should specifically evaluate current data quality, the tools needed to improve data quality (including curation), and potential improvements to the collection and reporting of government-award data.
Fortunately, Congress has already provided the foundation to help guide such an initiative. A GAO study on data quality should build on the work and recommendations of OMB Circular A-76, the Commission on Evidence-Based Policymaking and the House Committee on Oversight and Government Reform’s report on the Foundations for Evidence-Based Policymaking Act of 2017 (H. Rept. 115-411) to ensure the proper roles of government- and private-sector innovations.
The Committee on Oversight and Government Reform’s report language on the OPEN Government Data Act is particularly instructive for what propositions a GAO study should test:
Open formats and open licenses are necessary components of a default of openness because they remove barriers to accessing and using the data. The presumptions expand upon, but do not alter existing openness requirements related to the treatment of any work of the United States Government under section 105 of title 17 or any other rights regimes.
The default is only that – a default. There are instances where it could be inappropriate for the government to impose open license requirements, such as for data that the government uses and maintains but does not own. For example, an agency might contract with a commercial data provider to obtain data that, if the agency attempted to collect on its own, the agency would need to spend significant time and resources verifying. This bill is not intended to prevent agencies from contracting with commercial data providers to obtain data under restricted terms, when such contract is in the public interest and is the most cost-effective way to meet the federal government’s needs.
Studying these propositions will provide the necessary tools to develop a balanced open data policy that achieves the widest possible ease of use and transparency for all forms of data.
The digital revolution is an opportunity to achieve maximum public transparency and illuminate with sunshine previously opaque government programs and policies.
Legislation that translates the preference for openness into laws and regulations is a positive step, but policy makers should ensure these laws and regulations do not to enact a technology mandate with unintended and harmful consequences. The value of data is dramatically enriched by the quality of the data being provided. A tech mandate that effectively shuns the use of curated, or hybrid, data will usher in an era of data that is less reliable, less useful, less desirable, and more expensive. We cannot deprive government of the information and tools it needs to best serve and protect the public. It is time for the government, nonprofit, and private sectors to come together and formulate a balanced approach to the special case of “hybrid data.”
Rich Beutel is founder of Cyrrus Analytics, a government relations firm, and director of the Procurement Roundtable. He is also on the executive committee of ACT-IAC, working to implement reforms in IT acquisitions and to accelerate the cloud first mandate. Beutel served for over 10 years on Capitol Hill. Most recently, he was the lead acquisition and procurement policy counsel for Chairman Darrell Issa (R-Calif.) of the House Oversight and Government Reform Committee.
Harnessing the power of a data-driven government
Rich Beutel, founder Cyrrus Analytics and a former House staff member, explains the steps for agencies and Congress to make data more actionable.
The digital revolution, ubiquity of the internet and rise of big data have given government an unprecedented capability to produce, collect, utilize and disseminate a vast array of information and data. These trends have ushered in a new era of data-powered government innovation and citizen services based on the undeniable value in making government data widely available – to citizens, activists, companies, academics, and entrepreneurs. This is often referred to as the “open government” era, which thrives on government transparency, public accountability and citizen-centered services.
Consequently, the last 20 years have seen a transformation of public policies – legislative, regulatory, and administrative – grounded in the philosophy that access to and dissemination of government data is a public right and that any constraints on access hinder transparency and accountability. While there is broad recognition of the need to maximize access to government data, the types of government data are increasingly diverse and complex. For instance, there are many cases where the government collects or licenses private sector data, often combining this data with other data produced by the government. These datasets are often referred to as “hybrid data” or “privately curated data” –data licensed to or collected by the government that comprises both public and private sources. Access to and use of hybrid data is increasingly critical for government to transform data into actionable information.
Given the focus and challenges that come with this transformation, I offer some ideas that agencies and Congress should consider over the short and long term.
A more balanced open data policy should acknowledge and encourage government utilization of the most accurate and cost-effective data available, whether public, private, or a combination of the two. Technology mandates to release all datasets to the public domain fail to serve the broader public interest.
The Obama-era open government policies created a valuable framework for the maximization of purely government data, but these need to be applied carefully and updated to reflect the importance of hybrid data and the growing trends in technology and government data use.
In its 2013 Open Data Policy, the Obama administration expressed a preference for data formats that are “nonproprietary, publicly available, and [with] no restrictions… placed upon their use.”
While a preference for openness is appropriate, effective policy should strive to avoid the unintended consequences of moving from preference to technology mandate when applied to hybrid data. Mandates that make public–private initiatives economically unfeasible serve nobody’s interests.
Policymakers should caution against statutory restrictions that limit new and innovative technological choices simply because they happen to be proprietary. Instead, a better aim for legislative and regulatory oversight would be encouraging the market and public entities to develop the most efficient and effective data solutions possible while maintaining appropriate openness.
So how do we achieve this balance?
The first concrete step toward answering this question should take form in a GAO study of executive agencies’ use of public, private, and hybrid data sets. This study should specifically evaluate current data quality, the tools needed to improve data quality (including curation), and potential improvements to the collection and reporting of government-award data.
Fortunately, Congress has already provided the foundation to help guide such an initiative. A GAO study on data quality should build on the work and recommendations of OMB Circular A-76, the Commission on Evidence-Based Policymaking and the House Committee on Oversight and Government Reform’s report on the Foundations for Evidence-Based Policymaking Act of 2017 (H. Rept. 115-411) to ensure the proper roles of government- and private-sector innovations.
The Committee on Oversight and Government Reform’s report language on the OPEN Government Data Act is particularly instructive for what propositions a GAO study should test:
Open formats and open licenses are necessary components of a default of openness because they remove barriers to accessing and using the data. The presumptions expand upon, but do not alter existing openness requirements related to the treatment of any work of the United States Government under section 105 of title 17 or any other rights regimes.
Read more: Commentary
The default is only that – a default. There are instances where it could be inappropriate for the government to impose open license requirements, such as for data that the government uses and maintains but does not own. For example, an agency might contract with a commercial data provider to obtain data that, if the agency attempted to collect on its own, the agency would need to spend significant time and resources verifying. This bill is not intended to prevent agencies from contracting with commercial data providers to obtain data under restricted terms, when such contract is in the public interest and is the most cost-effective way to meet the federal government’s needs.
Studying these propositions will provide the necessary tools to develop a balanced open data policy that achieves the widest possible ease of use and transparency for all forms of data.
The digital revolution is an opportunity to achieve maximum public transparency and illuminate with sunshine previously opaque government programs and policies.
Legislation that translates the preference for openness into laws and regulations is a positive step, but policy makers should ensure these laws and regulations do not to enact a technology mandate with unintended and harmful consequences. The value of data is dramatically enriched by the quality of the data being provided. A tech mandate that effectively shuns the use of curated, or hybrid, data will usher in an era of data that is less reliable, less useful, less desirable, and more expensive. We cannot deprive government of the information and tools it needs to best serve and protect the public. It is time for the government, nonprofit, and private sectors to come together and formulate a balanced approach to the special case of “hybrid data.”
Rich Beutel is founder of Cyrrus Analytics, a government relations firm, and director of the Procurement Roundtable. He is also on the executive committee of ACT-IAC, working to implement reforms in IT acquisitions and to accelerate the cloud first mandate. Beutel served for over 10 years on Capitol Hill. Most recently, he was the lead acquisition and procurement policy counsel for Chairman Darrell Issa (R-Calif.) of the House Oversight and Government Reform Committee.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
The case for innovation: Let’s broaden Other Transaction Authority
Time to evolve FITARA oversight?
Making the grade! The congressional FITARA scorecard as a catalyst for change