National Secure Data Service comes into focus through semiconductor bill

With its National Secure Data Service pilot, the National Science Foundation is pushing Evidence Act efforts forward. Ultimately, NSDS aims to create temporary data...

The National Science Foundation kicked off a pilot to test a long-awaited service expected to streamline federal data sharing in and out of government, now that a major computer chip spending bill is law.

In early August, President Joe Biden signed the Creating Helpful Incentives to Produce Semiconductors (CHIPS) and Science Act into law. The $280 billion package is meant to boost the domestic semiconductor industry and promote scientific research.

The law requires NSF to create a National Secure Data Service (NSDS) demonstration project that would set a standard for data linkages and data access across the federal government.

Two years into this demonstration project, NSF will report to Congress and determine whether the NSDS concept is working and whether it should be continued or expanded.

The bill sets aside $9 million for every fiscal year from 2023 through 2027, in order to establish the demonstration project.

Building on the Evidence Act

The Commission on Evidence-Based Policymaking, which laid the groundwork for the 2019 Foundations for Evidence-Based Policymaking Act (Evidence Act), recommended creating NSDS more than five years ago.

Nearly half of the 22 recommendations in the commission’s final report in 2017 to Congress and the president make reference to the data sharing service as a means to facilitate data access for evidence-building and statistical activities.

Former U.S. Chief Statistician Nancy Potok, also a former commissioner of the Commission on Evidence-Based Policymaking, said NSDS would unlock the greater use of federal data envisioned under the Evidence Act.

“Right now, we have a very splintered federal data infrastructure. Every agency kind of does their own thing, and within each agency, the bureaus often are not coordinating data,” Potok said.

Creating temporary data-sharing capabilities

The commission, she said, envisioned NSDS as a place to pull together data from different sources across agencies, which does not routinely happen in the federal government right now.

“Linking these data sets is really critical. But you can’t just, willy-nilly, set up a giant data warehouse and throw all federal data in there. There’s privacy issues, there’s security issues, there’s ethical issues,” Potok said. “The National Secure Data Service is a place where you can selectively — with an ethical, secure privacy-protected framework — come up with the most important questions of the day that you would need to answer from a public policy standpoint to understand: Are programs working effectively? Are they really helping the public?”

NSDS will enable temporary data linkages between federal data sets to support evidence-building activities, rather than create a massive federal data warehouse or data lake.

“Right now, the federal government is very good at shoveling money out the door, and I think we saw this during the pandemic,” Potok pointed out. “Congress rightfully said, ‘But how do we know we’re achieving what we want to achieve? How do we know we’re actually changing anything? If we’re going to put all this money out there, for these programs, we need some evidence that it’s working.’ And if it’s not working, what does work?”

Corinna Turbes, policy director for the Data Foundation, said data collection by one agency or one level of government is often needed by another for analyzing and understanding policies that span across subject areas.

“The current situation we’re in is not one that was intentional or planned, but it evolved over many years, due to the sort of gradual buildup of legal authorities that dictate federal data access, use and protection,” Turbes said during an August briefing to reporters about the CHIPS legislation. “This means that we can’t get the data we need to inform decisions that we want to make on a policy level, at least not without significant delays, without questions of quality as well as timeliness. It’s this sort of complex problem that the National Secure Data Service hopes to address.”

Using data access to understand government performance

NSDS would provide secure access to federal researchers and program administrators, as well as outside researchers, to better understand where programs are working and where they’re falling short of their goals.

It would also help unlock the value of federal data, as outlined in the Federal Data Strategy that Potok, while serving as chief U.S. statistician, co-led in coordination with then-Federal Chief Information Officer Suzette Kent.

“One of the things that was very important as part of that strategy was to really take advantage of the wealth of data that the federal government has been collecting, both through the federal statistical system and in some of the things that people are most familiar with, like the Census, like education surveys, like health surveys,” Potok said.

Administrative records, data collected by agencies from people participating in federal programs, are a less utilized source of data for evidence-based policymaking.

Potok, a former deputy director of the Census Bureau, said the bureau already makes effective use of the data it collects from the public and other agencies “to put together important information about people in the economy.”

Ensuring privacy as federal data sharing expands

The Advisory Committee on Data for Evidence Building (ACDEB), which advises the Office of Management and Budget on ways to facilitate data sharing while also keeping sensitive data private, also recommended the creation of NSDS in a report from October 2021.

Data Foundation President Nick Hart, an ACDEB member, said NSDS will help agencies learn about disparities that exist across its benefits and services so that they can take steps to ensure underserved communities can more easily access federal programs.

Hart said such a data sharing service isn’t meant to become a new statistical agency nor is it “intended to be a panacea” for all the data sharing challenges that persist across the federal government. But the creation of NSDS would demonstrate progress in implementing the Evidence Act, he said.

“We have a fantastic federal statistical system. We have a growing evaluation infrastructure. We have an incredible cohort of chief data officers, chief information officers, and that ecosystem is continuing to evolve. The National Secure Data Service is intended to supplement on top of that as an additional resource,” Hart said. “The intent here is a new capability that can be scaled over time to solve real-world problems, will build better evidence and even supplement the resources available for the existing statistical system and the evidence-building community. We know there are major gaps that exist today. The Evidence Act created a foundation, a very strong foundation that we are continuing to build upon. NSDS is going to be a major resource.”

Defining the organizational structure of a new federal data-sharing organization

The commission originally envisioned setting up the service within the Commerce Department. The CHIPS and Science Act, however, hands this work off to NSF, which already includes the National Center for Science and Engineering Statistics, one of 13 designated federal statistical agencies.

Potok co-authored a Data Foundation report in June 2022 outlining a blueprint for NSDS and how it should operate on a day-to-day basis once fully mature.

The report outlines several potential governance models for the service. Among them, there are tradeoffs over whether NSDS is set up as a government-owned, government-operated entity, or a government-owned, contractor-run operation like many national labs.

Potok said having a government-owned, contractor-operated NSDS would help navigate some of the hiring and recruiting challenges that are persistent across federal agencies.

“The federal government is having a very difficult time recruiting people, particularly recruiting people who are doing work on the cutting edge of data science, computer science and technology. You want a facility like this to really have the most advanced privacy protections, to really understand data science and working with data, linking data sets,” Potok said. “We thought if we want it to be agile, we want something that’s going to be able to keep advancing. Putting it as a federal agency under the civil service, when we can see that agencies are really struggling in this area, is probably not the best idea. So if it can be contracted out, contractors can be more competitive on their salaries. They’re faster to hire. They can bring in academics. They can build the kinds of partnerships more rapidly than just a straight federal agency could do.”

To discover more digital transformation insights and tactics, visit The Evidence Act: Actionable Insights With Data series.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Getty Images/iStockphoto/master1305overworked graphic

    A quarter of federal employees feel burnout, causing high turnover and low morale, study finds

    Read more