Agencies are on the right track when it comes to their open data initiatives, according to a former technology leader at the Homeland Security Department.
Dan Katz, vice president of technology solutions at INADEV and a former web initiatives official at DHS, says 10 years ago GPS and climate data enabled a push forward in innovation that was inconceivable at the time.
“I really believe with the amount of data that’s coming out now, and especially the process around it and some of the governance around that data, we’re going to kind of have this second wave, this renaissance, of really innovative, really high-value solutions.”
But in order to make that second push forward, agencies have to get on board, he says.
It was a year ago that President Barack Obama launched the Open Data Policy along with an executive order mandating government data be as easily accessible and readable as possible. At the same time, the administration created Project Open Data—a free online platform providing agency developers a place to share resources.
Currently, Katz says there’s a wide variance between agencies regarding what kind of data they’re releasing to the public, and how comfortable they are releasing it. “That’s a very human apprehension,” he says. “With people that are still concerned about that, that concern is still going to be there.”
While agencies have to walk the line to make sure data channels are secure, Katz says the advantages are great.
Not only can open and accessible data be used by the general public, but it can and should be used both internally among agencies and also as a mode to receive data from users, he says.
As an example of a successful user-to-agency open data channel, Katz cites an app that attaches to an asthma inhaler. The app inputs data from the National Oceanic and Atmospheric Administration, and then sends data out to the Centers for Medicare and Medicaid Services and the Health and Human Services Department. The app allows CMS and HHS to learn more about what regions people are having respiratory problems, and in turn, provides information back to users about what problem areas to avoid and environmental triggers to be aware of.
But user-generated data flowing to agencies isn’t free of problems. Katz notes that making sure the data coming in is legitimate is both “a technology question and a governance question.” He says everything coming to agencies needs to be appropriately monitored and reviewed before use.
Making data into an asset is something the administration is pushing hard for a year out from the president’s issued action plan. In a May 9 memorandum, top agency executives outlined further tools that are available to agencies while they pursue the outcome of making released data usable.
The memo gives a November deadline for agencies to construct a strong, reformed data management approach to meet the administration’s goals.
Katz underlines the fact that releasing data isn’t the same as using the capabilities of open data to its fullest extent. He says its not just about publishing data sets, it’s understanding the evolving nature of open data. “This is our collective experience,” he says.
A year after the administration’s big push in the open data initiative, Katz says agencies still have a lot of questions. New technologies are coming out, and agencies might be calculating the best time to jump in the proverbial data river, and what the best way is to share the data that they have.
Katz says the way to move forward is to start now.
“Technology is going to continue to improve, but I don’t think that it’s a reason to wait to start, or improve these initiatives.”
Continuing the understanding and implementation of open data, ACT-IAC is hosting a series of executive panels on how agencies can prepare for the implementation of the DATA Act. July 9 is the second executive panel. It focuses on recent DHS efforts to standardize accounting codes.