Thanks to advances in information technology, data about everything has become an abundant commodity. But it’s not always a highly available one. That’s got to change, because recent statute and policy obligates federal agencies to use data in decision making about programs, operations and budgets.
The key for agency decision makers wanting to create data-driven organizations and programs: closing the gap between data and actual insight.
We discussed this challenge with experts from data integration and analytics company Qlik in an interview with Joe DosSantos, the company’s chief data officer, and Andrew Churchill, the vice president for Federal.
They said a key element in a successful data strategy is upgrading what they called the data literacy of the staff. Data officers will be more successful when program managers and analysts understand the data they have and how it can employ it analytically.
DosSantos said that data literacy starts upstream of the data itself, by forcing people to agree on definitions of program components. He cites one engagement where a bank had 50 definitions of the term “average daily balance.”
“In order for us to really make good decisions, we need to start with the idea of understanding what our data means what it’s telling us,” DosSantos said, “and to have a common understanding of the metrics that we use as an enterprise.
On the technology side, Churchill said, savvy agencies are working to build a data operations process, an efficient means of continuously using data to get to desired results.
“And if that process incorporates calling out to a script that executes practically anywhere … I think that’s exactly what we’re looking at; the idea here is getting faster, deliver of the result.”
Data Ops, and in general efficient use of data as analytic and insight fuel, requires a new architectural approach to the infrastructure, DosSantos and Churchill said. The notion of large data warehouses or data lakes has given way to indexed, cloud storage of authoritative date sets. These are, when required, connected by what they called “data pipelines” that bring the data required to a given operation.
“A modern data pipeline starts off with the principle of us having trust and speed of access,” DosSantos said. It requires just-in-time streaming of data to where it’s needed, subject to the rules embedded in the data governance strategy.
We need to get a handle on where our data is, so I don't need to have it physically stored someplace, necessarily, I just need to know where it is. And then if I know where it is, on demand I should be able to rapidly create pipelines that give me access to the data right now. Cloud based architectures are really driving this kind of change.
Joe DosSantos
Chief Data Officer, Qlik
Modern Data Architectures
Dashboards are going to be the least important thing that even our analytics tier supports. Analytics will become something almost omnipresent across all of the different types of business systems and and capabilities that we interact with.
Andrew Churchill
Vice President, Federal, Qlik
Listen to the full show:
Copyright
© 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.