Sudhir Hasbe, the chief product officer for Neo4j, said applying graphing technology can help agencies better understand relationships between people, processes and...
Agencies have understood the importance of data for every federal mission area for years. But at the same time, the volume, veracity and variety of that data makes it challenging for every federal mission to gain the valuable insights necessary to drive decisions.
It’s clear that agencies still need to break down data silos to make better data-driven decisions. This will lead to improved efficiency and better cost effectiveness for each decision and for each mission area.
Sudhir Hasbe, the chief product officer for Neo4j, said agencies must unify and mobilize complex data to improve mission critical decision making through better data.
“The first step is break the silos and get the data. The second step is can you make sense of that data or relationships? And the third thing, I think, is the democratization of data and insights within the agency. Do you have access to the right insight at the right place for every individual who’s making the decision? I think there is a lot more work we can do in that regard,” Hasbe said on the Innovation in Government show, sponsored by Carahsoft. “This is where newer technologies can help a lot. There is the whole wave of generative artificial intelligence technologies coming in, which can give you a natural language interface to your agency’s data. And if you had all the data mapped, let’s say in a graph, leveraging generative AI technology to use natural language processing on that and ask questions and get answers, will just democratize access to those insights and will actually help everybody within all the agencies and within all the programs.”
Hasbe said a good example of this is at the Department of Veterans Affairs, which is using graph technology to track assets, people and services, and address challenges of processing records backlogs.
“In that kind of an environment, there may be more opportunities to get efficiency by bringing all of this data assets into systems like graphs and get better efficiency across the whole agency,” he said. “We see different use cases using technologies like graphs to map out the physical world into a software world and do more of these analyses. We are seeing various agencies becoming better at not just collecting data, but also understanding it.”
Graphing technology, as Hasbe described, is a way to take stored data and visualize it to see relationships between key entities and their associated data sets.
It can be people focused, product focused or combine several different data points to see across all relationships.
“Another example is the supply chain. If you think about supply chain, it is billions of products and each product may be related to another product. And that relationship between products, in a simpler way, is what graph technology does,” he said. “Another great example is bill of material. If you look at Lockheed Martin, they manufacture for different missions and they have massive bill of material. If you need to build a rocket, imagine how many millions and millions of parts are required, so all the parts are linked to each other. What happens when a specific part is missing in the whole process and you need to procure it? What will be the delay for the whole program? You will basically be able to analyze that. Graphs are a mechanism of storing and analyzing information in form of entities and their relationships, and then dependencies across them.”
The need to break down data silos is not a new challenge. While agencies have improved their data sharing, Hasbe said with the increased use of connected devices and internet of things devices as well as an ever-increasing volume of data from more traditional sources, it’s easy for silos to stand in the way of decision making.
He said it’s a never-ending challenge for agencies, or any organization, to figure out what data is most important, and make it available to the right people at the right time to spur action.
“We may be combining the data into a single platform and we may be making it accessible and shareable with each other. But do you really understand the relationships between the different types of data that you have? Can you understand, for example, the cyber threat? Or, if you’re looking at cyber intelligence and cyber threat, there is data that’s coming from the web assets and you need to understand where threats are coming there. You need to understand how the networks are designed and how different users are using the platform,” he said. “It’s not just about how you can break the silo and put it into a single system, but can you build, for example, a graph on top of it so you understand the entities and their relationships? It’s not just having more data and breaking down the silos, but also understanding the relationships between these data assets is critical for all the agencies.”
Hasbe said breaking down the data silos, applying graphing technology and generative AI tools creates a powerful set of capabilities for agencies as they strive to make better and faster decisions.
But, he said agencies must be sure the answers they are getting from generative AI are accurate and valuable.
“The most important thing I always suggest is to start small with a few use cases and implement it. Then, learn from it and figure out how you want to do it,” he said. “Once you have learned with few smaller use cases, scaling across the agency is possible and more doable. I would take one program, figure out what you learned from that and then expand it across for the whole agency.”
Listen to the full show:
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Chief Product Officer, Neo4j
Executive Editor, Federal News Network
Chief Product Officer, Neo4j
Sudhir Hasbe is Neo4j’s Chief Product Officer. Sudhir previously led product management for Google Cloud’s Data Analytics Platform which includes industry-leading products like BigQuery, Looker, Dataproc, Dataflow, Pub/Sub, Composer, Data Fusion, and Dataplex. Under his leadership, BigQuery grew to be one of the largest analytics platforms with tens of thousands of customers, 110TB of data being processed every second, hundreds of customers with petabyte-scale datasets, and powering more than 700 ISV offerings. Hasbe also led acquisitions of Looker, Dataform, Cask, and CompilerWorks to enhance Google Cloud’s Data Analytics offering.
Sudhir was also an executive sponsor of several of Google Cloud’s marquee enterprise customers and ecosystem partners. Prior to Google, Hasbe led software engineering at Zulily, transforming it into a state-of-the-art data-driven organization. Prior to Zulilly, Sudhir spent seven years at Microsoft where he led product management for Xbox entertainment services, Azure Data Marketplace SQL Azure, and BizTalk Server. He is based in Seattle, WA.
Executive Editor, Federal News Network
Jason Miller has been executive editor of Federal News Network since 2008. Jason directs the news coverage on all federal issues. He has also produced several news series – among them on whistleblower retaliation at the SBA, the overall impact of President Obama’s first term, cross-agency priority goals, shared services and procurement reform.