Managing diverse data sets

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Fed Tech Talk’s audio interviews on Apple Podcasts or PodcastOne.

This week on Federal Tech Talk, Dan Graves, chief technology officer at Delphix, joined host John Gilroy to explain why Delphix is one of the best kept secrets in Silicon Valley.

Dan Graves, CTO, Delphix

Over the past few years storage has gotten cheap and there has become a fusion of data sources. It seems like we have advanced dramatically in the way we collect and store information yet have serious challenges when it comes to managing that diverse data set.

For example, a federal agency may have to pull data from an old IBM mainframe, a proprietary system, or even a hybrid cloud. Once they create an amalgamated data set, they are subjected to a classic “push-pull.”

Leaders know they have a responsibility to make sure that data complies with strictures of federal compliance. At the same time, the agency will have data scientists demanding access to all aspects of that data, as soon as possible.

During the interview, Graves suggested that a concept called “masking” can assure federal leaders that no personal information is released while allowing analysts to draw meaning from massive amounts of information.

Another concept Graves explained was the idea of referential integrity. When a federal agency draws from a wide range of sources, they can be assured all its references are valid.

Related Stories


Federal Tech Talk

TUESDAYS at 1:00 P.M.

Host John Gilroy of The Oakmont Group speaks the language of federal CISOs, CIOs and CTOs, and gets into the specifics for government IT systems integrators. Follow John on Twitter. Subscribe on Apple Podcasts or Podcast One.