Cyber Chat Host Sean Kelley sits down with former White House Fellow Jermon Bafaty and NASA Chief Data Officer Ron Thompson to discuss the state of data, data u...
Bafaty left the President’s Commission on White House Fellowships to serve as a senior advisor within the Department of Energy. He stood up a public private partnership with industry, academia and non-profits. He applied AI and related technologies to help first responders minimize the impact and improve response to wildfires in natural disasters. Bafaty is now the CEO of Platinum Technologies. Platinum Technologies is a digital and professional services firm focused on serving federal, state and local customers.
“There was a time when we could say that storage was cheap because disk was cheap, but with the explosion of data, storage isn’t cheap anymore,” Bafaty said. “So just being able to classify the data appropriately and being able to decide what level of security or protection that could be applied is a challenge within itself. Now that there are so many cloud first strategies within the federal government, migrating that data into these multi-tenant environments and ensuring that you’ve got the right level of security and protection based on government compliance — and sometimes international compliance — are also challenges. In government, we were looking at implementing data governance strategies that reflect more about a threat assessment, the classification of the data and creating, in a sense, its own enterprise architecture approach to ensure that we’re giving it the due consideration that it needed.”
Thompson is very familiar with the challenges of using and protecting data.
“What I’ve seen, my experience is a lot of data is ‘purpose built;’ it never really was designed to be sharable. It never really was designed to be taggable and made searchable,” he said. “That’s the beauty within NASA is taking data sets that are purpose built and making it accessible. We deal with about 12 terabytes day to day that we collect to study the Earth, and that’s projected in the next two or three years to increase to about 24 terabytes a day with observation data and that’s just the Earth. … We’re not sure what purpose it’s going to be used [for] outside of us collecting and making it available. … We are taking what’s called the ‘fair principles,’ and that is making sure our data is findable, accessible, interoperable and reusable. These principles are the foundation of how we’re going back and collecting our data in the future and going back in time and making sure we can access datasets. We have a really interesting story on our COVID pandemic that we’ve used, and we are collaborating across federal agencies to link into authoritative data source to make decisions when to reopen our offices. When is it safe for our people to come back in a physical workplace? Or some of these roles may never actually go back into the workplace. But we want to make sure our leaders in the agency have these data, these authoritative data sources that are accurate, that are based on methods of understanding that everyone is using and make sure it is available for them to make their decisions.”
In every position Thompson takes on he has sharpened his understanding of the need for good data and continues to put a great deal of focus on the customer. Thompson’s early experience in the Army and time as a government executive has prepared him to find terms of truth, find where the data sources are, articulate standards that we all can understand, and gravitate to these very important sources.
“We’re making great progress across the chief data officers counsel, across the federal workplace and working very closely together,” Thompson said. “We have multiple agencies in the same geographic area and some share the same building space themselves. So a lot of these data are collected for a specific purpose and an agency view. So we are really looking at how we can share that knowledge across multiple agencies.”
Bafaty said the concept of cybersecurity is evolving to include protecting data in movement.
“The fact that data may have been incredibly accessible at a particular point, and when you’re migrating that data into another infrastructure, whether it be cloud or for other use, there is an information assurance component that needs to be injected into that process. And then any relative cyber controls based on what the intended new use is for that repurposed data,” Bafaty said. “A lot of [securing data] to me is around the models and the architectures that are going to be associated with what is the new reason this data is going to be used for and addressing that up front. There are many tools that are out there these days, and a lot of them are very good. But it’s really around the process and ensuring you’ve got the corporate or organizational buy in for what it’s going to take to segment and protect data at various times. The last component is making sure that you’re not just too heavy-handed with the protection. Because the more monolithic you get, the more expensive that gets. From a taxpayer perspective, if we can avoid that, that’s always a good thing. From an industry perspective, you are trying to give your clients the best use for their dollar. So again, I default back to the actual design of the systems that are going to be needed to be put in place that make access to this data future-proof, and also agile in terms of what the customer or citizen’s needs are going to be.”
Part two will include more thoughts from Thompson and Barfaty on data security today and tomorrow.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Daisy Thornton is Federal News Network’s digital managing editor. In addition to her editing responsibilities, she covers federal management, workforce and technology issues. She is also the commentary editor; email her your letters to the editor and pitches for contributed bylines.
Follow @dthorntonWFED