The government needs to push adoption of secure by design principles for manufacturing, fielding and managing endpoints, SEI's Greg Touhill says.
As everything from baby monitors to fire detection systems and transportation infrastructure become connected to the internet, endpoints that federal agencies, contractors and critical infrastructure organizations must secure continue to proliferate at an incredible rate. This diverse ecosystem of connected devices and systems is prompting conversations about new ways to secure them against bad actors.
“The federal government needs to do a much better job of putting its requirements together, from a security standpoint, in bulk,” said Greg Touhill, director of the CERT Division at the Software Engineering Institute and former federal chief information security officer, “so that we can help move the market in a way that is going to produce results that are fresh, fair, effective, efficient and secure, not only for the taxpayers, but it’s going to impact the market in a way that’s going to benefit consumers everywhere.”
The government needs to push the market in the direction of secure by design, Touhill said on Federal Insights Monthly — Improving Cybersecurity Through Autonomous Endpoint Management. That includes manufacturing, fielding and managing endpoints. Plus, it requires building in cybersecurity from the start and training personnel to verify the effectiveness of those cybersecurity measures.
And that includes more than software, Touhill said. It also includes the platforms and hardware the software runs on — and needs to take users into account as well.
“What’s the user going to do, what’s the look and feel? How easy is it to install , configure and operate?” he said. “We’ve got a lot of products that hit the market that require extensive training and education programs and certification programs just to take it out of the wrapper and properly install and configure it. We can do better as a community.”
The secure by design philosophy needs to extend to all aspects of the system — hardware, software and wetware, Touhill said.
Legal attestations certifying that a vendor used secure-by-design principles, DevSecOps methodologies and similar best practices would be a start. In the longer view, artificial intelligence is likely the direction that efforts are heading, Touhill said.
Touhill said research is currently ongoing into adversarial AI and machine learning, which will be required to sort through the massive amounts of data that these diverse new endpoints provide. In fact, he said Carnegie Mellon University, where the Software Engineering Institute is housed, is currently piloting some basic capabilities along those lines.
“With the the volume of data and the velocity in which it is traversing all these different networks, some of the automated capabilities that AI systems are giving us gives us greater insights, helps us do the valuation, helps us prioritize, but even more important helps us maintain the security of the different systems that we are working to make sure are maintained and properly configured and operated,” Touhill said. “And we do it now at increased levels of scale and accuracy.”
The goal is for these AI systems to help find the unexpected within all the added data being generated by these new endpoints because anything unexpected is a potential indicator of breach, he said. AI will require less personnel than current methods, which will lower the barrier to small and medium organizations, reduce training needs for those organizations’ employees and make them more effective, Touhill added.
That massive proliferation of data due to an increased number of endpoints isn’t just a challenge — it’s an opportunity. Touhill said it’s increasing the amount of human knowledge exponentially every day, while also breaking down barriers to communication, cultural exchange and more. But that doesn’t mean all data is equally valuable; different data has different value for different people and organizations.
“Being able to understand the type of data that you create, that you own, that you are the custodian of, that’s going to be one of the acmes of skill over the next decade — to figure out really what is that value of data,” Touhill said. “I suspect that by the end of this decade, as part of the generally accepted accounting principles, … most companies by the end of this decade are going to put the value of data in their asset sheet.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Daisy Thornton is Federal News Network’s digital managing editor. In addition to her editing responsibilities, she covers federal management, workforce and technology issues. She is also the commentary editor; email her your letters to the editor and pitches for contributed bylines.
Follow @dthorntonWFED