Sponsored by Tanium

IoT-driven endpoint proliferation requires secure-by-design principles

The government needs to push the market in the direction of secure-by-design, including manufacturing, fielding and management of these new endpoints.

Federal Monthly Insights - Improving Cybersecurity Through Autonomous Endpoint Management - 10/08/2024

As everything from baby monitors to fire detection systems to transportation infrastructure become connected to the internet, the number of endpoints that federal agencies, contractors and critical infrastructure organizations need to secure are proliferating at an incredible rate. This new diverse ecosystem of connected devices and systems is prompting conversations about new ways to secure those against bad actors.

“The federal government needs to do a much better job of putting its requirements together from a security standpoint in bulk so that we can help move the market in a way that is going to produce results that are fresh, fair, effective, efficient and secure, not only for the taxpayers, but it’s going to impact the market in a way that’s going to benefit consumers everywhere,” said Greg Touhill, director of the CERT Division at the Software Engineering Institute and former federal chief information security officer, on Improving Cybersecurity Through Autonomous Endpoint Management.

The government needs to push the market in the direction of secure-by-design, Touhill said. That includes manufacturing, fielding and management of these endpoints. That means specific cybersecurity requirements need to be built-in from the start, and trained personnel to verify the effectiveness of those cybersecurity measures.

And that includes more than just software, Touhill said. That also includes the platforms and hardware that the software will run on. And it also needs to take the user into account as well.

“What’s the user going to do, what’s the look and feel? How easy is it to install up, configure it and operate?” Touhill said. “We’ve got a lot of products to hit the market that require extensive training and education programs and certification programs just to take it out of the wrapper and properly install and configure it. We can do better as a community. And that secure-by-design philosophy extends to all aspects of the system hardware, software and wetware.”

The way forward

Touhill said legal attestations certifying that a vendor used secure-by-design principles, DevSecOps methodologies and similar best practices would be a start. In the longer view, artificial intelligence is likely the direction that efforts are heading.

Touhill said research is currently ongoing into “adversarial AI [and] adversarial machine learning,” which will be required to sort through the massive amounts of data that these diverse new endpoints provide. In fact, he said Carnegie Mellon University, where the Software Engineering Institute is housed, is currently piloting some basic capabilities along those lines.

“With the the volume of data and the velocity in which it is transversing all these different networks, some of the automated capabilities that AI systems are giving us gives us greater insights, helps us do the valuation, helps us prioritize, but even more important helps us maintain the security, the different systems that we are working to make sure are maintained and properly configured and operated,” Touhill said. “And we do it now at increased levels of scale and accuracy.”

The goal, Touhill said, is for these AI systems to help find the unexpected within all the added data being generated by these new endpoints, because anything unexpected is a potential indicator of breach. And AI will do it while requiring less manpower than current methods, lowering the barrier to more small and medium organizations, requiring less training for current employees and making them more effective.

The upside of more data

That massive proliferation of data due to an increased number of endpoints isn’t just a challenge — it’s an opportunity. Touhill said it’s increasing the amount of human knowledge exponentially every day, breaking down barriers to communication, cultural exchange and more. But that doesn’t mean all data is equally valuable; different data has different value for different people and organizations.

“Being able to understand the type of data that you create, that you own, that you are the custodian of, that’s going to be one of the acmes of skill over the next decade to figure out really what is that value of data,” Touhill said. “I suspect that by the end of this decade, as part of the generally accepted accounting principles, … I think at the end of the fiscal year, most companies by the end of this decade are going to put the value of data in their asset sheet.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Graphic By: Derace LauderdaleCybersecurity

    Rethinking continuous risk metrics to fortify federal cybersecurity

    Read more