Insight by Red Hat

Managing the unique security considerations of edge computing

One of the best ways to minimize a data breach is to reduce the amount of data sent back over potentially unsecured networks, like wireless. That requires a reb...

Edge computing can provide unique benefits to agencies, but as with all new technologies, they also bring unique challenges. Edge refers to a spectrum of capabilities that stretch from the data center out to managed endpoints, many times in the form of internet of things sensors. The further out from the data center you go, the more centralized access and control diminishes. Meanwhile, the amount of access outsiders and potential bad actors can attain varies indirectly.

One of the best ways to accomplish this is to reduce the amount of data sent back over potentially unsecured networks, like wireless. That requires a rebalancing of the three main components of edge computing: storage, network and compute. If you can increase the amount of storage and compute available at the edge, then less data needs to be sent back over the network. This results in both cost savings – network is currently the most expensive of those three components – and increased security.

For example, consider a drone collecting signals intelligence in a contested environment.

“The traditional way of managing this would be that you have a sensor on the drone and it’s collecting data, and then that data gets streamed over a network to some processing system,” said Christopher Yates, chief architect for Army at Red Hat. “But if the processing is done on the drone and then results are pushed out so your network traffic is reduced, that gives you a couple of different advantages. Now instead of transmitting all of the data that the drone is collecting, the data that it’s transmitting is only the decisions that it’s made or the results that it’s found.”

The key here, Yates said, is encryption. Encryption generally isn’t meant to be unbreakable. It’s meant to be so complex that by the time you break it, the data is no longer valuable. For example, back around 2000, Yates said financial transactions on the internet were considered much safer because with the computing power available at the time, it would take 1000 years to decrypt – at which point your credit card would be long expired. But with the progression of computer capabilities, just 20 years later, that same interaction would only take a couple of hours at most to decrypt.

So the strategy is to expose as little data as possible to reduce the attack surface. The more data they can intercept, the larger body they have to work with in order to decrypt it. So by sending back only the results on the network, you’re limiting what your adversary has to use against you. As an added bonus, you’re also giving fewer clues to the capabilities of the sensor itself by denying your adversary the full body of data it collected.

But the network isn’t the only opportunity to apply improved security to edge devices. Just like data coming back from an edge device often has to travel across unsecured networks, so too do software updates traveling out to the device that can change or improve the platform capabilities. And those updates can affect hundreds or even thousands of parts of the system. That requires additional controls to validate the trust of the updates.

One way to accomplish this is to bundle updates into what is essentially a single atomic unit to reduce complexity.

“So if you were to update every individual part of that platform, there’s potentially thousands of individual parts and packages that might be updated and the combinatorics about which ones work and which ones aren’t going to work and which ones are safe to update versus not safe to update are very large,” said Michael Epley, chief architect and security strategist for the public sector at Red Hat. “Our approach is that we’re going to bundle a bunch of those together. That essentially is a known good configuration; I know that state. So it limits your options, lowers the complexity, but it makes it more reliable and more secure because that package is going to either succeed all at once or fail all at once.”

It also reduces the potential for human error. The traditional update process requires an administrator to go in and manually apply each of the updates. That’s hundreds, if not thousands of opportunities for a mistake. Now multiply that by however many systems that administrator is responsible for. So reducing the complexity also reduces configuration drift. The more consistency there is across all these systems, the less convoluted the attack surface is, and the easier it is to patch.

“One of the other aspects about edge that makes it unique is we’re talking typically about fleets. Large numbers of ideally identical devices. Our goal is to minimize the discrepancies,” Epley said. “You need to be able to, from a holistic fleet perspective, understand what the posture is of every device and eliminate anything that is below that threshold for your security controls.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    DoD Cloud Exchange 2024 Danielle Metz Thumbnail

    DoD Cloud Exchange 2024: OSD’s Danielle Metz on moving from ‘fiefdoms’ to coherent IT enterprise

    Read more
    DoD Cloud Exchange 2024 USTRANSCOM Michael Howard

    DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization

    Read more