The State Department is trying new pilots in edge computing at its embassies thanks to the flexibility of TIC 3.0.
The State Department is primed and ready to start moving compute out to the edge. Because of its unique operating environment with embassies all over the world, the Trusted Internet Connection 2.0 guidance required outbound internet traffic from those embassies to backhaul to central data centers in the U.S. — unless they took the much costlier route of deploying servers directly onsite.
But TIC 3.0 has given State more flexibility, and the department is embracing that.
“The retirement of TIC 2.0 is like a godsend in my eyes,” said Gerald Caron, director of Enterprise Network Management in the Bureau of Information Resources at the State Department, during a March 18 FedInsider webinar. “And having TIC 3.0 is great, because 2.0 was very prescriptive, and you had to meet certain requirements and they were very prescriptive in how you had to do the tech. 3.0 opened a great deal of flexibility while still meeting the spirit of what TIC is meant to do from a security perspective.”
Caron said the department has already piloted an edge computing solution to improve embassy internet performance. Rather than backhauling, as was required under TIC 2.0, State put telemetry at the embassies and allowed more direct traffic to the destinations. Caron said most embassies saw significant improvements in performance, and that the lessons learned will inform larger efforts as part of an overall architecture within the plans to modernize State’s networks.
The new guidance allows State to adopt more of a zero trust approach to its data: Understanding where data resides, how it’s being accessed, and using a baseline to protect that data. But there are still limitations.
“One of the concerns that we have is, as we move to cloud, some of them will get FedRAMP’ed and there’s some rules around where you can or cannot process data,” Caron said. “So if I’m in a facility that’s not an embassy or it is on foreign soil, and I’m processing data, then there’s certain levels of data that you cannot process or manipulate. I could do it at an embassy; that means I’ve got to put equipment at the embassy or things like that. But in all senses and purposes, a lot of these clouds get FedRAMP’ed, and they have to be on domestic soil, because of the processing that they do. So there are certain rules that you really have to dig in and understand where the flexibilities are and where they’re not.”
What it comes down to, he said, is figuring out where those flexibilities fit in with the bigger picture of an agency’s modernization efforts, not just within specific silos. That requires looking at use cases, business needs, and the overall architecture and goals of the agency’s modernization efforts.
“I think how edge computing will have the biggest impact is how it integrates, how it fits into that overall plan that you want to accomplish in that architecture, and how it interacts with the other tools,” Caron said. “How do you monitor, how do you do security, and where it contributes to all those different things. I think that’s where it’s going to have the biggest impact is on where it is improving different areas within your environment and your architecture. So looking at the bigger picture is really important.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Daisy Thornton is Federal News Network’s digital managing editor. In addition to her editing responsibilities, she covers federal management, workforce and technology issues. She is also the commentary editor; email her your letters to the editor and pitches for contributed bylines.
Follow @dthorntonWFED