Recreating airport security procedures

The Johns Hopkins Applied Physics Laboratory is working with TSA to perfect imaging systems passengers can simply walk right through, potentially eliminating lines,...

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Ever since the first magnetometers in the 1970s, airport authorities have sought better ways to see if people are hiding weapons or contraband. In recent years the old methods have give way to imaging. Now the Johns Hopkins Applied Physics Laboratory is working with the Transportation Security Administration to perfect imaging systems passengers can simply walk right through, potentially eliminating lines, scanners and pat-downs. With more on the technology, physics lab program manager Chris Thompson and lead engineer Shayne Gerber joined Federal Drive with Tom Temin.

Interview transcripts:

Tom Temin: Physics lab program manager Chris Thompson. Mr. Thompson, good to have you on.

Chris Thompson: Glad to be here. Thank you.

Tom Temin: And lead engineer Shayne Gerber. Shayne, good to have you.

Shayne Gerber: Good morning. Thank you.

Tom Temin: Alright. So tell us what is the technology you’re actually working to try and perfect here?

Chris Thompson: Well, TSA as you know, they have passenger screening systems at airport checkpoints. They refer to them as advanced imaging technology or AIT systems. And so what we’ve developed is something that we’re calling the virtual AIT. It’s a modeling and simulation program. And we’ve designed it to try to provide an alternative to some of the physical testing and development that would occur with the AIT systems.

Tom Temin: So you’re working on the pre-deployment issues of testing the latest imaging systems, that’d be a good way to put it.

Chris Thompson: It definitely could help with some of the pre-deployment issues. But one of the unique things about the virtual AIT is we can also use it to create synthetic images. And so that can help with honing and maturing detection algorithms. As well as possibly even down the road, the virtual AIT could be used as almost like a digital twin. For example, Tesla uses digital twins to monitor the lifecycle performance of its vehicles, and that allows them to get better service and reliability out of their cars. A virtual AIT could be used by PSA in a similar way to maintain better reliability for fielded airport scanners.

Tom Temin: So is this a lower cost way of having a twin rather than building two systems and running them side by side? When you say virtual, it sounds like a software defined type of approach to imaging systems?

Chris Thompson: Yes, that’s exactly right. With the virtual AI T, it will be a faster, cheaper and safer alternative compared to live testing. When they test AIT systems. those tests are designed and they are executed. And they require maybe thousands of scans. And as well as with human subjects that may have objects attached to their body, that whole process can be time consuming and expensive. But with the virtual AIT, we’re able to represent a much wider variety of humans that what might be possible for live testing, as well as the different items that may be attached to the body. And then we can accurately model those conditions to explore so many more configurations, and generate large data sets, and do so in much less time.

Tom Temin: But it’s accurate to say that TSA is ultimate goal here, I guess any kind of imaging system, is speed and having people able to walk through and eliminate some of these bottlenecks that have characterized airports for so long.

Chris Thompson: Yes, that’s also correct. Everything that you said really helps to address one of their major challenges, which is improving the passenger experience, but without losing any security effectiveness. So with what we’ve created, we hope to be able to help them to look at all of those complexities that come with exploring new technologies, and getting them to address the many challenges that TSA has to deal with.

Tom Temin: And you’re doing it with a program or a software package called shooting and bouncing ray, probably not the best name to use around an airport, or SABRE, what is it? How does that work? That’s that’s your simulator.

Shayne Gerber: So SABRE is what we call a computational electromagnetics tool. It was developed in house here at APL to run on graphics processing units or GPUs. So SABRE basically uses video game technology coupled with advanced physics calculations to calculate how an electromagnetic field like from a radar would interact and bounce back from objects. So we originally designed it to quickly and accurately calculate how far away objects might look to a radar something like a plane or something like that. It uses ray tracing technology. So that’s where the shooting and bouncing ray comes in, to calculate how that electromagnetic energy goes from the transmitter interacts with the object or objects and then bounces back to the receiver. And these airport scanners are essentially using hundreds of small radar transmitters. So we just modified SABRE to work up close to objects with those type of transmitters.

Tom Temin: It strikes me that this technology, this SABRE technology, could have a lot of applications for many, many federal and industrial applications where faster and more accurate imaging is something people are pursuing. Correct?

Shayne Gerber: That is correct. A lot of times when we’re dealing with radar, we’re dealing with radar simulations. And part of that radar simulation, a crucial part is what we call a radar cross section or RF signatures. So SABRE, the idea of it is to make these calculations fast enough that they can be running in the loop with these simulations. And yes, you’re right as almost unlimited application.

Tom Temin: And what happens when you have perfected SABRE, you now have the ability to make digital twins of proposed imaging systems. How does that get out into the public so people can take advantage of it either federally or industrially?

Shayne Gerber: That’s a good question. APL, we developed this SABRE software with the intent of sharing it with our government partners. So APL would love to see it in the wild in places where it’s useful.

Tom Temin: Chris?

Chris Thompson: Yeah, that’s exactly right. Just to build on that using Sabre, and this virtually it concept with TSA as well as save with other government partners. It wouldn’t necessarily go to the public. But what it does do is that it will allow our sponsors a tremendous amount of insight into systems. You can imagine that in TSA’s case, they’re working with the commercial industry. And they’re working with many different vendors who have a wide array of configurations that TSA may want to explore. Well, we can model those different configurations, including the cylindrical aitc systems you see at airport checkpoints today, as well as other configurations, including walk by and walk through systems. And so that will help TSA both with requirements development upfront. And as you mentioned, down the road, digital twins, so that deployed systems, they can have a much greater understanding of the things that could cause problems with those deployed systems.

Tom Temin: And what are the timelines here? How close are you to a model that you can rely on and say, This package is going to do the job for you?

Chris Thompson: Well, to date we’ve got a cylindrical model completed. And we’ve demonstrated that for TSA. We’ve also done some initial work on modeling some of the other systems. And it’s really been a good partnership with industry because we’ve had to get some inputs from some of the AIT manufacturers as well. But I would say over the next couple years, we would expect to be able to go through a full verification and validation process for the virtual AIT. And then we’d also be able to expand the number of let’s call them digital humans, as well as digital objects that we’re modeling. And so we can really begin to look at lots of different configurations and scanner types.

Tom Temin: I guess you might want to be able to image virtual support animals to that people seem to be dragging on planes a lot these days.

Chris Thompson: Well, you’re right. I mean, things can get really complex. And so again, that’s one of the values of being able to do this virtually.

Tom Temin: Alright, Shayne Gerber is lead engineer for the SABRE project and Chris Thompson is Program Manager, both at the Johns Hopkins Applied Physics Laboratory. Thanks so much for joining me.

Chris Thompson: Thank you for your time.

Shayne Gerber: Thank you very much.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories