Federal News Radio's Scott Maucione checks out some of the projects the Office of Naval Research is funding.
Right now, I’m staring right at Christopher Young, a researcher at Lockheed Martin, but between us is a mountain side of Camp Lejeune.
I should mention I’m currently in Washington, but what’s projecting a painfully detailed map of Camp Lejeune in front of me is a pair of bulky glasses both Young and I are wearing.
The virtual map I’m looking at is like a video game on a tabletop spreading out in the short distance between us. I can see the shadows of the buildings, the terrain, even the kind of siding on the barracks.
Young can see exactly what I’m seeing, and as we talk, he can place markers on the map that we can both see or zoom in on certain areas.
What we are wearing are Microsoft Hololenses, and their use for military applications is just one of the futuristic technologies on display at this year’s Naval Future Force Science and Technology Expo.
Young and I are using the Hololens Sandbox Application to literally look down on the camp.
“This is used for Marine Corps training. This map was built using an overflight of a small drone and taking that and using photogrammetry to create a 3D model of the terrain. We can use it to help facilitate training and even planning. By using another application called Interactive Tactical Decision Games, we can place unit symbols out on a 2D map and they will appear on this terrain, so now we can collaborate on approaches and that kind of stuff,” Young said.
While the technology is mostly used for training at this point, it has useful practical applications. Imagine a service member in an urban warfare zone. She’s trying to explain a situation to a commander back in Washington. That service member can point out exactly where they are taking enemy fire from down to the individual window. Both the commander and the service member can see the same thing. The technology has the potential to give commanders unheard of real-time information and situational awareness.
What creates these maps is a technology University of Southern California researcher Ryan Spicer is working on.
To make the Camp Lejeune map, the Marines flew a small off-the-shelf drone, which captured several thousand still images.
The map has an accuracy of three to five centimeters.
“The turnaround time from flight to pulling the data is in the ballpark of eight to 12 hours,” Spicer said.
One thing that will always get a crowd at a technology convention is a robot, and the expo has no shortage of those.
I found Aaron Nicely, a production and engineering coordinator at RE2 Robotics, maneuvering what looks like a stationary exercise machine.
It’s a movable metal frame Nicely is manipulating and a robot in front of him is mimicking his actions.
The robot isn’t something from a sci-fi movie, but rather looks like a 30-inch high tank with arms. At the base of the robot is a bag of mints and building blocks.
Nicely has no problem picking up the blocks and stacking them or opening the bag and grabbing a single mint.
“This is a highly dexterous manipulation system. It’s a robotic arm with a torso and two arms with grippers,” Nicely said. “This was funded by [the Office of Naval Research] at Code 32 and we have ongoing development for many, many different aspects of this, to miniaturize it, to marinize it, to replicate it and to make various degrees of freedom [for the arm].”
This robot has seven degrees of freedom in its arm, that’s the different ways it can move. The torso has two degrees of freedom.
“Currently, a lot of the fielded robots are two, three, four-degree arms. This is seven-degree each, which allows you do things that exactly mimic how a human arm moves,” Nicely said.
Robots with less dexterity are limited in their movements and take longer to complete tasks.
Nicely said the robot could be used for defusing bombs or working in tight areas.
A much larger robot lurks farther back in the Expo. This robot is more humanoid in shape and actually has a face.
It gives me a blink as I walk over. These are the kind of robots we all imagine in our head, but they are still far from doing chores around the house or being our personal maids.
Greg Trafton, a cognitive scientist at the Naval Research Lab, is trying to get robots to that point though.
Trafton’s bipedal research robot is capable of walking, rolling around on its knees and ankles and it can pick things up.
But more importantly, it can sense the world. The research robot uses artificial intelligence on multiple fronts to learn about the world around it.
The robot realizes when a ball is placed in its hands and then throws the ball to a designated area.
“The applications that we are working toward are firefighting on ships and Navy maintenance tasks,” Trafton said. “What we do is we model how people do those particular tasks and we want the robot collaborating with people.”
The robot uses computation perception AI, computational cognitive modeling and a reasoning application to do its work.
For now, the robot is early on in its progress.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Scott Maucione is a defense reporter for Federal News Network and reports on human capital, workforce and the Defense Department at-large.
Follow @smaucioneWFED