An Agency Approach to AI and Infrastructure
Nic Perez
Chief Technology Officer, Cloud Practice, ThunderCat Technology
Managing artificial and machine learning application projects is in large measure a matter of managing data connected to them. Not only curating data, but also...
Managing artificial and machine learning application projects is in large measure a matter of fine-tuning the locations where data and applications reside. It’s less a matter of data center versus cloud, than of portability among data center, cloud and edge computing, consistent with optimal availability and cost control. Therefore, it’s important for agencies to spend some effort planning the infrastructures for systems hosting AI development and training data, as well as for deployable AI applications.
In the cloud era, this management requirement extends to the commercial clouds agencies employ as components in their hybrid computing environments. With contemporary approaches to storage tiers, application hosting decisions, and locating and updating of the agency’s own data centers, the IT staff can find efficiencies that enable AI development in a cost effective way.
For some of the latest thinking, Federal News Network spoke to Nic Perez, the chief technology officer for cloud practice at ThunderCat Technology; and Al Ford, the artificial intelligence alliances manager at Dell Technologies.
Perez said that AI application development that used to require agency-operated high performance computing environments and the associated software tooling is all finding its way into commercial clouds.
“One of the benefits of the cloud is that agencies can leverage the compute and the power that is available inside these cloud providers,” Perez said. “Move your data, and then absolutely maximize the compute power there.”
Different clouds offer differing toolsets, he added, giving agency practitioners flexibility in the degree of automation they want in staging and training AI applications. Perez said that over the last year or so, he’s seen a “land rush” of agencies moving text analytics, speech, and video data to the cloud, and performing AI on the data there.
In other instances, Ford said, it may make sense to train and deploy artificial intelligence applications neither in the cloud nor in a data center, but rather in an edge computing environment.
For example, “it could be that you’re part of the geological survey, and you’ve got a vehicle carrying a camera, and you need to access that vehicle. So the edge literally could be out, field-deployed,” Ford said. Trucks, aircraft, Humvees – all can serve as edge computing locations. He said hyper-converged and containerized workloads are easily movable among computing resources located where you gather data. In such cases, Ford said, the agency may find it advantageous to add software stacks to the cloud, from which they can communicate to the edge, where artificial intelligence is occurring.
Otherwise, Ford said, applications running large data sets in edge resources often benefit from adding local GPU accelerators. These enhance performance, while helping the agency avoid some of the costs associated with moving large data volumes and workloads in and out of commercial clouds.
Agencies may find that with this approach, they may only need to transfer across their networks the output of an application, the decision-making result of AI. The data, application, and compute stays local.
Still another option is having a vendor own and operate a replication of an agency’s data center, but in a secure, “caged” facility maintained by the vendor. Advantages include having geographically strategic compute power without the need for a data center-sized capital investment.
“You’re using the same equipment, the same technology, the same education and investment you’ve had for a number of years,” Perez said. “You’re just now moving into the next stage and being able to do it faster and quicker.” And on a predictable consumption-cost model.
Perez and Ford said it’s important to distinguish between the training period of AI and the deployment, in terms of the best location and architecture. Each may require a different computing set-up for maximum efficiency. Training is generally more efficient in the cloud, whereas deployment often requires a federated approach.
“Effectively, federated is, rather than bringing the data from the edge back to you, why not send those analytics in that virtualized container to the edge so that you’re not moving the data,” Ford said. “And then, once you have the results computed at the edge, you only send back the results.”
Chief Technology Officer, Cloud Practice, ThunderCat Technology
Rather than bringing the data from the edge back to you, why not send those analytics in virtualized containers to the edge so that you're not moving the data. And then once you have the results computed at the edge, you only send back the results. You're lowering the bandwidth, decreasing the amount of data that has to traverse all of those networking hops.
Artificial Intelligence Alliances Manager, Dell Technologies
Listen to the full show:
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Chief Technology Officer, Cloud Practice, ThunderCat Technology
Artificial Intelligence Alliances Manager, Dell Technologies
Host, The Federal Drive, Federal News Network
Chief Technology Officer, Cloud Practice, ThunderCat Technology
Mr. Nic Perez is a senior software professional with over 25 years of industry experience in systems architecture. He is proficient in all aspects of managing, architecting and building complex end-to-end solutions. Nic has over 15 years of experience as a lead Solutions Architect, designing and implementing large-scale solutions for both commercial and federal agencies. As the Chief Technology Officer for the Cloud Practice, Nic assists clients with infrastructure transformation, infrastructure and architecture strategy design/development, business and technical case analysis for change, technical evaluations, recommendations and alternatives analysis primarily leveraging public cloud architectures.
Artificial Intelligence Alliances Manager, Dell Technologies
Al Ford is currently an alliance business manager for emerging technology at Dell EMC, Federal. His is responsible for collaborating with technology partners and Dell Technologies to bring Artificial Intelligence solutions to the Federal marketplace.
Al supports activities across ACT-IAC, AFCEA, CES Government. He is an active participant in the K9 for Warriors organization. Al is based at Dell’s corporate headquarters Austin, TX and serves the Federal community with an office in McLean, VA.
Host, The Federal Drive, Federal News Network
Tom Temin has been the host of the Federal Drive since 2006 and has been reporting on technology markets for more than 30 years. Prior to joining Federal News Network, Tom was a long-serving editor-in-chief of Government Computer News and Washington Technology magazines. Tom also contributes a regular column on government information technology.