Everyone wants everything in real time (or almost). Tech leaders from CACI, Future Tech, MITRE and SAIC share the impact that has on lifecycle management.
This is the fourth article in our IT lifecycle management series, Delivering the tech that delivers for government. It’s Part 2 of a two-part roundtable with enterprise tech leaders from CACI, Future Tech Enterprise, MITRE and SAIC. Find Part 1 here.
ASAP. It’s an acronym thrown around with impunity by almost anyone who has a technology need. For the leaders of technology enterprise operations at government contractors though, delivering capabilities right now is increasingly the desired norm.
Part of that is driven by the needs of the agencies that federal systems integrators work with. Federal agencies often look to their industry partners to provide technology they might not have themselves, said Cedric Sims, senior vice president of enterprise innovation and integration at MITRE.
“We have an obligation to our sponsors to be able to provide some compute and capacity that they don’t have access to now — at scale,” Sims said during the second half of a two-part roundtable discussion for our Federal News Network series, Delivering the tech that delivers for government.
Sims went on to point out that such needs have led MITRE to make investments in technologies ahead of government agencies making their own in some instances. For example, he said, “we’re building out a fairly significant artificial intelligence sandbox.”
For the roundtable, we talked with Sims and also technology leaders from CACI, Future Tech Enterprise and SAIC about how they increasingly deliver technology capabilities at speed to support users across their organizations and also the government organizations and missions that their companies support.
They shared the tactics and technology approaches they’re deploying now to meet these ASAP demands. The discussion homed in on five critical areas that impact delivering and supporting services for users in real or near-real time: cloud, data, security, AI and preparing for the future of lifecycle management. (You can find the first part of the discussion, “How federal contractors put users first — whether employee or fed — to deliver on government missions,” here.)
Managing the needs of users as well as preparing for Day 1 readiness on programs increasingly involves cloud — even cloud-like management in the data center.
“We’ve tried to go to a model that’s a little bit more hyperscaler, to where at least from an original equipment manufacturer standpoint, they can provision hardware, new hardware into that environment — creating that hyperscaler environment within the data center,” said Bernie Tomasky, vice president of enterprise IT at SAIC.
That said, he quickly added that ultimately “it’s all about trying to drive everybody into a hyperscaler cloud.”
It’s now more common for agencies to ask about shutting down data centers rather than standing them up for new programs — and how to maximize their existing capital investment while leaning into the cloud for scale, said Erik Nelson, senior vice president of the enterprise IT operating group at CACI.
On premise versus in the cloud are often competing needs, he said, especially for agencies with missions that take them to remote locations where they must deliver technology capabilities for temporary missions. Think the military services or the Federal Emergency Management Agency, for instance.
“It’s being able to figure out how to configure what is available to be out there in that austere location. And you don’t have a lot of time to deploy it,” Nelson said. “So you have to figure out how to kind of MacGyver some things to make things work. What is important about that is to have all the smart people in a room and be able to say, ‘Hey, here’s how this is going to work here.’ Then, test it out, pilot and deploy pretty quickly.”
The ability to easily navigate between cloud and on-premise environments is critical, said Rick Schindel, leader of federal systems integrator programs at Future Tech Enterprise.
“The OEMs have done a good job in kind of reinventing their as-a-service model,” he said, adding that it led Future Tech to develop Day 1 Readiness capability that blends the OEM elements with any mission-unique technology that may be required as FSIs work with agencies.
Although everyone agreed that agencies no longer resist migrating to the cloud, security still leads government organizations to keep some systems on premise. But the ability to support users interactively, provide services as needed, and proactively manage lifecycle and cybersecurity is pushing agencies to actively embrace cloud’s consumption-based and operational expenditure model, Tomasky said.
“Data security is paramount to what the customer is thinking when they typically are on-prem. But by and large, the obstacles they’re already facing by having on-prem solutions, they want to get past,” he said.
The data security focus ties directly to the government’s numerous cyber requirements, such as establishing zero trust architectures (with a fall 2024 deadline looming for agencies) and ensuring supply chain risk management through vendors providing third-party verification.
What is known about each industry provider’s infrastructure has become critical with the move to software as a service and multiuser platforms, Sims said. Plus, it’s common to integrate multiple vendors’ products and tools for federal projects or to host data and environments on the large cloud services providers’ platforms, he pointed out. Going forward, this makes visibility and transparency essential, Sims said.
It also requires reimagining how agencies and vendors manage data so that users can access exactly what they need when they need it to do their jobs, Nelson said. Yes, agencies and their industry partners are implementing least-privilege and role-based access models, but rethinking privacy and classification practices and extracting data selectively must occur in tandem, he advised. This is particularly challenging as the federal government houses vast stores of personally identifiable, sensitive and classified information.
“Being able to, to plan out, ‘Hey, this is the portion of that data that’s really important. Everything else around it, we can do something different,’ ” Nelson said. “It’s really a cultural change for all these government agencies because it’s easier to classify something than to say, ‘Well, only this is classified.’”
Artificial intelligence should help provide answers to data needs — both culling data but also managing end user devices and meeting users’ needs, Sims added.
At MITRE, AI and machine learning are “being used for things that we could have never imagined,” he said. Its network engineers apply AI models to log data to identify functions on the network that may not be performing as anticipated.
“We have some very bespoke capabilities around some of our security capability logs that come in,” Sims explained. “It allows our staff, our talent, to really explore: What does it actually mean to approach these problems in a different way? And we’ve seen some really impactful outcomes because of it.”
Schindel agreed that AI has potential to improve sustainment and maintenance operations of existing devices and platforms. “Our government has deployed all of these platforms across their operations. AI will help them sustain them for longer periods of time because you’re going to be able to do a ton more in terms of predictive and preventive maintenance,” he said.
Sims added that MITRE expects the development of new small language models trained specifically to do just that, to run on the edge so that IT and security needs can be “responsive, adaptive and predictive — without [devices] having to kind of dial back to a mothership.”
But for that evolution to take place, Schindel circled back to what Nelson shared about the need to focus on data management. Agencies will need to consider the security aspect of how their organizations use AI models, he said. Agencies will need to protect the resulting data and manage who has access to it and who can take part in discussions around it, Schindel noted. In other words, what are the appropriate guardrails for managing transparency and security simultaneously?
“It is an irony of AI” and definitely generative AI, Tomasky noted with a laugh. “We want all the data brought in, but we’re not going to put any of our data back out.”
He expects though that AI will let technology teams get away from looking at dashboards and screens before making changes. Throughout the IT lifecycle, “you’re going to see AI and automation play a bigger and bigger role,” Tomasky said. “It already has in the service desk environment. And you’ll see it across the network, and cyber, and everything else as well.”
Part of getting there depends on avoiding data overload, by focusing on the value that any given AI use and dataset delivers to the end user or the organization, and also on demystifying AI as well, Nelson said. That’s how to take advantage of “lots of AI ops capabilities and service desk to really draw down the mundane tasks and make them much easier to do.”
Discover more stories about how federal systems integrators and government contractors manage their enterprise infrastructure environments in our series Delivering the tech that delivers for government, sponsored by Future Tech Enterprise.
To listen to the full Part 2 discussion, click the podcast play button below:
Check out all podcast episodes of the Delivering the tech that delivers for government series.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Vanessa Roberts crafts content for custom programs at Federal News Network and WTOP. She’s been finding and telling B2B, government and technology stories in the nation’s capital since the era of the “sneakernet.” Vanessa has a master’s from the Columbia Graduate School of Journalism.