Central Command to release new data strategy

CENTCOM is paving its own data narrative with a new strategy while moving closer to Combined Joint All-Domain Command and Control.

The U.S. Central Command is about to sign out on its data strategy, marking a significant milestone for the command in paving its own data narrative.

While developing the strategy, the command relied heavily on the chief digital and artificial intelligence office’s AI and data strategy that was published last year. At the same time, it is taking its own approach to certain aspects of data governance.

“Our data strategy is going to be a living evolution in terms of how we build this out,” Michael Foster, the U.S. Central Command’s chief data officer, said at the Advantage DoD 2024 symposium Tuesday.

The command is focused on what the Pentagon calls “VAULTIS,” or data that is visible, accessible, understandable, linked, trustworthy, interoperable and secure as a framework to practically drive their governance processes.

Specifically, the command looks at data from different users’ perspectives. For operators, for example, visibility and accessibility manifest through a traditional user experience, while for developers, to be visible and accessible means having well-documented application programming interfaces (APIs) that enable them to interact with the system.

“VAULTIS, as it applies to a persona, can look very different. We’re using that to proactively drive governance processes at CENTCOM,” Foster said.

Foster’s office is particularly interested in several data quality dimensions, including accuracy, completeness, conformity, consistency, uniqueness, integrity and timeliness.

“We’re very intrigued by those as a mechanism to get our arms around data quality,” Foster said. “Use of data is pervasive across CENTCOM right now. The notion that [Joint All-Domain Command and Control] is something in the future is just foreign to me. JADC2 is my every day, we are living JADC2.”

The command is currently moving towards Combined Joint All-Domain Command and Control (CJADC2). It is building out its zero trust environment, with the ultimate goal of sharing this environment across a broader set of partners. While it’s not operational yet, the command plans to add stakeholders and allies to their environment this year.

Foster’s particular area of concern as the chief data officer is the quality of data that is feeding JADC2.

“When we talk about what the curation of our data at the enterprise level looks like, that’s where I think we’re trying to build a creative solution there. And again, it’s building off those data dimensions that came out of the strategy,” Foster said.

The focus now is on building data level testing units across those dimensions for priority datasets.

“One of the advantages we have — you could argue we are at the beginnings of a data mesh. But our data mesh predominantly has two nodes right now. It is not a largely federated system,” Foster said.

“By having localized data stores, it makes it very easy for us to deploy a robust set of unit tests and to routinely monitor our data to understand things like how consistent it is, or to what extent we are having geographic anomalies when someone flipped [latitude and longitude]. That happens. You have a boat in the middle of a continent that’s not supposed to be there.”

One of the most challenging aspects of JADC2 is having multiple data streams consolidated into a single view but coming in at different times, meaning multiple observations on the same object are flowing in at separate times, creating challenges in correlating and reconciling data to understand where the object is in space and time.

Moving forward, the command will focus on understanding data dimensions and routinely monitoring for quality through these dimensions this year.

Foster will also focus his efforts on data resiliency and smart infrastructure. As the command becomes more dependent on data, it’s crucial to its operations to have a contingency plan for when data is not available.

“We need to have backup plans and options for pivoting to alternative data sources for rapidly restoring data. I think that’s motherhood and apple pie. The challenge of resiliency is making that happen in a way that’s responsive, so that if there is an outage, what is that turnaround time before content is restored,” Foster said.

Foster said workflow automation is where the command has made the biggest gains in the last year.

“I would love to tell you that it was the amazing narrative of AI. I want AI to be relevant and central to all things good. The reality is that automation has been where we’ve seen the most gains; it’s been the most accessible for broader teams. But when I think about automation and where decision advantage comes from, I think workflow automation doesn’t get the proper highlights that it deserves. Because so much of the speed of our workforce is inherently anchored in unstructured data processes, manual transfers, phone calls across teams to make sure everyone has an understanding of what’s happening,” he said.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Photo by Michael L. LewisUniformed and civilian cyber and military intelligence specialists monitor Army networks in the Cyber Mission Unit’s Cyber Operations Center at Fort Gordon, Ga. (Photo by Michael L. Lewis)

    Army implementing new data architecture, launching innovation exchange lab next month

    Read more
    Amelia Brust/Federal News NetworkFederal Acquisition, GSA

    New data strategy, OMB Circular to set foundation for OFPP’s Better Contracting Initiative

    Read more