5 speakers
On Demand
Data has always been in some sense the purpose of information technology systems. Now, the need for data management is only growing. One reason is the advent of artificial intelligence, which both requires large amounts of data for training and generates new data products. Another is that large elements of organizations’ IT infrastructure – such as networks, network devices, processors and disks – increasingly exist as software, virtual instances that easily move among cloud computing environments, or scale up and down on demand.
Still another reason: For integrated, enterprise approaches to digital services, organizations need to meld multiple sources of data, independent from their original applications, into well-managed sources available to multiple applications.
In the Defense Department, mission operations as well as the supporting functions like acquisition and contract management are becoming more joint and integrated. Data sharing at an enterprise level requires careful data management to ensure interoperability and basic availability of data to organizations and applications that need it.
The Defense Contract Management Agency (DCMA) exemplifies this trend. With 10,000 employees overseeing the contracts with the entire defense industrial base, its data strategy comes down to three words: collaboration, consolidation and standardization. That’s according to DCMA Chief Data Officer Craig Morgan.
“We started moving to enterprise solutions,” Morgan said, “not only to collaborate internally but also to talk to our customers and our sister agencies so that our data would be on one data standard.” Ultimately, Morgan said, “we’re getting our data out of the masses so that that our customers can make better acquisition decisions.”
He added, “By moving in enterprise environments we get on standardized data lakes, and everybody can utilize our tools.”
Morgan was part of a Federal News Network panel convened to explore the issue of how best to realize the power of data using sound management and storage strategies.
For Cmdr. Tim Beach, department officer at the Naval Surface Warfare Center, Port Hueneme Division, the challenge is keeping data up to date and available even to remote or disconnected environments.
“When you talk about how we capture data strategically,” Beach said, “I think that the big Navy and DoD have approached it to say that data centricity is different than we’ve approached it in the past.” Whereas data in the past belonged to a specific application, now Navy operators think about it as a commodity in its own right, “as in, if your application dies, the data is still good,” Beach said. “Can you get to that data with a different application?”
He noted that the Joint All Domain Command and Control project, or JADC2, and the Joint Warfighting Cloud Capability, or JWCC, both represent efforts “to the point where we have managed services where people are able to self-serve what they need.” Beach added, “One of the tricky parts of managed services, and what the industry both commercial and DOD are struggling with, is how do you create trusted and secure data feeds?”
The Navy, like the other armed services, also must have the ability to keep going in contested environments with interrupted reach-back to cloud services or data centers.
“From an infrastructure standpoint, we’re driving to try to align all of these different stakeholders and users of data because everybody is a user and creator of data,” Beach said.
To Chris Cargile, the senior manager of Dell Technologies Storage Platform Solutions, data management planning must take into account three trends.
Data management also must incorporate the fact that not only applications of record produce data.
“What we see is a requirement to generate more data than we’ve ever seen before,” said James Rogers, director of data center modernization at Iron Bow Technologies. “Every single soldier, sailor, airman and Marine is out there, generating some sort of data. They need to be able to process that in real time locally in very austere environments.”
Rogers said locally-generated data needs eventual repatriation to data centers or clouds, “to be able to look at the bigger picture and be able to utilize technologies like AI and machine learning to aggregate that data and make inferences”
He said the same data also requires capture and storage locally “to provide that warfighting advantage.” Thus the emergence of hardware solutions that bring cloud architectures to local, sometimes disconnected, deployments. Wherever it is used, Rogers said, “we end up with a lot of the traditional questions. Are we backing it up? Are we encrypting it locally and in flight? Is it immutable to help protect against ransomware? Are we meeting our logging requirements?”
In short, DoD’s needs for remote and central computing plus its multi-cloud hybrid environments add up to what Rogers called “a pretty complex environment.”
Beach said the Navy is looking at hardware tech stacks that it can update with new data and new functionality, without disrupting tactical operations or breaking security measures already in place – almost, he said, in the app store model.
“We have to be able to communicate a base amount of data back to maintain security updates, tactical updates with functionality, with actual value to warfighting capability,” Beach said. He added that the challenge is “how you leverage and integrate with tech, versus interfacing data with humans.”
At the DCMA, Morgan said, the agency is studying ways to modernize its basic COBOL Mechanization of Contract Administration Services, or MOCAS, application. The goal is integration of MOCAS and various “sidebar” applications into the hybrid data infrastructure “so those are available at every contract management office that we have, across the world at any time. So anybody can touch can touch anybody else’s data across our agency at any time.”
Emerging data storage and management strategies, panelists agreed, will have to provide coherence so that users, data scientists and the IT staff will know what data they have and where it is. This, across an enterprise consisting of multiple clouds, data centers, edge computing centers, and all of the entities generating data. Plus, Rogers noted, the IT staff still needs to have a tiered storage plan for optimize costs as a given data set ages.
“The good news is, we do have some really good weapons in that fight,” he said. Weapons include software-defined storage. “We can develop a single software product, we can put it on premises at multiple facilities,” Rogers said. “And although each one of them stores data, they function as a single cohesive platform, which means that our applications and our end users, no matter where they live, can talk to that storage in any location.”
This approach, Cargile said, requires a special emphasis on cybersecurity.
“With software defined, we can have tools that work in the cloud and on premises,” he said. “But you always want a fallback position. If you are to lose that connectivity or if data is corrupted or disrupted, are you able to recover quickly on the edge and get back the key data centers? It’s important that organizations really map out and understand that cyber terrain and know that they can quickly recover.”
Learning objectives:
Please register using the form on this page.
Have questions or need help? Visit our Q&A page for answers to common questions or to reach a member of our team.