Insight by Palantir

The value of public and private sector partnerships

The review and great power competition depends on innovation, speed and the ability to buy new and emerging capabilities. The Defense Department is leading a Gl...

Shape

 

This is a really exciting time where the government can gain efficiencies in terms of time and cost efficiencies in terms of just the time to field an effective system, which, first of all, is probably the most important thing.

Shape

 

The core problem is the same and it is that information is all over the place. We haven't gotten rid of mainframes. The data models or schemas of those data are often different. The permissions for each different system is different. The way you call and structure the data, the format of the data, how large the dataset is, how you move that around so that you can make a decision are all different.

In 2015, the Obama administration introduced us to the reemergence of the great power competition. By January 2018, the Trump administration’s National Defense Strategy formally reoriented U.S. national security strategy and U.S. defense strategy toward an explicit primary focus on great power competition with China and Russia.

Today, the Defense Department is leading a Global Posture Review so that the military footprint is appropriately aligned with our foreign policy and national security priorities.

At the center of this effort, of course, is data. Not just data about China or Russia, but data underlies every strategy and plan DoD is embarking on. The review and great power competition depends on innovation, speed and the ability to buy new and emerging capabilities.

DoD must be able to ensure it’s positioned for success in this great power competition.

Doug Philippone, the global defense lead at Palantir, said to do that, the Pentagon must more quickly fully grasp the power of its data.

“The core problem is still fundamentally they’re talking about it too much. When you are talking about sensor to shooter, it’s about how quickly can you make those decisions? If you have a swarm of drones coming at you, how can you make the decision about what are you going to do about it? How long do you have to react? These are all really hard problems,” Philippone said during the discussion Technology and Great Power Competition: The Value of Public/Private Sector Partnerships sponsored by Palantir. “One thing that I’ve seen both across commercial industry and in the government is there are very few examples of a technology strategy that actually works. The core thing we’re really talking about is IT transformation, and most of the time those things fail. People talk about it. They’ll put out a white paper. They’ll get promoted based off the white paper or the procurement, and then they’re long gone before there’s any accountability or results whatsoever of those big problems. The biggest mistake folks make is, you just need to get started. The more you talk about it, the more you want to make it perfect. You just don’t get there from here.”

And to begin taking advantage of the data, agencies must assess and catalog the information and then begin to create the integration map.

“You can’t wish away all that complexity. You have to just embrace it and get started and do something useful and focus on producing an outcome right now. And so that just means you have to start somewhere,” Philippone said. “You can’t, connect all these things in the wild. It will take 10 years and it will never work. And doing a data lake never works. You just can’t just throw all the data into a ‘lake’ because you’d lose all the lineage and the auditability of that stuff.”

Philippone said agencies need technology that lives between the operating systems to connect the disparate data and address potential cybersecurity rules.

One of the ways agencies can break through these challenges is by partnering more closely with the private sector. Philippone said while the government has some unique needs, its core requirements are not unique.

“This is a really exciting time where the government can gain efficiencies in terms of time and cost efficiencies in terms of just the time to field an effective system, which, first of all, is probably the most important thing,” he said. “But then also getting it for cheaper is important because you’re taking advantage of either the investor base that’s sponsoring on the venture side and then also from the commercial side, these products actually work in the wild. We’re solving problems out there that the government can take advantage of where you’re able to start with the 80% solution.”

Philippone pointed to an example from the Army that highlights this approach.

He said the Army wanted to better understand its readiness through a human resources perspective.

“It was pretty intense in terms of the scale of the data. If you look at just the HR side of the Army, there’s something like 191 systems that that had been built over the last four years or so, and they don’t necessarily talk to each other. They’re written in different languages. They have different data models, etc. And then how do you bring those together so that you just know the full story of a soldier from when they are recruited to the time they retire? That ends up being a hard technical problem,” Philippone said. “We were able to apply our technology and just configure it for a different use case. We started with a pilot where we had something useful was like 13 days.”

The Army isn’t alone in this type of challenge where it needs to pull data from disparate databases to better understand how to solve a problem.

Philippone said agencies have to know where people, equipment, money and other resources are when a crisis happens and not have to find data through spreadsheets or other disparate databases.

“The core problem is the same and it is that information is all over the place. We haven’t gotten rid of mainframes. The data models or schemas of those data are often different. The permissions for each different system is different. The way you call and structure the data, the format of the data, how large the dataset is, how you move that around so that you can make a decision are all different. It’s a seminal problem,” he said. “Palantir has applied our core technology, the operating system that connects these things, in almost every industry around the world at this point, our operating system can bring that universe of data together.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories