A ‘once in a lifetime opportunity’ for agencies to use data to improve healthcare results

During this exclusive webinar, moderator Jason Miller will discuss how agencies can start collecting data and look at process improvements in healthcare with agency and industry leaders.

Register
Speakers

5 speakers

Date & Time

On Demand

Webinar

Duration: 1 hour
Cost: No Fee

The COVID-19 pandemic reinforced the power and importance of using data to drive decisions.

But what agencies didn’t know over the previous two-plus years is just how crucial that information has become over the long term to prepare for future health emergencies.

Agencies and the private sector saw the value of working together to solve immediate problems, whether it was getting vaccines or personal protective gear distributed across the country or sharing emergency room status of where patient spikes occurred.

Now that public and private sector organizations can all take a half a step back from the COVID emergency, they can start collecting and analyzing the lessons learned and looking at process improvements.

Data will drive a lot of that understanding and agencies will need the underlying technology to process that data.

Jeff Shilling, the chief information officer at the National Cancer Institute in the National Institutes of Health, said the biggest change he’s seen over the last few years is the ability to collect and maintain data in a more centralized way that can be more easily shared.

While the public and private sector medical community still has a long way to go to make full data sharing a reality, Shilling said the White House’s scientific data sharing initiative is driving a lot of their focus.

“The federal government has formally created a program called the Cancer Moonshot. The goal is to be able to really identify and address specifically the barriers that were stopping us from moving forward as quickly as possible. And certainly, one big barrier is really the data integration,” Shilling said on the discussion Advancing data analytics in health: Lessons learned from the pandemic. “While we might have some technical barriers, like not using standard terms for everything, we also do have some policy and some visioning ones, and that really ties into your preparedness. What are we? What are we envisioning? How are we planning?”

The planning and vision that Schilling refers to becomes more difficult as the amount of data continues to increase.

Preparing for the future

That is the situation the Food and Drug Administration found itself in as the pandemic started.

Sara Brenner, the chief medical officer for In Vitro Diagnostics and the associate director for medical affairs in the Center for Devices and Radiological Health at the FDA in HHS, said the agency has an opportunity to prepare for the future of ever-increasing amounts of data they must manage and understand.

“The COVID pandemic has been one large example where the government has rapidly tried to build systems to address data, capture and analyze new data in real time. But we’re absolutely at that point where we think about lessons learned and how do we prepare for the future,” Brenner said. “I think at FDA, one of the ways that we’re really now digging into this is in looking at medical products across what we call their total product lifecycle. That is evaluating pre-market, medical device performance as well as post-market medical product or device performance. Understanding how those products perform, perhaps in ways that weren’t anticipated just through the pre-market review process. That is a data heavy process, and at the FDA, we care a lot about regulatory grade quality that informs our decision making with real implications. We are very interested in how to improve the quality of data as well as the completeness of data captured, while recognizing that in this world of so much data, you can’t just ask for it all and have it all if there’s no utility.”

The Centers for Disease Control and Prevention in the Department of Health and Human Services found the volume and velocity of data during the pandemic reinforced the need to have systems and an infrastructure that are agile and scalable to handle the latest challenge.

“Our learnings during the current pandemic was this incredible improvements and enhancements to our data ecosystem. But at the end of the day, the data is only as good as how well we can communicate out the findings about it,” said Matthew Ritchey, the chief of the Partnerships and Evaluation Branch and lead of the COVID-19 Data, Analytics and Visualization Task Force for the CDC. “That’s a big thing that I think we learned during the pandemic. When you’re in the throes of an early part of a pandemic, in particular, and you don’t have all the information you would want as being that research scientist sitting at your bench, you want to be able to cross your ‘Ts’ dot your ‘Is,’ when you’re in that middle ground of evidence for a response around this new condition, how do you effectively communicate out what you do know? I think that’s a big thing that was learned from this as well. It’s not just about the data. It’s not about the pipelines, not even just about the analytics. It’s about how do you package that information in the right way to the public, to other consumers, to make sure that we’re informing them as we know what we know, and knowing that that could evolve as well.”

Trust and transparency remain key

Dr. Bill Kassler, the chief medical officer for Palantir, said what Ritchey is talking about is ensuring citizens trust the data and the information that is coming from the government and other organizations.

Kassler said agencies need to have the right technology and analytical capabilities to help drive that trust and transparency message.

“You can’t collaborate unless everybody has access to the data and feels comfortable that there is a robust governance, that there is security, that there is role based and purpose based access control, that privacy and civil liberties issues will be addressed within that governance structure. One of the things that COVID has taught us is it illuminated a lot of preexisting flaws within the system that were there, but perhaps weren’t generally noticed,” he said. “We saw the health care system, and primary care in particular, was overwhelmed. We saw federal, state and local health departments overwhelmed by the COVID surge. We saw supply chains that have been, and still in some cases, stretched thin by COVID. Somehow our frayed data infrastructure was just not up to the task. It hampered our ability to respond.”

Kassler said agencies have a once in a generation opportunity to modernize data, systems and analytical capabilities to get closer to having real-time situational awareness.

He said through the use of predictive analytics, public and private sector organizations can get ahead of and understand where the next health emergency is heading.

Learning objectives:

  • Data Strategy Around Public Health Data
  • Use Cases
  • The Elements of Infrastructure

Complimentary registration
Please register using the form on this page or call (202) 895-5023.

Speakers
Jeff Shilling
Chief Information Officer, National Cancer Institute, NIH
Matthew Ritchey
Branch Chief, Center for Surveillance, Epidemiology and Laboratory Services and Lead, COVID-19 Data, Analytics & Visualization Task Force, CDC
Sara Brenner
Chief Medical Officer, In Vitro Diagnostics and Associate Director for Medical Affairs, Center for Devices and Radiological Health, FDA
Dr. Bill Kassler
Chief Medical Officer, Palantir
Jason Miller
Executive Editor, Federal News Network
Sponsors

Please register using the form on this page.
Have questions or need help? Visit our Q&A page for answers to common questions or to reach a member of our team.