John Kreger, a vice president for public sector programs at MITRE’s Center for Programs and Technology, details four steps to unlock the power of intelligence...
Minutes matter. Every pilot, doctor and first responder knows it. For time-critical problems, people need fast access to data from multiple and often disparate sources.
From the Defense Department to the Centers for Medicare and Medicaid Services, agencies spend billions each year to build, maintain and secure systems that manage citizen data. And they spend millions more to connect these systems.
Getting to that connection requires “interoperability.” Roughly defined, it means sharing data across many users and systems. With a more complete picture, people can analyze data for better decisions, faster. It’s not easy to achieve, but it’s more necessary than ever.
What if we could rethink the very concept of interoperability? Make it work smarter, not harder. Using a concept called “intelligent interoperability,” people could make potentially lifesaving decisions in minutes, rather than weeks or months. Even better? These approaches can be cheaper and more secure.
It’s a myth that the only way to make data accessible is by migrating the information to a common database or creating dedicated machine-to-machine connections. It’s an even bigger myth that all you need is technology. That’s where intelligent interoperability comes in.
Intelligent interoperability requires us to determine what problems we’re solving by asking the right questions. As a result, we know what information we want to share with whom.
Getting to the actual data sharing, however, starts with a foundation of laws, policy and regulations. Stakeholders must develop data-use agreements to allow trusted sharing of potentially sensitive data. Agencies may even want to establish incentives (as in the financial and travel industries).
Instead of expecting systems to exchange all data, intelligent interoperability identifies the right core set of data. That requires studying use cases and workflows, then modeling that data using open standards and shared terminology among all users.
Two common ways to use the selected data include moving the data into a common database or sending a query to the data owners.
The second option keeps the original data in its “home.”
Of course, not every problem can, or should, be solved the same way.
For example, application programming interfaces (APIs) loosely connect different systems so they can exchange information flexibly, on demand, without the burden of sharing all of it, all the time. We believe that defining open, standard APIs across entire industries, such as healthcare and emergency response, is a key to lowering cost and improving outcomes.
In turn, APIs demand closer attention to privacy and cybersecurity. It’s all about building a systems approach around important problems—technology plus processes, guidelines and people.
An intelligent way forward to identify the right data
Take healthcare. Connecting data across electronic health record systems enables patients and their providers to improve care by accessing a single, computable health record. It’s safer for patients to share a complete health record, rather than one fragmented across multiple providers. Plus, they can use their personal devices to be more proactive about their health. Given that the United States is moving toward alternative payment models that reward good health outcomes, the ability to measure quality of care is gaining momentum.
While the health industry has been slow to evolve toward modern architectures, it received a big push a few years ago with Fast Healthcare Interoperability Resources (FHIR). It’s an API for exchanging electronic health records. Choosing the “right” health data to share has the potential to fundamentally change the healthcare system. Imagine if information that’s currently stuck with one provider could instead flow readily from that one provider to the next. That would be intelligent interoperability in action—enabling information to unlock healthy outcomes. And it’s beginning to happen.
Steps to intelligent interoperability
Intelligent interoperability isn’t limited to healthcare. It’s relevant for any area that would benefit from rapid, streamlined and selective data sharing.
Regardless of the domain, we believe an intelligent interoperability framework should include these key tenets:
Multidisciplinary approach involving all aspects of interoperability: legal, policy, business, culture, systems, architectures, cybersecurity and data privacy.
User-centric perspective that involves operational experimentation to identify key information and to be shared among stakeholders.
A scalable ecosystem of reusable components, tools, open source modeling components, and testing resources that together facilitate innovation and rapid discovery.
A secure ecosystem of reusable, easy-to-configure components, tools, open source modeling components and testing resources that facilitate innovation and rapid discovery, regardless of location.
For this framework to have impact, we always start with convening the right stakeholders. The first two steps—knowing your outcomes and understanding your users’ needs—are vital before any actual technical or process changes get underway.
To unlock the full potential of our data, we can no longer rely on traditional methods. Together, we should rethink the very concept of interoperability. We can make time-critical, data-driven decisions in minutes, not weeks or months. The benefits to the economy—and to the convenience, health and safety of our people—would be incalculable.
When minutes matter to intelligent decisions
John Kreger, a vice president for public sector programs at MITRE’s Center for Programs and Technology, details four steps to unlock the power of intelligence...
Minutes matter. Every pilot, doctor and first responder knows it. For time-critical problems, people need fast access to data from multiple and often disparate sources.
From the Defense Department to the Centers for Medicare and Medicaid Services, agencies spend billions each year to build, maintain and secure systems that manage citizen data. And they spend millions more to connect these systems.
Getting to that connection requires “interoperability.” Roughly defined, it means sharing data across many users and systems. With a more complete picture, people can analyze data for better decisions, faster. It’s not easy to achieve, but it’s more necessary than ever.
What if we could rethink the very concept of interoperability? Make it work smarter, not harder. Using a concept called “intelligent interoperability,” people could make potentially lifesaving decisions in minutes, rather than weeks or months. Even better? These approaches can be cheaper and more secure.
Learn how DLA, GSA’s Federal Acquisition Service and the State Department are modernizing their contract and acquisition processes to make procurement an all-around better experience for everyone involved.
The myth of interoperability
It’s a myth that the only way to make data accessible is by migrating the information to a common database or creating dedicated machine-to-machine connections. It’s an even bigger myth that all you need is technology. That’s where intelligent interoperability comes in.
Intelligent interoperability requires us to determine what problems we’re solving by asking the right questions. As a result, we know what information we want to share with whom.
Getting to the actual data sharing, however, starts with a foundation of laws, policy and regulations. Stakeholders must develop data-use agreements to allow trusted sharing of potentially sensitive data. Agencies may even want to establish incentives (as in the financial and travel industries).
Instead of expecting systems to exchange all data, intelligent interoperability identifies the right core set of data. That requires studying use cases and workflows, then modeling that data using open standards and shared terminology among all users.
Two common ways to use the selected data include moving the data into a common database or sending a query to the data owners.
The second option keeps the original data in its “home.”
Of course, not every problem can, or should, be solved the same way.
For example, application programming interfaces (APIs) loosely connect different systems so they can exchange information flexibly, on demand, without the burden of sharing all of it, all the time. We believe that defining open, standard APIs across entire industries, such as healthcare and emergency response, is a key to lowering cost and improving outcomes.
Read more: Commentary
In turn, APIs demand closer attention to privacy and cybersecurity. It’s all about building a systems approach around important problems—technology plus processes, guidelines and people.
An intelligent way forward to identify the right data
Take healthcare. Connecting data across electronic health record systems enables patients and their providers to improve care by accessing a single, computable health record. It’s safer for patients to share a complete health record, rather than one fragmented across multiple providers. Plus, they can use their personal devices to be more proactive about their health. Given that the United States is moving toward alternative payment models that reward good health outcomes, the ability to measure quality of care is gaining momentum.
While the health industry has been slow to evolve toward modern architectures, it received a big push a few years ago with Fast Healthcare Interoperability Resources (FHIR). It’s an API for exchanging electronic health records. Choosing the “right” health data to share has the potential to fundamentally change the healthcare system. Imagine if information that’s currently stuck with one provider could instead flow readily from that one provider to the next. That would be intelligent interoperability in action—enabling information to unlock healthy outcomes. And it’s beginning to happen.
Steps to intelligent interoperability
Intelligent interoperability isn’t limited to healthcare. It’s relevant for any area that would benefit from rapid, streamlined and selective data sharing.
Regardless of the domain, we believe an intelligent interoperability framework should include these key tenets:
For this framework to have impact, we always start with convening the right stakeholders. The first two steps—knowing your outcomes and understanding your users’ needs—are vital before any actual technical or process changes get underway.
To unlock the full potential of our data, we can no longer rely on traditional methods. Together, we should rethink the very concept of interoperability. We can make time-critical, data-driven decisions in minutes, not weeks or months. The benefits to the economy—and to the convenience, health and safety of our people—would be incalculable.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
John Kreger is vice president for public sector programs at MITRE’s Center for Programs and Technology
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Navy exploring data’s ability to inform better decisions, faster
Data integration effort delivers savings, better decision making for HRSA
Mitre, UVA offering systems engineering masters program for feds