June 20, 2024
Navigating the complex terrain of healthcare interoperability

While some progress has been made in exchanging data, there are challenges to solve to envision a seamless future in health data management.

Navigating the complex terrain of healthcare interoperability

This article is Part 4 in a series about the healthcare data explosion. View Part 5 here.

In a series of articles elsewhere, we have explored the challenges and opportunities presented by the extensive data explosion in the healthcare industry. You can find the initial publication here for all five themes.

In our previous article in Health Data Management, we explored the concept of data silos and discussed the necessary steps to transform them into cohesive data islands, fostering improved communication. In this installment, we review the regulatory efforts that have begun driving this transformation while also examining the lingering challenges hindering interoperability today.

Interoperability is an enduring challenge

According to a survey conducted by the American Hospital Association, more than 70 percent of hospitals reported challenges in exchanging information across various vendor platforms. Additionally, 70 percent expressed difficulties in locating provider addresses, and 67 percent noted, “There are providers with whom we share patients, but data exchange is not common.” Despite the survey’s original 2021 timeframe, many of these persistent challenges continue to plague the healthcare landscape today.

Since 2021, the Office of the National Coordinator (ONC) has been actively implementing various health IT provisions outlined in the 21st Century Cures Act. Concurrently, the expanding landscape of vendors and service providers in this domain points to the unwavering and enduring nature of this challenge.

Why is it still so hard?

There are many reasons why interoperability remains a vexing problem for healthcare in this country.

For one, the U.S. lacks a standardized patient identifier, which is a major hurdle in achieving healthcare interoperability. While the debate continues about whether such an identifier should exist and the associated political, competitive, privacy and regulatory implications, healthcare organizations still spend considerable time ensuring that patient records match. Any mistakes in this process could expose a patient’s private health information.

The second challenge is the growing complexity of healthcare delivery across various settings like hospitals, nursing facilities, pharmacies and new digital-care options such as retail health. This complexity makes it more difficult to connect patient data from different sources using different systems, like electronic medical records and health information networks, making it challenging to create a longitudinal patient record.

Finally, the most significant hurdle is the inconsistency in clinical practices and data formats across healthcare. For instance, recording ejection fraction, an indicator of heart failure, comes in various forms, from unstructured notes to standardized codes. This diversity in data capture methods makes it difficult for systems to integrate data seamlessly, affecting various data types, like family history, clinical notes and more.

Inherent risks with interoperability

The technological aspects of interoperability (integration engines, data formats, data standards and APIs) are just part of a broader set of challenges around security, privacy and the lack of incentives alignment.

For instance, two competing hospitals in a region could be wary of sharing data because of the risk of losing patients to the other system, or revealing competitive or sensitive information. Unless there are broader regulatory “sticks,” providers and payers often take the path of least resistance, which is to avoid the sharing of data.

During the past few years, significant strides have been made by the Office of the National Coordinator for Health Information Technology (ONC) in addressing the challenges highlighted in the survey, thanks to the introduction of new interoperability and anti-information blocking rules. These vital developments include: 

  • • The United States Core Data for Interoperability (USCDI), which provides a standardized framework encompassing essential health data classes and elements, accessible through APIs, such as the Fast Healthcare Interoperability Resources (FHIR) standard. 
  • • The implementation of fines and penalties for non-compliance with these standards via the 21st Century Cures Act. 
  • • The introduction of the Trusted Exchange Framework and Common Agreement (TEFCA), offering an infrastructure model and governance approach for secure sharing of basic clinical information across different networks. 

However, these initiatives will only slowly create change. Software vendors will take time to develop product features and implementation and adoption of feature sets will only slowly occur across many thousands of providers and hundreds of payers. Also, it is unclear how business and care models will change to meet new standards, which alone may not be sufficient to support operational adoption of these standards.

Reimagining, reassessing interoperability

The next decade in healthcare will see disruption in care delivery models. Seamless interoperability and more broadly robust healthcare data exchange methodologies will determine the success or failure of these programs. Here are some key areas to reconsider our approach to interoperability.

A plug-and-play model for healthcare interoperability could be a disruptive solution, even if it may appear unattainable at present. The Digital Imaging and Communications in Medicine (DICOM) standard serves as our closest approximation to near plug-and-play interoperability. Its consistency and widespread adoption supporting radiology and associated workflows, which enables data attributes, classes and transmission mechanism.

Therefore, use case driven interoperability will get us closer to plug and play. These “pivotal” use cases must be prioritized and aligned to the most impactful cost reduction and healthcare improvement efforts on a national scale.

A couple of notable cases include clinical research, value-based care, hospital at home and behavioral health. Certain initiatives, like the Standards Versions Advancement Processes (SVAP), are already making significant strides in this direction but need to move faster to align with the national healthcare objectives.

The evolution of innovative technologies, such as artificial intelligence and robotic process automation, holds immense potential to alleviate interoperability challenges, especially in areas like patient identification, natural language processing, and AI-based image-to-text processing.  For example, in the above example of a handwritten EF note in a scanned image, an AI-based image-to-text-NLP processing model could be trained to extract the discrete ejection fraction value to be stored and to expedite data exchange with other systems.

Lastly, patients must be enabled to own their healthcare data. With patients as the center of healthcare interoperability, enabled with easy-to-use apps, patients can receive and store data from multiple care platforms and share their data with other providers on an as-needed basis.

Healthcare providers using modern EHRs and payers leveraging their own technology, often bolstered by big tech platforms, possess the capability to share data with their patients. The challenge lies in how they collaborate to offer a more comprehensive dataset.

With the foundational technology and regulations in place, it is crucial to expedite incentives, standard adoption and tool development to meet the industry’s anticipated and necessary changes in our healthcare system.

Sriram Devarakonda is chief technology officer for Cardamom Health.

This article is Part 4 in a series about the healthcare data explosion. View Part 5 here.

Leave a Reply