The Department of Veterans Affairs’ first medical center to launch its new Electronic Health Record (EHR) is running into data quality challenges so severe that its inspector general’s office is concerned whether the facility can maintain its hospital accreditation.
The IG’s office, in a report Wednesday, found the Mann-Grandstaff Medical Center in Spokane, Washington still lacked critical quality and patient safety metrics a year after the EHR go-live.
The new EHR went live at Mann-Grandstaff VAMC in October 2020. The facility served over 35,000 patients in fiscal 2021.
The IG’s office found that the facility staff’s lack of access to critical metrics hurt the facility’s continuous readiness for re-accreditation, “which may compromise the facility’s future hospital accreditation status.”
“The OIG remains concerned that deficits in the new EHR metrics may negatively affect organizational performance, quality and patient safety, and access to care,” the report states.
The report states that the facility losing its accreditation would hurt patients’ trust in the facility and make it harder to recruit quality employees.
House lawmakers have raised concerns about the new EHR’s reliability, and have called on the VA to pause future rollouts of the system until it conducts a comprehensive review of problems.
The Senate last week passed the Electronic Health Record Transparency Act, which requires the VA to submit periodic reports to Congress on the costs, performance metrics, and outcomes for EHR modernization. The bill now heads to President Joe Biden’s desk.
VA officials told the Spokesman-Review last month that the EHR has experienced 42 “unplanned degradations” and eight “unplanned outages” between its launch in October 2020 and April 20, 2022.
The IG’s office released three reports in March that found the new EHR sometimes failed to indicate to providers that patients were flagged as being at high risk of suicide and gave VA providers an incomplete picture of patients’ health care data.
The EHR went live at a VA facility in Walla Walla, Washington this March, and in Columbus, Ohio in April.
Deputy VA Secretary Donald Remy agreed with the report’s recommendations but told the IG’s office the VA anticipated that it would take time to fully utilize its new EHR’s analytics capabilities.
Remy added that the data between the agency’s legacy VistA EHR and the new Cerner EHR “would not be directly comparable,” particularly for appointment scheduling.
“The transition is very difficult, but the Cerner system will benefit VA by providing better standardization, more real-time front-line analytics, a common system with DoD and other health systems and alignment with health care industry best practices,” Remy wrote.
The Cerner EHR is running at more than half of the Defense Department and Coast Guard’s health care facilities, as well as 27,000 private provider facilities and more than 5,900 hospitals globally.
Remy said the transition underscored the challenge of implementing a new EHR in the largest integrated health care system in the U.S. The VA’s enterprise data encompasses over 2 trillion rows of data from 13 source systems.
However, the IG’s office remains concerned that EHR problems found in Spokane may create additional problems as the EHR goes live at larger, more complex VA facilities.
“The OIG is concerned that further deployment of the new EHR in VHA without addressing the gap in metrics available to the facility will affect the facility and future sites’ ability to utilize metrics effectively,” the report states.
‘Every week those workflows are changing’
The IG’s office found the Spokane facility, one year after the EHR go-live, had only six metrics partially available, out of 17 metrics necessary for accreditation.
A VHA leader told the IG’s office that the facility was “absolutely not” ready for an upcoming accreditation survey through The Joint Commission, an independent, not-for-profit organization that certifies nearly 21,000 health care organizations and programs in the U.S.
“Every week those workflows are changing, meaning the way they do work, what they enter is changing every week. It’s hard to keep up,” the leader said.
The IG office raised concerns that missing clinical metrics, including patient safety and quality of care metrics, “may not allow for accurate and timely patient safety monitoring,” and could delay opportunities to improve service.
The report also found that a lack of publicly reported quality metrics made it harder for veterans to make informed choices about VA care and how the quality of VA care compares to private sector providers.
VA is mandated under the MISSION Act to measure, track, and publish quality and patient safety metrics at VA facilities. VA must ensure that data reported to the public are “clear, useful, and timely” so that patients are able to make “informed decisions regarding their health care.”
‘Many hours have been added to workload’
The IG’s office found that following the EHR go-live, facility staff made created workarounds to mitigate post-go-live gaps in metrics.
“By having to audit every patient admitted during a time frame to see if they are applicable to my data needs, many hours have been added to workload,” one facility employee told the IG’s office.
Facility staff told the IG’s office that the workarounds created a “tremendous” increase in additional workload, at times requiring several hours or days just to prepare one metrics report.
“At times I have worked weekends and nights until 10 p.m., 12 midnight, or one time I didn’t even go to sleep to provide a needed report by the next morning for our leadership or for a suspense [due date],” an employee told the IG’s office.
Following the EHR go-live, one employee said responses to data requests took about eight people spending a combination of 24 hours a week to complete
“Now we have it down to about 6-8 a week,” the employee told auditors.
Staff also relied on workarounds to assess wait times for new and returning patients seeking care. A facility leader told the IG’s office that those workarounds produced approximate results, but were not “the exact metrics required by VA.”
The IG notes the VHA’s “history of deficient scheduling,” as a major reason why the agency needs accurate metrics on patient wait times.
The IG’s office found that the inability of VHA staff in Spokane to track the availability of care and wait times “impedes the ability to prevent delays in care and could lead to patient harm at the facility.”
The report found these metrics made it impossible for veterans to compare wait times at other VA facilities or choose care at facilities with shorter average wait times.
A facility leader explained that the “inability to measure what we do accurately has a negative outcome to metrics, but not necessarily to care.”
A Veterans Integrated Services Network (VISN) leader told the IG’s office that any data or reports that required manual validation increased the risk of human error.
“Because we have been very diligent about both creating the reports gradually over time and having our chiefs provide their best estimates, I don’t believe this has been a direct patient safety issue. But it clearly is an efficiency issue and ultimately accurate data is needed to make [the] best decision,” the VISN leader said.
‘Everything is different’
VHA employees told the IG’s office that these workarounds stemmed from the new EHR producing a much larger volume of data, but labeled in ways unfamiliar to staff.
A VHA leader told the IG’s office that 10 days after the new EHR go-live, the agency received “roughly eight times the amount of data out of Cerner than we have ever gotten out of VistA,” the VA’s legacy EHR system.
The VHA leader said the new EHR produced data fields that were named differently than what employees were used to under the old system, and “we have never seen the data before.”
“Everything is different,” the leader explained.
A staff member told the IG’s office that facilities using the legacy EHR have one dashboard. However, the employee told the auditors that the new EHR requires facility staff to pull data from five different reports, export the data to a software program, manipulate the data and then review the data for errors.
The report finds that a lack of data definitions in the new EHR reports made it difficult for facility staff to explain what metrics meant to leadership, and made it harder for staff to export and use data.
“The problem with exporting any data is that we do not have data definitions to determine where the data comes from or what it really means. By making assumptions, we could easily make a huge error,” a facility leader told the IG office.
Another facility leader told the IG’s office data from the new EHR was frequently unusable, unless data were exported from the new EHR and then manipulated with other software tools.
Exporting data created additional challenges. The facility leader explained that following the export of data, the data had to be reviewed for strange outliers, such as duplications, test data and data in the wrong columns.
“This happens regularly, so it can never be trusted. If you forget even one filter or service from the manual clicking of each clinic, you must start over,” the facility leader said.