When a healthcare organization suffers a major breach of protected health information, it is usually reported to the US Department of Health and Human Services. Details of the breach are made public, as are the penalties handed down and corrective actions the provider is required to take.

That tends to be where public tracking of the incident ends, however. The Office for Civil Rights (OCR) monitors the organization for 3 years, but little is known about what happens after creation of a resolution agreement. This prompted a study showing that hospitals that had a reported breach suffered reduced quality in 2 areas: higher 30‐day mortality rates for heart attack patients and increased time for patients to get emergency electrocardiograms (EKG).

“We compared hospitals that had a breach and made [IT] changes to those that had not, and we discovered there was an association,” said study co-author Christoph U. Lehmann, MD, of the University of Texas Southwestern Medical Center in Dallas. “We cannot claim that this is causal, but we did suspect it.”

The results mean there may be a connection between OCR’s required changes to increase security and how the changes could adversely affect clinical care, he said.


Continue Reading

Slow it down

Dr Lehmann equates the increased security providers have to implement—at a high level—with logging into a bank account. If a person needs to use 2 ways to authenticate who they are to get into the bank’s online system—such as a password and an answer to a personal question—it is going to take longer than just using a password. Known as multi-factor authentication, this is often a measure implemented to increase IT security in healthcare organizations.

“We wanted to understand if there might be unintended consequences to making changes in technology,” Dr Lehmann said. “Hospitals make changes, and they don’t necessarily measure or know what happens downstream.”

The time sensitivity of EKGs is the reason they used the measure of a heart attack. It is recommended that patients with ST‐segment elevation myocardial infarction (STEMI) get an EKG within 10 minutes of entering an emergency room. Quicker diagnosis improves outcomes for these patients.

The study by Dr Lehmann and colleagues, which was published in Health Services Research (2019;54:971-980), included 3025 hospitals. They merged breach data with Medicare Compare data for the period 2012 to 2016. Hospitals that had breaches took up to 2.7 minutes longer to get patients to an EKG in the third year after the breach. Prior to breaches, all of the hospitals’ times to EKG were similar.

Mortality rates for heart attacks increased gradually after a breach as well. They rose by 0.23% in year 1, 0.36% in year 2, and 0.35% in year 3. Dr Lehmann thinks the increase over time may be caused by different security measures being implemented periodically during remediation. If researchers had continued to track these hospitals, he said the rates would likely level out.

“We expected these results, but they were bigger than I had anticipated,” Dr Lehmann said. “I was surprised to see the EKG times went up by that large of a fraction. Going up 2.7 minutes from a starting point of 8.15 minutes means they went from in recommended range to out of it. And in cardiology, time equals muscle.”

Factors affecting care

Jon Moore,senior vice president and chief risk officer at Clearwater Compliance, LLC, based in Nashville, Tennessee, said IT usually is not at the top of the list of factors that pose a risk to patient safety, but he thinks it should be somewhere on there.  

“It’s important that folks understand that as technology gets embedded deeper into the business operations of healthcare providers, it can have a direct impact on their ability to deliver care,” he said.  

This sometimes happens when an organization has a breach and did not have sufficient security measures in place. To comply with their resolution agreement, they can flip direction and “lock things up too tight,” Moore said.

After a breach occurs, organizations typically change the way staff handle and access data. Devices shut down more quickly when not in use, stronger passwords are put in place, data are encrypted and multiple sign-ons may be required. On top of that, staff must be trained and become accustomed to the new systems.

These changes can add up to getting patients through the system more slowly, particularly in hospitals where there may be numerous systems to log into with passwords that regularly need to be changed. What should happen, according to Moore, is a conversation between security and clinicians to think through and balance security and clinical risks.

Reducing the impact

Moore said organizations are working on ways to reduce the burden of access controls in healthcare. Much like newer phones have facial recognition or fingerprint IDs to ease signing in, some similar solutions exist in healthcare as well.

For instance, some hospitals are using smart cards with a proximity reader, said Gary Pritts,president of Cleveland-based Eagle Consulting Partners, Inc. As staff members move through a hospital, workstations detect when they are nearby and can sign them in. Organizations can also design a system where staff members enter a system password only once every 24 hours, and the rest of the day, they simply waive the smart card each time they move to a new workstation.

“It’s faster than fumbling with passwords [every time they go to a new computer],” Pritts said.

“Having solutions like these improves workflow and speeds things up for clinicians.”

Pritts said the study is very preliminary and warns about making too much of the results. He said other institutional factors, like poor management, can manifest itself in places like IT and clinical care at the institutions.

Pritts said he has heard anecdotally that increasing security slows processes down, particularly when done poorly. His takeaway of this research is that organizations should be attentive to the workflow of clinicians when any IT improvement takes place.

Related Articles