Season 1 – Episode 35 – Managing Healthcare Cybersecurity Risks and Incidents
Discusses various ethical and practical challenges organizations face in managing cybersecurity risks and responding to breach incidents.
Podcast Chapters
Click to expand/collapse
To easily navigate through our podcast, simply click on the ☰ icon on the player. This will take you straight to the chapter timestamps, allowing you to jump to specific segments and enjoy the parts you’re most interested in.
- Introduction and Guest Background (00:00:03) Host introduces the podcast, guests, and outlines the episode’s focus on healthcare cybersecurity and compliance.
- Lynn Sessions’ Experience and Role (00:01:05) Lynn describes her background, legal practice, and her team’s work in healthcare privacy and data security.
- Ransomware Attacks: To Pay or Not to Pay? (00:02:18) Discussion on guiding organizations through ransomware decisions, double extortion, and factors influencing ransom payments.
- Transparency After a Data Breach (00:05:28) Balancing transparency with affected parties, early notification, and managing reputational risk post-incident.
- Preventing Future Breaches (00:08:46) Advice on post-incident security, HIPAA’s framework, risk assessments, and the importance of data inventory and retention.
- Telehealth Devices and Inventory (00:12:37) Addressing outdated telehealth devices, device inventory, and HIPAA’s security risk analysis requirements.
- Vendor Monitoring and Third-Party Breaches (00:13:53) Obligations to monitor vendors, security questionnaires, and challenges in auditing business associates.
- Consultants and Revenue Cycle Vulnerabilities (00:16:39) Consultants as access points, revenue cycle credential risks, and contract obligations for business associates.
- Change Healthcare Breach and Vendor Risks (00:17:11) Trends in revenue cycle intrusions, the Change Healthcare incident, and the impact of vendor breaches on clients.
- Insider Threats and Employee Monitoring (00:21:55) Rise in insider incidents, employee privacy expectations, and challenges in detecting insider threats.
- Change Healthcare Breach Explained (00:26:52) Detailed account of the Change Healthcare breach, ransom payments, and notification processes.
- Contract Management Challenges (00:30:46) Difficulties in contract inventory, legacy agreements, and strategies for effective contract management.
- Compliance vs. Practical Implementation (00:34:55) Challenges for smaller organizations in aligning compliance with practical mitigation and the importance of risk-based approaches.
- Phishing and Employee Training (00:39:38) Phishing as a primary attack vector, simulation training, and evolving social engineering tactics targeting help desks.
- Revenue Cycle Roles as High-Risk Targets (00:43:37) Risks for revenue cycle staff, payment diversion schemes, and the stress of working in these roles.
- Resources and Further Guidance (00:45:21) Recommendations for blogs, industry organizations, and where to find more information on healthcare cybersecurity.
- Final Thoughts: Diligence in Healthcare Security (00:47:14) Closing advice on the need for ongoing diligence in healthcare security, privacy, and training.
- Outro and Additional Resources (00:47:48) Host wraps up, promotes CITI Program resources, and thanks the production team.
Episode Transcript
Click to expand/collapse
Daniel Smith: Welcome to On Tech Ethics with CITI Program. I am joined today by my colleague, Andra Popa, who oversees the development of healthcare compliance content at CITI Program. We are going to speak to Lynn Sessions, who is the Healthcare Privacy and Compliance team lead, and Co-Lead of the National Healthcare Industry team at BakerHostetler. Lynn currently focuses her practice on healthcare privacy and data security, breach response, regulatory defense, and HIPAA compliance. In our conversation, we will discuss various ethical and practical challenges organizations face in managing cybersecurity risks, and responding to incidents. Before we get started, I want to quickly note that this podcast is for educational purposes only. It’s not designed to provide legal advice or legal guidance. You should consult with your organization’s attorneys if you have questions or concerns about the relevant laws and regulations that may be discussed in this podcast. In addition, the views expressed in this podcast are solely those of our guests. And on that note, welcome to the podcast, Lynn.
Lynn Sessions: Thank you.
Daniel Smith: So just to get started, I gave you a very brief introduction, but can you tell us more about yourself and your work at BakerHostetler?
Lynn Sessions: Sure, I’m happy to do that. So I have now been at Baker for almost 15 years. I have been a healthcare lawyer for the 30 plus years that I’ve been in practice. I was an in-house attorney at Texas Children’s Hospital before coming here, and since being here, I focus my practice on guiding healthcare organizations through cybersecurity incidents, privacy incidents, and the regulatory investigations that follow that, whether it’s with the Office for Civil Rights, or any state attorney general investigations that may arise.
As part of that, we get kind of a real up-close view of what it is the regulators are looking for for our healthcare organizations, and through that, we’re able to also provide them with guidance around how to be proactively compliant with both HIPAA and the emerging state laws that are in place now, and covering many of our healthcare institutions across the country. But lead a team of about 15 lawyers that’s exclusively focused in the space. I’m part of a larger group of the digital asset and data management practice group. We have over 100 lawyers and technologists that help organizations through a variety of different legal issues as they come up on the continuum of data, as we say.
Daniel Smith: So given your experience advising clients during ransomware attacks and data breaches, how do you guide organizations in balancing whether and when to pay ransom?
Lynn Sessions: So it’s a great question, and I oftentimes are working with the CEOs, and perhaps the board of directors chair in helping to make these decisions for our healthcare organizations. It really is a classic lawyer answer that I’m going to say it depends, but really, what that comes down to is looking at what is the impact to the healthcare organization when they’re suffering a ransomware attack. Mostly what we’re seeing today is what I will call the double extortion. So not only are systems encrypted by a threat actor that gets into the network, but we also see that they take data, and we’ve seen the threat actors use that to their advantage to help pressure a healthcare organization into paying the ransom.
With many of our large health systems, many of which are non-profits, their 990s and other financial information are available on their websites as part of being a non-profit, and so many of the threat actors have done their homework to get a sense of what the financial wherewithal are for those organizations. Some of them are just well-known. We’ve got some of the best healthcare organizations throughout the world, and even though most of these threat actors are in Eastern Europe, or Russia, or in other locations, they know about who our big players are. And if they don’t know who the threat actor is, they do their research, and do some reconnaissance on them.
Sometimes we find that they’re looking for their cyber liability insurance policies, and they use that as part of the negotiations. Other times it’s very clear that they know what their financial situation is, and use that. So we take all of that information, and help guide the healthcare organization on whether it’s right for them to pay or not to pay. I tend to start with the premise that, “I don’t want you paying these criminals.” But that’s not really my decision to make. That comes down to what’s in the best interest of that organization, how quickly they can recover from the encryption event, if they have to rebuild from scratch or whatever that looks like, versus paying ransom for a decryption key that also might be fraught with issues when they try to restore using the decryption key.
And then two, getting a sense of what is the data that’s involved. Even if they pay the ransom, if there is PHI or PII, personally identifiable information that was taken by this threat actor, they still have to notify all the affected individuals. So they have to deal with all of the regulatory consequences, all of the litigation consequences that comes with having to do all the notification. So that’s something we want to explain to the CEOs who aren’t HIPAA experts typically. It’s certainly not in the sense of their privacy officers are as we are in being able to provide that advice. So initially, we look at is there an OFAC sanctions issue that we have to worry about? Can we even pay the threat actor? And we go through this calculus with the CEO, and the CFO, and others that might be in their cabinet, and help them make that decision. But at the end of the day, it’s the client’s decision to make, and there’s not really a right or wrong answer to that, depending on what the threat actor demands, and what the impact to the organization is.
Andra Popa: Organization when advising clients, how do you balance the need for transparency with the parties whose data was compromised?
Lynn Sessions: That is an excellent question, and in the probably 20 years since I went in-house at Texas Children’s Hospital, there’s a big push in healthcare to be transparent with our patients, particularly when some type of adverse event may have happened. And so we try to balance that with being able to provide them with accurate information. In the early days of a ransomware attack, so little is known. We don’t know if their data’s affected, we don’t know if patient care, frankly, is going to be affected real adverse way. One of the things I love about healthcare institutions, in particular healthcare providers, large health systems, is that they are incredibly resilient.
My mother was a nurse, nurses are very resilient, and able to look at ways in which they’re just going to care for patients regardless of the circumstances. And physicians too, but we really see it at the bedside with the nurses. And so I’m always amazed when they’re without any of their electronic systems, and they’re able to respond so quickly, and we want to get that good information out.
If an organization’s able to still care for patients even if they’re on downtime procedures, even if… Many of the things that they’re used to dealing with on a daily basis, if they can still care for those patients, then we want their community to know that. So we’re very vocal about that. I also err on the side of communicate and cadences, so that the community, your employees, the media, your third-party payers, whomever it may be, knows when they’re going to get information about the incident. And there’s things that we can provide early on, there’s a lot of things that we don’t know.
And I think the key questions that we get asked by the media in particular is, are you going to pay ransom? Which, frankly, is none of their business in my view. But we also get asked, “Is patient information affected? Was there a data breach?” Is typically the question that gets asked. And in the early days of an incident, it’s too early to tell, but as soon as we’re able to, and I really do try to push for early notification, because we know that the healthcare organization’s dealing with a very public event, they’re having to weather the media attention during the course of the encryption part of it, that if we can start getting out some of what I would call the bad news early on, then when it comes time for us to have to do breach notification, and patients start getting letters, and a media release has to be released under HIPAA, and the regulators find out about it because we report it to them, then we’ve really weathered the media issue already.
And what I’ve found is that most of the healthcare organizations that I get to represent are upstanding institutions within the community, so they enjoy a very trusted relationship with the community. And the long-term consequences… I mean, there’s the legal issues, there’s the regulatory issues, there’s the class action lawsuit that gets filed by the plaintiff’s attorneys, but for the most part, there’s not a significant impact to those organizations in the sense that people quit going to them. There may be a few that say, “I’m going to go to the hospital down the street.” But for the most part, we are happy with our healthcare providers, we’ll continue to go to the hospitals and the doctors that we work with, and we don’t see a significant long-term impact on the reputational risk around us.
Daniel Smith: In addition to the post-incident response, how do you advise organizations on preventing future breaches, or future ransomware attacks? What can they do to prevent them from happening in the first place, and also again in the future?
Lynn Sessions: Yeah. So interestingly, you’re probably the most secure as a healthcare organization following a big incident, because you had a forensic investigation firm that have not only looked at how did this threat actor get into your systems, but what are the other ways in which bad guys can get into your systems? And sometimes, we hear from CISOs, like, “Okay, this was probably one of the best things that could have happened to us, as painful as it was, because some of the things that I’ve been wanting to put in place I can now get.” Because there is an organizational awareness of how bad these things can be. But that’s not true all the time. I tell my CISOs all the time, “You have to be perfect 100% of the time, and the bad guy just has to be right once.” They just have to get lucky once to get in your systems.
But I will say this, I mean, HIPAA is a really good regulatory framework. It has been in place with the privacy rule for over 20 years, security rule for now 20 years, and it has been somewhat pressure tested, and it allows for flexibility depending on the size of the organization, the sophistication of the organization, and the dollars, frankly, that they’re able to spend on security. HIPAA does allow for that flexibility. So a lot of times, it comes down to being aware of what your risks are, and knowing where your PHI is. So it comes down to some of the basics. It’s usually not something that is so sophisticated. There’s a lot of really good tools out there now that are helping our CISOs secure their environments, but it comes down to the basics, knowing where your PHI is, which HIPAA would call that a PHI inventory, and then assessing where the risk and vulnerabilities are to those PHI.
Then you can put in appropriate safeguards, whether they’re administrative safeguards, technical safeguards, or physical safeguards that help protect that protected health information. And that’s really what it goes to. So it sounds very simple, and when I talk to the CISOs at my healthcare organizations, they’re like, “Yes, you make it sound so simple.” And recognizing that these are typically very complex environments, with a lot of information. Some of which has been around for 50 plus years, just the nature of healthcare organizations, and the length of time that they’ve been in business and taking care of patients, or doing research, or… Fill in the blank, all of those things.
So it’s a complex environment, but knowing where your PHI is, frankly, is step one. And then what we’ve seen a lot of healthcare organizations start to move towards, and they’ve been really asking us this in the last six months in particular, has been, how do we get rid of the data? We have so much PHI, frankly, that we don’t need, and how do we go about putting in a retention policy that we’re able to follow? And that’s a hard thing, because data is king. Healthcare organizations use data for a variety of different reasons, for quality purposes, for determining business impact and business plans around where they may go next.
And so not having that data, and being able to look at a variety of different data, then they lose the ability to do that in some cases. However, I don’t know that we need to be keeping 50 years worth of data. The problem with it is, we don’t really know where it all is, because it’s kept in a lot of disparate places. I jokingly say, and it’s really not much of a joke, that a threat actor will go in, and they’ll find your drunk door. It’ll be your drunk door, full of all different kinds of PHI, and all PII, and that becomes very painful when you’re trying to respond to an incident. So if we can get our arms around where all of our data is, and manage that data in such a way, protect that data in such a way, it lessens the risk to the organization.
Andra Popa: There’s still a lot of outdated devices being used, particularly in telehealth. Would that also be cataloged as part of the inventory?
Lynn Sessions: Yeah, it should be. So we should have an idea of where all of our devices are that hold or transmit protected health information. That’s part of what HIPAA calls the security risk analysis, which is kind of the basis in the Officer for Civil Rights as the entire security plan. You can have the best policies and procedures in place, you can have the best training in place, but if you don’t have an underlying security risk analysis, which I think most of us would call a risk assessment, but it includes the inventory of where all your PHI is within your network, and where it is on your devices. And then making a determination how it is that you’re going to secure the transmission of any PHI that may go across those devices, and protect those devices from a physical security standpoint.
So yes, it would be cell phones even. It would be your telehealth equipment, as you described, and I think post-COVID, or during COVID, we were so quick to say, “Gosh, we have to have telehealth visits.” There was a lot of leniency around that. Five years now out, there’s a little bit less leniency around that, but I think a lot of healthcare organizations are having to respond to that as a result.
Andra Popa: I was recently speaking with someone from the Office of Civil Rights, but different people have different opinions about the gray area, when exactly there’s a notice of breach, or particularly in the area of third-party vendors. For example, if a vendor has a breach, but they might not notify you, do you feel that there’s an ongoing responsibility to always be monitoring your vendors?
Lynn Sessions: So when the final rule of HIPAA came out in 2013, there was language in there that talks about that a covered entity has an obligation to monitor its vendors. We weren’t given any guidance as to what monitoring our vendors really meant, so we saw a number of our clients that said, “Lynn, what do we do here? What is it that we do?” We actually thought we were probably going to get some guidance around this, and now you spring forward 12 years later, and there are so many vendor breaches that are happening in healthcare, that I think we would still love to get some guidance around it.
What we’ve seen organizations do, and what we recommend is that there be some level of monitoring around your vendors. It can be something like on the front-end, making sure that you’ve got a security questionnaire, for example, asking your vendors, do they use multifactor authentication on their mobile devices? Do they have a HIPAA training program, which is required under most business associate agreements? And that’s just kind of an example of things. We’ve seen clients that have asked things like, “Do you have endpoint monitoring on your network?” And I have to review the business associate agreements for my law firm, so we see the types of things that are coming in, and with our great clients who are asking us to sign the business associate agreements and the security questionnaires.
We’ve seen that, we’ve seen really not them come in and do audits of their business associates, but they do reserve the right to audit. And that can be kind of tricky, because it’s an expense that the covered entity would have to take on. But the sheer volume of business associate agreements that particularly large health systems enter into creates a situation where it becomes very difficult for them to monitor all of them.
So what we do see happening, and again, what we would recommend is there be some type of security questionnaire that periodically gets sent out to the organizations, to the business associates for them to complete, and then they have to have a process in place though if the business associate either doesn’t answer, or answers it in a way that makes it risky. And what we think the OCR would say around that, would say, “You can always seize your relationship with that business associate. There’s always someone else out there that can do this, maybe not for the price, but there should be a more secure vendor out there that you can go to.” So it’s not a matter of just having them to complete a security questionnaire, for example, you’ve got to make sure you’re following through on that.
Andra Popa: In my experience as a consultant, I would do audits and chart reviews, and I had a lot of access into Epic and other electronic medical record software. They would ask about my insurance, my location, run extensive background checks, but not really about other things. Consultants seemed to be an access point, in addition to revenue cycle roles. I’ve read that they’re impersonating the credentials of revenue cycle roles as they have control over Epic, and the way things are billed, and the addresses.
Lynn Sessions: So leading up to Change, probably within 18 months of the Change Healthcare matter that happened in February of 2024, we were contacted by a number of our clients who were having intrusions into their revenue cycle department. It seemed like over a period of about 18 months, nearly every business email compromise that we were handling on the healthcare team was through rev cycle. It was shocking to me how many were revenue cycle. And so when Change Healthcare happened, we were not shocked, it was kind of like, “We saw something like this might be happening.” None of the ones that I’d handle were necessarily trying to get into Change. We saw some of the Change competitors, that their systems were being attempted to get into. We saw healthcare organizations, revenue cycle people, emails that were compromised, and they were attempting to use the credentials to get into some of the software for some of the third-party payers.
It’s almost like someone over in Eastern Europe or Russia was going, “Okay, this is our ticket in.” They kind of figured out how healthcare works, how healthcare funding works here in the U.S., which, gosh, most Americans have no idea how that happens. So it was pretty sophisticated approach in what we’re seeing happening. So back to your comment though about consultants and others, it is a weak link. We as consultants, and I would consider us to be consultants as well, but we as business associates of our healthcare organizations, we’re making representations through contracts, that we will agree to do certain things, or we have certain things in place.
Sometimes they will ask for proof of cyber insurance, for example, or proof of other liability insurance. But short of that, there’s not a lot of proof that we typically provide in the early contracting process. So the healthcare organization is left with, again, thousands of contracts that they have in play at any given time, and finding a way to ensure that those contracts are truthful. Now, in their mind, they can always see if we breach a contract. Lots of times, they’re the 800 pound gorilla in the contracting process, so they’re the ones that really have the bargaining power, and frankly, all the money, that with some of the smaller organizations, were saying, “Gosh, it’s just a great opportunity to get to work with…” Fill in the blank, world-renowned organization. And we see really, really smaller entities who will agree to things that they have no right to be agreeing to.
That wasn’t the case with Change Healthcare, by the way. Obviously they’re a, what? Fortune 5, Fortune 10 company, being attached to UnitedHealth Group, but some of the consultants that our clients contract with really don’t have that financial wherewithal, but they have access to a lot of PHI. So it’s putting a lot of things at risk, and really, even post Change, and with all of the vendor breaches we’ve seen in the last five years, in particular in the healthcare space, we’ve seen a lot of our clients start to really tighten that up a little bit, with requesting more proof of certain things, including more limits of liability on cyber liability insurance, stronger indemnification language, and their business associate agreements, and other things like that that they feel puts them in a better position to protect.
We saw one business associate about two years, had a very, very large incident, and they had to file for bankruptcy. So they were a tiny organization, they were servicing large health systems, and had a million-plus person incident. They had maybe $3 million worth of cyber insurance, and it was gone, it was completely gone.
Two bad things happen with that, so they need to use up that insurance for whatever purpose they think is appropriate, sometimes that goes to breach notification, sometimes that goes to defend themselves in the class action litigation, but it, unfortunately, will pull in their customers, or covered entity customers into the litigation many times, because they have a deeper pocket. So we see this play out time and time again with smaller organizations that are business associates but have access to a ton of data. They also happen to be part of the billing process. So in some instances, every patient that went through that organization, not the business associate, but the covered entities that they do business with, every patient’s data was being sent over to this company.
Alexa McClellan: I hope you’re enjoying this episode of On Tech Ethics. If you’re interested in hearing conversations about the research industry, join me, Alexa McClellan, for CITI’s other podcast called On Research with CITI Program. You can subscribe wherever you listen to podcasts. Now, back to the episode.
Daniel Smith: So we’ve talked a lot about third-party vendors, and also consultants and things like that, but what about insider threats? How can institutions monitor employees to prevent data breaches without eroding trust, or violating their privacy?
Lynn Sessions: I can tell you that we have seen a significant increase in the last 18 months of insider issues. In the almost 15 years that I’ve done this, I have probably handled 10 to 12 actual insider issues, and eight of them have been in the last 18 months. So 18 months versus 15 years, what is that? I’m bad at math, 3/4 of them have been in the last 18 months. So I’m not sure why, other than the fact that we’ve got people that have access to electronic media, and they think it’s easy, they don’t get caught, this kind of thing.
But your question is troubling, because we essentially have employees of healthcare entities that have rightful access to a lot of information, or a lot of systems, and they’re essentially exceeding the access to do untoward things, whether it’s for financial reasons, whether it’s for other reasons that we’ve unfortunately see occur that they’re essentially exceeding that trust.
Now, I will say this, most employers have policies that would indicate that we as employees do not have any expectation of privacy if we are using company assets. So my laptop that I use for work is a work laptop. If I choose to put personal information on there, if I choose to do my taxes, if I choose to engage with personal conversations with my family, with significant other, whatever, then I should have no expectation of privacy as an employee of my firm. And that’s the truth, that is how most policies are written, is that the assets that are company assets are to be used for company purposes, and if I choose to use them outside, then I should have no expectation of privacy.
That sounds very harsh. Most of my organizations are a lot nicer than me, and they’re not really trolling their employees for inappropriate activities, and it’s usually quite a surprise when we find that there has been an insider issue. And oftentimes, it’s an insider that is a very high performing individual, and their supervisors are shocked that they’re even involved in this type of activity. That’s the usual situation that we see in the rare occasion that we’ve had insiders, although I did mention there’s an increase.
There is also a group that was identified by many of the forensic firms that’s called Famous Chollima, that is tied to the North Korean government, and they are intentionally seeding into organizations, going through an interview process with the IT department working remotely, which remote workers have put us more at risk too, particularly in this space, and they are infiltrating into organizations, and essentially doing just enough work to fly them to the radar, but then potentially holding those organizations for ransom once they’ve kind of gotten infiltrated in.
We’ve also seen insiders where there’s been remote interviews where they’re pretending to be someone working inside, and they’re job sharing, so they’re job sharing with other individuals in other parts of the world, and it is very difficult for our organizations to catch this. Part of what it is, is a lot of healthcare organizations have faculty members that travel all over the world, so you can’t say, “Okay, well, we’re going to not let people log in from foreign countries.” So then you’re essentially looking to monitor certain countries that may or may not be logging in with an IP address. But that’s not perfect either, because bad guys can circumvent the IP address, and have it appear as if it’s coming from the United States.
So it is a really, really difficult thing to detect. Security controls that are out there that look for anomalous behavior, they can alert an organization if there are a multitude of IP addresses, or if an employee is logging into the same device from multiple IP addresses. So like today, I’ll be here in Texas, tomorrow I’ll be in California, next week I’ll be in Boston, and that’s not unusual behavior for me. I mean, that’s not anomalous behavior. But if within a 24-hour period, I’m logging into all those locations, that is anomalous. And so they call it kind of impossible travel, that can sometimes be detected through those logins. But it is a very, very difficult thing to root out, but I also think that employees need to understand they really don’t have any expectation of privacy with their company devices.
Andra Popa: I was wondering for our listeners, if you could explain what happened at Change Health, if you’re able to discuss it? I recall that they weren’t your client.
Lynn Sessions: No, Change is not my client. We did work with over 100 providers in health plans, downstream, were impacted by Change. So my understanding, what happened with Change, and there was a lot of public information that Change themselves put out to their customers and to the general community, they also had to testify before Congress. So what our understanding is that through an endpoint that did not have multifactor authentication, a threat actor was able to get into the Change Healthcare systems, and took data, and encrypted their systems. Change quickly responded, chose to rebuild their systems from scratch, as opposed to paying for a decryption key, and restoring using the decryption key. I’m not sure the reason they decided to do that, but that’s what they decided to do.
There was reportedly in the data that was taken by this threat actor, 190 million Americans’ data was affected by that. So protective health information of roughly 2/3 of the United States. That just shows you the breadth in which Change provided services to many organizations throughout the country. There was a ransom demand, and we understand that ransom demand was ultimately paid initially for $20 million. The way the threat actor group worked in there… So this goes back to the first question you asked me about paying ransom, and how we advise our clients, but the threat actor was paid $20 million reportedly, I’m sure Change thought, “Okay, we can now go on with our obligations to our customers.” What happened was, the threat actor group, and there’s kind of a middleman that is the recipient of the dollars, and there’s a big boss that sits at the top that gets paid whatever take he or she gets paid, and then you have affiliates that’s sit under the middleman that are doing the actual work.
So they’re going in, and they’re figuring out how to get into the systems, they’re taking the data, they’re holding the data, they’re doing all the technical stuff. So we understand what happened with the middleman is, he decided to take the money and run, but the affiliate who actually figured out how to get in and held the data, had the 190 million Americans’ data did not get paid, and normally, they get paid a percentage as well. So that threat actor threatened to publish the data, re-extorted Change, and reportedly, Change paid another $10 million. Unfortunately, it was after the data had been posted at least for a short period of time, and so there were some covered entities that were on notice at that point that they were impacted.
And then Change methodically went through over, even still to this day, through data, and started notifying the people who were impacted by it. So they notified the covered entities, which is their obligation as a business associate. The covered entity then makes the decision on notifying actual individuals after they’ve seen the data that was involved, and then instructs Change to go ahead and do that. So we are now close to 18 months out since the incident occurred, and we are still waiting for final documentation from Change on it. So I think they were pretty transparent, especially early on in trying to calm other customers. There are things I could criticize Change about, but I think by and large, they handled it with the volume of data, with the amount of attention that was on them, with the impact it had to healthcare organizations. I think they were communicative about the incident, and the volume, and the complexities in which they had to work with. I think while they were… Frustrating for us, that we’re helping our clients through it, we recognized that there was some complexities to this that were probably causing it.
Andra Popa: One of the most enduring points that you made in your keynote at HCCA was the importance and the difficulty in maintaining current contracts. If a smaller entity wants to begin the review process of all their contracts, where would they begin exactly?
Lynn Sessions: Whether you’re a small entity or a large entity, I think the hardest part about getting your arms around the contracts is just knowing all the contracts. It kind of goes back to inventorying your PHI, it’s inventorying your contracts. And many organizations have been around for a long time, and so while they may put a great contracting program in place today, and so going forward, they’ve got a great way to be able to keep up with the contracts, historically, going back for organizations… I work with clients who’ve been in existence for over 100 years, and I doubt they have contracts that go back quite that far, but it’s possible. Some of my health systems that are academic medical centers and partner with medical schools, those go back maybe 100 years or more.
So just getting your arms around all of those contracts, and knowing what’s in them, knowing who the actual entity is can be really difficult. So one of the things that we saw happen with Change… Because one of the first things we asked the clients when they came to us is we said, “Do you have the contracts with Change? Show us the contracts. That will tell us what their obligation is to you.” So they came back to us with contracts that didn’t have the word Change in it, didn’t have the word UHG, didn’t have anything, it had a predecessor organization.
So we go to the legal department, and we say, “Okay, did you get notified that your entity was bought out by Change? We don’t have any record of that.” It doesn’t mean that Change didn’t do it, it just maybe did not get to the right person in the organization, because the business person, they all would be there, or they may have gotten it and thought, “What’s the big deal?” So long as Change was continuing to get paid, they weren’t concerned it.
So we start thinking about where is it in your process that if something like that happened, like, if there was a change in the organization, that you would be notified about it. So it seems to me accounts payable should know about it, because writing a check to a new entity, and oftentimes, my clients ask me for a W-9, and all of that stuff. So odds are, somebody new about this, but how this gets reported back to the legal department, or to supply chain, or whoever’s keeping up with your contracts can be really difficult.
So when we think about it, when we sit with clients and we go through, “Let’s talk about a contract management process.” Because you can have great programs in place, but we’ve got to figure out what the process looks like. First thing we do is we kind of look at what are your active contracts, and then we go to accounts payable, and we say, “Who are you paying that we may not even have a contract in existence?” Some of these contracts with Change predated HITECH, which came into effect in 2009, 2010. Some of them even predated the security rule, which came in place in 2005.
So we know that there were some historical contracts here that probably never really gotten modernized, and somehow I’m sure Change was getting paid somewhere along the way. So it really highlighted to me… And having worked at Texas Children’s Hospital before coming here, I saw it first-hand, just the volume of contracts that you have, the volume of contracts that I had to review for indemnification language, and insurance language, things like that.
So it’s really getting your arms around where all of those are, through the various processes that large healthcare organizations go through, and then ensuring that they’re kept current. Some ways that some of our clients do it is, they will periodically require that a new business associate agreement goes out that allows them to inventory where they’ve got PHI, or tell us to delete or destroy PHI that we have in our possession. Some of them get a new program, and they do exactly what I’ve described, they just start getting a sense of where all their contracts are. But it’s not easy. Small organizations might be able to do it better than others, but we’re also talking about legal departments that are taxed with a myriad of things that they have to address, and contracts is just one of the things that the legal departments and supply chain have to keep up with.
Daniel Smith: Speaking of smaller organizations with limited resources, or just any organization in general, can you talk a bit more about the challenges that they face when aligning compliance with the practical implementation of some of these mitigation measures that you’ve been mentioning?
Lynn Sessions: Yeah, that’s a good question. I will say that I tend to approach things very practically. So HIPAA does allow for some flexibility, as I mentioned earlier, depending on the size and sophistication of your organization. However, it doesn’t alleviate the need for you to put in administrative, physical, or technical safeguards. There’s two places that I see healthcare organizations have challenges from a compliance standpoint and a mitigation standpoint, and they’re really around the people, because oftentimes, we see smaller organizations that have a leader that wears multiple hats.
So you have a chief compliance officer, and that chief compliance officer handles all compliance across the organization. So whether it’s Medicare, Medicaid compliance, it’s billing compliance, it’s HIPAA, and we can go on and on with the various compliance that healthcare providers and other organizations have to adhere to. So that’s, I think, the first thing that I see is, you’ve got someone who is tasked with doing a lot of different things, and having to keep up with a variety of different regulations that are constantly changing, or at least constantly having different focus. So that’s one of the challenges that I see.
I will say this, if you can partner with good consultants, if you can partner with good outside counsel that are able to practically help you address the compliance issues that come up, I think that is a key partner for you to have. So one of the things we pride ourselves in is, we really try to approach advising our clients from a risk perspective. So there is supposedly the black letter of the law in working with regulators, we have a very good sense of what’s important to them, and where they’re focusing their efforts. And when we talk with our clients, we talk about it in terms of risk to the organizations. I mean, there’s certain things that are just illegal, there’s certain things they have to do, but there are areas where we can say, “Look, this is where you should focus your time from a practical standpoint. This is where you should spend your dollars, and from a practical standpoint.”
And there are things that are almost… Even though they’re not written in stone in HIPAA, we know that the Office of Civil Rights and state attorneys general are going to approach it in such a way that it’s going to be just very black and white to them, and we want to make sure that they understand that. And if they don’t do that, then we want to advise them of what the risk to them is going to be. And there can be regulatory risks from a penalties perspective at the federal level, we’ve seen a number of the state AGs start to step in as well in this space following data breaches, and they’re less forgiving and less flexible, I’ve found, if the entity is domiciled in their state, and then we talk about ways in which they can help mitigate that, that’s very practical, but it’s focused on where we see the regulators playing, and what the risks that we would see to them as an organization.
The second area that I see is on the technical safeguards side. There are things that a large health system has and can afford, that a small organization may not be able to. And what we’ve seen over the years in the 15 years I’ve been doing this is that OCR considers certain things to be what I would call standard of care, which most of my clients understand that, because they’re healthcare providers. So we talk in terms of what is standard of care for a world-renowned large health system, versus what would be standard of care for a one-person physician’s office, and then what are the deal breakers? There are certain things you have to have in place, there are certain things that would not be expected of you, or would probably… We could put different types of safeguards in place, because of the size and sophistication of your organization.
And really, that is how we approach it. We approach it in terms of risk to you, and at the end of the day, the client gets to take the risk, or not take the risk. And if they need our administrative assistance because we would provide the administrative safeguards through policies and procedures, and training and things like that, or if we can introduce them to our partners in the security space to help them from a security standpoint, then we try to do that, and we right-size that. There’s some that would be able to afford a large, multinational security forensics firm, and there’s others that are going to want to have a smaller, less expensive security firm to help them out. And fortunately, there are good partners in all of those spaces. But that’s really what it comes down to at the end of the day, is working with people who understand your business, who can approach it from a practical standpoint, and understand your risk tolerance.
Andra Popa: My last question is, how to teach employees, particularly in revenue cycle role, how to recognize phishing schemes?
Lynn Sessions: So we’ve seen with the advent of AI that phishing emails are getting a lot more sophisticated, so they’re more directed, and better English, and just the way they look, they look like real emails that are coming. What I recommend is that you not only do kind of didactic education, so you’re teaching the employees what to look out for from a privacy and security standpoint, but there are programs out there that are not expensive that organizations can engage with that does simulated phishing. And I think the revenue cycle employees probably need to be tested more often than not, just because we know that that’s a target.
So it is reinforcing that education around being concerned about phishing emails. And what our statistics at Baker show is that phishing still is the number one way which people are getting into systems, we just as humans fall for it. And there’s a variety of reasons, it’s not usually because we’re dumb, it’s usually because busy, and we’re trying to get things out of our account. And then what the phishing simulation does is it tests us, and then in the moment, if I respond to something that’s inappropriate, then it points it out to me, and does real in time training so that I’m made aware of it, and hopefully I’m becoming more careful when that happens.
And if you’ve got an employee who is frequently failing the phishing simulation, then we need to do specific training for that individual, because that’s going to be probably someone who’s going to be your weak link. Having said all of that, I’m working with a number of organizations that now, the bad guys, are calling the help desk, because my clients have implemented multifactor authentication, and they’re calling the help desk, impersonating the employee, and changing, not only their email password, but also changing the device where the multifactor authentication prompt goes to.
So as our healthcare organizations implement more and more security, the bad guys are figuring out a way around that. There might be a pause for 12 months, 18 months, and then they figure out a way to get around it. And it is very concerning. And we will go back and we’ll listen to these help desk recordings, and you’re kind of like, “Gosh, it’s amazing what they know about these individuals. Have they gotten information about them from social media? Maybe going onto LinkedIn? Do they have a cache of data that they found on the dark web about our employees? Where is this coming from?” But particularly for my organizations that have implemented a lot of security measures, that has been coming up very frequently.
And it feels like an insider, it feels like somebody knows about their colleagues, it feels like these are threat actors that have kind of studied their potential victim. There, we’re finding that they’re, at least in the most part, diverting payroll to try to get a quick payday, because the paycheck gets diverted to a fraudulent account. But they’re doing it by the social engineering that’s happening at the help desk. And a number of our clients have received notice from Epic that this is happening to the Epic MyChart. There are patients who either have never set up a MyChart account, or they’ve set up a MyChart account, and the threat actor is calling into the help desk to change that.
So there’s new MyChart accounts that are being set up, they’re getting access to patient information. We don’t really know what the motive is, there’s some speculation as to it may be tied to fraudulent [inaudible 00:43:16] medical equipment, fraudulent billing to Medicare, Medicaid, things like that. But there doesn’t really seem to be a rhyme or reason, based on what we’re seeing. But it’s very troublesome, because obviously there’s a lot of information about people that’s available electronically. It also shows to me the level of sophistication that the threat actors are willing to go to try to get access to this information.
Andra Popa: Yes, it made me wonder whether revenue cycle rules should even post their information on LinkedIn. I read that with these revenue cycle credentials, the threat actors were changing the payment addresses so that rather than going to the hospital, it would go to the threat actor location. It’s a very risky job to have to work in revenue cycle.
Lynn Sessions: It is a risky job. I think it’s already fairly stressful, because it moves very quickly, and the folks in revenue cycle answer to the CFO ultimately, and it’s all about how they keep their doors open. So it’s already a really stressful job, and there’s a lot of pieces to that. The documentation has to be put in the medical record by the providers on the front-end, then the coders have to code it appropriately, and then the rev cycle people have to get it to the third-party payers based on the contractual arrangements they have with them. So I think it’s a very stressful job, and to know that they’re essentially under attack.
And Andra, what you described is something we definitely have seen in rev cycle, is they would divert a third-party payer’s… Being able to get in through, however it is they were able to get to the third-party payer’s portal, or through their system, they were able to divert that payment. And sometimes we’re talking several millions of dollars, and the hospital’s going, “Where’s my payment?” And then the third party payer, usually some of the larger insurance companies, will come back and say, “Hey, we paid you three weeks ago. What are you talking about?” And then that, of course, kicks off the investigation. So it’s a real concern, and I do not envy those folks that are in the rev cycle process, and in particular at that stage, where they’re accessing the third-party payer information, because they are definitely a target of these actors.
Daniel Smith: So we’ve covered a lot of ground today. We’ve talked about a lot of different issues. So do you have any recommendations for additional resources where listeners can learn more, or find additional guidance on post-recovery from an incident, and anything else we’ve talked about today?
Lynn Sessions: So there’s a few plugs I’m going to put in. I would be remiss if I didn’t say that BakerHostetler has resources on our blog, which is bakerdatacounsel.com. We’ve got a 50 state survey of all the data breach laws, we put out periodic blog articles on a variety of topics, whether they’re current events that we want to address or things, as you described, which are, how can organizations recover from an incident? What other types of recommendations we have. And those are pretty broad recommendations, but I think it’s a great place for organizations to start, healthcare or not.
And then I would also say too, that many of our industry organizations like Healthcare Compliance Association and American Health Law Association have fantastic resources from Baker, as well as from my peers at other law firms and consultancy organizations that have always provided me. I mean, I know when I was in-house, that was a place that I would go frequently before I would call a lawyer, would phone a friend to a lawyer on the outside. And then even now, those are things that Baker, we will go to from time to time to look and say, “Okay, what is it that some of our peers are saying?” So there are a lot of great health lawyers out there, there are a lot of great healthcare privacy lawyers that are out there, and fortunately, many of us are great friends with each other, and we share a lot with each other. But I think those two organizations are great places for my clients and my colleagues to get information.
Daniel Smith: Wonderful. And I’ll be sure to include links to those resources that you mentioned in our show notes.
Lynn Sessions: Thank you.
Daniel Smith: So my final question for you today is, do you have any final thoughts that we’ve not already touched on?
Lynn Sessions: I’ll just kind of close with a global statement, healthcare is under attack. I will tell you that there’s not been a day in almost 15 years I’ve been at Baker that I haven’t been busy. It’s a good problem to have, but it’s bad for my clients. And so I think being diligent around this on the security front, being diligent around it on the training front, being diligent around it on the privacy front are just key, you just can’t let your guard down. And so I would just advise that we just remain diligent, and alert to the attacks that are happening to our healthcare organizations.
Daniel Smith: Thanks, Lynn. I think that is a wonderful place to leave our conversation for today. So thank you again, it was great having you on the podcast.
Lynn Sessions: All right. Thank you, Daniel. Thank you, Andra. It was a pleasure.
Daniel Smith: If you enjoyed today’s conversation, I encourage you to check out CITI Program’s other podcasts, courses and webinars. As technology evolves, so does the need for professionals who understand the ethical responsibilities of its development and use. CITI Program offers ethics-focused, self-paced courses on AI and other emerging technologies, cybersecurity, data management, and more. These courses will help you enhance your skills, deepen your expertise, and lead with integrity. If you’re not currently affiliated with a subscribing organization, you can sign up as an independent learner. Check out the link in this episode’s description to learn more. And I just want to give a last special thanks to our line producer, Evelyn Fornell, and production and distribution support provided by Raymond Longaray and Megan Stuart. And with that, I look forward to bringing you all more conversations on all things tech ethics.
How to Listen and Subscribe to the Podcast
You can find On Tech Ethics with CITI Program available from several of the most popular podcast services. Subscribe on your favorite platform to receive updates when episodes are newly released. You can also subscribe to this podcast, by pasting “https://feeds.buzzsprout.com/2120643.rss” into your your podcast apps.
Recent Episodes
- Season 1 – Episode 34: The Essential Role of Bioethics in HBCU Medical Schools
- Season 1 – Episode 33: Integrating AI into Healthcare Delivery
- Season 1 – Episode 32: Modernizing Clinical Trials with ICH E6(R3)
- Season 1 – Episode 31: Fostering AI Literacy
Meet the Guest
Lynn Sessions, JD, BA – BakerHostetler
Lynn Sessions is the Healthcare Privacy and Compliance team lead, and co-lead of the national Healthcare Industry team at BakerHostetler. Lynn focuses her practice now on healthcare privacy and data security, breach response, regulatory defense and Health Insurance Portability and Accountability Act (HIPAA) compliance.
Meet the Host
Daniel Smith, Director of Content and Education and Host of On Tech Ethics Podcast – CITI Program
As Director of Content and Education at CITI Program, Daniel focuses on developing educational content in areas such as the responsible use of technologies, humane care and use of animals, and environmental health and safety. He received a BA in journalism and technical communication from Colorado State University.
Meet the Guest Co-Host
Andra Popa, JD, LLM, Assistant Director, Healthcare Compliance – CITI Program
Andra M. Popa is the Assistant Director, Healthcare Compliance at CITI Program. She focuses on collaborating with learning professionals to develop healthcare compliance content. Previously, Andra was the owner of a consulting firm that worked with over 40 healthcare entities to create, assess, audit, and monitor compliance programs, as well as to create educational programs. A graduate of Boston College with degrees in English and economics, she also has JD and LLM (healthcare law) degrees from Loyola University Chicago School of Law. She has published over 100 articles, written book chapters, and conducted workshops in design and compliance.