Cracking The Code Ep. 21: The FTC’s Expanded Reach on Health Data Protection

On the latest episode of Cracking the Code, Chris Bowen, Founder and Chief Information Security Officer, discusses the recent finalization of the Health Breach Notification Rule.

Chris delves into the Federal Trade Commission’s (FTC) recent finalization of changes to the Health Breach Notification Rule (HBNR). This marks a significant shift from fragmented, independent privacy and security measures to a more unified, collaborative defense strategy.

Don’t forget to schedule your Cloud Risk Checkup, powered by our cloud security posture management (CSPM) software, the CyberHealth™ Platform, and our team of highly trained experts.

Want to get in touch right away? Call (833) 992-5327.

FAQ

Transcript

0:07
Hello everybody, welcome back for the next episode of Cracking the Code.

0:11
This is episode #21 we are going to be covering the FTC finalization of their changes to the health breach notification roll.

0:20
This is a little bit of a continuation or a loop back from Cracking the Code episode 13 back in the spring of 2023.

0:28
So if you haven’t watched that one, please do.

0:30
It sets a good ground work here.

0:32
It sets sets the stage for what we’re talking about today with the actual finalization of this role.

0:37
So, as always, here is Chris Bowen, our chief information and security officer of Clear Data.

0:44
How are you today?

0:46
I’m doing great, Natalie, thanks for asking.

0:49
I’m here to pick your brain about some of the stuff.

0:51
We’re going to circle back.

0:52
I’ll ask you to give a little bit of some ground work here, but thank you for sharing your thoughts.

0:57
Let’s dive in.

0:58
All right, First, can you remind the viewers and just provide a high level overview of what the health breach notification rule is and maybe how it differs from the protections afforded by HIPAA?

1:10
Yeah, two completely different laws, or rules, if you will, administrative law, It’s in that category.

1:17
HIPAA was written in 1996.

1:20
Bill First, Senator.

1:21
Bill First, if I recall, was one of the chief authors of HIPAA and the HIPAA security rule, HIPAA privacy Rule, HIPAA breach notification rule is, is all about making sure that certain data is protected.

1:36
The data has to be part of healthcare treatment, payment or operations.

1:42
That’s what defines Phi, protected health information.

1:46
And many times people think that HIPAA was all about making sure your data’s private.

1:54
Well, it was actually all about making sure that your data is, is portable and in terms of making your data available for treatment, care and operations, you have to have certain protections in place on that data.

2:08
Now Fast forward a few years, a lot of years actually, and the FTC has started to several years ago, started to take notice of the fact that there’s a, there was a mass amount of confusion around I’ve got health related data, but that may not actually necessarily be Phi because health related data maybe comes from your watch.

2:34
You know, it’s measuring my activity tracking, it’s measuring my steps, whatever it is, you might my heartbeat.

2:40
And many times people would, would mistakenly believe that their data load, that kind of data is protected under HIPAA or some kind of data protection.

2:51
Well, guess what, folks?

2:52
It’s not, it hasn’t been.

2:55
And so health related applications, you know, spawning lots of data and they could do whatever they want with it.

3:05
They could share it, they could sell it.

3:09
You know, if you think about the patchwork of, of privacy laws all over the country, you know, if you, if you messed up the, if you, if if you had that data get exposed, then guess what would happen?

3:20
Not a damn thing, right, Right.

3:23
Not a thing.

3:24
And and there were many attempts along the way from different parties to try to say, hey, let’s figure out how to make it so that when someone’s health related data is breached that they can actually be notified of that and that they have some kind of recourse.

3:42
Well, you know, the FTC came along and said, hey, that’s a really good point.

3:48
Let’s start to crack down on egregious violations of people’s information if it’s shared inappropriately.

3:59
And so they started working on the health breach notification rule.

4:03
And what that really says is if you have health related data and it’s not already covered or addressed by the HIPAA security, privacy or breach notification rules, mostly it’s the privacy rule then.

4:18
And if there is a breach, then you need to notify those that are victims of that breach of that health related data.

4:25
And there have been lots of examples of misuse.

4:28
I’ll, I’ll give, I’ll give you one.

4:29
We’re on a Zoom call right now.

4:31
Remember what happened in during COVID?

4:33
Well, Zoom was part of the telemedicine approach.

4:36
And what did they do with, with, with 80 million records of people who were having Zoom conversations with their doctor?

4:43
They sold it to Facebook big lawsuit several years ago.

4:48
So we also have other examples good RX, good RX did similar things.

4:56
We talked about good RX and instead of saying, hey, you know what, we’re sorry that we sent all this data somewhere where the the consumers didn’t have any clue about, they basically said tough, tough cookies.

5:10
They really didn’t care.

5:12
They really didn’t care And and so there’s another pre mom was another application that, you know, shared some data.

5:21
And so kudos to the FTC for coming up with this approach and, and basically saying we’re not going to take this anymore.

5:33
Absolutely yes.

5:34
And, and you make a good point there.

5:36
I was having a conversation recently where individuals do not know that this is happening to them.

5:41
And Clear Data commissioned a, a poll last year and found that over 80% of, of consumers who have these, you know, these watches and, and the platforms and engage with these companies where they’re sharing their, their health information, don’t realize that it’s not protected under HIPAA or don’t realize that, you know, the purpose of HIPAA was more portability with the privacy and instead of the actual, you know, protection of it.

6:04
So I think there’s just a, a big misunderstanding of what HIPAA is and what it means to protect our, our data and what that data is.

6:12
So, so can you give a little bit of some of these revising definitions?

6:18
Can you go over any of the changes from the FTC and exactly how they might be cracking down?

6:23
And although this is a step in the right direction, you know, do they need to go further?

6:29
Well, the, the rule is fairly new, so it will continue to evolve.

6:36
It’s interesting to note that they’re cracking down on these companies in a, in an aggressive way now.

6:44
They’re probably gonna get more aggressive as they see more egregious violations of the rule.

6:50
And, and of course, we know that, as you said, many Americans have no idea that their data is being sold as being, you know, sold to the highest bidder, if you will.

7:01
And, you know, the the fact is it’s important that we maintain control of our own data.

7:08
It’s almost impossible these days, but it’s important that we we consent to what the data is used for.

7:16
So I don’t know that I answered your question, but let’s come back to the individual and say try to be responsible for your data.

7:24
I know it’s difficult because your data’s everywhere at this point.

7:28
Well, that was going to be one of my questions as well as consumers.

7:30
You know, when we’ve we’ve discussed this before and you’ve participated in webinars where you’ve discussed this as well.

7:36
But what, what’s our responsibility as consumers to, to protect ourselves, our privacy, our data?

7:43
And maybe is it an educational issue?

7:45
I mean, I don’t think I’m going to stop using my, my Nike apps or my Peloton.

7:49
I don’t really see that happening.

7:50
So you know, me as a consumer, is it just more of an education thing or do I just hit the consent to those terms of terms and conditions and then it’s gone from there?

7:59
Well, it’s even, it’s, it’s even more personal than that.

8:04
If you think about this, this Zoom call that we’re having, that we’re recording, if you don’t consent to their terms, you don’t get to use the platform.

8:14
We’re doing it all the time.

8:15
We’re doing it all the time.

8:17
And so what’s the recourse, to your point?

8:20
We don’t have a lot of recourse.

8:22
We can we can say, hey, we’re going to stop using my iPhone.

8:25
Yeah, I’m not going to stop using my iPhone.

8:28
Neither are you.

8:29
And do you really read the notice that says, hey, I’m going to read these 14 pages of legal terms and then I’m gonna say, I don’t think I like that.

8:40
So I’m not gonna use the use the, the phone I just pinned $1000 for, you know, so there’s, there’s not a lot that we can do at this point.

8:49
I think it comes back to data minimization.

8:54
If you can, if you can stop sharing your data in ways that are that it’s appropriate, if you can, you know, turn on your private Instagram setting instead of public.

9:07
If you can just do certain things to to help protect yourself, even set up, you know, Google Voice if you don’t want to give your phone to everybody.

9:15
Now, my phone number’s been out there on the dark web for for ages.

9:20
Thank you to our to the 8 breaches that I’ve been part of.

9:24
But but the other the other big challenge, Natalie, is the breaches related to Pixels.

9:33
Now let’s talk about those for a second.

9:36
We’ve these are these are some that have really snare been, you know, snaring certain health systems when they use things like a Google, Google Pixels and things like that to track what’s happening with the data that they get.

9:50
And sometimes millions and millions of people gets get caught up in all that and their data’s now been shared without consent.

9:59
And now in HIPAA world, that’s a big deal in normal worlds where you’re not subject to HIPAA.

10:07
Well, now the FTC comes comes into play and they’ll be doing the enforcement very good, very good.

10:15
You know, I, I do want to you recently participated in a webinar about minimizing third party risk in healthcare, especially given the the major breaches in healthcare that have always been happening.

10:24
But the major breach that everybody’s discussing.

10:27
And so I wanted to shift the conversation a little bit to minimizing third party risk.

10:32
If you are a digital health company, whether it be a, a company that does, you know, you know, report, you know, does follow HIPAA or a health breach notification rule.

10:41
But what can these digital health companies who are collecting our data, what can they do if they don’t have the infrastructure to manage the security and the privacy internally?

10:50
What can they do when they’re looking at a third party vendor?

10:55
Well, so if I’m, if I’m a, a vendor, let’s say I’m clear data, which that’s what that’s, that’s us.

11:02
And we’re trying to hire someone to do or some buy some platform to help us enable our, our communications.

11:10
For example, we will do a lot of vetting.

11:12
We will frustrate our marketing department by slowing things down just a bit and say let’s take a look at what’s under the covers here and see how they’re handling, handling their data throughout the life cycle.

11:25
Data life cycle, very simple.

11:28
Data is created, it’s it’s distributed, it’s used, it’s maintained, it’s archived.

11:35
Eventually it’s destroyed.

11:36
You hope.

11:38
And, and so how does the company address the data?

11:42
What are their policies and procedures along that data life cycle and how do they adhere to those policies and procedures?

11:50
Are they doing what they’re supposed to be doing in that scenario?

11:54
You can tell a lot about a company’s privacy practices if you just ask them what the data life cycle is.

11:59
And they say what?

12:00
And then you, you know, you have to dig a little deeper and understand in, in many cases, you’ll, you’ll find this hilarious.

12:08
We’ll, we’ll have companies that will want to do business with us that have never had or, or even imagined what a privacy policy is.

12:18
Shocking.

12:19
So what we end up doing is we have to sit down with them and help educate them about privacy and what it’s about and why it’s important.

12:27
And we’ll even help them create their own privacy policy in a way that that works.

12:33
And then we’ll check on them on a periodic basis.

12:36
And that’s being held to the high standards, rigid standards of security and privacy.

12:40
Absolutely.

12:41
Yeah.

12:41
Yeah.

12:41
Now in HIPAA world, it’s it’s it’s, it’s mandated.

12:44
We have to slow down the contract terms or they don’t get to play ball.

12:49
We just turned down a huge company a couple weeks ago because they refused to tell share with us any of their audit reports that says that we’re doing what they’re what we’re saying we’re doing.

13:00
We had to say no, I’m sorry.

13:01
If you, if you can’t show us that, then we can’t do business with you.

13:05
Wow.

13:06
But that’s what needs to happen.

13:09
Absolutely.

13:09
And cause at the end of the day, we’re handling patient data and this is about patient safety.

13:14
And you have, you know, LED clear data and founded clear data with the patient first mindset and everything like that.

13:19
So, you know, do companies then did these digital health companies, do they have a responsibility to increase transparency and communication with, with the consumers?

13:28
I know we’re not technically patients, but you know, our health records, as you’ve stated many times, they don’t expire.

13:33
That is why there is such a high demand for this, for this data.

13:36
So you know, what can the the companies do to maybe increase transparency with their consumers in term?

13:43
Maybe it’s education and then just kind of ending on what, what we as consumers just in general can do to protect our data, whether it be passwords and XY and Z, these, these small hacks.

13:53
Like you said, you’ve been breached 8 times and, and you are following all of the, the proper guidelines, of course, but you know, just a little bit of education to leave our consumers.

14:04
Sometimes you can’t do anything about it.

14:05
I was wrapped up in, in two breaches from our federal government, the US House of Representatives and the FBI.

14:13
So, so my idea was sitting there, it was supposed to be protected.

14:16
And guess what?

14:17
It just, it wasn’t.

14:19
So sometimes you’re just a victim and you’re doing all the right things to your point, in terms of transparency, share with the world what you’re doing from the, from a privacy perspective.

14:30
Apple does a great job at that.

14:32
I, I really do admire, you know, Apple’s approach, which is let’s be transparent.

14:37
Let’s let’s tell you that your iPhone was logged into by someone somewhere else and let’s make sure that they someone knows that.

14:46
Let’s make sure that that your, your SMS, your not your SMS, but your I message is encrypted end to end, which is, which is the case.

14:57
It’s not the case with everything for sure.

15:00
But if you if you say you’re gonna do something, do it as a company, absolutely.

15:07
There is responsibility there for sure.

15:10
Yeah, we are not going to have enough time to talk about it today, but I would like to to ping you for a discussion about a national privacy law.

15:16
There have been some some updates and legislation there.

15:19
So I think that that will be our next episode.

15:22
But you know, I, I do want to leave viewers clear data offers a, a cloud risk check up.

15:28
If you are a company working in the cloud and you’d like your infrastructure evaluated, please do, I think we’ll do another QR code in the screen somewhere.

15:37
So please scan that.

15:38
It’s painless.

15:39
We promise.

15:40
We’ll, we’ll scan the infrastructure, give you a full report of your cloud security posture.

15:44
And we look forward to the next episode.

15:46
Chris, thank you as always for your thoughts today.

15:49
Any, anything else departing before we log off?

15:53
You know, I’ll echo what you said on the cloud risk check up one of the things that we’re seeing, and this comes straight from the change health breach IAM identity and access management.

16:05
If you, if you think that you’re doing great in your cloud environment and you haven’t done some hygiene on your IAM, go take a look.

16:13
Make sure you have MFA in place.

16:15
Make sure that you have rotated your keys or, you know, eliminated those user accounts, those those other accounts that need to be pruned, if you will, That’s what brought down change.

16:29
So that’s the the cloud risk checkup will will help cover some of that.

16:34
Thank you.

16:35
Yes.

16:35
And, and to that point that reminded me, I’m going to link in the resources here a couple of recent events from Clear data, Chris, that webinar about third party risk.

16:43
We’ll we’ll link to episode #13 as well as that cloud risk checkup.

16:47
So thank you all for watching and we’ll see you next time on episode 22.

16:52
Thanks Chris.

16:52
Thanks Nelly.

16:53
Bye.

Don’t wait for cloud unknowns to become cloud nightmares.

Schedule Your Cloud Risk Checkup Today.

Request Checkup