February 17, 2023

Issue 15: The FTC might have changed the game in GoodRx settlement

Oh hey! Welcome to The Privacy Beat Newsletter!

Here’s the gist: Come here for insights on the hottest topics in privacy according to our peers’ tweets so you can walk into any happy hour or team meeting and sound like the absolute baller you are. No current topic gets by you. Did you post a hot take you want included? Tag it #PrivacyBeatNews and see if it makes it into the next edition!

It's me. Hi. I'm the problem it's me.

I’m running late today, so you’re probably getting this close to closing up shop. But here’s the thing, I really wanted to write about this GoodRx case for you, because privacy pros around here (in D.C.) are buzzing about it. But it’s not a simple case, and I’ve been on my group thread of privacy nerds all morning trying to wrap my head around what happened.

In fact, I called my poor buddy on vacation in Grand Cayman to walk me through parts of it. It’s dangerous to become my friend. I have needs. But here's what I've got, as succinctly as I can put it while writing on deadline.

GoodRx got a good 'talking to' last week

The Justice Department (on behalf of the FTC) fined GoodRx $1.5M. GoodRx runs a telemedicine platform, and it allows users to search for better prices on prescription drugs. As The Verge reported, an FTC investigation found that GoodRx had been sharing data with “Facebook, Google, Criteo, Branch, and Twilio since at least 2017.”

The case is historical because it’s the first time the FTC has enforced the health breach notification rule, but it’s also historical because the FTC, as a result of this case, has straight-up banned GoodRx from selling health information – forever.

If you need a brush up: the HBNR rule emerged in 2009, and it aims to keep health data protected no matter who’s holding onto it. Specifically, the rule requires vendors with personal health data to notify consumers of a breach. And if a vendor’s service provider suffers a breach, the rule requires the service provider to notify the vendor, and the vendor to notify its consumers.

It’s like: We don’t care who owns the data or who lost it, you must notify the consumers.

Normally, you’d see the rule invoked for an actual breach, but this case is about data sharing. But what’s nuanced about this case is the FTC said that because GoodRx customers didn’t know their data was being shared, that’s a breach. In essence: If it’s a surprise to customers that you’re revealing their data to a third-party, then it’s a breach.

But what’s also interesting about the case is that even though GoodRx wasn’t selling data to advertisers, they were sharing it with Facebook in order to re-target its own customers. It grouped customers, based on their health-information, into “custom audiences” like “Viagra-takers,” for example.

They’d upload those groups to Facebook, and Facebook would advertise to them on GoodRx’s behalf. GoodRx also provided user contact details to build the audiences. Sharing data in this way is a common practice, and Facebook wasn’t taking GoodRx’s data and selling it or retargeting. But the combination of GoodRx's data with Facebook data exposed GoodRx customers' health information in a way that wasn't unfair the FTC alleged. The agency said the health-related data was used for advertising, and you need consent for that.

The fact that the FTC considered those custom audience names as “health-related” personal information is also nuanced, because the categories only revealed drugs and conditions people suffer from, and the FTC said inferences could be taken from that.

The bottom line in this case, and what has privacy peeps OMG-ing, is that the FTC – using its unfairness tool – was able to force GoodRx to change its conduct by forbidding it from selling health data, period. And, as Justin Brookman pointed out, there could be significant implications for other apps sharing health data without consent – even if they believe they are within their legal bounds.

New resources for ya

Do you know how to PbD with security?

I don't know if you heard, but, Feb. 8, the ISO released a standard on privacy by design! What a time to be alive. The vibe I get is that many of us are still trying to figure out what privacy by design even means, especially how it applies to our own organizations. Check out this IAPP webinar I recorded earlier this week with my COO, Chris Handman, and Jason Cronk, who wrote the textbook on PbD for the IAPP. We're talking about "Five ways to build a bulletproof PbD program with your security partners." Because here's the thing: Not only can PbD be done, but you can leverage some of your security pals' shared needs and goals to tackle it together. This is a bit on how to do that in the wild.

Watch it here

The change is the constant: How to future-proof your privacy program

Chaos is just part of the lifestyle if you're a privacy pro in 2023. But it doesn't have to feel so reactive and anxiety-inducing, according to a couple of peeps who've put processes in place to avoid that. Here's a post based on a chat between myself, my COO, Chris Handman, and Uber CPO Ruby Zefo, on how to organize your priorities, think about strategy, and implement some good old PbD.

Check out the post here

Hot take of the week