Get Started

August 25, 2023

Issue 24 — California bill aims to take a hot poker to data brokers

Here’s the gist: Come here for insights on the hottest topics in privacy according to our peers’ tweets so you can walk into any happy hour or team meeting and sound like the absolute baller you are. No current topic gets by you!


Welp, it's August, and things slow down a little bit. But there are a couple of trends I'm seeing in the space that I think are worthing talking about this week. First, the widespread efforts on behalf of state lawmakers, federal regulators, and even journalists to figure out this whole data broker problem. It's a strange problem to confront, because data brokers operate in the shadows. It can be difficult to identify any one target and say, "You're the problem." Plus, data brokers work in partnership with so many businesses and organizations, and the data ecosystem there is massive and complex. But it's a problem worth confronting, so we'll try.

Second, the children's privacy space continues to blow up.

Here are the details I want to bring to your attention in this edition of this here newsletter, which is written with love from me to you.

California aims to take a hot poker to data brokers

California lawmakers have introduced the “Delete Act,” or SB 362, which aims to extend the CCPA’s reach on opt-out requests. The act would allow Californians to click once to indicate their deletion or opt-out requires for all data brokers, as well as their associated service providers and contractors. As JD Supra reports, it would create a “do not sell” list for data brokers targeting the state’s residents.

Under the proposed act, data brokers are defined as a business that “knowingly collects and sells to third parties the personal information of a consumer with hom the business does not have a direct relationship.”

If it passes, data brokers would have to comply with deletion requests on August 1, 2026, and they would be required to continuously delete the consumers’ personal data at least once every 31 days. Really clean things out, you know?

Fines for failure to register as a data broker comply stand at $200 per day, and an additional $200 per day for each deletion request failure.

Obviously, the data broker industry doesn’t like this bill. POLITICO reports that the Interpublic Group, a giant conglomerate of ad networks, is “pulling out all the stops” to fight it.

The POLITICO scoop uncovered an Aug. 14 email from Interpublic Group CEO Sheila Colclasure to executives saying, in part, “We would like to mount an ‘opposition campaign’ using in-house digital advertising capabilities, targeting California.”

Their campaign method involves using the same personal data that Californians could ask to have deleted to launch targeted ads opposing the bill. Acxiom CEO Chad Engelgau has kindly pledged that his company would provide the data to target the ad campaign.

What a time to be alive! The very data consumers may want deleted will be used to combat the idea that a law would protect it at all.

Relatedly, this week, 404 Media reported a network of hackers have gained access to individuals’ credit history data and are selling it in online messaging groups like Telegram. Because credit bureaus have exceedingly valuable information on people, years ago they decided to share or sell some of it with third parties, like debt collectors, insurance companies, and law enforcement. Now, hackers have tapped into what’s called “credit header data,” the information credit bureaus receives from credit card companies, which includes name, birth date, current and prior addresses, Social Security number, and phone number. The 404 Media journalist, Joe Cox, was able to purchase that data and more in chatrooms aimed at facilitating “swatting” for a mere $15 in Bitcoin. $20 if you want the Social Security number, too.

In the meantime, the Consumer Financial Protection Bureau is looking at the data broker industry to figure out its approach to some new rules under the Fair Credit Reporting Act to deal with the data broker marketplace.

Allegations fly that YouTube may (still) need to comply

A couple of advocacy groups have asked the FTC to investigate whether Google and YouTube are being naughty by delivering personalized ads on YouTube channels made for kids. As CYBERSCOOP reports, Fairplay and the Center for Digital Democracy want the FTC to investigate if that behavior violated COPPA, as well as Google’s 2019 settlement with the agency in 2019. In that case, as you’ll recall, Google and YouTube paid $170 million for collecting kids’ personal information without parental consent.

While there's not official evidence YouTube is violating the law, Fairplay said it did its own research following news reports last week that alleged YouTube was putting ads for adult products on videos designed for children — allegations that incited calls from lawmakers for the FTC to investigate.

The kids’ space is almost on pace with state privacy laws lately. Congress has been considering updating COPPA for awhile now, and then there’s the Kids Online Safety Act. Both are on their way to the Senate floor, having succeeded in getting voted out of the Senate Commerce Committee.

Other developments in the space include:

  • Microsoft’s recent $20 million FTC settlement over COPPA violations.
  • Amazon’s recent $25 million FTC settlement over COPPA violations via Alexa.
  • New laws in Utah, Arkansas and Louisiana requiring parental consents for access to children’s accounts on certain platforms.
  • The FTC’s policy statement warning ed tech about forthcoming scrutiny.
  • California’s Age-Appropriate Design Code (in effect July 1, 2024).

In addition, and I mentioned this in an earlier newsletter but it’s still happening and interests me greatly, the Entertainment Software Rating Board – which is an authorized COPPA Safe Harbor – has asked the FTC to approve a new mechanism for obtaining parental consent under the law. Yoti and SuperAwesome have asked the agency to deem their “privacy-protective facial age estimation” software as a verified solution for COPPA compliance. The technology analyzes a users’ facial geometry to determine their approximate age.

The Entertainment Software Rating Board wrote in an op-ed for the IAPP, “One advantage of privacy-protective age estimation for COPPA and emerging laws is that people like it. SuperAwesome reports, whenever facial age estimation is available as an option for parental consent outside the U.S., more than 70% of parents choose it over other methods.”

I hate to be a pessimist, but, I get so nervous about giving a picture of my face to any company. I spent a lot of time reporting from Congress on the merits and drawbacks of using biometrics, and I was forever changed after reading Georgetown Law’s prolific work on the topic. You’ve only got one face, and once it’s out there in the data ecosystem, that’s forever. Besides the inaccuracies identified in facial recognition technologies, if it's misused or leaked, you can’t order a new one like you would a lost or stolen debit card. So the idea that there’s zero risk, or negligible risk, in having people upload selfies to a vendor for any purpose, regardless of the promised safeguards, makes me very nervous.

But I have anxiety, and a lot of things make me very nervous. So maybe I’m wrong here.

Latest podcast episode for you

Upcoming event with me

Hot Tweet of the week

Burn! Hey, thanks for reading! If you liked it, please help me out by sharing it on your socials and subscribing! Or one of those things? Appreciate you. You're doing amazing. Keep it up. xoxo.

Loading GTM...