Terratrue knows security

September 1, 2022

California’s Age-Appropriate Design Code mandates privacy by design 


California passed what many call the “Kids’ Code” earlier this week. The bill will require websites, platforms, apps and others with content that may attract children to set the strongest privacy settings by default.

California State Assembly members Buffy Wicks, a Democrat, and Jordan Cunningham, a Republican, introduced the bill. It covers for-profit entities captured under CCPA and CPRA that attract users under the age of 18 to their products and services. The bill responds to widespread criticism over how social media companies impact children’s mental health and personal safety. Industry is losing its collective mind over the bill, officially called the Age-Appropriate Design Code, which would become effective July 1, 2024. It now needs only California Gov. Gavin Newsom’s signature to become law.

And it’s no joke, either. The penalties for violation are steep: the California Attorney General is authorized to fine companies $2,500 per child affected if they’re negligent about it. If you intentionally violate the law, you’re paying $7,500 for each child affected.

Specifically, the code bans companies from:

  • Selling kids’ personal information
  • Profiling kids
  • Designing features that are detrimental to kids’ well-being
  • Tracking kids’ locations, unless essential for the service
  • Using dark patterns to get kids to cough up their information

It’ll require companies to:

  • Provide privacy notices in ways kids can understand
  • Alert kids if they’re being tracked or monitored
  • Design sites based on kids’ estimated ages

It’s no coincidence that the California Kids’ Code looks a lot like the U.K.’s “children’s code,” which requires online services “likely” to attract children to abide by 15 different standards. Though California legislators had to push the bill forward, U.K. non-profit group 5Rights Foundation reportedly lobbied Wicks and Cunningham to get it done. The same group pushed the U.K. children’s code from ideation to passage. That code requires those covered to do age checks, switch off geolocation services, provide a “high level of privacy by default” and avoid nudging kids to provide more personal data through tactics such as dark patterns. 

Preventing harm via contact, conduct, and content … huh?

Industry’s panic isn’t necessarily an overreaction. In a recent chat with Future of Privacy Forums’ Chloe Altieri and Lauren Merk, and Linnette Attai, president of Playwell, the compliance headaches became evident. The bill aims to protect children from harm via “content, conduct, contact, algorithmic arm,” and harm that could result from targeted advertising systems.

“That’s a lot,” Attai said. “If it was easy to fix all those in one bill, we would have done it already.”

Some of the provisions on age verification and who the bill captures mirror COPPA, the children’s privacy law on the books in the U.S. But it’s not entirely clear who should consider themselves a site

“If it looks, smells, feels like it’s for kids, then it’s for kids,” said Attai. “But the bill refers to a ‘significant amount’ of children or teens routinely accessing your site or your service, and that’s where this starts to get tricky. Vague language comes into play. We don’t know what ‘significant” means. Significant meaning what? Are they going to put a number on that? What does it mean to ‘routinely’ access a service? Routinely meaning what, and to whom? What does it mean to be be harmful or potentially harmful to a child when it comes to contact or conduct? Then you get into a very murky area.”

Age verification is gonna be tricky

Altieri said the bill’s requirement that companies do an “age-estimation” is part of industry’s panic.

“It seems like the bill is saying: If you do not estimate the age, and you’re unsure of the age, then you should treat all users as children, put in all of those privacy protections for users,” she said, leaving companies asking, “So, it’s basically putting companies in this place of, ‘Do we estimate the age? Does this cover us? If we do [estimate the age], how do we do that in a way that doesn’t collect more sensitive information?’”

Or, do you not even bother estimating because that’s pretty hard, and you just implement the most privacy-protective features by default for everyone? You’re going to lose some data there, in cases you don’t necessarily need to.

DPIA requirements are heavy

The bill requires those covered to do DPIAs on any new features or services before they’re introduced to the public. For services not traditionally geared toward children: Am I really supposed to do a DPIA for every new feature I do if I’m writing blogs?

Attai said it’s one thing to ask the giant social media companies to do DPIAs ahead of launches, but, “the local bike shop that has kids coming to their site to look at bikes, they need to DPIAs now?”

Now, if you’ve done a DPIA for your GDPR requirements already, the bill suggests that’s a sufficient method. But what’s different is you must do a DPIA for every new feature, system, or offering. You need to keep a record of all those DPIAs, and you must revisit them every two years. 

If the California attorney general – who seemingly will enforce the law along with the California Privacy Protection Agency – asks to see your DPIA, you must produce a list of your DPIAs within three business days, and you have five days to produce the DPIA itself.


Attai said typically, DPIAs are done for honest risk assessment and mitigation purposes, as part of a legal compliance requirement. But those are internal documents, outside of a consulting process.

“So we’re being rather blunt,” Attai said of the internal process. “If you do a DPIA, you’re taking an honest look at what you’re doing with respect to the risks to the individual. Now, I’m saying I need to make sure that counsel has vetted this thoroughly, because it would be going to the enforcement agency whenever they ask for it. It’s a very different exercise. I need a lawyer because it has to be good enough to pass muster, and I don’t want to be putting a target on my back either. Every organization doesn’t have counsel, and every organization doesn’t have privacy counsel, that’s for sure. It’s such a challenge, a burden that hasn’t been discussed,” Attai said.

There’s no word yet on how regulators aim to implement the law. That will come with regulations from the attorney general and the CPPA. We’ll keep you posted. For now, the bill awaits Gov. Newsom.

Hey, if you want to know more, check out this 25-min, quick-hit chat between Attai, Altieri, Merk, and Angelique Carson.