Officially enacted August 30th, 2022, California’s Age-Appropriate Design Code Act (AADC) is set to be in effect on July 1, 2024. The bill poses new safety regulations for companies operating in California, with a focus on large social media and gaming platforms where the child user base is growing daily. Preparing for the bill to come into full effect, California tech companies are now required to redesign their digital services, products, and features to prevent California users under the age of 18 from online risks.
In this article, you’ll learn about the act’s key provisions, how platforms must comply, and the new dynamic between the California tech industry and lawmakers.
Included in this article:
Bolstering online protections for children in California
Online platforms thrive by engaging users and creating a community within their digital spaces. Though this approach is both beneficial and entertaining in many ways, it can have a number of risks for children using platforms that were not specifically designed for their age group.
Background information on the act
The California Senate passed the Age-Appropriate Design Code Act (AB 2273) on August 30, 2022, by a vote of 33 to 0 with the State Assembly already approving a previous version of the bill. California Governor Gavin Newsome approved the bill on September 15, 2022. This law was modeled after the Age Appropriate Design Code (AADC) of the United Kingdom government, which is now working to officiate the Online Safety Bill (OSB) with an additional focus on child safety. California has been a US leader in children’s online safety by enacting protections over the past decade that other states have later replicated.
Importance of the bill passing in California
California has become the first state to require general-audience websites and applications to safeguard users under the age of 18, whereas before only children 13 and under were protected by the Children’s Online Privacy Protection Act of 1998.
The bill has sparked a nationwide conversation in the US about the potentially damaging effects that social media content can have on children present on online platforms. Mental health risks, namely poor body image and internet addiction, are among the concerns grounded in the bill, though a majority of the regulations are set to protect the data of child users.
Platforms to comply
Every “business that provides an online service, product, or feature likely to be accessed by children” and those “online products and services specifically directed at [children]” is required to meet specific regulations in the California Age-Appropriate Design Code Act. [AB-2273, Sec. 1(a)(5)]
The scope of this bill applies to “businesses” operating in California, defined by the California Consumer Privacy Act (CCPA) as for-profit companies with one or more of the following:
Possesses an annual gross revenue of more than $25 million
50% of its annual revenue comes from the sale of personal data
Receives, purchases, and sells for commercial purposes the personal information of over 50,000 households, customers, or devices
Determined age groups for children in California
This bill only applies to the protection of child users located in California; in the bill, children are defined in the bill as “a consumer or consumers who are under the age of 18.”
Child age groups according to the bill include:
0 to 5 years of age or “preliterate and early literacy”
6 to 9 years of age or “core primary school years”
10 to 12 years of age or “transition years”
13 to 15 years of age or “early teens”
16 to 17 years of age or “approaching adulthood”
AB-2273, Section 1 (a)(5)
Entities exempt from the bill
The bill exempts certain entities from the scope. The service types listed below are not classified as “online service, product, or feature,” per the bill’s definition, and are therefore not required to comply.
Exemptions include:
Broadband internet services
Use/delivery of physical products
HIPAA covered medical/clinical trial information and healthcare providers
Key online safety protections for children
The bill requires platforms and apps intended for use by general audiences to implement protections for children, specifically for those they are the most likely to access. Platforms and apps will need to analyze their products and implement features that diminish the risks posed to minors, such as exposing them to explicit content or applying ways to make them stay online longer than what is considered healthy.
“Children should be afforded protections not only by online products and services specifically directed at them, but by all online products and services they are likely to access.”
AADC, Section 1, (a)(5)
Obligations and design standards for in-scope platforms
Data privacy protections
Default settings
Online services will need to, by default, apply heavy privacy settings to users under the age of 18. Pre-selected, mandatory features need to be in place for child users when they use an online service, product, or feature. This includes disabling features that would compromise the child’s location.
Data assessment
Platforms must perform a Data Protection Impact Assessment (DPIA) at the onset of the bill before offering new products, services, and features online. DPIA’s are systematic surveys performed, in this case, to analyze and minimize the risks originating from a business’ data management where children are likely to access an online service, product, or feature that poses risks.
No storing or selling child user data
In addition to having an age-appropriate design under this bill, platforms will be prohibited from accessing, keeping, and selling the child users’ geolocation and other personal information without full disclosure to the user when and where they are being tracked.
In practice, platforms must not profile child users by default nor ask minors on the platform to detail any personal information.
Data minimization
The AADCA would make it illegal to sell, collect, share, or store consumer data that is unnecessary when delivering a product or service. To ensure the safety of children’s data, companies will only be allowed to use children’s data for the purpose it was collected for. The AADCA only permits using a minimal amount of data to verify the age of a user.
Data subject rights
Children, or their parents, if necessary, must be provided with accessible tools to assist them in exercising their right to reporting and privacy.
Additional obligations
Age guarantee
The Act requires platforms to estimate a user’s age with a “reasonable” level of guarantee according to risks posed by data management practices or to extend data protection and privacy to consumers of all ages.
Policy transparency and enforcement
The bill will also require platforms to have clear, accessible, and enforced terms of service, policies, and privacy information. Child users will need to have rights to their privacy as adult users do.
Clearly written rules
Privacy information should be written clearly enough that children who are likely to use the service can understand it. All published terms of service, guidelines, and policies set by a company should be strictly enforced on their respective platforms.
Documentation
Platforms must document all materially harmful risks that the company’s data management methods pose to children per their Data Protection Impact Assessment.
Children’s Data Protection Working Group to Regulate the AADC
The State of California will introduce the Children’s Data Protection Working Group, the entity designed to communicate and enforce the standards for platforms and privacy guidelines they must adhere to.
Established alongside this bill, the Working Group will deliver a report on the best practices for implementing the bill by January 2024 and be in charge of providing compliance guidance by April 2024.
The Working group will also be responsible for:
Minimizing children’s online risks
Identifying and assessing which features and services are likely to be accessed by children
Pinpoint “age assurance methods” that are best for reducing children’s online risks
The choice of language in children’s privacy policies and others.
Repercussions of non-compliance
Once in effect, the Attorney General will have the authority to file a civil lawsuit against any company that violates the bill. Non-compliant companies can face fines up to $2,500 per impacted child user per violation, and up to $7,500 for each deliberate violation.
Summary of requirements for platforms
Conduct a Data Protection Impact Assessment (DPIA)
Informing child users they’re being monitored and their rights by helpful policies and notices
Establish the highest data privacy settings for children by default
Take appropriate security measures to ensure data privacy
Simplify reporting system so children may easily report concerns
Write policies and notices that are easy for children to understand
Keep data storage to at the bare minimum of what is required
Comentarios