California’s New Child Privacy Law
A child, by reason of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection.[1] The existing California laws on data privacy, namely the California Consumer Privacy Act of 2018 (CCPA) as amended by the California Privacy Rights Act of 2020, mandate that businesses safeguard consumer information and privacy and that they must educate consumers about their rights, including the right to access specific information that it has collected from consumers.
With a wider scope, the Parent’s Accountability and Child Protection Act of 2018, requires a person or any business in California that seeks to sell certain products or services to take reasonable measures to ensure that the purchaser is of legal age at the time of purchase or delivery.
California has developed a novel approach to protecting children’s online safety. The California Age-Appropriate Design Code Act (CAADCA) was signed on September 15, 2022. It is a bipartisan agreement at the international level that is set to be effective from 1st July 2024.
What is the scope of applicability of the new code?
The legislation aims to further the purposes and intent of the California Privacy Rights Act of 2020(CPRA). Therefore, it is implied that the new act shall apply only to those business subject of CPRA. Under CPRA, a covered business is defined as a “for-profit entity doing business in California that collects personal information of California residents and meets specific threshold criteria.”[2] This legislation does not cover a ‘broadband internet access service’, ‘telecommunications service’ and ‘delivery or use of a physical product’.
What does this new law bring to the table for California?
Unlike other California privacy legislations, CAADCA defines a “child or children” as a consumer below 18. The new legislation declares that children should be protected not only from online products and services specifically directed at them but from all such products and services that are ‘likely to be accessed by children’. It shall mean all online products and services where it is reasonable to expect children to access them based on the following indicators:
The online product, service or feature is “likely to be accessed by children” if it is
i. “Directed to children”, as defined by the Children’s Online Privacy Protection Act (COPPA)
ii. Determined to be routinely accessed by a significant number of children (as evident from reliable and competent reports on audience composition)
iii. Containing of advertisements marketed to children
iv. Substantially same or similar to another online product, service or feature routinely accessed by children.
v. Made of design elements that interest children, including but not limited to games, cartoons, music, and celebrities who appeal to children.
vi. Determined that a significant amount of its audience is children, based on internal company research.
What are the requirements to be complied with by such businesses under this Act?
Businesses providing online services, products and features as discussed above, are required to consider the best interests of the children while designing, developing and providing them to young consumers. In case of conflict between commercial interest and the interest of children, the companies shall prioritize the latter.
The Code requires that a business providing an online product, service or feature, likely to be accessed by children shall take the following actions:
1. Data Protection Impact Assessment (DPIA), defined by law to mean “a systematic survey to assess and mitigate risks that arise from the data management practices of the business to children who are reasonably likely to access the online service, product or feature.” Under the new code, online businesses likely to be accessed by children are required to complete a DPIA before offering any new product, service or feature to the public. Specifically, the assessment must identify the following:
o Purpose of the online product, service, or feature,
o In what ways it will use the personal information of the children, and,
o Children’s risks associated with business data management practices
The company should document the risks identified during DPIA that are detrimental to children and create a plan for mitigating or eliminating the risk before the product or service is made available to children.
2. Age Estimation: Under CAADCA, a business is required to determine the age of child users with a degree of certainty that is reasonable and suitable given the risks associated with the ways that the businesses manage their data.
3. Provide Privacy by Default: The term default is defined to mean “a preselected option adopted by the business for the online service, product, or feature.” Unless the business can give persuasive evidence that a different option is in the best interests of children, it shall configure all default privacy settings to a setting that offers a high level of privacy.
4. Provide clear privacy policy: All privacy information, terms of service, policies and community standards shall be provided in a clear and concise manner such that it is understandable to the children of the age group most likely to access it.
5. Inform any Tracking or Monitoring attempts: If the online product, service, or feature allows the child’s parent, legal guardian or any other consumer to monitor the child’s online activity or track their location, it should provide an obvious signal to the child informing them that they are being tracked or monitored.
6. Provide for the exercise of privacy rights: Businesses must provide prominent, accessible and responsive tools to help children, or their parents or legal guardians, exercise their privacy rights and report concerns.
What restrictions are imposed under this Act?
CAADCA restricts any business that provides an online service, product, or feature likely to be accessed by children from taking any of the following actions:
1. Using personal information of any child a) in a way that “business knows” to harm their physical or mental well-being, OR b) for reasons other than for which it was collected (if the end user is a child), OR c) to estimate age or age range for an unrelated purpose, OR retain such information for a longer time than necessary for the above-mentioned purposes.
2. Profiling a child, which is defined to mean “any form of automated processing of personal information that uses such information to evaluate certain aspects relating to a natural person, including analyzing or predicting aspects concerning a natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.” Profiling is acceptable only if it meets two criteria. First, if the business can prove that it has appropriate safeguards to protect children. Second, either profiling is necessary to provide the online product, service or feature, OR, the business can validate that profiling is in the best interests of children.
3. Collecting, selling, sharing or retaining unnecessary personal information unless the business proves a convincing reason for doing so.
4. Collecting, selling, and sharing geolocation information, by default, unless it is strictly necessary for providing online products, services or features.
5. Collecting geolocation information, without informing the child.
6. Use of “dark patterns” to lead or encourage children to provide personal information, beyond the requirements.
What will be the repercussions for violating this Code?
Any business that violates the Code shall be subject to an injunction and liable for a civil monetary penalty of up to $2,500 per affected child for ‘negligent violation’, and up to $7,500 per affected child for each ‘intentional violation’. It shall be assessed and recovered only in a civil action brought in the name of the people of the State of California by the Attorney General and not through a private action.
Any fees, penalties and expenses recovered in an action brought under this code shall be deposited in the Consumer Privacy Fund which will be used to meet offset costs incurred by the Attorney General in connection with the act.
However, for businesses which are in substantial compliance with the requirements of the Code, the Attorney General shall provide written notice to the business before initiating an action here. The business shall not be liable for a civil penalty if it cures the violation and provides written notice of the same to the Attorney General within 90 days of his notice.
Why did the Act create the California Children’s Data Protection Working Group?
The Act created the “California Children’s Data Protection Working Group” to deliver a report to the Legislature regarding the best practices for implementation of the Act. Its main objectives are as follows:
· Identifying online services, products or features, likely to be accessed by children.
· Evaluating and prioritizing the best interests of children and how those could be furthered by the design, development, and implementation of an online business.
· Ensure that age assurance methods used by businesses are proportionate to the risk that arise from the data management practices of the business
· Assessing and mitigating risks arising from the use of the business
· Publishing privacy information, policies and standards in a language suited for children.
What is the way forward for Businesses?
Businesses subject to CCPA may begin by assessing if their online services, products or features are likely to be accessed by children. They can then proceed to ensure compliance with other requirements such as the Data Privacy Impact Assessment, Default Privacy, and tracking information. It is to be understood that the same data protection regime may not be appropriate for children of all ages. Hence, in order to help support the design of online products, services and features, the act suggests that businesses should take into account the unique needs of different age ranges, including the following: developmental stages: 0 to 5 years, or preliterate and early literacy- 6 to 9 years or core primary school years; 10 to 12 years or transition years 13 to 15 years or early teens; 16 to 17 or approaching adulthood.
Final Thoughts on the new law
A comparable regulation which was recently put into effect in the UK served as CAADCA’s blueprint. The UK legislation is claimed to have created waves of change on significant platforms like Facebook, YouTube, Instagram and TikTok. For instance, YouTube, TikTok and Instagram made teenagers’ uploads private by default and YouTube disabled the “Autoplay” button that nudged them to keep watching longer. Google activated Safe Search for all users under the age of 18. Further TikTok, Instagram and Snapchat disabled direct messages between children and unknown adults.
Many parents and authorities who want to safeguard young consumers have applauded these developments. The tech industry groups, however, were vehemently opposed to the California measure. The Code’s ambiguity raises serious concerns about how businesses will adhere to its requirements once it takes effect. The legislators chose not to define some specific terms in the Act, leaving the tech companies to pause and mull over phrases or words like “material” harm to well-being or “likely to be accessed by children”.
Another criticism is the law’s requirement that online businesses may collect information to determine the age of child users with a “reasonable level of certainty”. It is feared that such a requirement might compel companies to gather more personal data. A perfect example is the invasive approach of Instagram. In order to change a user’s age from under 18 to over 18, Instagram now requires users to present a government-issued ID, a “video selfie” for face analysis, or the support of three adult followers. This intrusive policy was implemented in June.
California is the world’s torchbearer for children’s privacy and setting the standard for making the digital space a secure environment for kids. The tech businesses are expected to have some clarification by January 2024, when the working group issues additional implementation recommendations.
References:
California Legislative Information, The California Age-Appropriate Design Code Act 2021-22 , AB-2273 https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=202120220AB2273&showamends=false ( Last visited in December 2022)
The National Law Review, Impact on Companies of California’s Children’s Privacy Law, available at https://www.natlawreview.com/article/impact-companies-california-s-children-s-privacy-law-effective-2024 ( Last visited in December 2022)
Office of Governor, Governor Gavin Newsom signs First-in-Nation Bill Protecting Children’s Online Data and Privacy, available at https://www.gov.ca.gov/2022/09/15/governor-newsom-signs-first-in-nation-bill-protecting-childrens-online-data-and-privacy/ ( Last visited in December 2022)
Lewis Rice, California Passes Another Privacy Law, this time for Children Under 18 https://www.lewisrice.com/publications/california-passes-another-privacy-law-this-time-for-children-under-18/ ( Last visited in December 2022)
Fast Company, Why California’s new child privacy law could yield new protections nationwide, available at https://www.fastcompany.com/90809105/why-californias-new-child-privacy-law-could-yield-new-protections-nationwide ( Last visited in December 2022)
California’s New Child Privacy Law Could Become National Standard, available at https://www.pewtrusts.org/en/research-and-analysis/blogs/stateline/2022/11/07/californias-new-child-privacy-law-could-become-national-standard ( Last visited in December 2022)
[1] Convention on the Rights of the Child, 1989
[2] Alexander Misakian, Young, California Enacts the California Age-Appropriate Design Code Act, JDSUPRA, (2022), available at https://www.jdsupra.com/legalnews/california-enacts-the-california-age-5700283/
Newsletter
Don't miss our future updates! Get subscribed today!
MS Sulthan
Legal Associates
CONTACT
136/2, Rameshwar Nagar, Model Town, New Delhi – 110033