A new bill on children’s online privacy is quickly making its way through the California legislature and could lead to a revision of national safety standards for websites that children are likely to have access to.
The California-age design code will require social media platforms to exclude geolocation for children, stop “prompting” techniques that trick children into giving up information, reduce exposure to harmful content and limit potential. for risky relationships with adults.
Co-authored by Assembly members Buffy Weeks, a Democrat from Auckland, and Jordan Cunningham, a Republican from Templeton, the bill was modeled after a recently adopted UK law. At a time when teens spend an average of 8.5 hours online each day, the bill will force social media companies to apply the strictest default security settings available to users under 18.
“When you don’t have government regulation that makes this a priority, it becomes a belated thought,” Weeks told CBS News. “What the regulation can and must do and will do is make this conversation at a much higher level within companies.”
More than a dozen bills in Congress
The pressure to pass online child safety legislation in California comes as attempts at the federal level fail to gain momentum. At least 15 bills, several with bipartisan support, are currently circulating in Congress, with goals such as modernizing Internet security standards, making it easier for people to sue large technology companies and setting up a data privacy agency. The US Innovation and Choice Act, which focuses on the application of antitrust rules in industry, is the only one that has advanced through committee voting.
Despite numerous congressional hearings featuring executives from Meta, Twitter, Snapchat, TikTok and YouTube, as well as explosive testimony fromprogress on these bills has been delayed by other legislative priorities and now seems unlikely as by-elections are imminent.
Security over profit
The California bill says that in the event of a conflict, social media platforms and all websites that are “likely” to be accessible by children should prioritize children’s best interests over their own “commercial interests.”
The phrase is reminiscent of Facebook’s controversial Senate hearingswhen lawmakers accused the company of putting profits above safety, an accusation the company denied.
“Their own data is the strongest argument for why this type of legislation, these types of safeguards, are important,” Weeks said.
Meta, who paused Family center gives parents more access to supervisory tools.A project last year after a backlash from advocates and lawmakers told CBS News that the company wants to create age-appropriate features, enable teens to take control of their online privacy and experience, and involve parents in the process. A recently launched
Teen profiles on Instagram are private by default. In addition to reminders about Instagram’s Take a Break feature, Meta said she will soon start directing teens to different topics if they stick to one for long periods of time.
Teen activists are talking
For Emily Kim, the bill is a welcome change. Kim immediately pulled out Instagram when she got her first phone at 13, “so I can fit in,” she said.
“As I scrolled down, looking at the profiles of my peers, I found myself staring at my own image, reading countless inscriptions calling me fat and ugly,” Kim said at a hearing at the Assembly last month. Her “online torture” continued after an autoimmune disease led to significant hair loss, she told lawmakers.
“My classmates posted pictures of me in countless trends that I couldn’t take part in,” Kim said, adding that she felt “terrible” even though she didn’t post pictures of herself.
Kim, now 18, works with LOG OFF, a teen-led digital wellness advocacy group to inform her peers “about the harms of social media and how to use it safely.” She spoke in favor of the bill in California, saying legislation was needed “to protect young people from growing mental and physical dangers.”
The Wicks and Cunningham bill was passed unanimously by the Consumer Privacy Commission in April. He may reach the assembly hall this month.
Weeks said the new British code for children works, and if California can successfully follow the same pattern, “it could have quite significant consequences.”
According to the 5Rights Foundation, a London-based non-profit organization that advocates for UK law and supports the measure in California, “a wide range of services have made hundreds of changes to their privacy settings” to comply with UK law. kingdom.
In August, Google has made SafeSearch the default option and exclude Location History for users under 18 worldwide. YouTube has turned off autoplay and included default bed reminders for those under 18. TikTok also announced improved security featuresincluding disabling direct messages between children and adults and disabling push notifications after 22:00 by default for underage users.
“There is a history of companies that have passed the strictest state law on a particular topic and are simply doing so by default across the country,” Eric Zero, director of the Privacy and Data Project at the Center for Democracy and Technology, told CBS News. .
Zero explained that COPPA focuses on “parents taking action to allow a child to use the website or get a company to collect data”, while the California bill “focuses much more on what companies are and is not allowed to do. “
Although there is a “good amount” of positive progress in the California bill, Zero warned that this could have unintended consequences.
“One of the biggest impacts on privacy that this type of bill will have is, in essence, each website will have to determine the age and collect information about the age of each user who has it, so that these websites can distinguish people,” which must be treated, “said Zero. “This requires a lot of data collection for each individual user on almost any website,” he added.
Although Meta and Google do not weigh the measure, some industry trade groups are raising concerns.
TechNet and the California Chamber of Commerce, two groups opposed to the bill, said it exceeded it by including all sites “likely to be accessible to children”, not just those aimed at children.
The groups also say the bill’s “new age verification standards” will force companies to gather more information about consumers, such as “birthdays, addresses and government IDs.”
The Electronic Frontier Foundation (EFF) told Wicks that it could not support the bill unless it was amended to include only users under the age of 13, in accordance with federal law. The EFF also said that many of the terms in the bill were “unclear” and that implementation mechanisms remained unclear.
Wicks said that “we are working on the implementation component at the moment and we are trying to figure out the best way to do that.” She added that the legislation was not intended to “fuck up big technology” and said she hoped social media executives would come to support it.
“They’re parents, too,” Weeks said.