California legislatures have passed legislation aimed at protecting the privacy and welfare of minors on social media and protecting them from predators and exploitative commercialization on internet platforms.
If the bills become law, tech companies like Facebook, Snapchat, Twitter, YouTube and Google would have to publicly disclose their content screening policies, a requirement aimed at combating the spread of hate, racism, extremist violence and conspiracy theories online.
The state assembly passed two bills, AB 2273 and AB 587, on Tuesday, a day after they flown through the state Senate with strong bipartisan support. The measures will now go to Governor Gavin Newsom for consideration.
“Our children are being bombarded with information online, and they don’t yet have the capacity to understand all of the information that’s coming at them,” Rep. Buffy Wicks (D-Oakland) said ahead of Tuesday’s vote. “We want to make sure that when these products are developed they are inherently safe and by default safe for our children.”
Wicks is the primary sponsor of AB 2273, the California Age-Appropriate Design Code Act, which prohibits technology companies from using children’s personal information in ways that are harmful to their physical or mental health. Web platforms that children are likely to use would need to adopt privacy measures such as: B. setting high privacy settings for users, describing privacy policies in language children can understand, and prohibiting children’s personal information from being used for anything other than the purpose for which it was originally used was collected.
“As a parent, you don’t stand a chance under the status quo. You don’t stand a chance. There’s something going on in the background. There are things that affect your children’s minds, the development of their brains, that you cannot control. Most parents are not software engineers,” said Assembly Member Jordan Cunningham (R-Paso Robles). “As a former prosecutor, I can tell you there are predators out there and they are using these tools to try and get at children. It’s not right, and it’s time for tech companies to get involved.”
A coalition of technology groups, including Entertainment Software Assn., opposed the legislation. In a statement to lawmakers, they said applying the law to websites “that a child is likely to access” is too broad and would affect far more websites and platforms than necessary.
The News/Media Alliance, an industry advocacy group of which the Los Angeles Times is a member and whose board of directors is California Times President Chris Argentieri, has been pushing for changes to the bill amid concerns that the publication of online news this would make it more expensive.
dr Jenny Radetzky, a behavioral development pediatrician and an assistant professor at the University of Michigan School of Medicine, told lawmakers in March that most web platforms have been developed by adults who have not been trained in the way children experience the digital world. Designers, she said, often focus on monetization or engagement tactics — luring users in by offering “rewards” for viewing ads or finding ways to make a site difficult to navigate — rather than the unintended negatives take into account the consequences for children.
“We find that adult design norms are simply copied and sloppily pasted into children’s digital products,” she said.
TikTok, Pinterest, Twitter, Twitch, LinkedIn, and Discord have not responded to requests for comment on whether they support the Design Code Act, how it would affect them, and whether there are any changes they would like to see made to it. Google, which owns YouTube, and Snap, the owner of Snapchat, also didn’t respond. Reddit, Tumblr and Yelp all declined to comment.
A spokesman for Meta – the parent company of Facebook, Instagram and WhatsApp – pointed out the “Best Interests of the Child Framework‘ as a guide to how the company is building ‘age-appropriate experiences’ for young users. The spokesperson also cited several platform features that protect young users, including a system where teens’ accounts default to private and another where advertisers can only use age, gender and location to target teens with ads.
“We test verification tools on Instagram […] This allows us to provide age-appropriate experiences for people on our platform,” the meta spokesman said in an email to The Times. “We also use AI to understand if someone is a teenager or an adult.”
Mark Weinstein, Founder of the alternative social media platform MeWe – a small Facebook competitor who has courted users who feel censored by the larger platform – said the Design Code Act “is an important step forward in protecting our children’s privacy and critical thinking skills”.
“Current mainstream social media companies are brainwashing and addicting our kids,” he wrote via email. “The act is thoughtful and necessary given the blind eye of social media companies whose amoral interest lies solely in revenue and sticky eyeballs.”
The bill has also garnered support from one of the loudest voices in the growing chorus of social media criticism: Frances Haugen, the Facebook product manager-turned-whistleblower last fall a treasure trove of internal company documents leaked Congress, the US Securities and Exchange Commission and the Wall Street Journal.
The material in Haugen’s “Facebook files” included internal discussions between Meta employees about the company’s contribution to various social ills, including mental health issues, among teenage Instagram users. (The company claims its documents have been misrepresented.)
Haugen’s leaks sparked and fueled a renewed wave of Facebook criticism in public. She has since used her platform to advocate for a handful of political efforts to tighten regulation of internet companies, including the Design Code Act. In April she has sat on a platter to discuss child safety online with Sacramento state legislators.
Although the documents she leaked were covered a whole range of problem areasincluding online misinformation and political extremism, Haugen said she’s not surprised it’s the impact on children that draws most lawmakers’ attention.
“The solutions to many of the problems described in my disclosures are actually quite complicated,” she told the Times in May. But “when it comes to kids, it’s really easy.”
Following Haugen’s leaks, Meta paused development of a teen-focused Instagram Kids app that would have been ad-free, prioritizing an age-appropriate design. The company, which initially pitched the project as a way to capture kids who would otherwise join Instagram by lying about their age, announced in September that it would step back and discuss the proposed product with parents and other stakeholders , before continuing .
Key aspects of the legislation passed on Tuesday were modeled after data protection and privacy restrictions already passed in Europe. For example, Wicks says, in the UK, Google has made safe search the default browsing mode for anyone under the age of 18, YouTube has disabled autoplay for underage users, and TikTok and Instagram have disabled direct messaging between children and adults .
Under Wick’s bill, California’s Attorney General could take civil action against companies that don’t comply with the law, including fines of up to $7,500 per child for each violation.
State legislatures also approved AB 587, which would require social media companies to publicly post terms of service — the policies that specify conduct and activities that are permitted, prohibited and monitored — and to report that information to the Attorney General.
Member of Parliament Jesse Gabriel (D-Encino), sponsor of the bill, said it aims to curb the spread of extremism, racism and conspiracy theories via social media.
“Consider the recent mass shootings we’ve had in this country,” he said. “One of the issues: They’ve been radicalized, often with a toxic concoction of white supremacy and extremist ideology.”
Gabriel also lashed out at the country’s major web platforms Tuesday, most of which are based in California, saying they “fought us at every turn.”
Given the influence California has on politics statewide, both Gabriel and Wicks suggested that other states — and Congress — could use the child safeguarding and transparency requirements in legislation as a template for passing their own legislation. If the bills become law, Facebook, Google and other web platforms can also enact the restrictions and protections nationwide.
“Would you have different rules for children in California than in Nevada? No, you would just create a standard that you would adhere to everywhere,” Wicks said.
https://www.latimes.com/politics/story/2022-08-30/california-lawmakers-pass-new-social-media-protections-for-minors California lawmakers pass new social media protections for minors