Republish
As lawmakers struggle to act, California schools turn to the courts to combat social media
We love that you want to share our stories with your readers. Hundreds of publications republish our work on a regular basis.
All of the articles at CalMatters are available to republish for free, under the following conditions:
-
- Give prominent credit to our journalists: Credit our authors at the top of the article and any other byline areas of your publication. In the byline, we prefer “By Author Name, CalMatters.” If you’re republishing guest commentary (example) from CalMatters, in the byline, use “By Author Name, Special for CalMatters.”
-
- Credit CalMatters at the top of the story: At the top of the story’s text, include this copy: “This story was originally published by CalMatters. Sign up for their newsletters.” If you are republishing commentary, include this copy instead: “This commentary was originally published by CalMatters. Sign up for their newsletters.” If you’re republishing in print, omit the second sentence on newsletter signups.
-
- Do not edit the article, including the headline, except to reflect relative changes in time, location and editorial style. For example, “yesterday” can be changed to “last week,” and “Alameda County” to “Alameda County, California” or “here.”
-
- If you add reporting that would help localize the article, include this copy in your story: “Additional reporting by [Your Publication]” and let us know at republish@calmatters.org.
-
- If you wish to translate the article, please contact us for approval at republish@calmatters.org.
-
- Photos and illustrations by CalMatters staff or shown as “for CalMatters” may only be republished alongside the stories in which they originally appeared. For any other uses, please contact us for approval at visuals@calmatters.org.
-
- Photos and illustrations from wire services like the Associated Press, Reuters, iStock are not free to republish.
-
- Do not sell our stories, and do not sell ads specifically against our stories. Feel free, however, to publish it on a page surrounded by ads you’ve already sold.
-
- Sharing a CalMatters story on social media? Please mention @CalMatters. We’re on X, Facebook, Instagram, TikTok and BlueSky.
If you’d like to regularly republish our stories, we have some other options available. Contact us at republish@calmatters.org if you’re interested.
Have other questions or special requests? Or do you have a great story to share about the impact of one of our stories on your audience? We’d love to hear from you. Contact us at republish@calmatters.org.
As lawmakers struggle to act, California schools turn to the courts to combat social media
Share this:
A California teenager committed suicide in her bedroom after watching disturbing videos of a simulated hanging. Another teen in the Bay Area refused to eat and was hospitalized due to an eating disorder. An 8-year-old girl in Temple, Texas, died of self-strangulation.
The reason parents gave for these unspeakable tragedies: social media use.
Silicon Valley may be a place where success sprouts from the seeds of innovation, but a growing list of California schools believe the same forces behind the economic engine of Big Tech have fomented a youth mental health crisis and behavioral issues unlike anything they’ve ever seen.
California educators have assessed the damage and concluded that they can no longer wait for lawmakers in Sacramento and Washington D.C. to take bold action against social media giants. They have instead turned to the courts in a desperate attempt to hold these companies accountable.
The San Mateo County School Board, representing 23 school districts just a few miles from Big Tech’s global hub, was the first in California to file a federal lawsuit against YouTube, TikTok and Snap, alleging these platforms – and the algorithms designed to keep kids hooked – have caused unprecedented levels of anxiety, depression, bullying, eating disorders and suicidal ideation.
The crisis is so severe that schools don’t have enough counselors or administrators to manage it all, and lack resources to reach kids early enough to explain the appropriate uses and most harmful impacts of social media.
“You can’t really move through the world without hearing from parents about the high level of anxiety and stress that kids are exhibiting,” said San Mateo County Schools Superintendent Nancy Magee. “This is a social issue; not a single, family issue.”
At least 60 school districts across the nation have filed lawsuits against social media companies, following the lead of Seattle Public Schools, which in January became the first in the country to do so. Anne Marie Murphy, an attorney representing the San Mateo school board, said she expects hundreds more school districts to join.
The lawsuit is essentially a public nuisance case, Murphy said, but it’s also bolstered by a racketeering claim, alleging that social media companies intentionally misled the public and covered up knowledge of the harm their products cause for financial gain. The recent $462 million settlement with e-cigarette maker Juul is centered on anti-corruption law, she added. Years ago, the Justice Department also used the RICO Act in its successful case against the tobacco industry.
The disappointing fact remains that California lawmakers are trailing other states when it comes to tougher restrictions. Ed Howard, chief counsel for the Children’s Advocacy Institute at the University of San Diego, pointed to other states – some under Republican leadership – that have done more to shield kids from the dangers of social media.
Utah Gov. Spencer Cox recently signed a bill that allows parents or legal guardians to set time limits on social media use and institutes a strict age verification process. The Arkansas House of Representatives passed similar legislation that would require social media companies to verify user ages and confirm minors have a guardian’s consent to open an account. Last month, Montana lawmakers sent a bill to the governor that would ban TikTok on all personal devices.
“They are acting and acting ambitiously,” Howard said. “California could lead again, but right now, California’s parents and children have far fewer rights than parents and children in other states.”
Powerful lobbies and dizzying amounts of wealth make tech companies notoriously difficult to rein in. The result is a grindingly slow legislative progress, especially in California where there’s much hand-wringing about the outsize influence tech lobbyists have on legislators, particularly those whose constituents include tech leaders with deep pockets.
Last year, then-Assemblymember Jordan Cunningham, a Central Coast Republican, co-authored a first-of-its kind bipartisan bill with Oakland Democratic Assemblymember Buffy Wicks that would have made social media companies liable for the damage caused by their addictive platforms. But that bill was quietly killed in the Senate Appropriations Committee.
Wicks and others are trying again this year.
She introduced Assembly Bill 1394 with Stockton Republican Assemblymember Heath Flora to address child sexual abuse and child sex trafficking online. The legislation would impose civil penalties on social media companies for “each act of commercial sexual exploitation facilitated, aided, or abetted by their platform,” according to the Children’s Advocacy Institute.
The bill still includes the contentious private right to action, giving parents the right to sue, but Wicks remains optimistic that it will receive enough support to pass. She’s already had bipartisan success with a bill signed last year that requires social media platforms to consider what’s in a child’s best interest. In addition, it defaults to privacy and safety settings, further protecting the mental and physical health and well-being of children.
The California Age-Appropriate Design Code Act was not about content but rather the product design choices companies make, Wicks said. Over the course of last year, she met with tech industry officials, including from Meta.
“They said, ‘we want the internet to be safer for kids, too,’” Wicks recalled. “They do, but are they putting in the engineering time and capacity to make sure this is in fact happening? Is this issue one, two, three or down lower on their list? If you have government regulation, it forces issues to be one, two and three.”
Another pending bill, Senate Bill 287, authored by state Sen. Nancy Skinner, would hold the platforms accountable for promoting the illegal sale of fentanyl and for the sale of unlawful firearms to California’s youth. Social media companies would also be liable for targeting content to youth that could result in eating disorders, suicide or inflicting harm on themselves or others.
There are “cracks in the wall” as the public becomes more knowledgeable about the dangers of social media on kids’ mental health, said Cunningham. This added pressure will build as more states pick up the mantle.
“It’s all moving in the right direction,” he said. “I think that the era where tech lobbyists can say I want you to vote or not vote for a bill that will hurt our company, that era is over with.”
Until then, the school lawsuits alone won’t be enough to break the social media companies’ grip on California’s youth. It will take the combined efforts of schools, teachers, parents, and state policymakers to force them to reject their core business model, Howard said, which is to make money by keeping kids engaged as long as possible. With this strategy, one that helped win the fight against Big Tobacco, social media companies will find that it’s no longer profitable to do business as usual, either because the conduct is unlawful, or it will tarnish their brand.
Representatives for Tik Tok, YouTube and Snap declined to comment on the school lawsuits. However, the companies said they are committed to keeping children safe and establishing parental controls.
A spokesperson for Google, which owns YouTube, said “protecting kids across our platforms has always been core to our work. In collaborating with child development specialists, we have built age-appropriate experiences for kids and families on YouTube and provide parents with robust controls. The allegations in these complaints are simply not true.”
Snap said that “nothing is more important to us than the well-being of our community. At Snapchat, we curate content from known creators and publishers and use human moderation to review user generated content before it can reach a large audience, which greatly reduces the spread and discovery of harmful content.”
TikTok replied that it has age-restricted features – limiting direct messages and livestreams – and private accounts by default for younger teens, as well as restricted nighttime notifications, screen-time management tools and parental controls.
It’s not lost on Magee, the county superintendent, that the people who work at these companies – responsible for bringing vast amounts of money into the local and state economy – are neighbors and parents of students who attend San Mateo County schools.
“I understand that these tech companies are business partners and members of our community,” she said. “It’s not personal. It’s about the bigger picture and the impact on society, and not being afraid to explore ways to build in additional protections for kids.”
Forcing social media companies to see that big picture will be easier said than done. Change is unlikely to come quickly, and politicians – often more beholden to special interests than the public – will struggle to deliver.
As with Big Tobacco, it took decades to force a recalcitrant industry to admit the harm it caused, and social media behemoths are no different. They won’t relent without a protracted fight.
Absent significant legislative victories, the school districts’ mounting lawsuits send a strong signal that social media companies must stop hurting our kids. But schools can’t do it alone, and they shouldn’t have to.
Julie LynemCalMatters Contributor
Julie Lynem is a journalism lecturer at Cal Poly San Luis Obispo and co-founder of R.A.C.E. Matters SLO County and RaiseUp SLO. Lynem is a veteran journalist who has been a reporter, columnist or editor... More by Julie Lynem