New York City joins a growing chorus of social media companies suing states, cities and school districts, accusing Metaplatforms and others of exploiting children and teenagers, accusing their platforms of fueling a mental health crisis among teens. Joined the.
The most populous city in the United States filed a lawsuit Wednesday in California state court in Los Angeles against Meta and its Facebook and Instagram platforms. TikTok Inc. and its parent company ByteDance Ltd. Google LLC and its YouTube platform. and Snapchat owner Snap Inc.
Social media companies face growing legal risks over claims they use algorithms to get teenagers and young adults addicted to their platforms. Mehta was sued in October by attorneys general from more than 30 states over similar claims. A month later, a judge in Oakland, California, ordered Meta, Google, TikTok, and Snap to file hundreds of lawsuits for kidnapping young people.
Hundreds of school districts are also suing to force companies to change their behavior and pay to address social media addiction. New York City said it spends more than $100 million each year on mental health programs and services for youth. The city’s health commissioner last month called unchecked access to social media a “public health hazard.”
“Our city is built on innovation and technology, but many social media platforms end up endangering children’s mental health and encouraging addiction,” New York Mayor Eric Adams said in a statement. “It encourages and encourages dangerous behavior.”
“Maximizing engagement”
Like many other charges across the country, the New York City lawsuit alleges that the companies borrowed behavioral and neurobiological tactics used by the casino and tobacco industries to “maximize youth engagement and generate advertising revenue.” The company claims that it has designed features aimed at “promoting” and targets children and adolescents who: “They are particularly vulnerable to the addictive nature of those features.”
Google spokesman Jose Castañeda disputed the city’s claims.
“Providing young people with a safer and healthier experience online has always been at the core of our work,” he said in an email. “We have worked with youth, mental health and parenting experts to build services and policies that provide age-appropriate experiences for young people and robust controls for parents.”
Meta said it wants teens to have a “safe and age-appropriate online experience” and listed more than 30 tools and features to support them and their parents.
“We’ve spent a decade tackling these issues and hiring people who have dedicated their careers to keeping young people safe and supported online,” a spokesperson said in a statement.
A TikTok spokesperson said the company has “industry-leading safety measures to support the health of teens, including age restrictions, parental controls, and an automatic 60-minute time limit for users under 18.” “We are taking lessons,” he said. We regularly partner with experts to understand emerging best practices and continue to work to keep our communities safe by addressing industry-wide challenges. ”
A Snap spokesperson said in a statement that the platform was “intentionally designed to be different from traditional social media, with a focus on helping people communicate with their closest friends on Snapchat.” Rather than a feed of content that encourages passive scrolling, Snapchat gives you direct access to your camera, and there are no traditional public likes or comments. We always have so much work to do, and we love the role Snapchat plays in helping our closest friends feel connected, happy, and prepared as they face the many challenges of adolescence. I’m satisfied. ”
The case is City of New York v. Meta Platforms, Inc., 24STCV03643, Superior Court of Los Angeles County, California.
Copyright 2024 Bloomberg.
topic
california new york lawsuit
interested in digital marketing?
Get automatic alerts on this topic.