Disruptive Technology: How Businesses Are Adapting to AI, Blockchain, and IoT
Facebook and Instagram owner Meta has been sued by US states for knowingly designing features that make young users addicted to its social platforms.
Attorneys from 33 US states filed a lawsuit against the social media giant on Tuesday, claiming that it repeatedly misled the public about the dangers of its social platforms and knowingly induced young children into addictive and compulsive social media use.
They claim that the company aims to ensure that young people spend as much time as possible on social media despite knowing that teenage brains are susceptible to becoming addicted and seeking approval from features such as likes and comments from other users.
This, they allege, violates consumer protection laws by unfairly ensnaring children to the platforms, and ha actively contributing to the global youth mental health crisis.
“Meta has harnessed powerful and unprecedented technologies to entice, engage and ultimately ensnare youth and teens,” said the complaint, filed by 33 states including California and Illinois.
“Its motive is profit, and in seeking to maximise its financial gains, Meta has repeatedly misled the public about the substantial dangers of its social media platforms. It has concealed the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children.”
'Profits over health'
Alongside claims that Meta had made its platforms deliberately addictive, the state attorneys accused the social media firm of routinely collecting data on children under 12 without their consent, which is a violation of US privacy laws.
It’s not the first time it has been questioned for its handling of people’s data. In May, the company received a whopping €1.2 billion fine for mishandling people’s data when transferring it between the EU and the United States.
It’s also not the first time regulators have tried to hold social media firms accountable for causing harm to young people. In fact, last year a coroner in the UK ruled that Instagram had actually contributed to the death of a teenager who took her own life after seeing thousands of images of self-harm on the platform.
But Laws to protect the safety of children online in the US have stalled in Congress as tech companies lobby against them.
“We’ve been warning about Meta’s manipulation and harming of young people from its start and sadly it has taken years to hold it and other companies like Google accountable, said Jeffrey Chester, executive director of consumer advocacy at the Center for Digital Democracy.
“Hopefully justice will be served but this is why it’s so crucial to have regulations.”
Phil Weiser, Colorado’s attorney general, said that Meta and other social media platforms are actively putting profits over their user health as Tobacco and vaping companies have done in the past.
“Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximise its profits at the expense of public health, specifically harming the health of the youngest among us,” Phil Weiser, Colorado’s attorney general, said in a statement.
Safeguarding children online
The state attorneys’ lawsuit suggests a variety of solutions to keep young people safe on social media platforms like Facebook or Instagram, including substantial federal penalties and fines.
In another, still-ongoing suit, lawyers representing more than 100 families filed a master complaint accusing social media firms including Meta, Snapchat, Google and TikTok’s parent company, ByteDance, of harming young people with their products.
In a joint statement, the attorneys from that case applauded the move from US attorneys general, stating that it brings the US one step closer to concrete regulations that protect children online.
“This significant step underscores the undeniable urgency of addressing the impact of addictive and harmful social media platforms, a matter of paramount concern nationwide, as it continues to contribute to a pervasive mental health crisis among American youth,” they said.
Meta has made progress toward protecting teens and children on Facebook and Instagram in the past year. In November 2022, for instance, the company rolled out privacy changes for all users under the age of 16.
According to the company's blog post on the new settings, a 'suspicious' account is one that belongs to an adult that may have recently been blocked or reported by a young person.
“As an extra layer of protection, we're also testing removing the message button on teens' Instagram accounts when they're viewed by suspicious adults altogether,” the blog post reads.
Meta also said it developed tools to encourage teens to report accounts that make them uncomfortable and notify Instagram users when they have been on the app for too long, encouraging them to take a break.
It denied claims that it has put the health of its young users at risk.
“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said in a statement.