European Union regulators on Thursday opened an investigation into American tech giant Meta over the potentially addictive effects Instagram and Facebook have on children, an action with far-reaching implications because it cuts to the core of how they are designed the company's products.
Meta products could “exploit the weaknesses and inexperience of minors” to create behavioral addictions that threaten their mental well-being, the European Commission, the executive branch of the 27-member bloc, said in a statement. EU regulators could eventually fine Meta up to 6% of its global revenue, which last year amounted to $135 billion, as well as impose other product changes.
The investigations are part of a growing effort by governments around the world to rein in services like Instagram and TikTok to protect minors. For years, Meta has been criticized because its products and recommendation algorithms are optimized to appeal to children. In October, three dozen US states sued Meta for using “psychologically manipulative product features” to lure children, in violation of consumer protection laws.
EU regulators said they had been in contact with US counterparts about the investigations announced on Thursday. Regulators said Meta may be violating the Digital Services Act, a law passed in 2022 that requires large online services to more aggressively police their platforms for illicit content and adopt policies to mitigate risks to children . People under the age of 13 are supposed to be unable to register an account, but EU investigators said they would carefully examine the company's age verification tools as part of their investigation.
“We will now examine in depth the potential addictive and 'rabbit hole' effects of the platforms, the effectiveness of their age verification tools and the level of privacy afforded to minors in the operation of the recommendation systems,” Thierry Breton , responsible for the EU This was stated in a statement by the Commissioner for Internal Markets, who is supervising the investigations. “We will spare no effort to protect our children.”
On Thursday, Meta said its social media services are safe for young people, highlighting features that allow parents and children to set time limits on how much they use Instagram or Facebook. Teens are also forced to use more restrictive content and recommendations. Advertisers are prohibited from showing targeted ads to underage users based on their activity on Meta apps.
“We want young people to have safe, age-appropriate experiences online, and we have spent a decade developing more than 50 tools and policies designed to protect them,” Meta said in a statement. “This is a challenge facing the entire sector and we look forward to sharing the details of our work with the European Commission.”
EU officials did not provide a timeline for how long the investigation will take. But the opening of a formal investigation Thursday gives regulators broad authority to gather evidence from Meta, including issuing legal requests for information, interviewing company executives and conducting inspections of company offices. The investigations into Instagram and Facebook will be conducted separately.
EU regulators have targeted a number of companies since the Digital Services Act came into force. Last month, TikTok suspended a version of its app in the European Union after authorities raised concerns about an “addictive” feature that allows users to earn rewards such as gift cards for watching videos, liking content and following certain creators.
Meta is facing another investigation related to political advertising, while X, the social media site owned by Elon Musk, is facing an investigation related to content moderation, risk management and advertising transparency.