Meta is in hot water again over its child protection methods (or lack thereof). The European Commission has started official proceedings to determine whether the owner of Facebook and Instagram breached the Digital Services Act (DSA) by contributing to children’s addiction to social media and failing to ensure they have high levels of safety and privacy.

The Commission’s investigation will specifically examine whether Meta properly assesses and acts against the risks posed by its platform interfaces. He is concerned about how their designs could “exploit the weaknesses and inexperience of minors and cause addictive behavior and/or enhance the so-called ‘rabbit hole’ effect.” Such assessment is necessary to counter potential risks to the exercise of children’s fundamental right to physical and mental well-being, as well as respect for their rights.”

The proceedings will also examine whether Meta takes the necessary steps to prevent minors from accessing inappropriate content, has effective age verification tools, and whether minors have clear, strong privacy tools, such as default settings.

The DSA sets standards for very large online platforms and search engines (those with 45 million or more monthly users in the EU) such as Meta. The obligations of the companies mentioned include transparency about advertising and content moderation decisions, sharing their data with the Commission and addressing the risks their systems pose related to areas such as gender-based violence, mental health and the protection of minors.

Meta responded to the official production by touting features like parental control settings, silent mode, and automatic content restriction for teens. “We want young people to have safe, age-appropriate experiences online, and we’ve spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge facing the entire industry and we look forward to sharing details of our work with the European Commission,” said a Meta spokesperson Engadget.

However, Metta has consistently failed to prioritize the safety of young people. Previous troubling incidents include Instagram’s algorithm suggesting content involving child sexual exploitation and claims that it designed its platforms to be addictive to young people while suggesting psychologically harmful content such as promoting eating disorders and body dysmorphia.

Meta is also known as a hub of misinformation for people of all ages. The commission already opened formal proceedings against the company on April 30 over concerns about deceptive advertising, access to data for researchers and the lack of “an effective real-time third-party civil discourse and election monitoring tool” ahead of June’s European Parliament elections, among others with other concerns. Earlier this year, Meta announced that CrowdTangle, which has publicly shown how fake news and conspiracy theories circulate on Facebook and Instagram, will be completely shut down in August.

https://www.engadget.com/eu-investigating-meta-over-addiction-and-safety-concerns-for-minors-120709921.html?src=rss