Teen Instagram accounts that were supposed to protect minors online and prevent them from seeing violent content instead allowed users to view videos of fights and animal abuse, according to a new analysis of the social network’s mobile app. The findings raise questions about the rigor of Instagram owner Meta’s protections for teens. The tech company launched teen Instagram accounts for users 13 to 17 years old in September 2024, a year after 41 states and D.C. sued Meta, alleging that it purposefully designed Instagram and Facebook to make children addicted to social media. Researchers from the Tech Transparency Project, the research arm of the nonpartisan nonprofit Campaign for Accountability, set up an Instagram teen account in June on a newly activated iPhone, with a profile listing the user’s age as 15. They searched one word throughout the experiment — fight. At first, they found that Instagram limits fight videos or shows content warnings on the “For you,” “Accounts” and “Reels” tabs. However, in the same search, when looking in the “Tags” tab, it recommended “#fight” along with a content warning. When the hypothetical 15-year-old clicked on “#fight,” the report said, Instagram generated “an array of violent and gory videos.” One piece of content displayed two dogs “tearing at each other’s heads and throats in a dimly lit space while people kick them to encourage them to fight,” the report said. Other results showed fights apparently filmed by students at school or taking place on public streets, the researchers said. Instagram also let the young user return to the “Accounts” tab, where the content restriction had vanished. More than two dozen fighting accounts could be viewed. Meta disputed the findings and contested the notion that the researchers had revealed a “loophole.” A Meta spokesperson said the teen accounts are “broadly working as intended, though search results may vary based on the specific queries,” adding that teen accounts saw “some violating content and we removed it.” The Tech Transparency Project reran their analysis after The Washington Post contacted Meta and found that the hidden content warnings that existed in their first search on the “For you,” “Accounts” and “Reels” tabs were gone. Yunyu Xiao, an assistant professor at Weill Cornell Medicine, said exposure to violent content affects teenagers more than adults. The part of the brain that processes emotions — the amygdala — and stress systems are activated more intensely in adolescents than adults when seeing violent media, Xiao added. Instagram and similar platforms have made strides but can still improve the experience for young people by working with researchers, she said. “Teens are still exposed to content that can undermine well-being, including violence, disordered eating and addictive use loops,” Xiao said. “Even with parental controls, algorithm-driven feeds can surface harmful material, and teens often find ways around restrictions.” The past few years have seen an uptick in concerns over social media contributing to worsening young mental health. Four years ago, Facebook product manager Frances Haugen released thousands of internal documents to the government and news media, and testified to the Senate, alleging that the company prioritized profit over young people’s well-being. Other whistleblowers followed, including Arturo Béjar and Sarah Wynn-Williams, who also claimed Meta pursued young users without proper care for their well-being. Meta has said it works constantly to keep young people safe online. |