Logo 468x60
Uncategorized

The debate over Instagram for kids reignites debate among experts

Monitoring Desk

Facebook is working on a new version of its popular app that’s targeted at children under 13.

Public health experts are urging Facebook to leave behind its plans for a new version of Instagram targeted at kids under 13. Such a plan, these groups said in a letter sent on Thursday, would “put young users at great risk,” arguing that Facebook isn’t ready to introduce and oversee an app that could have such a powerful influence over young children.

The new app, which Facebook says will not include ads, is being designed for children under the minimum age for Instagram, which is 13. Facebook also says it’s trying to find new methods, including using artificial intelligence, to confirm that users on the main Instagram platform aren’t under 13. That age restriction is a product of a 1998 law called the Children’s Online Privacy and Protection Act (COPPA), which establishes more stringent requirements and potential financial liabilities for online platforms that collect personal information about users under 13 without their parents’ consent. Child safety experts worry that social media poses additional threats to young children, too.

“Instagram’s focus on photo sharing and appearance makes the platform particularly unsuitable for children who are in the midst of crucial stages of developing their sense of self,” the organizations, which include the Campaign for a Commercial-Free Childhood and ParentsTogether Action, told Facebook CEO Mark Zuckerberg in the letter. “Children and teens (especially young girls) have learned to associate overly sexualized, highly edited photos of themselves with more attention on the platform and popularity among their peers.”

The public health experts and child advocacy groups who signed the letter also argue that social media built for kids could violate younger peoples’ privacy and create an increased risk of depression, among a wide variety of other potential harms.

“During the pandemic, I have heard countless stories from parents of elementary-aged children about high-drama and problematic interactions happening over social media that kids weren’t developmentally ready for,” said Jenny Radesky, a pediatrics professor at the University of Michigan’s medical school, in a statement on Thursday. “An Instagram for kids is the last thing they need.”

The letter comes as lawmakers increasingly scrutinize the efforts of tech giants to build kid-focused apps and tools. Members of Congress have expressed concern that these apps have become addictive, are harmful to young people’s mental health and self-esteem, and endanger children’s privacy. At the same time, tech companies are grappling with the reality that kids under 13, who are technically not allowed on their platforms, manage to gain access anyway.

The debate over kids on social media was reignited following a BuzzFeed News report in March that Facebook was in the early stages of building an under-13 Instagram app.

Facebook has defended its Instagram-for-kids plan, arguing that it’s an effort to keep younger people off of its main service. The company also told Recode that the new version of Instagram is being designed in consultation with child development and mental health experts as well as privacy advocates, a process that the company expects will take several months.

“We’ve just started exploring a version of Instagram for younger teens,” said Facebook spokesperson Stephanie Otway. “The reality is that kids are online. They want to connect with their family and friends, have fun, and learn, and we want to help them do that in a way that is safe and age-appropriate.”

Otway added that Facebook did not have more specifics to share regarding how it will approach content moderation for its kids-focused platform. The prospect of adults interacting with children on Instagram is particularly concerning. Last month, Instagram added new features to restrict direct messages between teens and adults they do not follow, and said it’s looking into how to make it more difficult for adults with “potentially suspicious behavior” to interact with young people.

Previous attempts by tech and media companies to reach a large number of young children online have run into problems, and the Federal Trade Commission has been involved in several cases related to tech platforms and children’s privacy. In 2017, Facebook launched a kids version of its Messenger app. Two years later, Facebook shut down a technical flaw in its system that made it possible for kids to enter group chats with strangers their parents hadn’t approved. The company now says there are more than 7 million monthly active accounts on the Messenger Kids service.

YouTube has also run into problems with its app for young people, YouTube Kids, which it launched in 2015. The company has had to crack down on inappropriate videos being displayed to young people. Earlier this month, the House Subcommittee on Economic and Consumer Policy told YouTube CEO Susan Wojcicki it was investigating YouTube Kids, hammering the service for low-quality content, a high degree of product placement, and insufficient content moderation. Earlier this week, Viacom, Disney, and 10 ad-tech firms came to a settlement in a lawsuit accusing these companies of launching tracking software on children-focused apps without the consent of the kids’ parents.

So, while we don’t know when a kids’ version of Instagram will launch, it’s clear that lawmakers and child safety experts are not happy with tech platforms targeting children. And when the app does launch, past problems with kids-focused platforms as well as Instagram itself suggest that the new app could be a troublemaker.

Courtesy: Vox