18.4 C
Washington
Saturday, July 27, 2024

Instagram App for Users 13 and Under is being delayed by Facebook

A Facebook spokesperson confirmed on Monday that the company has suspended development of an Instagram Kids service, which would have been tailored to children aged 13 and under. With more attention being focused on the social media platform’s effect on the mental health of teenagers, the company made the decision.

The withdrawal came ahead of a congressional hearing this week on internal Facebook research, which was first reported in The Wall Street Journal, and which revealed that the company was aware of the negative mental health effects that Instagram was having on adolescent girls at the time of the research. A public relations crisis for the Silicon Valley firm has erupted as a result of the disclosures, which have prompted yet another wave of demands for additional regulations.

Facebook said it still planned to develop an Instagram product for children that would provide a more “age appropriate experience,” but that it had decided to postpone the plans in the wake of widespread condemnation.

The decision to stop the development of the app is an unusual change of course for Facebook. Over the last several years, the social network has emerged as arguably the most closely watched company on the planet, dealing with privacy concerns, hate speech, disinformation, and charges of anticompetitive business tactics, among other things. Regulators, legislators, journalists, and civil society organizations from all over the globe have expressed concern about the company’s impact on society.

With Instagram Kids, Facebook claimed that young people were already using the photo-sharing app despite age restrictions, and that it would be preferable to create a version that was more appropriate for them. Facebook claimed the “kids” app was designed for children between the ages of 10 and 12, and that it would need parental permission to join, would not display advertisements, and would include more age-appropriate material and features. Parents would be able to monitor which profiles their children were following on social media. YouTube, which is owned by Google, has launched a children’s edition of its mobile application.

Policymakers, regulators, child safety organizations, and consumer rights organizations have all expressed concern that the app attracts children at a younger age rather than protecting them from problems associated with the service, such as child predatory grooming, bullying, and body shaming, among others.

It seems unlikely that a children’s version of Instagram will address deeper systemic concerns, according to Al Mik of 5Rights Foundation, a London-based organization that works with digital rights issues for children. Children as young as 13 were targeted with dangerous information within 24 hours of establishing an account, according to a study released by the organization in July. The content included material linked to eating disorders, severe diets, sexualized images, body shaming, self-harm, and suicide.

Jonathan James
Jonathan James
I serve as a Senior Executive Journalist of The National Era
Latest news
Related news

LEAVE A REPLY

Please enter your comment!
Please enter your name here