Lawmakers have been hammering Facebook for weeks now, claiming that the social media network is harmful to its youngest users. They demonstrated on Tuesday, however, that their worries about data privacy, destructive postings, and transparency extend to other big online businesses as well as Facebook and Twitter.
In a hearing that lasted more than three hours, a bipartisan group of senators expressed concern that the software used by YouTube, Snap, and TikTok was steering young people toward inappropriate posts, that consumer data was being mishandled, and that the companies were not doing enough to identify potentially dangerous content on their platforms. Several legislators said that their staff had been able to locate hazardous information — such as postings connected to self-harm and pornography — inside the businesses’ products, sometimes while signed in as a juvenile user.
During the hearing’s opening remarks, Senator Richard Blumenthal, a Democrat from Connecticut, accused the firms of enticing young people to use their products more and more often.
“Everything you do is geared at attracting new users, particularly children, and retaining them on your apps for a longer period of time,” said Senator Richard Blumenthal, who chairs the Senate Commerce Committee’s subcommittee that convened the hearing.
They represent the rising pressure on the nation’s top social media firms to protect children who use their products from material that exposes them to violence or danger, as well as content that reduces their sense of self-worth, as seen by the challenging questions. Following the testimony of Frances Haugen, a former Facebook product manager who was responsible for the leak of hundreds of pages of internal information, the committee heard how the business was aware that its products were making certain youngsters feel less confident about themselves.
Legislators have increasingly started to propose measures aimed at improving the protection of children while they are using the internet. A group of House members has introduced legislation that would subject social media firms to legal action if their algorithms magnify information that is linked to serious damage. Additionally, Mr. Blumenthal proposed on Tuesday that American authorities may adopt a children’s design code similar to the one that just went into effect in the United Kingdom, which establishes new standards for how corporations use children’s data.
Any new restrictions would have to be approved by a Congress that is now gridlocked. Nonetheless, plans to safeguard children are less likely to be stymied by the party splits that have stymied earlier attempts to regulate the digital giants.
According to Nu Wexler, a former communications director for technology businesses and legislators in Washington, “it’s one of the few places where Congress can really do something and there is bipartisan unanimity.” “Child safety is the route of least resistance for politicians in certain respects,” says the author.
To respond to the questions, the corporations sent executives with political expertise. TikTok was represented by Michael Beckerman, the company’s head of public policy for the Americas, who formerly served as the executive director of a major lobbying organisation for internet corporations in Washington, DC. Among others who spoke on behalf of YouTube was Leslie Miller, the streaming site’s vice president for government relations and public policy, who formerly worked as a Democratic political assistant. Snap, the parent company of Snapchat, brought Jennifer Stout, its vice president for global public policy and a former deputy chief of staff for John Kerry, to the conference.
The firms moved rapidly to disassociate themselves from one another, claiming that they were already taking considerable precautions to safeguard children who were using their products or services.
Snap was described as a “antidote to social media” by Ms. Stout, who also highlighted the distinctions between Snapchat and Instagram. She said that her company’s app was designed to connect individuals who already knew one other in real life, rather than providing them with a steady stream of material from complete strangers. She also said that it was focused on privacy, with photographs and communications being deleted by default.
She also emphasised that Snapchat moderates public material more rigorously than other social media businesses, which she believes is an important distinction. “Human moderators assess material from publishers before pushing it in Discover,” Ms. Stout said. Discover is the public portion of Snapchat that features news and entertainment content. Mr. Stout went on to say that content on Spotlight, Snap’s creator programme that promotes films from users, is checked by artificial intelligence before being disseminated, and then by human moderators before it can be seen by more than 25 people, among other things.
He suggested that legislators examine the processes in place to determine if users are of legal age to use a product, and that law should include wording requiring age verification “across all applications,” according to him.
Lawmakers also grilled Mr. Beckerman on whether TikTok’s Chinese ownership may result in the disclosure of customer data to the Chinese government. The Chinese government has long said that Google would be required to hand up the data of Americans if the corporation were to be ordered to do so.