The CEOs of Meta, TikTok, X, Snapchat, and Discord are currently testifying before the Senate Judiciary Committee, addressing concerns about child exploitation, addiction, and exposure to harmful content on their platforms.
Mark Zuckerberg, Meta's CEO, are appearing before Congress for the second time since the 2018 scandal of Cambridge Analytica privacy, meanwhile TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino, Snap CEO Evan Spiegel, and Discord CEO Jason Citron testified for the first time.
Advocates, families, and lawmakers expressed growing concerns about the impact of social media on young lives amid a Harvard study that revealed that social media companies had generated $11 billion in U.S. ad revenue from minors.
In December, over 200 organisations asked Senate Majority Leader Chuck Schumer to schedule a vote on the Kids Online Safety Act (KOSA), which aims to hold these social media platforms accountable for suggesting negative content that affects mental health of minors.
Senators Marsha Blackburn and Richard Blumenthal, who are the co-authors of KOSA, stressed the need for lawmakers to ensure tech companies prioritise safety of users over their own profits. Other bipartisan bills, like the Stop CSAM Act were also taken in considerations that focus on removing child sexual abuse materials and holding these platforms accountable.
However, there are many supporters of the regulation on these platforms but some experts caution against the potential over-censorship, recommending to shift focus on empowering these platforms with tools for safer internet navigation rather than relying solely on legal action.
Tech Giant Meta to take the centre stage
Meta is the centre of concern, as it is currently facing lawsuits from numerous US states alleging that it deliberately designed features on Instagram and Facebook that addict children and failed to protect them from online predators.
Earlier, Meta has clarified that it has improved child safety features, such as hiding inappropriate content and limiting minors' messaging. Steps have been taken including hiding content related to sensitive topics on Instagram and Facebook for teenagers. Restrictions on minors receiving messages from unknown contacts were added, along with nudges to discourage late-night browsing.
However, critics argue that Meta's actions still don't make sufficient changes to address child safety concerns. Arturo Béjar, a former Meta engineering director, criticised the company's response to scandals, accusing them of selectively presenting statistics and emphasising less impactful features. He questioned why certain safety features, like 'quiet mode,' aren't defaults for all kids on Instagram. Béjar also highlighted the absence of a mechanism for teens to report unwanted advances, raising concerns about the overall effectiveness of Meta's safety measures.
X's efforts
X, formerly Twitter, stated that its CEO Linda Yaccarino recently met with senators in Washington to discuss the company's approach to addressing child sexual exploitation (CSE) and various other topics such as privacy, artificial intelligence, content moderation, and misinformation.
The company, emphasising its transformation as an entirely new entity, highlights its strengthened policies and enforcement against CSE. X’s stance is that it is actively taking measures against users distributing such content and promptly addressing the networks of users engaged with it.
Discord’s Zero Tolerance Policy
Discord had emphasized that it has a zero-tolerance policy for child sexual abuse, and have a mix of proactive and reactive tools to moderate the platform.
A Discord spokesperson gave a statement emphasising, “Over 15% of our workforce is dedicated to trust and safety full time. We prioritize issues that present the highest real-world harm to our users and the platform, including child sexual abuse material.”
He added, Discord has a team “specifically dedicated to preventing minor safety issues on our platform and taking action whenever we become aware of this content, including removing it, banning users, shutting down servers, and engaging with the proper authorities.”
TikTok’s Stance
TikTok has around 63% of teen usage. CEO Chew defended Tikatok, stating that it doesn’t harm kids, in a House Energy and Commerce Committee hearing last March, addressing the topic of how Congress can safeguard American data privacy and protect children from online harms.
To enhance safety for minors, TikTok implemented restrictions on livestreams, limiting them to accounts registered to users who are 18 or older. Chew said in his opening remarks, “We spend a lot of time adopting measures to protect teenagers,” and “Many of those measures are firsts for the social media industry.”
SnapChat’s support to KOSA
Snap came out in support of the Kids Online Safety Act (KOSA) before the hearing and emphasised that many of KOSA provisions align with Snap's existing safeguards.
A Snap’s spokesperson mentioned that Snap sets teens' accounts to the strictest privacy settings by default, providing additional privacy and safety protections for teens, it offers in-app parental tools and reporting tools, and limits the collection and storage of personal information.
Youtube flies under the Radar
Surprisingly, Google's YouTube, used by 93% of U.S. teens, was left out of the companies called to the Senate on Wednesday, in contrast to its status as the most widely used platform among kids.
The thing about YouTube is that it kind of flies under the radar," said Larissa May, the founder and executive director of the nonprofit #HalfTheStory, which helps teens develop healthy relationships with technology.
©️ Copyright 2024. All Rights Reserved Powered by Vygr Media