Opinion: Tech companies must be held accountable for abuse of kids

Meta CEO Mark Zuckerberg, during a recent congressional hearing on internet child safety, apologized to families of children harmed by social media platforms.

But apologies aren’t enough. Social media companies have had ample opportunity to halt online child abuse and sexual exploitation. For more than a decade, they’ve refused to take decisive action. It’s time for Congress to force their hand — by holding them legally liable for hosting images and videos of child abuse.

Social media and artificial intelligence have created a dangerous world for our youth. Since smartphones first became commonplace, the number of kids sexually exploited or harmed online has hit shocking new levels year after year.

In 2013, the CyberTipline operated by the National Center for Missing and Exploited Children received 1,380 reports per day of suspected child sexual exploitation. Today, the number is 100,000 per day.  Over 99% of those reports involve online child sexual abuse material.

We’ve seen a staggering rise in “sextortion” — when an adult poses as a child or teen to solicit explicit photos and then blackmails the victim. In 2023, the CyberTipline received over 186,000 reports of online enticement — more than a four-fold increase from 2021.

AI has opened up frightening new avenues for creation and distribution of child sexual abuse material (CSAM). New software can borrow pictures already online, creating new material from old CSAM and re-victimizing exploited children. According to Stanford researchers, one popular database used to train AI contained more than 1,000 images of CSAM.

  5 newly announced big concert tours heading for Bay Area this year

Self-policing hasn’t worked. Internal Meta documents showed that Zuckerberg rejected specific child safety proposals, including the hiring of 45 new staff members dedicated to children’s well-being.

Elon Musk gutted Twitter/X’s council of advisors focused on child sexual exploitation and online safety and harassment. YouTube and TikTok are under investigations in the European Union for their failure to protect minors.

This follows a familiar pattern of tech company failure to adopt even basic child safety rules. Existing U.S. law prohibits companies from collecting personal information from anyone under the age of 13 without parental consent.

Social media platforms nominally comply with this law, but they make little effort to verify if a 16-year-old user is actually a teenager — or if they’re a 50-year-old predator masquerading as one.

In the early days of Myspace and Facebook, we failed to put protections in place. We can’t turn back the clock. But we can create a strong federal approach today to ensure that more kids aren’t victimized tomorrow.

That starts with reform of Section 230 of the Communications Decency Act, a rule tech companies use to shield themselves from legal responsibility for child exploitation on their platforms.

Related Articles

Commentary |


How cryptocurrency executives helped decide the California Senate primary

Commentary |


When it comes to ketamine, Meta’s posting policy is no party to decipher

Commentary |


Sen. Schumer signals slower pace on TikTok measure in the Senate

Commentary |


Donald Trump’s Treasury Secretary Steve Mnuchin says he’s putting together investor group to buy TikTok

Commentary |


Don Lemon says Elon Musk canceled his deal with X after ‘tense’ interview

  Navalny’s mother appeals to Putin to release her son’s body so she can bury him with dignity

Any reform to Section 230 must remove the blanket immunity from liability that tech companies enjoy. If social media platforms are held accountable for harmful content, they will police it. When a 2018 carve-out to Section 230 made it illegal to facilitate prostitution online, Craigslist quickly removed its “personals” section — a popular site for sex workers to solicit clients. Congress could devise carve-outs for child exploitation material.

The Kids Online Safety Act is a bipartisan bill languishing in Congress that would impose a duty of care on tech companies to prevent and mitigate harm to minors who use their platforms. It would require platforms to publish annual, independent audits examining risks to children.

We don’t need another apology. Now is the time for lawmakers to enact meaningful safeguards to protect our kids.

Teresa Huizar is CEO of National Children’s Alliance, America’s largest network of care centers for child abuse victims.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *