A new bipartisan Senate bill is taking aim at the liability protections enjoyed by internet platforms like Facebook and YouTube.
The Platform Accountability and Consumer Transparency (PACT) Act, introduced by Sens. Brian Schatz (D-HI) and John Thune (R-SD) on Wednesday, would require online platforms like Facebook and Google to reveal their content moderation practices through a range of mandatory disclosures. The bill would also create a new avenue for holding these companies responsible for hosting illegal content by making changes to Section 230 of the Communications Decency Act.
The bill is a combination of measures that encourage platforms to remove bad content and measures that keep those moderation systems in check, hoping to draw support from both sides of the ongoing debate over platform regulation.
If approved, the bill would force large tech platforms to explain how they moderate content in a way that is easily accessible to users and release quarterly reports including disaggregated statistics on what content has been removed, demonetized, or had its reach algorithmically limited. Platforms would then be required to roll out a formal complaint system for users that processes reports and explains their moderation decisions within 14 days. Users would then be allowed to appeal those moderation decisions within a company’s internal reporting systems, something that already exists on platforms like Facebook.
Other bills aimed at Section 230 would allow users to report certain content moderation decisions to the government, primarily the Federal Trade Commission. The PACT Act is in direct opposition to those proposals and allows for moderation reports to remain internal. Over the last year, both Republicans and Democrats have taken aim at 230 as a means of addressing misinformation or what conservatives believe to be bias or “shadowbanning” of right-leaning posts. President Donald Trump signed an executive order last month that would pare back 230 protections. Democratic presidential candidate Joe Biden has called for 230 to be repealed entirely.
“Our approach is a scalpel, rather than a jackhammer,” Schatz said in a call with reporters Wednesday.
On top of formalizing moderation behavior, the bill amends Section 230 in a way that requires large platforms to remove court-ordered illegal content within 24 hours. It also opens them up to civil lawsuits from federal regulators, which are currently limited by Section 230. State attorneys general would also be empowered with the ability to enforce federal civil law against platforms. The likely result would be a flood of new lawsuits against large tech companies, with unpredictable consequences.
“There is a bipartisan consensus that Section 230, which governs certain internet use, is ripe for reform,” Thune said in a statement Wednesday. “The internet has thrived because of the light touch approach by which it’s been governed in its relatively short history. By using that same approach when it comes to Section 230 reform, we can ensure platform users are protected, while also holding companies accountable.”