By Â
Democrats urged Big Tech to step up online censorship or face government regulation during a March 25 congressional hearing with the chief executives of Facebook, Google, and Twitter.
The lawmakers portrayed the platforms as rife with âdisinformation and extremismâ that the platforms are unwilling to purge.
âOur nation is drowning in disinformation driven by social media,â said Rep. Mike Doyle (D-Pa.), chair of the House Subcommittee on Communications and Technology, who hosted the hearing.
âThe way I see it, there are two faces to each of your platforms,â he said in his opening statement. âFacebook has family and friends neighborhood, but it is right next to the one where there is a white nationalist rally every day.
âYouTube is a place where people share quirky videos, but down the street, anti-vaxxers, COVID deniers, Qanon supporters, and flat-earthers are sharing videos.
âTwitter allows you to bring friends and celebrities into your home, but also Holocaust deniers and terrorists, and worse.â
Bound by the Constitution, Doyle is unable to ban white nationalists or anybody else from organizing rallies, just as he canât prevent Americans from discussing their opposition to vaccines, questioning the existence of COVID-19âthe disease caused by the CCP (Chinese Communist Party) virusâsupporting the anonymous âQâ persona, or believing that the earth is flat.
Doyle said, according to research, âmisinformation related to the electionâ and âCOVID disinformationâ content was seen billions of times in past months. He acknowledged that the platforms have already taken steps to suppress the content, but called for more.
âYou can take this content down, you can reduce the vision, you can fix this, but you choose not to,â he said.
The companies should now brace for regulation, said Rep. Frank Pallone (D-N.J.), chair of the House Committee on Energy and Commerce, in his written opening statement.
âIt is now painfully clear that neither the market, nor public pressure will force these social media companies to take the aggressive action they need to take to eliminate disinformation and extremism from their platforms,â he said.
âAnd, therefore, it is time for Congress and this committee to legislate and realign these companiesâ incentives to effectively deal with disinformation and extremism.â
It isnât clear what he would qualify as disinformation and extremism. His office didnât immediately respond to requests for further details.
Rep. Jan Schakowsky (D-Ill.), chair of the House Subcommittee on Consumer Protection and Commerce, held a similar opinion.
âThe regulation we seek should not attempt to limit constitutionally protected free speech, but it must hold platforms accountable when they are used to incite violence and hatredâor as in the case of the COVID pandemicâspread misinformation that costs thousands of lives,â she said in a written statement.
While inciting violence could be illegal, inciting hatred and spreading misinformation generally is constitutionally protected speech. However, opinions vary on what constitutes hate speech and misinformation.
In recent years, Facebook has relied on paid fact-checkers, but thereâs evidence that the fact-checkers themselves need to be fact-checked and their operations are politically slanted.
The platforms already prohibit âhate speech,â which is a subjective standard impossible to enforce fairly, according to Nadine Strossen, a law professor and former president of the American Civil Liberties Union.
People on the political left are much more likely to call a variety of statements âhateful,â while those on the right tend to call the same statements âoffensive, but not hateful,â a 2017 Cato survey found (pdf).