After ten hours of testimony across two days, Facebook CEO Mark Zuckerberg left Capitol Hill Republicans unsatisfied this week with his denials of allegations that the social media platform censors conservative content, but legal experts say there may be little Congress can do about it.
“I’m not a fan of more regulation for social media but that doesn’t mean we shouldn’t ask Facebook whether they are a First Amendment speaker with the right to throttle down conservative content if they choose and elevate liberal content or whether they are a neutral public forum where everybody’s ideas have to be treated equally,” Rep. Matt Gaetz, R-Fla., said in an interview Wednesday.
Rep. Darin LaHood, R-Ill., also sees regulation as a last resort, but he believes social media companies should be treating all users fairly and legitimate questions have been raised about whether Facebook is.
“There’s real concern by myself and others about the fact that Facebook and other companies are regulating what people see, particularly politically,” he said. “I think we need more transparency, more openness.”
During two hearings before Senate and House committees on Tuesday and Wednesday, Zuckerberg was repeatedly pressed to explain why Trump-supporting sisters Diamond and Silk were recently told by Facebook their content was “unsafe.”
“What is unsafe about two black women supporting President Donald J. Trump?” asked Rep. Billy Long, R-Mo., at one point. Zuckerberg replied that nothing is unsafe about that.
“In that specific case, our team made an enforcement error,” Zuckerberg told Rep. Joe Barton, R-Texas, in another exchange. “And we have already gotten in touch with them to reverse it.”
Diamond and Silk denied Wednesday that Facebook had contacted them about correcting the error, but the company maintained in a public post on the sisters’ page that a staffer had emailed them.
Rep. Steve Scalise, R-La., questioned Zuckerberg about a study that claimed conservative content was shown in news feeds much less than liberal content after the company changed its algorithm late last year, but the Facebook founder denied that the process was biased.
“There is absolutely no directive in any of the changes that we make to have a bias in anything that we do. To the contrary, our goal is to be a platform for all ideas,” he said, one of many times he used that phrase.
The longest exchange on the subject came Tuesday with Sen. Ted Cruz, R-Texas, who cited, among other things, a 2016 Gizmodo article that claimed Facebook suppressed conservative content in its trending topics section. Zuckerberg recognized that Silicon Valley is “an extremely left-leaning place,” but he said the company does not screen its employees for political views or intentionally make biased decisions.
Cruz asked three times if Facebook is a “neutral public forum” or is engaged in political speech.
“Our goal is certainly not to engage in political speech,” Zuckerberg answered.
Still, Lawmakers cited other anecdotal instances of what they saw as inappropriate censorship of conservative views:
- A video posted by a political candidate in Michigan was deemed too threatening
Zuckerberg readily acknowledged that mistakes are sometimes made, but some experts are unconvinced that this amounts to systematic suppression of political speech.
“The assertion that Facebook has ‘censored’ content on the basis of its politically conservative content is also unsupported by the facts,” said Mary Anne Franks, a professor at the University of Miami School of Law and author of “The Cult of the Constitution: Guns, Speech, and the Internet.” “There is in fact far more evidence to suggest that Facebook's algorithms and policies allowed ‘conservative’ content to spread much faster and more widely than other types of content.”
The idea of left-wing Silicon Valley silencing the right resonates with conservatives, but according to Elizabeth Cohen, an assistant professor of communication studies at West Virginia University, Facebook’s business model and philosophy give the company no incentive to turn away users or decrease their time on site.
“It’s kind of a dog whistle,” Cohen said. “Facebook doesn’t want to do that. Facebook wants to have as many people as possible having as positive an experience as they can.”
In a Fox News op-ed Wednesday, Sen. Cruz argued that Facebook’s protection from legal liability for content users post under section 230 of the Communications Decency Act is dependent on it remaining a “neutral public forum” rather than a “publisher or speaker.”
“If Facebook is busy censoring legal, protected speech for political reasons, the company should be held accountable for the posts it lets through,” he wrote. “And it should not enjoy any special congressional immunity from liability for its actions.”
Discriminating against certain political views, according to Cruz, could lead to government oversight and regulation of social media or the revocation of the platforms’ protection from liability.
Section 230 states that no provider or user of an interactive computer service shall be held civilly liable for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”
Three cyberlaw experts said Thursday that Cruz’s understanding of Section 230 is incorrect.
“It’s 100 percent wrong and it’s an embarrassment that he’s taken this position after the number of times people have tried to convince him otherwise, including me,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law, who testified at a Senate hearing last September.
David Greene, civil liberties director at the Electronic Frontier Foundation, said he has heard others make the same argument Cruz does, but they misstate the statute.
“That’s exactly completely wrong,” he said. “It’s the exact opposite…. Section 230 was passed for the specific purpose to encourage the companies to moderate content. At that point, they were trying to get them to filter out sexual content.”
According to Franks, the purpose of the law was to enable online intermediaries to decide which content to restrict or allow without fear of being sued or imprisoned.
“Ted Cruz's interpretation of Section 230 is not correct and is perilously close to being entirely backwards,” she said. “Section 230 does not require online intermediaries to be ‘neutral platforms.’ There is nothing in the text of the statute to support this reading.”
Cruz’s office did not respond to a request for comment on Thursday afternoon.
Experts are skeptical about threats by lawmakers to regulate Facebook’s content or roll back the protections provided by Section 230. Any rule that requires social media platforms to permit or restrict certain political content would violate the First Amendment.
“Under the First Amendment and Section 230, Facebook has the unrestricted discretion to decide whether to publish user content or not,” Goldman said.
Greene pointed to a 1974 Supreme Court case, Miami Herald Publishing Co. v. Tornillo, that overturned a Florida law requiring newspapers to give equal space to the opponents of candidates they endorse. The court determined that this exacted a penalty on publishers for exercising their free speech.
“You can’t force someone…to include something they don’t want to include,” Greene said, acknowledging that there are exceptions to that principle for broadcast television and radio where equal time rules sometimes apply.
While forcing Facebook to post or not post certain content would run afoul of the Constitution, Congress does have the authority to alter the provisions of Section 230. Anti-sex-trafficking legislation signed by President Trump Wednesday removed liability protections for some sexual content, but Goldman warned against doing the same for political speech.
“It would be one of the worst possible ideas Congress could entertain,” he said.
The law was written years before Facebook and Twitter were developed, but it enabled the creation of forms of communication like public message boards and consumer reviews that could not easily exist in the real world. The ramifications of declaring that online platforms are legally liable for content their users post would be far-reaching.
“If Congress wanted to change Section 230, it needs to recognize there are certain classes of content that would go away, they could not exist,” he said.
According to Goldman, social media sites would be fundamentally changed, turning them into traditional publishers that carefully vet all content before allowing it to be posted.
“You take away Section 230, if Twitter [still] exists, Twitter exists only by pre-screening of all tweets. That would require those tweets to be embargoed for some period of time,” he said.
Experts agree Facebook has no legal obligation to serve as a platform for all voices. According to Ray Klump, chair of computer and mathematical sciences at Lewis University and a former software developer, creating a completely free and fair platform while only filtering out indisputably dangerous speech would also be technologically difficult.
“It is certainly possible to characterize individual posts as potentially negative or offensive but to do it on the scale of the number of posts that are made per second…that’s the technical challenge,” Klump said.
Engineers can identify trigger words or subject matter that would block a post or flag it for more scrutiny or filter out content based on the reliability of its source, but there will always be an element of subjectivity to that. At some point, someone has to determine what is and is not offensive or reliable.
The prospect of putting too much power in the hands of Mark Zuckerberg’s 15,000 to 20,000 security and content review staffers in Menlo Park makes some uncomfortable.
“We don’t want to live in a society where Facebook determines the truth,” Cohen said.
Zuckerberg has now repeatedly said that Facebook will increasingly move toward AI automatically taking down and even *blocking* posts. Because that's what our policymakers want: in one breath they complain of Facebook's unchecked power and in another demand they use it even more.— Kevin Bankston (@KevinBankston) April 10, 2018
Many of the headaches currently plaguing Facebook’s leadership are the result of the site’s past attempts to police its content and protect users from what its staff and algorithms identify as violent or unsafe posts. Once they start blocking some material, there are inevitably questions about why that content was restricted and demands to restrict even more.
“I think the aspiration to be an open platform is a good one,” Greene said. “When these companies do moderate content, they do a really bad job of it.”