The New York TimesNov 18, 2020 06:44:32 IST
Senators took the chief executives of Facebook and Twitter to task on Tuesday for how the services handled misinformation around the election, showing bipartisan support for changing a law that protects the companies from lawsuits.
In a Senate Judiciary Committee hearing that lasted more than four hours, the lawmakers forced Mark Zuckerberg of Facebook and Jack Dorsey of Twitter to defend their companies’ efforts to limit the spread of false information about voting and the election results. Republicans accused the companies of censoring conservative voices while Democrats complained about a continued surge of hate and misinformation online.
It was the second time in three weeks that Zuckerberg and Dorsey testified before Congress. But in contrast to the earlier hearing, lawmakers on Tuesday drilled deeply into the companies’ practices for moderating content and outlined a legislative agenda that could restrain the platforms.
“I fully expect that Congress is going to act in the next Congress that we’re going to produce an outcome,” Senator Tom Tillis, R-North Carolina, said.
Among the highlights from the hearing:
Lawmakers drill down on how Facebook and Twitter moderate content
Much of the discussion at the hearing focused on the minutiae of how Facebook and Twitter carry out the process of moderating the billions of pieces of content regularly posted to their networks.
Both Democrats and Republicans zeroed in on the issue, according to a tally by The New York Times. Out of 127 total questions, more than half — or 67 — were about content moderation. Democrats asked 12 questions aimed at how Facebook and Twitter could increase their moderation efforts around topics like hate speech, while Republicans asked 37 questions about why some points of view were censored online and how content moderation could be decreased in some areas, according to the tally. (The remainder of the questions about content moderation did not indicate a clear desire for more or less moderation.)
Democrats call for more regulation of the tech industry
Democrats showed no signs of letting up on criticisms of Facebook and Twitter at the hearing despite greater efforts by the companies to act on misinformation in the recent election.
Instead, several Democratic lawmakers blamed Zuckerberg and Dorsey for a surge of hate speech and election disinformation after the election. They pointed to comments on Facebook from Steve Bannon, a former senior adviser to President Donald Trump, who called for the beheading of Dr Anthony Fauci, and posts on and Facebook groups that spread false conspiracy theories about voter fraud.
“I think you can and must do better,” said Senator Patrick Leahy, D-Vermont.
Several of the Democrats called for a slew of legislation directed at the tech sector. Senator Richard Blumenthal of Connecticut, for example, called for tougher data privacy laws, changes to the law that gives the companies legal protection for content posted by users, known as Section 230 of the Communications Decency Act, and greater antitrust action.
“You have built terrifying tools of persuasion and manipulation — with power far exceeding the robber barons of the last Gilded Age,” Blumenthal said. “You have made a huge amount of money by strip mining data about our private lives and promoting hate speech and voter suppression.”
Several Democratic members pointed to calls for violence and protests on the companies’ platforms after the election. Some pro-Trump groups organized on Facebook to stop the count of voting in some states, for instance, before the groups were removed.
Zuckerberg promised to be vigilant.
“I’m very worried about this, especially any misinformation that could incite violence in such a volatile period like this,” Zuckerberg said.
Republicans home in on bias complaints
The committee’s Republican members attacked the power that social media companies have to moderate content on their platforms, accusing them of making politically slanted calls while hiding behind a decades-old liability shield.
“I don’t want the government to take over the job of telling America what tweets are legitimate and what are not,” said the panel’s chairman, Senator Lindsey Graham, R-South Carolina, “But when you have companies that have the power of government, have far more power than traditional media outlets, something has to give.”
Trump and his allies have spent years attacking the Silicon Valley platforms for what they say is bias against conservatives, pointing to the liberal politics of the companies’ employees and instances of moderation that affected Republicans or conservative media. Their evidence for these claims has always been anecdotal, and many right-wing personalities have built big followings online.
Republicans spent much of their time focusing on individual decisions made by the companies. Graham took exception to the way Twitter and Facebook had initially limited the reach of a New York Post article about Hunter Biden, the son of President-elect Joe Biden. The article prompted the committee to demand that the chief executives of the two companies testify.
Zuckerberg and Dorsey said that while their companies had sometimes made mistakes, their policies were fair and supported the best interests of their users.
How Twitter and Facebook plan to handle Trump’s accounts when he leaves office
Many world leaders generally have wider latitude on Twitter and Facebook because their comments and posts are regarded as political speech that is in the realm of public interest. But what will happen to Trump’s accounts on the social media platforms when he leaves office?
Dorsey said the company would no longer make policy exceptions for Trump after he leaves office in January. During Trump’s time as a world leader, Twitter allowed him to post content that violated its rules, though it began adding labels to some of the tweets starting in May to indicate that the posts were disputed or glorified violence.
“If an account suddenly is not a world leader anymore, that particular policy goes away,” Dorsey said.
In contrast, Zuckerberg said at the hearing that Facebook would not change the way it moderates Trump when he leaves office. Since Election Day, Facebook has labelled a few of Trump’s posts and has pointed users to accurate information about the results of the election, but it generally takes a hands-off approach.
Tech’s legal shield draws substantive scrutiny
The law that has legally shielded online platforms from liability for what their users post has long been mentioned by lawmakers as a potential target for reform.
Yet when it came down to it, the debate on Section 230 has resulted in minimal concrete discussions. Not on Tuesday.
Lawmakers approached Section 230 differently out of the gate. They began with a bipartisan call to change the “golden goose” legal shield, with a substantive focus on legislation that will probably take centre stage in the next Congress.
Graham opened the hearing taking direct aim at the legal shield.
“We have to find a way when Twitter and Facebook make a decision about what’s reliable and what’s not, what to keep up and what to keep down, that there is transparency in the system,” Graham said. “Section 230 has to be changed because we can’t get there from here without change.”
Democrats have agreed that the law needs reform, but they have taken the opposite position on why. Democrats have said Section 230 has caused disinformation and hate to flourish on the social media sites.
“Change is going to come. No question. And I plan to bring aggressive reform to 230,” Blumenthal said in his opening remarks. Blumenthal was a leading proponent of the first reform to Section 230, in 2018, which made the platforms liable for knowingly hosting content on sex trafficking.
The hearing, by the numbers.
Republicans asked 72 questions of the chief executives, 53 of which concerned how they moderate content on their social media platforms. Republican senators were particularly focused on how Twitter and Facebook could employ less moderation, with 37 questions about censoring conservative voices and the ideological makeup of their workforces.
Democrats asked 14 questions about content moderation, but most of those focused on whether more moderation could help prevent the spread of hate speech and violence.
Zuckerberg fielded the majority of the inquiries with 71, and Dorsey was asked 56 questions.
Cecilia Kang, David McCabe, Mike Isaac and Kate Conger c.2020 The New York Times Company
Post a Comment