The media giants will be heavily questioned by Congress regarding their role in spreading misinformation and online extremism.
This marks the first time the chief executives of Facebook, Google, and Twitter will be appearing before lawmakers since the Capitol riots and Covid-19 vaccine distributions. Members of the House Energy and Commerce Committee are expected to question Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey about “their platforms’ efforts to stem baseless election fraud claims and vaccine skepticism.”
A memo distributed within the committee also alluded to the fact that the executives will also be questioned about their platforms “opaque algorithms” that promote the spread of misinformation in general. All three tech platforms have been under fire since the 2020 election in which former president Donald Trump and other individuals in political power were able to spread constant misinformation regarding the election and America as a democracy.
Twitter and Facebook previously announced steps that they were taking to combat this issue, such as flagging all posts containing misinformation, however, the January 6th Capitol riots proved that a simple warning sticker isn’t enough to actually stop the harmful impact that these falsehoods cause when they spread online, especially at the hands of the president.
“The hearing marks the CEOs’ first time back before Congress since Trump was banned or suspended from their respective platforms following the Capitol riots.”
“The Capitol attack was a horrific assault on our values and our democracy, and Facebook is committed to assisting law enforcement in bringing the insurrectionists to justice. We do more to address misinformation than any other company,” Zuckerberg claimed in his initial testimony.
There’s currently legislation being placed under heavy consideration within the House and Senate to have the nation’s leaders rein in just how much power these platforms actually have. Some of the bills being discussed specifically target the companies’ economic dominance and anti-competitive practices, while others focus on their approach to data privacy.
The legislation being discussed could introduce new requirements for tech platforms, or potentially expose them to greater legal trouble, either way, it’s expected that the industry will be changing in the coming years. This week’s meeting between the executives and lawmakers will likely also be their last chance to personally make a case in front of Congress. The main issue that will be discussed involves Section 230 of the Communications Act of 1934 that grants websites and their owners legal immunity from any content posted by their users.
“Both Republicans and Democrats want the Communications Act to be updated due to the fact that it’s credited with the development of the open internet but was established long before the internet was created.”
Zuckerberg claimed that Facebook “favors a form of conditional liability where online platforms could be sued over user content if the companies fail to adhere to certain best practices established by an independent third party.” The other two CEO’s don’t discuss Section 230 specifically, and instead offered ideas for general content moderation.
Pichai focused on calling for clearer content policies while Dorsey called for a more “user-led” content moderation, as well as the creation of better settings and tools for users to create a more customizable online experience. Both Zuckerberg and Dorsey appeared before the Senate in November to discuss the same issues as well, and Zuckerberg and Pichai testified in the House last summer over antitrust issues.
According to media coverage from the hearings, “Facebook on Monday said it removed 1.3 billion fake accounts last fall and that it now has more than 35,000 people working on content moderation. Twitter said this month it would begin applying warning labels to misinformation about the coronavirus vaccine, and it said repeat violations of its Covid-19 policies could lead to permanent bans. YouTube said this month it has removed tens of thousands of videos containing Covid vaccine misinformation, and in January, following the Capitol riots, it announced it would restrict channels that share false claims doubting the 2020 election’s outcome.”
Committee members are not as convinced that these changes will actually make a difference, and are continuing to grill the tech giants on their role in spreading misinformation and extremism.
Eric Mastrota is a Contributing Editor at The National Digest based in New York. A graduate of SUNY New Paltz, he reports on world news, culture, and lifestyle. You can reach him at email@example.com.