Reviews | Media floods ad area with Facebook failure



If you’ve decided to take a break this weekend from the endless stream of Facebook news, you’ve got a lot of catching up to do.

Since Friday afternoon, several leading media have rushed to the press with stories drawn from Facebook’s mine of internal documents provided by whistleblower Frances Haugen. These stories offer a shocking glimpse into Facebook’s cover-up of its role in the destabilizing spread of hatred and disinformation, particularly in the aftermath of the 2020 US election.

Fixing the model is the right approach. But it’s a fix Facebook will never do on its own.

On Monday, a dozen more reports were added to the pile, including major reports from Bloomberg and USA Today about Facebook’s inability to curb the amplification of hate speech or prevent conspiracy theorists from playing with its. algorithms to disseminate their toxic messages on company networks.

What is happening here?

Since early October, a consortium of 18 news outlets, including the Associated Press, The Atlantic, Bloomberg, CNN, NBC, The New York Times, and The Washington Post, have sifted through tens of thousands of Haugen documents, along with the intend to publish reports in a synchronized manner.

Benefits to people

Part of that push was the weekend’s crop of stories. Almost all of them reveal the reluctance of Facebook executives to solve the problem at the heart of the company’s dangerous but profitable business: a revenue model that puts engagement and growth before the health and well-being of a company. multiracial democracy.

The Associated Press on Friday reported on a rebellion that “broke out” in Facebook offices on January 6 as employees were frustrated by the company’s reluctance to deal with rising political extremism. pro-Trump on his platforms after the 2020 election.

On CNN, Donie O’Sullivan, Tara Subramaniam and Clare Duffy reported that Facebook was “fundamentally unprepared” to curtail the Stop the Steal movement, which arose out of the false belief that the elections were rigged and that Trump was the “Real president”. “

Spokespersons and organizers of this undemocratic campaign used Facebook’s platforms to deport people at events that led to the deadly attack on the United States Capitol. Worse yet, the company provided the basic coordination infrastructure that mobilized people and incited them to violence. In a damning news clip linked to the CNN report, O’Sullivan asked participants in the insurgency how they heard about or helped organize the attack. Their answer: via Facebook groups and event pages.

NBC, The New York Times, and NPR reported on the efforts of a Facebook researcher who created an imaginary user, Carol Smith from North Carolina, to test the platform’s engagement algorithms. The researcher provided some details about Smith, including that she was a Trump supporter and followed the conservative Fox News and Sinclair Broadcast Group accounts.

“Within a week, Smith’s thread was full of groups and pages that had broken Facebook’s own rules, including those against hate speech and disinformation,” reports NBC’s Brandy Zadrozny. These included several recommendations for Smith to join groups dedicated to disseminating far-right conspiracy theories QAnon or other supporters of an upcoming race war.

Too big (and profitable) to fix

On Monday, Bloomberg reported that Facebook executives have long known that the company’s hate speech problem is far bigger and more entrenched than they had revealed.

Last March, Facebook founder Mark Zuckerberg assured Congress that “over 98% of the hate speech we suppress is uttered by [artificial intelligence] and not by a person. But Facebook employees warned the numbers were misleading. Neither the company’s human reviewers nor its automated systems were so good at flagging the most hateful content.

“In practice, the company only suppressed 5% or less of hate speech, the documents suggest,” Bloomberg reported.

Zuckerberg and many of his spokespersons continue to say that the violence that has resulted from the spread of hatred and disinformation is the fault of those who have physically injured others during the insurgency and at other times. , but this new series of reports places a significant responsibility on Facebook executives.

“Undoubtedly, [Facebook] aggravates the hatred, “Haugen told Members of the British Parliament on Monday.” I think there is a view within the company that security is a cost center, it is not a cost center. growth, which I think is very short-term because Facebook’s own research has shown that when people have worse experiences with integrity on the site, they are less likely to retain [them]. “

Facebook is not capable of such foresight. Yes, its constant hunt for short-term growth could ultimately sabotage the long-term survival of the social media giant. But can we really afford to wait for Facebook to fix itself?

As news outlets continue to publish articles exposing Facebook’s failures, more lawmakers and regulators are calling for an investigation into a business model that profits from the spread of the most extreme hatred and disinformation.

Fixing the model is the right approach. But it’s a fix Facebook will never do on its own.


Leave A Reply

Your email address will not be published.