December 15, 2021

Mark Zuckerberg Knows Exactly How Bad Facebook Is

 Facebook CEO Mark Zuckerberg looks down as a break is called during his testimony before a joint hearing of the Commerce and Judiciary Committees in April 2018.

By
Jeet Heer
The Nation
 

On March 25, Republican Congresswoman Cathy McMorris Rodgers grilled Facebook CEO Mark Zuckerberg on whether social media platforms were doing harm to children. Zuckerberg’s first response was to sidestep the issue of children altogether and mutter vaguely about “people” instead: “Congresswoman, the research that I have seen on this suggests that if people are using computers and social—” Rodgers cut off this evasion and asked for a simple yes-or-no answer. Zuckerberg replied, “I don’t think that the research is conclusive on that. But I can summarize what I have learned, if that is helpful.”

In his gloss on the scholarship, Zuckerberg highlighted the happy news that “overall, the research that we have seen is that using social apps to connect with other people can have positive mental health benefits and well-being benefits by helping people feel more connected and less lonely.”

The words came across as weaselly and disingenuous at the time, but they sound even worse now. Thanks to tens of thousands of pages of internal documents provided by Frances Haugen, a former Facebook product manager turned whistleblower, we know that Zuckerberg was willfully lying about this and many other issues concerning his company. Haugen took these internal reports to The Wall Street Journal, which has published them in a lengthy series titled The Facebook Files.

Facebook has long been the object of a great deal of external criticism. Haugen’s massive cache not only validates this criticism but also makes much of it seem excessively generous. Some of the company’s research focused on Instagram, the image-sharing site it owns. As the Journal highlighted: “Facebook has been conducting studies into how its photo-sharing app affects its millions of young users. Repeatedly, [its] researchers found that Instagram is harmful for a sizable percentage of them, most notably teenage girls.” According to one internal Facebook slide, “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.”

Summing up “The Facebook Files,” the Journal notes: “Facebook’s own research lays out in detail how its rules favor elites; its platforms have negative effects on teen mental health; its algorithm fosters discord; and that drug cartels and human traffickers use its services openly.” The newspaper adds, “The documents show that Facebook has often made minimal or ineffectual efforts to address the issues and plays them down in public.”

The repeated pattern the documents show is that, in an attempt to assuage public anger, Facebook would make periodic gestures at reform, mainly by conducting internal research into the impact it was having on its users. The researchers would come back with extremely negative reports and suggestions for wholesale reform. Zuckerberg and other top executives would then reject those recommendations because they would dampen the company’s growth.

There’s no denying that Zuckerberg’s single-­minded focus on growth has paid off. Facebook has gone from an idea he and some classmates developed as Harvard undergraduates into a company valued at $1 trillion, with an estimated 3.5 billion users across Facebook and its affiliated platforms (Instagram, Messenger, and WhatsApp). Yet the bigger Facebook gets, the more it needs to keep growing. Speaking on 60 Minutes, Haugen said, “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook.”

One of Facebook’s shrewder public relations moves was to ban Donald Trump on January 7, the day after a mob he incited attacked the Capitol. Bernie Sanders has been one of the few left-of-center American politicians to criticize the move, on the grounds that such power could be wielded less scrupulously by tech companies in the future.

“The Facebook Files” vindicates Sanders’s critique. Long before it banned Trump, Facebook gave him a special exemption. Along with other elite politicians and pundits, Trump was on a “whitelist” of figures who had immunity from normal enforcement rules. In May 2020, in response to the protests after George Floyd’s murder, Trump tweeted and also posted to Facebook an ominous warning that “when the looting starts, the shooting starts.” An automated system rated these incendiary words as 90 out of 100 in terms of violating the platform’s rules. If an ordinary person had posted that, it would take just one report from a user to cause the post to be removed. Instead, it was flagged for management review, and Zuckerberg himself intervened to keep the post up.

The lesson of “The Facebook Files” is that the company cannot be trusted to regulate itself. There’s no need for further arguments about whether Facebook has a deleterious effect: The company’s own research says it does. Facebook is thus in the same position as tobacco companies that knew smoking causes cancer, or oil companies long aware that fossil fuel consumption is driving climate change.

Government reorganization of Facebook is the only way forward. The urgency became all the clearer after the outage of October 4, when the many platforms owned by the company went offline for hours. Facebook itself might be just a place to post pictures for most people, but for tens of millions, particularly in poorer countries, WhatsApp and Messenger are as essential as phones. Facebook is a public utility run by an irresponsible oligarch.

The political question now is what form that reorganization should take. Should the various internal remedies proposed by Facebook researchers be mandated? Should Facebook, as Elizabeth Warren advocates, be broken up into smaller companies? Or would a more radical proposal to socialize Facebook and run it as a public utility, freeing it from the imperatives of economic growth so that it works only to foster communication, be the best approach?

These divergent solutions need to be hashed out politically. The one thing they have in common is that they start from the premise that neither Zuckerberg nor any other CEO can be allowed to dictate the future of the company. Facebook has become a public problem that needs public solutions.

No comments:

Post a Comment