August 4, 2020

Facebook Fuels Its Users’ Ignorance With Lies


Facebook CEO Mark Zuckerberg looks down as a break is called during his testimony before a joint hearing of the Commerce and Judiciary Committees in April 2018.


By
Eric Alterman
thenation.com


You’d have had to be some kind of evil genius to imagine something as terrible for the world as Facebook. With an estimated 2.6 billion users and $70 billion in annual profits, it is the most effective purveyor in history of right-wing hate, lies, and incitement against vulnerable people and the planet.

Is Facebook’s malevolence driven by a thirst for profit or politics? As with Fox News, alas, that’s a false choice, as the two reinforce each other. Facebook makes its money—as newspapers used to—by selling eyeballs to advertisers. But before local news started collapsing, thanks partly to the advertiser exodus to Facebook and Google, newspapers used this model to fulfill their responsibilities to educate readers and hold those in power to account. Facebook does the opposite: It narrows its users’ interests and fuels their ignorance with lies and misinformation.

Every so often, Mark Zuckerberg will issue a statement that implies he is sorry and that Facebook will try to do better. Of course, it never does. According to a study reported by the watchdog website Popular Information, during the first 10 months of 2019, “politically relevant disinformation was found to have reached over 158 million estimated views, enough to reach every reported registered voter in the US at least once.” That pace was accelerating, and guess what: “Most negative misinformation (62%) was about Democrats or liberals.” The incitement of violence remains on Facebook and on the company’s other apps as well. Just recently, BuzzFeed News reported that an ad on Instagram, which is owned by Facebook, showed clips from action movies of cops being killed and invited people to “join the militia, fight the state,” to a soundtrack of “We ain’t scared of no police / We got guns too.”

This is no accident. Yaël Eisenstat, Facebook’s former head of global elections integrity, explained in The Washington Post that the company “profits partly by amplifying lies and selling dangerous targeting tools that allow political operatives to engage in a new level of information warfare. Its business model exploits our data to let advertisers custom-target people, show us each a different version of the truth and manipulate us with hyper-customized ads.”

Ask yourself: Why does Facebook refuse to apply its gentle fact-checking apparatus to political advertisements?

Why does it include the racist, sexist, anti-Semitic Breitbart as one of its “trusted” news sources?

Why does it continue to allow Holocaust deniers onto its site, and why does Zuckerberg choose to define their poison as mere opinion?

Why did Facebook create a “newsworthiness” category in 2016 when dealing with President Donald Trump’s lies, racism, and hate speech?

Why did Zuckerberg tell employees that a possible Elizabeth Warren presidency represented an “existential” threat to the company? And what will that mean if Joe Biden picks her as his running mate?

Why in May 2019 did Facebook refuse to take down an obviously doctored video that falsely portrayed Nancy Pelosi as acting like a drunk?

“And why, of all things,” asked Bill McKibben in The New Yorker, “did the company recently decide to exempt a climate-denial post from its fact-checking process?”

Here’s one reason offered by Tim Wu, a professor at Columbia Law School: “Facebook can, by tinkering with its rules for political ads, give itself a special, unregulated power over elections. Just that possibility gives Facebook political leverage and politicians reasons to want leverage over Facebook.” David Thiel, a former Facebook security engineer quoted in the Post, said, “The value of being in favor with people in power outweighs almost every other concern for Facebook.”

Deploying their traditional working-the-refs playbook, Trump and the Republicans have turned truth on its head by casting themselves as victims of the site’s biases. “Facebook was always anti-Trump,” the president has whined, and congressional Republicans and the Department of Justice have threatened legal action to continue this campaign of Orwellian doublespeak.

Facebook’s desire to kowtow to Republicans has been evident at least since 2011, when it hired GOP operative Joel Kaplan as its vice president for global public policy, along with Katie Harbath, a former aide to Rudy Giuliani, and Kevin Martin, a former Republican-appointed FCC chairman, to support Kaplan’s efforts. Kaplan declined to intervene in Facebook’s decision to invite politicians to lie in their paid advertisements. And he has stood in the way of efforts designed to police misinformation because, according to anonymous sources quoted in the Post, he correctly perceived that it would “disproportionately affect conservatives.” Zuckerberg also attended a secret dinner with Trump, Jared Kushner, and the right-wing entrepreneur and early Facebook investor Peter Thiel.

In the wake of the Black Lives Matter protests and thanks to efforts by the NAACP, Color of Change, and the Anti-Defamation League, we’ve seen a Facebook advertising pause by more than 970 companies, including Unilever, Coca-Cola, Pfizer, and Starbucks.

The social sanction is helpful. It may inspire employees to try to change policy from within, and as Trump sounds more malignant by the day, it also puts pressure on those at the top to protect their reputations from the poison of his presidency.

Still, the company’s top 100 advertisers provide only 6 percent of its income, while small businesses account for more than 70 percent. And they do not have nearly as many alternatives. Most people I know, myself included, do not want to quit Facebook, especially during a socially isolating pandemic.

So here’s my idea: Let’s just boycott the ads. Don’t click on them. That way, even the small advertisers will have to find new outlets unless Facebook changes its policies. Spread the word… via Facebook.

No comments: