By Kevin Roose
After an Election Day largely free of viral social media misinformation, and with little trace of the kind of Russian troll stampede that hit its platform in 2016, executives at Facebook may be tempted to take a victory lap.
That would be a mistake.
It’s true that Facebook and other social media companies have made strides toward cleaning up their services in the last two years. The relative calm we saw on social media on Tuesday is evidence that, at least for one day, in one country, the forces of chaos on these platforms can be contained.
But more than anything, this year’s midterm election cycle has exposed just how fragile Facebook remains.
Want a disaster-free Election Day in the social media age? You can have one, but it turns out that it takes constant vigilance from law enforcement agencies, academic researchers and digital security experts for months on end.
It takes an ad hoc “war room” at Facebook headquarters with dozens of staff members working round-the-clock shifts. It takes hordes of journalists and fact checkers willing to police the platform for false news stories and hoaxes so that they can be contained before spreading to millions. And even if you avoid major problems from bad actors domestically, you might still need to disclose, as Facebook did late Tuesday night, that you kicked off yet another group of what appeared to be Kremlin-linked trolls.
I’ve experienced Facebook’s fragility firsthand. Every day for the past several months, as I’ve covered the midterms through the lens of social media, I’ve started my day by looking for viral misinformation on the platform. (I’ve paid attention to Twitter, YouTube and other social networks, too, but Facebook is the 800-pound gorilla of internet garbage, so it got most of my focus.)
Most days, digging up large-scale
misinformation on Facebook was as easy as finding baby photos or
birthday greetings. There were doctored photos used to stoke fear about the caravan of Latin American migrants headed toward the United States border. There were easily disprovable lies
about the women who accused Justice Brett M. Kavanaugh of sexual
assault, cooked up by partisans with bad-faith agendas. Every time major
political events dominated the news cycle, Facebook was overrun by hoaxers and conspiracy theorists, who used the platform to sow discord, spin falsehoods and stir up tribal anger.
Facebook
was generally responsive to these problems after they were publicly
called out. But the platform’s scale means that even people who work
there are often in the dark. Some days, while calling the company for
comment on a new viral hoax I had found, I felt like a college R.A.
telling the dean of students about shocking misbehavior inside a dorm
he’d never visited. (“The freshmen are drinking what?”)
Other days, combing through Facebook falsehoods has felt like watching a nation poison itself in slow motion. A recent study by the Oxford Internet Institute, a department at the University of Oxford, found that 25 percent of all election-related content shared on Facebook and Twitter during the midterm election season could be classified as “junk news.” Other studies have hinted at progress in stemming the tide of misinformation, but the process is far from complete.
Editors’ Picks
When the Death of a Family Farm Leads to Suicide
Justin Trudeau’s Official Home: Unfit for a Leader or Anyone Else
Sigrid Johnson Was Black. A DNA Test Said She Wasn’t.
A Facebook spokesman, Tom Reynolds, said that the company had improved since 2016, but there was “still more work to do.”
“Over the last two years, we’ve worked hard to prevent misuse of Facebook during elections,” Mr. Reynolds said. “Our teams worked round the clock during the midterms to reduce the spread of misinformation, thwart efforts to discourage people from voting and deal with issues of hate on our services.”
Even with all Facebook has done, the scale of misinformation still often feels overwhelming. Last month, a viral post falsely claimed that Cesar Sayoc, the suspect in the attempted bombing of prominent liberals and news organizations, was a secret Democrat participating in a “false flag” conspiracy. The post racked up nearly 80,000 shares, more than any post by The New York Times, The Washington Post or Fox News during the entire month of October.
When the news on Facebook was not blatantly false, it was often divisive and hyperpartisan — exactly the kind of thing Mark Zuckerberg, the company’s chief executive, has said he wants to combat by using Facebook to “bring the world closer together.” Nearly every day, the stories that got the most engagement across the network came from highly partisan sources — mostly conservative outlets like Fox News, Breitbart and The Daily Caller, with a handful of liberal pages like Occupy Democrats and The Other 98% thrown in — that skewed heavily toward outrage and resentment.
Even
the anti-abuse systems the company put in place after the 2016 election
have not gone smoothly. One of the steps Facebook took to prevent
Russian-style influence campaigns was to force political advertisers to
verify their identifies. But the company left a loophole
that allowed authorized advertisers to fill the “paid for by”
disclaimer on their ads with any text they wanted, essentially allowing
them to disguise themselves to the public.
Facebook
has framed its struggle as an “arms race” between itself and the bad
actors trying to exploit its services. But that mischaracterizes the
nature of the problem. This is not two sovereign countries locked in
battle, or an intelligence agency trying to stop a nefarious foreign
plot. This is a rich and successful corporation that built a giant
machine to convert attention into advertising revenue, made billions of
dollars by letting that machine run with limited oversight, and is now
frantically trying to clean up the mess that has resulted.
As
the votes were being tallied on Tuesday, I talked to experts who have
paid close attention to Facebook’s troubles over the past several years.
Most agreed that Election Day itself had been a success, but the
company still had plenty to worry about.
“I
give them better marks for being on the case,” said Michael Posner, a
professor of ethics and finance at New York University’s Stern School of
Business. “But it’s yet to be seen how effective it’s going to be.
There’s an awful lot of disinformation still out there.”
“On
the surface, for Facebook in particular, it’s better because some of
the worst content is getting taken down,” said Jonathan Albright, the
research director at the Tow Center for Digital Journalism at Columbia
University. Mr. Albright, who has found networks of Russian trolls operating on Facebook in the past, has written in recent days
that some of the company’s features — in particular, Facebook groups
that are used to spread misinformation — are still prone to
exploitation.
“For blatantly false news, they’re not even close to getting ahead of it,” Mr. Albright said. “They’re barely keeping up.”
Jennifer
Grygiel, an assistant professor at Syracuse University who studies
social media, said that Facebook’s pattern of relying on outside
researchers and journalists to dig up misinformation and abuse was
worrying.
“It’s a bad sign that the
war rooms, especially Facebook’s war room, didn’t have this information
first,” Professor Grygiel said.
In
some ways, Facebook has it easy in the United States. Its executives and
engineers are primarily English-speaking Americans, as are many of the
content moderators doing the work of policing the platform. The country
also has a strong independent press, law enforcement agencies and other
stable institutions that are capable of filling in some gaps. And
Facebook is highly incentivized to behave well in the United States and
Europe, where its most important regulators (and the bulk of its
advertisers) are.
It
is hard to imagine Facebook extending the same kind of effort to
prevent misinformation and interference in Madagascar, Armenia or El
Salvador, all of which have upcoming elections. And if you think
Facebook will spin up a 24/7 “war room” to help stop meddling in
Nigeria’s February elections, I have a bridge in Lagos to sell you.
It’s
worth asking, over the long term, why a single American company is in
the position of protecting free and fair elections all over the world.
But that is the case now, and we now know that Facebook’s action or
inaction can spell the difference between elections going smoothly and
democracies straining under a siege of misinformation and propaganda.
To Facebook’s credit, it has become more responsive in recent months, including cracking down on domestic disinformation networks, banning particularly bad actors such as Alex Jones of Infowars, and hiring more people to deal with emerging threats.
But
Facebook would not have done this on its own. It took sustained
pressure from lawmakers, regulators, researchers, journalists,
employees, investors and users to force the company to pay more
attention to misinformation and threats of election interference.
Facebook has shown, time and again, that it behaves responsibly only
when placed under a well-lit microscope. So as our collective attention
fades from the midterms, it seems certain that outsiders will need to
continue to hold the company accountable, and push it to do more to
safeguard its users — in every country, during every election season —
from a flood of lies and manipulation.
No comments:
Post a Comment