Can Circuit Breakers Stop Viral Rumors On Facebook, Twitter?

Enlarge this image
Buildings are engulfed in flames as a wildfire ravages Talent, Ore., on Sept. 8, 2020. Unfounded rumors that left-wing activists were behind the fires went viral on social media, thanks to amplification by conspiracy theorists and the platforms’ own design.
Kevin Jantzer/AP
hide caption
toggle caption
Kevin Jantzer/AP

National
Oregon Officials Warn False Antifa Rumors Waste Precious Resources For Fires
The unfounded rumors Hill heard were lighting up social media: They said the fires had been intentionally set by left-wing arsonists — specifically, Black Lives Matter or antifa, a loosely defined antifascist movement.
None of those claims were true. Law enforcement in Oregon and even the FBI said there was no evidence that any activist or political groups were involved in the fires.
And yet, local 911 dispatchers were soon overrun with calls. The rumors caused so much disruption, police departments took to Facebook and Twitter to beg people to stop spreading them.
Reports that extremists are setting wildfires in Oregon are untrue. Help us stop the spread of misinformation by only sharing information from trusted, official sources. pic.twitter.com/ENc4c3kjep
— FBI Portland (@FBIPortland) September 11, 2020
Tim Fox, a captain with the Oregon State Police, said he received a hundred emails about the false claims, creating more work and diverting resources from actual emergencies.
«The fires, the evacuations, the security of the places, you know, we still have our regular calls for service. Life hasn’t stopped, he said. «All these rumors and things that are going around are tough because we have to find people to respond to them, to investigate them, to check them out.
Social media’s design helps hoaxes go viral
Part of the reason these claims spread so widely on Facebook, in particular, is that the world’s biggest social network rewards engagement. Posts that get lots of shares, comments and likes get shown to more people, quickly amplifying their reach.
Facebook is well aware of its power to make stories go viral. As the fire rumors proliferated, the company put warnings on some posts its fact checkers had found false and reduced their distribution.
But that wasn’t enough to quell the rumors. On Sep. 12, a day after the FBI put out its statement declaring the reports untrue and at least three days after the claims began spreading on social media, Facebook began removing posts connecting left-wing activists to the fires.
We are removing false claims that the wildfires in Oregon were started by certain groups. This is based on confirmation from law enforcement that these rumors are forcing local fire and police agencies to divert resources from fighting the fires and protecting the public. (1/2)
— Andy Stone (@andymstone) September 12, 2020
By then, however, hundreds of thousands of people had seen the false posts.
The claims were being amplified by social media accounts known to spread false information, including followers of Qanon, a baseless conspiracy theory, as well as by Russian state news outlet RT.
«What that said to us is that they’re waiting too long, said Karen Kornbluh, director of the Digital Innovation and Democracy Initiative at the German Marshall Fund of the United States, a think tank.
Kornbluh’s research team found the debunked claims still spreading in large private Facebook groups, some with hundreds of thousands of members, several days after Facebook said it was removing such posts.
Facebook said it was using artificial intelligence and human review to find and take down the hoax everywhere it appeared, including in private groups. The company said some of the posts in the private groups the German Marshall Fund identified had already been deleted by users, and the company removed the rest.
«We share the goal of limiting misinformation which is why we have taken aggressive steps to combat it — from creating a global network of over 70 fact-checking partners and promoting accurate information to removing content when it breaks our rules, said Andy Stone, a Facebook spokesman. «There’s no silver bullet to addressing this challenge, which is why we continue to consult with outside experts, grow our fact-checking program, and improve our internal technical capabilities.
Still, what happened in Oregon shows that once this kind of hoax starts spreading, it is extremely difficult to stamp out.
«When you think of the psychology of misinformation, you can think of something like molding clay, said Dolores Albarracín, a psychology professor at the University of Illinois. «When you have soft clay, you can print anything you want on to it … Once it dries out, though, then that’s it. Your print or shape is set.
So while fact checks and removing posts can help, the real challenge is stopping harmful hoaxes from going viral in the first place.
Stock markets have «circuit breakers. Should social media?
Now, some experts are promoting a new way for social media platforms to hit pause on their powerful amplification engines.
It is modeled on the stock market’s method of halting trading when stocks are too volatile.
«If the S&P drops really suddenly, we’ve had these thresholds in place for a lot of years now. The market will stop and that will automatically trigger review, said Erin Simpson, associate director of technology policy at the left-leaning Center for American Progress.

Business
How Stock Market Circuit Breakers Work
Those automatic triggers are called circuit breakers. Simpson says social media also needs circuit breakers to stop the viral spread that platforms are designed to encourage.
When a controversial topic is gaining steam, Facebook or Twitter could limit its reach, while reviewing disputed information.
«A system like this could maybe make it harder for stuff to go viral, instead of the status quo, which is a set of Facebook business products [that] make it easier to go viral, Simpson said.
Simpson and a colleague at CAP described such a viral circuit breaker in a paper last month, building off an idea first proposed by Ellen Goodman, a law professor at Rutgers University.
«Pausing waves of virality could stem disinformation, deepfakes, bot-generated speech, and other categories of information especially likely to manipulate listeners, Goodman wrote.
Now, the idea is even gaining traction inside Facebook. The company told The Verge it is testing this kind of speed bump for viral posts.
Editor’s note: Facebook is among NPR’s financial supporters.
Oregon Public Broadcasting’s Monica Samayoa contributed to this report.
- misinformation
- Oregon
- social media
Обсудим?
Смотрите также: