Facebook executives are implementing emergency measures reserved for "at-risk" countries in a company-wide effort to bring down the online temperature.
As the U.S. braces for election-related unrest next month, Facebook
executives are implementing emergency measures reserved for "at-risk" countries in a company-wide effort to bring down the online temperature.
The Wall Street Journal reported Sunday that the social media giant plans to limit the spread of viral content and lower the benchmark for suppressing potentially inflammatory posts using internal tools previously deployed in Sri Lanka and Myanmar.
The tools, now a key component of Facebook
's strategy to prepare for the contentious U.S. election, would only be activated in "dire circumstances" and instances of violence, people familiar with the matter told the Journal.
The measures would loosen the threshold previously established for content deemed dangerous on the platform, and would slow down the dissemination of specific posts as they begin to gain traction, the Journal explains. An internal adjustment would also be applied to news feeds to control the content available to users.
"Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence and misinformation, said the people familiar with the measures," the Journal writes. "But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook
employees uneasy, some of the people said."
spokesman Andy Stone told the Journal that the company has "spent years building for safer, more secure elections," and that their strategy is based on "lessons from previous elections, hired experts, and built new teams with experience across different areas to prepare for various scenarios.”
The move comes days after Facebook
censored a story from The New York Post detailing allegedly corrupt business deals by Joe Biden
's son Hunter Biden -- which prompted harsh backlash from President Trump and Republicans who have long criticized the platform's role in regulating content.
At the time, Facebook
CEO Mark Zuckerberg
said that the company would impose fewer restrictive rules on content following the conclusion of November's election, but that they had implemented policy changes to address any uncertainty and the perpetuation of disinformation for the time being, according to BuzzFeed News.
"Once we’re past these events and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content," Zuckerberg said.