Facebook says it will begin removing misinformation that leads to violence

Image result for Facebook

Hours after CEO Mark Zuckerberg spurred history by defending the rights of Holocaust deniers to post on Facebook, the company said it had begun removing misinformation that contributes to violence. “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down,” the company said in a statement. “We will be begin implementing the policy during the coming months.”

Under the new policy, Facebook will begin reviewing posts that are inaccurate or misleading, and are created or shared with the intent of causing violence or physical harm. The posts will be reviewed in partnership with local organizations including threat intelligence agencies, which Facebook says are in the best position to evaluate threats. Posts covered by the policy include manipulated imagery as well as text.

Partners are asked to verify that the posts in question are false and could contribute to imminent violence or harm, Facebook said. The company said it would rely on reports from partners, the press, and its own public policy employees. When Facebook has verified a report, it will remove the post, along with any duplicate posts that have been created.

While the new policy was announced today, Facebook put it into place last month, the company said. Posts falsely stating that Muslims in Sri Lanka were poisoning food given to Buddhists were removed after an investigation. Sri Lanka temporarily shut down Facebook earlier this year after hate speech spread on the company’s apps resulted in mob violence.

In many countries, Facebook is contending with a darkening reputation as a vehicle for disinformation and false news.

In March Sri Lanka sought to block access to the social network, as well as two other platforms that Facebook owns, WhatsApp and Instagram, in an attempt to stem mob violence directed at its Muslim minority. Citing inflammatory posts on Facebook and WhatsApp, the Sri Lankan government ordered internet providers and mobile phone carriers on Wednesday to temporarily block the services along with Viber, another messaging app. Sri Lanka’s government has also imposed a nationwide state of emergency after violence broke out Sunday in one of the island’s central cities, where dozens of Muslim businesses, houses and at least one mosque were attacked. At least one person was killed.

Sri Lanka is the latest country to grapple with hate speech being magnified on Facebook, especially in parts of the world that have only recently come online. As use of the social media platform has ballooned in recent years, so have cases of extremist fringe groups using Facebook’s reach to magnify their messages.

In Myanmar, where Facebook is so dominant that it is often confused for the internet itself, the social network has been blamed for allowing hate speech to spread, widening longstanding ethnic divisions and stoking violence against the Rohingya ethnic group. In the Philippines, Facebook has been used to spread pro-government propaganda.

Internet watchdog groups have long warned that Facebook was being used to distribute hate speech about ethnic minorities in Sri Lanka. Freedom House, a Washington-based nonprofit that advocates for free speech and democracy, said in a recent report that “hate speech against minorities continues to foment on various social media platforms, particularly Facebook.” The report said online campaigns targeting Muslims and other minority groups in Sri Lanka had been ongoing since 2013 and had recently increased.

Sri Lankan officials  blamed Sri Lanka’s Buddhist majority, as well as Muslims, for spreading false information on the social networks.Violence continued, according to news reports. Although Facebook was blocked, WhatsApp was functioning sporadically.

On Twitter, two of the country’s legendary cricket players, Kumar Sangakkara and Mahela Jayawardena, tried to defuse tensions.

“I strongly condemn the recent acts of violence & everyone involved must be brought to justice regardless of race/ religion or ethnicity,” Mr. Jayawardena tweeted on Wednesday.

Sri Lanka is still recovering from a long civil war waged by Tamil separatists. The conflict ended in 2009 after the military crushed the rebels, and hard feelings remain.

Facebook said that it has clear rules against hate speech and incitement to violence. “We are responding to the situation in Sri Lanka and are in contact with the government and nongovernmental organizations to support efforts to identify and remove such content,” the company said in a statement.

It was confirmed that Facebook was cooperating with the Sri Lankan government. He said that at a meeting in Colombo, the capital,  the government raised more than 100 items with the company.

“Once we come to an agreement on these, and once the situation is under control, Facebook will be live again,” he said. It was not immediately clear what the items entailed.

Sri Lanka is hardly the only country to resort to extreme measures like a social media shutdown.

Last year, India blocked 22 social networking services  including Facebook, Twitter, WhatsApp and YouTube  for one month in the disputed territory of Jammu and Kashmir in a bid to curb street protests there. Mobile internet service is also frequently blocked in Kashmir, which borders Pakistan and has gone through spasms of violence for decades.

Turkey has also repeatedly shut down Twitter and YouTube for allowing content opposed by the government of Recep Tayyip Erdogan, the country’s president.

Leave a Reply

Your email address will not be published. Required fields are marked *