Oversight Board reverses Facebook removal of post touting hydroxychloroquine in COVID-19 treatment
You can quote several words to match them as a full term:
"some text to search"
otherwise, the single words will be understood as distinct search terms.
ANY of the entered words would match
2 min read

Oversight Board reverses Facebook removal of post touting hydroxychloroquine in COVID-19 treatment

The board found the social media giant's misinformation and imminent harm rule is too vague and recommended the platform consolidate and clarify its standards on health misinformation in one place.
Oversight Board reverses Facebook removal of post touting hydroxychloroquine in COVID-19 treatment

Facebook's independent Oversight Board has reversed the social media platform's decision to remove an October 2020 post pertaining to the drug hydroxychloroquine in the treatment of COVID-19.

In October 2020, a user posted a video and accompanying text in French in a public Facebook group related to COVID-19

the board explained on its website.

The post alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products), which refused to authorize hydroxychloroquine combined with azithromycin for use against COVID-19, but authorized and promoted remdesivir. The user criticized the lack of a health strategy in France and stated that Raoult’s cure is being used elsewhere to save lives. The user’s post also questioned what society had to lose by allowing doctors to prescribe in an emergency a “harmless drug” when the first symptoms of COVID-19 appear.

While the person's post pushed back against a government policy, it did not urge people to obtain or take medicine without a prescription, the board noted.

Facebook also failed to show why it did not opt for a less severe remedy than removing the post from the platform, the panel found.

Given that Facebook has a range of tools to deal with misinformation, such as providing users with additional context, the company failed to demonstrate why it did not choose a less intrusive option than removing the content

the board explained.

The board also determined that the social media giant's misinformation and imminent harm rule is too vague and recommended that the platform consolidate and clarify its standards on health misinformation in one place.

The Board also found Facebook's misinformation and imminent harm rule, which this post is said to have violated, to be inappropriately vague and inconsistent with international human rights standards.

A patchwork of policies found on different parts of Facebook's website make it difficult for users to understand what content is prohibited. Changes to Facebook's COVID-19 policies announced in the company's Newsroom have not always been reflected in its Community Standards, while some of these changes even appear to contradict them.

Read the full article at the original website.

References: