YouTube will remove content that promotes “cancer treatments proven to be harmful or ineffective” or that “discourages viewers from seeking professional medical treatment.” the video platform announced today. This app comes as YouTube attempts to streamline its medical moderation guidelines based on what it has learned while attempting to combat misinformation on topics such as covid-19, vaccines and reproductive health.
Going forward, Google’s video platform says it will enforce its medical misinformation policies when there is a high risk to public health, when guidelines from health authorities are publicly available, and when a subject is prone to misinformation. YouTube hopes this policy framework will be flexible enough to cover a wide range of medical topics, while striking a balance between minimizing harm and allowing debate.
Videos are not allowed to discourage viewers from seeking professional medical treatment.
In its blog post, YouTube says it will take action against both treatments that are actively harmful as well as those that are unproven and suggested in place of established alternatives. A video could not, for example, encourage users to take vitamin C supplements as an alternative to radiotherapy.
While major tech platforms were united in early 2020, their exact approaches to Covid-19 misinformation have differed since that initial announcement. More specifically, Twitter stopped applying its disinformation policy on covid end of 2022 following its acquisition by Elon Musk. Meta has also recently relaxed its moderation approach, reconsider its rules on disinformation on covid in countries (like the United States) where the disease is no longer considered a national emergency.