News

YouTube pulls down 30,000 videos containing misinformation about COVID vaccine

EdexLive Desk

With the aim to demotivate false claims about COVID-19, video-streaming platform YouTube has removed over 30,000 videos for sharing misinformation about COVID-19 vaccines over the last six months.

According to a report by Axios, the video-streaming platform has taken down more than 800,000 videos containing COVID-19 misinformation since February 2020. The videos are first flagged by either the company's AI systems or human reviewers, then receive another level of review.

Videos that violate the vaccine policy, according to YouTube's rules, are those that contradict expert consensus on the vaccines from health authorities or the World Health Organization (WHO), the report said.

Other platforms, including Facebook and Twitter, have also rolled out policies to reduce the spread and reach of such content.

Recently, the micro-blogging platform Twitter introduced a strike system against misleading tweets about COVID-19 vaccination and five or more strikes will result in permanent suspension of the account.

Since introducing the COVID-19 guidance, Twitter said it has removed more than 8,400 tweets and challenged 11.5 million accounts worldwide.

While one strike will cause no account-level action, two strikes will lead to a 12-hour account lock; three strikes in another 12-hour account lock; four strikes in a 7-day account lock and five or more strikes means permanent suspension of the account.

Labels will first be applied by Twitter team members when they determine the content violates the platform's policy.

Bengaluru: BTech student allegedly falls to death from university hostel building; police launch probe

FIR lodged against unidentified man for making 'obscene' gestures in JNU

UGC launches 'SheRNI' to ensure women scientist representation

Father of Kota student who killed self suspects foul play, demands fair probe

Gorakhpur NCC Academy will inspire youth to contribute to nation-building: UP CM Adityanath