An unannounced check revealed that YouTube blocked all obviously funny political ads, but TikTok posted 90%

With the November 8, 2022

midterm elections just around the corner and political advertising on social media under scrutiny, a survey conducted across three major social networks found TikTok to contain blatant misinformation. It turned out that I had accepted most of the political ads verbatim. On the other hand, the difference in the checking system is clearly visible, such as all rejections on YouTube and the YouTube channel being closed.

Cybersecurity for Democracy

TikTok and Facebook fail to detect election disinformation in the US, while YouTube succeeds | Global Witness

TikTok, Facebook failed to remove ads spreading election misinformation: report |

International NGO Global Witness and New York University's Cybersecurity for Democracy on October 21, 2022 contain misinformation on TikTok, Facebook and YouTube, the social media platforms where the most debate about the election takes place. Published the results of a survey that hits political ads.

The political advertisements used in the experiment were 10 in total, 5 containing falsehoods about elections and 5 that lead to votes being invalidated, each in English and Spanish. Examples of false political ads include those that encourage people to vote twice, those that encourage people not to vote, and those that claim proof of vaccination is required to participate in elections. These political ads were also designed to be used in five U.S. electoral battlegrounds: Arizona, Colorado, Georgia, North Carolina, and Pennsylvania.

To be fair, all three platforms used the same ads. Also, the research team removed these ads before they were actually posted, even if they were accepted.

When conducting an experiment to post false political advertisements in this way, TikTok failed to block 18 out of 20 advertisements, despite the policy prohibiting all political advertisements, the worst among the three was the result of In addition, it seems that the account that posted these political ads remained as it was until the research team contacted them.

On the other hand, Facebook blocked 13 out of 20 posts, but accepted 7 posts. Also, one of the three dummy accounts used to post political ads was closed, but two remained untouched.

The best results were on YouTube, where half of the false ads were rejected within a day, and the other half were all removed within a few days. The YouTube channels used for posting were also closed, but the Google Ads accounts associated with those accounts remained active.

Laura Edelson, who leads the team at Cyber ​​Security for Democracy, said, 'The performance that YouTube showed in our experiment demonstrates that it is not impossible to detect disinformation that is harmful to elections. All the platforms we looked at should have gotten an A on this test, and Facebook and TikTok failed to do so. I'm asking,' he commented.

YouTube showed good performance in a survey conducted in early October in the United States, but in a similar survey conducted in Brazil in August, all ads passed the review. For this reason, the research team is asking YouTube to apply the process of seeing through false political advertisements worldwide.

in Web Service, Posted by log1l_ks