TikTok, Facebook OK’d Ads With Misinformation About Voting: Report

Social media platforms Facebook and TikTok failed at enforcing their policies despite each being hit with ads containing “blatant” misinformation about the 2022 midterm elections, a new report found.

The report, which stems from an investigation by watchdog Global Witness and New York University’s Cybersecurity for Democracy (C4D) team, describes researchers’ efforts to post 20 ads with misinformation to Facebook, TikTok and YouTube.

The ads were in both English and Spanish language and targeted multiple battleground states in the midterms such as Arizona, Colorado and Georgia.

The ads, which the groups said were deleted after the platforms informed them if they were accepted, reportedly featured several inaccurate claims such as claims about extended voting days and primary votes counting in the midterms.

TikTok OK’d those ads, the report said, but would not let a Facebook-approved ad about mandatory COVID-19 vaccinations for voters slide.

TikTok – owned by Chinese company ByteDance – fared the worst in the researchers’ investigation, the report said, as the platform approved 90% of ads with disinformation.

The platform’s reported failure in the research comes three years after a ban on political ads in the app.

A TikTok spokesperson, in a statement to the groups, claimed the platform prohibits and removes election misinformation along with paid political advertising from the app.

“We value feedback from [non-governmental organizations], academics, and other experts which helps us continually strengthen our processes and policies,” the spokesperson said.

Meta’s Facebook platform approved a “significant” number of the ads, 30% in English and 20% in Spanish during one test and 20% in English along with 50% in Spanish during another, the report said.

A Meta spokesperson told the groups that their report was based on a very small sample size and doesn’t represent the political ads the company reviews daily and around the world.

They wrote that the platform’s ad review process goes through several layers of analysis and detection, as well.

“We invest significant resources to protect elections, from our industry-leading transparency efforts to our enforcement of strict protocols on ads about social issues, elections, or politics – and we will continue to do so,” they said.

Global Witness noted other investigations that show all election misinformation ads it tested in Brazil and all hate speech ads it tested in Kenya, Myanmar and Ethiopia sailed past Facebook’s ad approval process.

Google-owned YouTube, on the other hand, found and rejected each ad the researchers submitted to the platform while also suspending a channel used to post ads, according to the report.

Google, in a statement to the Associated Press, wrote that the company has “developed extensive measures to tackle misinformation” on its platforms, including false claims about elections and voting.

“In 2021, we blocked or removed more than 3.4 billion ads for violating our policies, including 38 million for violating our misrepresentation policy,” Google wrote in a statement.

“We know how important it is to protect our users from this type of abuse – particularly ahead of major elections like those in the United States and Brazil – and we continue to invest in and improve our enforcement systems to better detect and remove this content.”

Damon McCoy, co-director of C4D, said that disinformation has had a major impact on elections and said YouTube’s performance in the research isn’t impossible.

“But all the platforms we studied should have gotten an “A” on this assignment,” McCoy said.

Jon Lloyd, senior advisor at Global Witness, said companies with social media platforms claim to recognize the problem of disinformation and added that the research shows they aren’t doing enough to curb it.

“Coming up with the tech and then washing their hands of the impact is just not responsible behaviour from these massive companies that are raking in the dollars,” Lloyd said.

“It is high time they got their houses in order and started properly resourcing the detection and prevention of disinformation, before it’s too late. Our democracy rests on their willingness to act.”

Comments are closed.