Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on November 24, 2020

Targeting Trump fans, QAnon ad slips through Facebook’s filters


Targeting Trump fans, QAnon ad slips through Facebook’s filters Image by: The Markup

Despite Facebook’s efforts to tamp down on the fringe conspiracy QAnon movement, its backers are still finding ways to push its message on the platform — even paying Facebook to do so.

Just days after announcing last month that it would ban QAnon theories from the site, Facebook allowed an advertisement featuring a cartoon Pepe the Frog — a character associated with the alt-right political movement— with a message to “Connect with Cue.” The ad, when clicked, directed Facebook users to a Facebook page called “Cue” featuring popular QAnon videos.

The Markup found the Cue ad in data submitted by participants in the Ad Observer project at New York University.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

When The Markup first found the ad on Nov. 5, the page had about 8,400 likes. Eight days later, it had more than 10,800. When The Markup flagged the ad to Facebook, the company removed the page and stopped the ad.

“We don’t allow ads with praise, support, or representation for QAnon, so we have rejected this ad,” Rob Leathern, a Facebook ad executive, told the Markup in an emailed statement.

[Read: Here’s how to make your website more accessible]

It’s unclear why Facebook’s protective systems failed to detect the Cue page or prevent it from running ads.

Facebook’s QAnon policy is enforced “both through automated review and teams of reviewers,” Leathern said. “Enforcement will never be perfect but our systems improve with time,” he continued.

While it represents just one ad and one page on Facebook’s massive platform, the Cue post, experts say, points to a larger demon: Facebook’s rules consistently lag behind or fail to effectively police ads on its platform, even when warning signs are obvious.

“Facebook has repeatedly failed at enforcing their policies,” Kayla Gogarty, a researcher at Media Matters, which has found problems in Facebook’s enforcement of its QAnon rules, told The Markup.

The company has long struggled with how to handle incendiary content, a problem made more acute by an election year filled with conspiracies, vitriol, and misinformation. Earlier this month, Facebook received unwanted attention after former presidential adviser Steve Bannon made violent comments about the director of the FBI, Christopher Wray, and infectious disease expert Anthony Fauci in a video posted to social platforms, including Facebook—and got further attention when Facebook took down the video but declined to ban Bannon from the platform.

Facebook’s efforts with tackling QAnon

Facebook’s dealings with QAnon have also been heavily scrutinized. The movement is a vast web of conspiracy theories that grew out of anonymous posts on online message boards from “Q” claiming to have top-secret information about President Donald Trump’s plan to arrest members of the “deep state” as secret pedophiles. The theory absorbed other conspiracy theories, from UFOs to “PizzaGate,” and grew in part as Facebook and YouTube began recommending groups and videos to non-adherents. By this year, Q symbols and slogans like “Where we go one, we go all” began appearing at offline rallies.

In August, Facebook restricted QAnon groups, saying that they were part of a movement that has “demonstrated significant risks to public safety” and pledging to stop recommending QAnon groups and to stop selling ads supporting QAnon.

Almost immediately the administrators of QAnon pages began changing the letter “Q” to “Cue” in an attempt to evade detection, The New York Times reported in September.

Then, on Oct. 6, Facebook banned QAnon-related content outright; since then, it says, it has removed thousands of pages and groups.

But, on Oct. 17 — weeks after the Times reported on the Cue renaming tactic—The Markup found the Cue page up and running and directing users to its page through ads, apparently undetected.

The Cue page itself features a variety of QAnon videos, with opening clips featuring a flaming letter Q, with references to a “Storm” and “Operation Q is Real.”

Travis View, a QAnon researcher and cohost of the “QAnon Anonymous” podcast, called the videos on the page “some of the most effective videos for radicalizing new QAnon possible recruits.”

The purpose of the page is “clearly to draw people into the radicalizing propaganda,” said View. “It’s designed to encourage new recruits and form a community around the existing followers.”

The Markup reached out to the Cue page with a Facebook message and received a call from a man who said his first name was “Ashley” and identified himself as a co-founder of Cuetoob (which appears to be under construction). He told The Markup that Cuetoob is a video sharing site and the Cue Facebook page a way to “promote videos about the Q movement.”

He believes in the Q movement—he said the term “QAnon” is “the media’s version of it”—saying, “Who knows what’s going on in the real world? It’s hard to say, it might be all true but it might be totally baloney. But I think there’s something true to it.”

He denied any link between what he calls the “Q movement” to what Facebook has characterized as “violence and real world harm,” saying, “it’s not about violence or hate … I don’t promote any kind of hate whatsoever.”

As for how he evades Facebook: “We managed to skim by and it’s pure luck,” Ashley said. “We don’t post anything that’s completely obvious. Videos are a little bit harder [for Facebook to catch], since they’re a little longer, so you have to watch them.”

He added that “when people start posting the actual Q on [the page], we try to delete those items” to avoid detection by Facebook.

He described Facebook’s efforts to rein in QAnon content as counterproductive. “I think they’ve just made it bigger,” he said.

Facebook does disclose who pays for political ads on its platform—in addition to other transparency measures—but because the Cue ad was apparently not considered political by Facebook’s systems, the platform did not report its customer’s identity.

The Cue ad was targeted at people interested in Donald Trump, Eric Trump, the Heritage Foundation, or Rush Limbaugh, among others, according to data collected by the Ad Observer project at NYU.

Facebook has consistently refused to disclose the targeting choices advertisers make. Data submitted by participants in the Ad Observer project does include targeting information. Facebook has sought to shut down that project’s collection of data about Facebook ads.

The Cue ad is not the only ad targeted at Trump supporters in the past few weeks that Facebook has failed to classify properly; the NYU Ad Observer project data also includes ads from the NRA and from a New Mexico rally against coronavirus-related restrictions targeted at people “interested in Donald Trump.”

Ads about politics are supposed to be subject to additional scrutiny. Facebook also began a post-election moratorium on Nov. 4 on all ads about politics or social issues, designed to prevent the spread of election-related misinformation.

The failure to identify political ads irks Sen. Mark Warner (D-VA), who in an emailed statement called the Cue ad found by The Markup “highly divisive, misinformation-laden political advertising.”

Warner sponsored a bill, the Honest Ads Act, which would require the disclosure of targeting information in political advertising “precisely because we need greater transparency on efforts such as this.”


Originally published on themarkup.org

This article was originally published on The Markup by Jeremy B. Merrill and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top