For Political Cartoonists, the Irony Was That Fb Didn’t Acknowledge Irony

0
114
For Political Cartoonists, the Irony Was That Facebook Didn’t Recognize Irony

SAN FRANCISCO – Matt Bors has been a left-wing cartoonist on the Internet since 2013. His website, The Nib, features cartoons of him and other contributors who regularly impale right-wing movements and conservatives with ironic political commentary.

A cartoon in December targeted the Proud Boys, a right-wing extremist group. With his tongue planted firmly in his cheek, Mr. Bors called it “Boys Will Be Boys” and was a recruitment that trained new Proud Boys to be “stabby guys” and bows to “teenagers” at playing video games scream”.

Days later, Facebook sent Mr Bors a message that he had removed “Boys Will Be Boys” from his Facebook page for “advocating violence” and that he was on probation for violating his content policy.

It wasn’t the first time Facebook berated him. Last year, the company briefly picked up another Nib cartoon – an ironic criticism of former President Donald J. Trump’s pandemic response, the content of which encouraged the wearing of masks in public – to “misinformation” about the coronavirus spread. Instagram, which owns Facebook, removed one of its sardonic anti-violence cartoons in 2019 for promoting violence, according to the photo-sharing app.

What Mr. Bors encountered was the result of two opposing forces unfolding on Facebook. In recent years, the company has been more proactive in restricting certain types of political utterance by curbing contributions to marginalized extremists and calls to violence. In January, Facebook banned Mr Trump from posting on its website after inciting a crowd to storm the U.S. Capitol.

At the same time, according to misinformation researchers, Facebook struggled to identify the slipperiest and subtlest political content: satire. While satire and irony are common in everyday language, the company’s artificial intelligence systems – and even the human moderators – can struggle to distinguish them. That’s because such a discourse relies on nuances, implications, exaggerations, and parodies to get a point across.

That means Facebook sometimes misunderstood the intent of political cartoons, resulting in shutdowns. The company has confirmed that some of the wiped out cartoons – including that of Mr. Bors – were accidentally removed and later reinstalled.

“If social media companies want to take on the responsibility of finally regulating incitement, conspiracies and hate speech, they have to develop a certain literacy in relation to satire,” said the 37-year-old Bors in an interview.

Emerson T. Brooking, an Atlantic Council researcher on digital platforms, said Facebook “doesn’t have a good answer for satire because there isn’t a good answer.” Satire shows the limits of a content moderation policy and may mean a social media company needs to get more real-world in order to identify this type of language, he added.

Many of the political cartoonists whose comment was penned by Facebook were left-wing, a sign that the social network has sometimes cut off liberal voices. Conservatives have previously accused Facebook and other Internet platforms of suppressing right-wing views.

In a statement, Facebook did not respond to whether it had problems recognizing satire. Instead, the company said it made room for satirical content – but only up to a point. Posts about hate groups and extremist content are only permitted if the posts clearly condemn them or discuss them neutrally, as the risk of damage in the real world is otherwise too great.

Facebook’s efforts to moderate content on its core social network Instagram, Messenger, and WhatsApp are well documented. After the Russians tampered with the platform by spreading inflammatory items prior to the 2016 presidential election, the company recruited thousands of third-party moderators to prevent it from happening again. Sophisticated algorithms have also been developed to sift through content.

Facebook has also developed a process that only verified shoppers can buy political ads and has implemented anti-hate speech policies to restrict posts with anti-Semitic or white supremacist content.

Last year, Facebook announced it had submitted more than 2.2 million as-yet-verified political ads aimed at US users. It also cracked down on conspiracy group QAnon and the Proud Boys, removed vaccine misinformation, and displayed warnings about more than 150 million pieces of content viewed in the U.S. that were exposed by third-party fact-checkers.

But satire kept turning up as a blind spot. In 2019 and 2020, Facebook frequently dealt with far-right misinformation sites that used “satire” claims to protect their presence on the platform, Brooking said. For example, The Babylon Bee, a right-wing site, has often been traded under the guise of satire of misinformation.

“I suspect that at some point Facebook will get tired of this dance and adopt a more aggressive stance,” said Brooking.

Political cartoons that appeared in non-English speaking countries and contained sociopolitical humor and irony specific to certain regions were also difficult for Facebook to work with, misinformation researchers said.

This has led to failures among many political cartoonists. One of them is Ed Hall in North Florida, whose independent work appears regularly in North American and European newspapers.

When Prime Minister Benjamin Netanyahu said in 2019 that he would ban two women Congressmen – critics of Israel’s treatment of Palestinians – from visiting the country, Mr. Hall drew a cartoon with a sign on barbed wire that read in German: “Jews are not welcome here.” He added a line of text addressed to Mr. Netanyahu: “Hey Bibi, did you forget something?”

Mr. Hall said his intention was to draw an analogy between Mr. Netanyahu’s treatment of US officials and Nazi Germany. Facebook took down the cartoon shortly after it was published, saying it violated its standards for hate speech.

“If algorithms make these decisions based solely on words that appear in a feed, it is not a catalyst for fair or measured decisions about freedom of speech,” Hall said.

Adam Zyglis, a nationally syndicated political cartoonist for The Buffalo News, was also caught in the crosshairs of Facebook.

After the Capitol storm in January, Mr. Zyglis drew a caricature of Mr. Trump’s face on a sow’s body, depicting a number of Mr. Trump’s “followers” as piglets with MAGA hats and Confederate flags. The cartoon was a condemnation of how Mr Trump fed his followers with violent speech and hateful messages, Mr Zyglis said.

Facebook removed the cartoon promoting violence. Mr. Zyglis suspected that this was because one of the flags in the comic contained the phrase “Hang Mike Pence,” which Mr. Trump’s supporters sang over the Vice President during the riot. Another support pig was wearing a noose, an item that was also present at the event.

“Those of us who tell the truth to power are caught on the web for hate speech capture,” said Zyglis.

For Mr. Bors, who lives in Ontario, the problem with Facebook is existential. While his main source of income is paying memberships to The Nib and booking sales on his personal website, he gets most of his traffic and new readership through Facebook and Instagram.

The shutdowns that led to “strikes” against his Facebook page could change that. If he accumulates more strikes, his page could be deleted, which, according to Mr. Bors, would cut 60 percent of his readership.

“Removing people from social media can end their careers these days, so you need a process that differentiates inciting violence from satirizing the very groups that instigate,” he said.

Mr Bors said he had also heard of the Proud Boys. A group of them recently got organized on the messaging chat app Telegram to mass-report its critical cartoons to Facebook for violating the site’s community standards, he said.

“You just wake up and find that you are in danger of being shut down because your comic sparked white nationalists,” he said

Facebook sometimes recognized and corrected its mistakes after appealing, Bors said. But the back and forth and the possibility of being evicted from the site were frustrating and made him question his job, he said.

“Sometimes I think about whether a joke is worthwhile or whether it will ban us,” he said. “The problem with this is where is the limit for this kind of thinking? How will this affect my work in the long term? “

Cade Metz contributed to the coverage.