UMass Amherst expert examines the models of political troll farms, before they potentially target Western democracies

AMHERST, Mass. – A new report published by the NATO Strategic Communications Center of Excellence examines the four models of “fake news” factories employed in the political battlegrounds of the Philippines in an effort to understand this social media phenomenon as the trolls stand ready to export their services to a more global clientele, with the potential to disrupt and influence western democratic elections.

The main aim of the report, according to co-author Jonathan Corpus Ong, associate professor of global digital media in the department of communication at the University of Massachusetts Amherst, is to shed light on the variety of work arrangements of digital political trolling that have continued to “hide in plain sight” – industries and political players that have been complicit to fake news production as normalized and financially lucrative work.

“The typical troll in the Philippines is not the sad nerdy guy living in his parents’ basement, but the savvy entrepreneur hyping their digital skills and seeking both political and corporate clients,” Ong says. “This kind of troll does not need to hide in the dark web or a black market; this troll is employed in the politician's in-house team, or a PR firm or a digital marketing company.”

Ong and co-author Jason Cabañes, associate professor at De La Salle University Manila, found four organizational models of disinformation production, including three models they personally observed in their research: the in-house staff model, the advertising and public relations model and the clickbait model. The fourth model of disinformation production they examine is the state-sponsored model.

“The emergence of four disinformation work models across the spectrum of the politics-profit mix powerfully signals that fake news production is becoming ever more entrenched into the very fiber of contemporary politics,” Ong and Cabañes write.

“Confident veterans of political campaigning, chiefs of staff usually lead by example in the inhouse staff model,” they write. “The chiefs expect their staff to take on this additional trollwork, regardless of their official designation and without extra pay.”

In the advertising and public relations model, politicians and/or their private donors outsource trolling jobs to disinformation consultants for hire. In the shadows of their legitimate corporate and celebrity campaigns, they assemble a team of political disinformation producers who work together in disinformation campaigns on a per project basis.

“The most politically agnostic and commercially driven model of digital disinformation production is the clickbait model,” they write. “In the Philippines, the clickbait model is best exemplified by the case of Twinmark Media Enterprises, whose 220 Facebook pages, 73 Facebook accounts, and 29 Instagram accounts were shut down in a high-profile platform takedown right before the 2019 election season. Twinmark Enterprises’ revenue from Facebook and Google’s advertising technologies could have earned the company as much as approximately EUR 7M (PHP 400M) in four years of operations. In other words, clickbait websites are so profitable from ad tech alone that political pundits and influencers are happy to cash in from sharing their emotionally appealing but factually misleading stories.”

Finally, the authors examine the state-sponsored model and its techniques of formal intimidation and digital bullying that “lead to silencing, self-censorship and chilling effects among dissenters and the public at large.”

The authors found that reportage in the Philippines describes the state-sponsored propaganda model as assuming “intentionality from the president himself to intimidate and harass his critics. This involves him deploying the fake news label in tirades against mainstream media. His outbursts are usually a response to their unsavory reports of the government and his policies, most notably the war on drugs. His message is taken forward by his so-called ‘keyboard army,’ consisting of hyper-partisan political pundits, social media influencers and fans.”

The authors conclude by suggesting a process-oriented response to these digital misinformation campaigners, which includes increasing communication about the threat of the trolls, a push for legal reforms including campaign transparency, the enabling of fact-checking between social media platforms and media and academics, increasing transparency in the operations of social media platforms and enacting industry standards and mechanisms in the digital workplace that reward professional and ethical practice.

“Whether driven by political or commercial imperatives, the political chiefs of staff, advertising and PR consultants, and technopreneurs have come to normalize, professionalize and rationalize disinformation work,” Ong and Cabañes write in the report. “This has enabled them to downplay the political and moral consequences of what they do. This, in turn, has made it easy for them to carry on with fashioning themselves as nothing less than pioneering explorers shaping the frontierlands of digital politics. This could very well be feeding their desire to take the next step and go global.”

The production of the report was managed by Sebastian Bay, senior expert of the Technical and Scientific Development Branch at NATO StratCom Centre of Excellence Riga, Latvia, who presented the report at a launch event of various studies of digital disinformation on Dec. 6 in Riga. The complete study, “Four Work Models of Political Trolling in the Philippines,” is available at https://stratcomcoe.org/four-work-models-political-trolling-philippines.