麻豆社

How not to amplify bad ideas

Amplification is the idea that journalists bring undue attention to things that are fringe, extreme but very harmful. So how can we avoid it?

Watch 'How not to amplify bad ideas'

Michael Wendling is the Editor of 麻豆社 Trending and the author of Alt-Right: From 4chan to the White House. Throughout the Covid-19 pandemic he has been leading a team of 麻豆社 journalists debunking disinformation and conspiracy theories.

鈥淚t鈥檚 interesting,鈥 the editor says, 鈥渂ut is there a risk of 鈥 amplification?鈥

I鈥檝e heard this sentiment dozens of times while working in the murky marshes of conspiracy theories, extremism and the internet鈥檚 fringes. And over the past year such questions have become increasingly relevant during the 鈥渋nfodemic鈥 that鈥檚 run in parallel to the pandemic.

Amplification is the idea that we are bringing attention 鈥 鈥渢he oxygen of publicity鈥 鈥 to ideas that are fringe, extreme, sometimes compelling, but very harmful. And the stark answer to that standard editor鈥檚 question is this: not only is there a risk of amplification, there鈥檚 a nailed-on, guaranteed certainty that in covering a conspiracy theory, we鈥檙e amplifying it.

There鈥檚 just no way to ensure that a debunk of a falsehood only reaches the people who鈥檝e already seen the falsehood. And no matter how much clarity and how many provable facts we pack into a story, a small proportion of our substantial audience will always take away with them the lie we鈥檙e stripping down rather than the truth we鈥檙e building up.

So by that logic, we should never cover disinformation or conspiracy theories, right? Not so fast.

To begin with, let鈥檚 not consider amplification in isolation. To do so would lead us down a strange and contorted path. Governments, paramilitaries, criminal organisations and terror groups use lies and disinformation 鈥 should we not report on them? You can see the madness that lies that way.

Instead let鈥檚 set amplification against the various merits of reporting on all sorts of unsavoury stuff. These including exposing disinformation networks, showing the damage they do, and educating people in how to spot bad information themselves.

If weighing up all these factors seems daunting 鈥 well, it is. Fortunately we have a model. At 麻豆社 Trending, a World Service programme and multimedia team established to report on and investigate the politics and culture of social media, we鈥檝e been covering social media forces and the fringes of internet culture for years. And the principles we apply to story selection can also help us deal more generally with conspiracy theories and disinformation 鈥 which, not coincidentally, come mostly from the internet.

Long ago at Trending, we developed a three-part test to figure out whether a story is worth covering. Is it popular - in other words, is it actually trending? Is it worthy of our attention as journalists 鈥 can we separate out the meaningful stuff? And can we add something to the sum of knowledge, rather than just repeating stories or rehashing online arguments?

The tools we use give us an easy way to answer the first question. We look at whether a piece of disinformation is being widely shared and repeated, not just on one social network, but across platforms. When audience members 鈥 as they frequently do 鈥 start to ask us about a particular story or viral post, alarm bells start to ring.

When it comes to online falsehoods, our second question takes a slightly different form: 鈥淒oes the danger posed make it worth our time to tackle this story?鈥

Suppose (to give an entirely made-up example) a rumour was circulating saying that drawing red X鈥檚 on your hands would protect you from rickets.

Yes, this falsehood might give people who believe it a false sense of security, and yes, I suppose the ink might have hidden toxins 鈥 but in general something like that would seem on the very low end of the harm spectrum.

By contrast, consider a conspiracy theory that became popular last spring - that coronavirus 鈥渄oes not exist鈥 and instead the symptoms being blamed on it were somehow caused by 5G mobile phone masts. Not only did the people who buy into this ignore vital safety measures, a number of them attacked phone masts and telecommunications works, resulting in dozens of arsons and assaults. Real-world effects make a difference.

The third question is at least equally important and gets to the heart of the mitigation against amplification. We must do more than simply repeat a claim. At the very least, we investigate and provide facts 鈥 new information and original research. We try to show our work 鈥 we know this builds trust.

And there are other, more ambitious goals, depending on the nature of the story. Can we describe the networks that spread bad information and draw some conclusions about their motivations? Can we describe the real-life effects of disinformation, and help our audience to determine fact from fiction for themselves?

One last point on language. Every so often we get irritated audience members (or usually, anonymous accounts on Twitter) asking who the hell we think we are, classifying particular ideas as 鈥渄isinformation鈥 or 鈥渃onspiracy theories鈥.

Many hold an odd, hyper-relativistic worldview - the idea that all known opinions, no matter how extreme or unfactual, are worthy of equal time and should be simply presented without comment or judgement so that people can 鈥渕ake up their own minds鈥.

They don鈥檛 suggest how people are actually supposed to do this 鈥 but more to the point, they鈥檙e usually just lashing out at us for pointing out what鈥檚 wrong with falsehoods they fervently believe in.

More relevant are concerns closer to home, that labelling something as a conspiracy theory or disinformation might somehow compromise our impartiality.

But the worlds we journalists reporting on the underbelly of the internet describe are very different environments than those inhabited by many of our trusted colleagues. When we cover disinformation we鈥檙e not talking about political debate or scientific uncertainty. These are not conventional flows of information and mainstream political actors who obey reason and democratic norms.

And while they open a whole host of legitimate and enormous political questions 鈥 for starters, how should social media companies be regulated? - our work in this area is something else altogether.

Ultimately, it鈥檚 about separating truth from fiction. Isn鈥檛 that what journalism is about?