Industries in Depth

Tackling disinformation - how can we combat the lies that go viral?

Tackling Disinformation - screenshot

Tackling Disinformation - a screenshot from the session Image: WEF - screenshot

Robin Pomeroy
Podcast Editor, World Economic Forum
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Industries in Depth?
The Big Picture
Explore and monitor how Media, Entertainment and Sport is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Media, Entertainment and Sport

Loading...
  • Disinformation is creating a polluted information environment that individuals and news organizations are still learning to navigate.
  • Social media platforms have a central but complex role in addressing the problem.
  • Restoring trust in traditional media outlets and institutions is also key.
  • Subscribe to the Agenda Dialogues podcast here.

Disinformation is not new. Examples of disinformation and so-called fake news campaigns are plentiful. But with increasing fears about the cost of living – exacerbated by the pandemic and the energy crisis – it is now more critical than ever to tackle disinformation head-on.

This podcast contains the audio from an Agenda Dialogue discussion at the Sustainable Development Impact Meetings 2022 on how the public, regulators and social media companies can collaborate to increase online safety.

Participants:

Adrian Monck, Managing Director, World Economic Forum Geneva (moderator)

Melissa Fleming, Under-Secretary-General for Global Communications, United Nations

Rachel Smolkin, Senior Vice-President, Global News, CNN Digital Worldwide

Claire Wardle, Professor, Brown University School of Public Health

You can read a transcript below, and watch the video here:

Loading...

Transcript

This transcript has been generated using speech recognition software and may contain errors. Please check its accuracy against the audio.

Adrian Monck, World Economic Forum: Hello and welcome from Manhattan. I'm Adrian Monck at the World Economic Forum. and this session is Tackling Disinformation, part of our Sustainable Development Impact Meetings 2022 . Thanks for joining us.

I'm delighted to say that for this panel, we have a fantastic group of experts with me here in the studio. We have Melissa Fleming, Under-Secretary-General for Communications at the United Nations, formerly in charge of all the comms for UNHCR and a very distinguished communicator and podcaster. I'm also joined down the line by Rachel Smolkin, who is Senior Vice President at CNN, in charge of the network's digital output. And one of the leading experts in disinformation, Professor at Brown University Claire Wardle. And Claire also advises many organisations on what they should do to combat disinformation.

Navigating the polluted information environment

But to start our session, Melissa, I wanted to turn to you. When you took over in 2019 running communications for the United Nations, I imagine you had on your agenda the Sustainable Development Goals, a lot of the big global issues that the UN tackles. I'm guessing that you probably didn't have on the list of things that you'd be looking at fighting global disinformation. How did that become part of your mission?

Melissa Fleming, UN Under-Secretary-General for Global Communications: Yeah, you're right. It was certainly kind of bubbling in the background as a phenomenon, but it really exploded with COVID-19.

But I think it exploded also in our own awareness of the phenomenon and the problem. And because, you know, as long as the social media platforms had become so dominant, there was already a proliferation of mis- and disinformation that was making achieving what we were trying to achieve - a better world, and a more inclusive, a more peaceful and harmonious world - it was making it more difficult.

With COVID-19, we realized very quickly we were in a communication crisis unlike any that we had ever been in before.

Melissa Fleming, UN Under-Secretary-General for Global Communications

But with COVID-19, we realized very quickly we were in a communication crisis unlike any that we had ever been in before, because this was a novel pandemic and we were asking the public around the world to do things that they were very uncomfortable doing. And there was also so little known. We know that communicating science, from just looking back at communicating vaccines or communicating all kinds of science, is hard because, you know, it's not black and white. It's nuanced. And in this case, the virus was changing.

We're used to, as public health institutions or the UN or WHO, putting out kind of press releases or dry documents in the form of PDFs. And meanwhile, very emotive content was going out, people expressing their fears. And, you know, people who are very active in the anti-vaccine scene and others were seizing the opportunity of people being so afraid as well and, you know, injecting disinformation and misleading information fuelled also by some leaders and governments.

Loading...

So it was this, it was a cacophony of information. WHO called it an infodemic, which meant it was, you know, if you were a user and you were trying to search, you were just confused because there was so much information. Some of it good. Some of it 'meh', and some of it really, really bad.

So, yes, we started to study the phenomenon, and I know Claire Wardle was very much seized with this as well and helped to advise us. So maybe she can comment and then we can speak a little bit about what we did and how we kind of, you know, changed course and beefed up our operations to try to address it.

Adrian Monck: Yeah, I'd love to hear a bit more about that. But Claire, I mean, you've been studying disinformation for probably longer than almost anyone in the academic and communications world. Give us a sense of how big a problem we're facing. This is something that a lot of people in the media talk about. It's something that you hear about on shows dedicated to examining the media world. But is it something that really everyday citizens ought to be concerned about or is it something that's small enough to be contained, and we should put some perspective to it?

Claire Wardle: It is something that people should be concerned about. And I often say we can't put everything back into a box, that fundamentally our information environment will always be polluted in different ways, and that actually as citizens, we have to learn how to navigate polluted information environments. And like pollution, some of it is by people who are trying to make money and they don't care. They're polluting the environment. Some people are doing it for political gain.

So disinformation, which is people deliberately creating false information to cause harm. The amount of people doing that is relatively small, but they're pretty good at it because they are learning and they adapt and they evolve.

The bigger problem is the fact that many citizens, all of us as humans, are susceptible. So as Melissa said, one of the reasons that the pandemic was such a difficult time was because we were all fearful. All of our lives were turned upside down. And so when as humans, we are frightened, we are scared, we are unsure, our critical functions don't work as effectively. And so what that means is that we're all sharing information with each other, often trying to be helpful, but not understanding that actually it's false or misleading and it's going to lead to harm.

So this panel is called Tackling Disinformation, but really it's all forms of polluted information that we should be aware of and taking different steps, you know, to stop Russian actors who are deliberately trying to destabilize the country. We need to take certain actions there to prevent Uncle Bob from sharing misleading information at the Thanksgiving table — it's a different set of responses. So it's all of these different things that we need to learn and understand in order for us to try and mitigate the harms of polluted information.

Adrian Monck: Thanks for that. And I want to bring Rachel in here. Rachel, you have a huge portfolio at CNN, but, you know, you're in a position where CNN is both an organization that's trying to make sense of the world and trying to establish the facts. It's also part of a political war on who owns the narrative. And you're also now facing these folks that Claire just alluded to who are in a professional business of fabricating disinformation in order to make your life even harder.

How do you and your colleagues go about looking at that disinformation environment? Is this something that you've become used to, or is it something that you're still learning to kind of navigate and learning to kind of make your way through?

It becomes very complicated when there is not an agreed upon set of facts and a single narrative. There are the facts. And in the media it's our job to continue to point those out, no matter what the politicisation around it is.

Rachel Smolkin, SVP Global News, CNN Digital Worldwide

Rachel Smolkin, Rachel Smolkin, SVP Global News, CNN Digital Worldwide: Claire did such a good job with the framing of this as polluted information. I find that a really helpful way to think about it, because as we navigate this environment, the information being put out that is deliberately wrong is all mixed together with information that is wrong, misleading, dangerous, no matter what the original intent of the information is.

So we are navigating this in different ways and in different spaces. We've been very much in the space around the 2020 US presidential election, the claims of a stolen election, the false claims, the whole 'stop the steal' movement that we've been really pressing to give audiences the facts, because the strength of our democracy is in our institutions and the public trust in the institutions. It becomes very complicated when there is not an agreed upon set of facts and a single narrative. There are the facts. And in the media it's our job to continue to point those out no matter what the politicisation around it is.

The other challenge is, we're no longer in one confined space. This is not just happening in the political space. It has been a huge issue during the pandemic. We've seen it a lot around the vaccines and false claims and misinformation about the safety of COVID vaccines that can scare people off. There are real health ramifications for this and it can be quite dangerous to people who are taking in that polluted information, whether it is disinformation or misinformation.

We're seeing it in the abortion space now. Also a very potentially dangerous space for women who are being told things on the internet that are just not factually correct, not scientifically sound. We've seen it in the Russian-Ukraine war, with Russia spreading disinformation and then governments such as China picking that up. So we are really seeing it in so many different spaces and need to think about how best to get the facts out to audiences and how to hold actors accountable and call out what's wrong without further spreading it ourselves. So that is always a balance that we have to be mindful of.

Adrian Monck:. I want to come back to you, Melissa, because you talked at the beginning about how you'd started to kind of tackle this. What was the toolkit that you came up with to try and start to, you know, detoxify, or de-pollute, this kind of information sphere?

Melissa Fleming: Travel to where the disinformation also travels. Claire has spoken about data gaps. We need to find where people are searching and get there first, but not with a, you know, kind of boring 50-page document, but content that is produced in an engaging form, that travels well digitally and works on social media and also in languages.

We deployed our country offices all over the world to take the basic messaging that really didn't change that much on health guidance and on the efficacy and the safety of the vaccines, and produce content in such a way that it is locally relevant and that it travels in digital spaces, but it is also in languages that people understand and in contexts that make sense.

So we really took a lot of guidance from our local teams on what was trending there. The central messaging from Geneva or from New York isn't going to work for everyone.

Another really key strategy that we had was to deploy influencers, influencers who were really keen, who have huge followings but really keen to help carry messages that were going to serve their communities. And they were much more trusted than the United Nations telling them something from New York City headquarters.

And finally, we had another trusted messenger project, which was called Team Halo, where we trained scientists around the world and some doctors on TikTok. And we had TikTok working with us. And these scientists who had virtually no following to start with, got verified ticks. They started bringing people in their community into their labs, into their offices, and answering their questions, engaging with them. It really took off and many of them became kind of like national media go-to advisers.

It was a layered deployment of ideas and tactics.

But finally, and Claire also mentioned this, people need to be inoculated themselves. And I think, social media took off so quickly that I think people of all ages are very ill-equipped, especially in times of crisis, when they're feeling very engaged with what's out there, and searching, and wanting to help and wanting to share, really learning actually how to spot mis- and disinformation and how not to be part of the problem.

Finally, finally, and none of us have mentioned this yet, we really think the platforms bear a huge accountability, responsibility, much more than they're doing. They did step up. They did provide, you know, ad credits. They did take down quite a bit. But the phenomenon was still exploding on their platforms and still is. And I'm just talking about COVID. I mean, Rachel talked also about conflict. There's the Ukraine war. We're seeing the phenomenon of hate speech that is making wars worse, that is actually fuelling conflicts. And these are all phenomena that always existed. It's just they now have a distribution possibility that is so much more powerful than the other means they had before the digital age.

Adrian Monck: And that's really interesting because, for people who don't understand how we got into this situation, perhaps we need to just take a step back.

When social media and internet communications started, it wasn't the situation that you had when the printing press rolled off, you know, when the founding fathers in the United States were battling with people publishing all kinds of scurrilous rags and libel laws and other kinds of free speech things came into being.

They were basically given a dispensation to say that anything that appears on your platform - that word came to be - not publication, but platform - is not content that you're responsible for. And that has been a kind of fundamental factor in the growth of these platforms, because that's not true, is it, Rachel, for CNN? You know, if you put out a news report, you can't turn around to the world and say, well, sorry, you know, don't sue us — nothing to do with the editorial process here at CNN, it just happens to be unlucky, someone's opinion.

So Claire, can you just take us back a little bit? And maybe, Rachel, also just give us an idea of some of the ways that you actually professionally manage the information and the checks that you guys do. Because I think it's quite important for people to understand both of those sides of it. There is a real process here and there's also some real structure to why we are where we are. Claire.

Social media platforms: publishers or pipelines?

What we're seeing is this technology has also allowed all sorts of magic to happen. And that's what we have to balance — the horrible side of the internet with the joyous side of the internet.

Claire, Professor, Brown University

Claire Wardle: I would just say is that we all got one of these [a smartphone], not that long ago. I think the iPhone was 2007, and none of us got a crash course. We didn't do a driving license in using a phone and we got all excited about it. But fundamentally, I now have the same power as Rachel. I can create something right now, and if it's amazing, it might have a bigger reach than CNN, it would have to be very good. But nobody was taught how to have that level of responsibility. We didn't talk about it as publishing. We talked about it as posting it and share a status update. We didn't say you have the ability to share information and if it's false, it can really cause harm.

What you're getting at, Adrian, is that the fundamental change with the internet and this idea that they were just kind of the communications pipelines and there was no responsibility. It is hard to wrap our head around why doesn't Mark Zuckerberg take responsibility, and he should do. But this is hard because also the absence of gatekeeping that we have on the internet has also allowed all sorts of voices to be heard and to flourish and movements to develop that in an age of gatekeeping we didn't necessarily hear from.

So what we're going through right now, and you're right when you talk about the printing press, that is a revolution of the same scale. And we are, as a set of societies, trying to get through this period of adjustment, of what does it mean when everybody has a mouthpiece? What are the norms that make that we can do this in a way that doesn't cause harm, etcetera, etcetera?

And there are calls for change in what's known as Section 230, which means that platforms have to take responsibility for that. But I do worry that if we if we kind of have a kneejerk reaction to that, what kind of speech then gets chilled or what kind of speech don't we hear? How do we moderate that kind of scale of speech?

So I'm not trying to say the platforms don't need to do more. They absolutely do. But what we're seeing is this technology has also allowed all sorts of magic to happen. And that's what we have to balance — the horrible side of the internet with the joyous side of the internet, which, to be fair, is human nature. And we're trying to get through this period now.

And that's why, you know, how many of these conversations do we keep having? Because it's hard and it's complex and it's nuanced. And I think that's what we're trying to balance: the human element of speech and communication with the technical abilities of the kind of iPhones or computers of the internet. That's why it feels so hard right now, because there's no easy pathway through it and we're figuring it out as we go.

Adrian Monck: And Rachel, Claire said there, you know, everyone's got a phone you can just post. I mean as a journalist at CNN, you can't just hop on to Twitter and put an opinion out or share something that you're not really sure if it's correct. You know, you actually have processes in place, you have editorial processes that go into making sure that what you put on air and what you share digitally is checked and is researched and is stuff that you will stand by. Because unlike platforms, you're under an obligation, aren't you, to make sure that you've gone through a thorough process of verifying what you broadcast and what you put out.

Rachel Smolkin: Yes, absolutely. There are many levels of vetting, from checking the information itself. It starts with the reporters, but there are many layers within the organisation to vet that to double check. Yes, we heard this from one person, but have we gone to this other source, this other person who might give us a more detailed understanding or might tell us that the information we have is wrong? Have we checked it? Have we thought about it from this other angle, from this other perspective? We have layers of editors. We have leaders who are experts in the areas who think through the information, who connect it to other pieces we've done. We have people in the organisation who look specifically at our standards and whether the reporting is meeting our standards.

So there are many layers of people who look at these things within CNN, really working to get it right for the exact reason you're saying, which is once we put it out, we know we have a powerful platform and we have a responsibility to serve our audiences with the most accurate information we can give them. When there is an error, I think that part of the process is also very important, to tell audiences exactly what we got wrong and fix that transparently. So we try to make sure we're very rarely in that space, but if we find ourselves in it, it's an important piece of accountability as well, to be always as clear with our audiences as we can.

And Claire nailed it by saying there are many sides to the internet. We started this discussion by talking about how people were so anxious during COVID and looking for an outlet to share that, to let that out. And that's important, too, as a way to bring those voices out and bring them together. It's just incumbent on all of us to have platforms, to share information, that we make sure when there is information that goes to people's health, to their safety, to their understanding of and belief in the fundamental institutions of our government, that that information is accurate and correct.

Free speech and the media's checks and balances

Adrian Monck: Thanks, Rachel. Claire and Melissa, I just want to come back to both of you on this, because you talked Claire about the joy of the internet and the fact that we do hear so many voices, but we've learnt quite a lot, haven't we, in the decades and hundreds of years of history of information and journalism, which is that you do need to have checks. You can't just shout fire in a crowded room, that you do need to kind of make sure that you've gone through processes when you're putting out information that could be detrimental to people's health, their well-being, their reputation, all of these things. And yet, those lessons seem to have been put to one side entirely in the current situation, and we're all groping a little bit in the dark.

Does any of that need to be revisited now? Because we keep having these conversations as if something called the history of journalism and the history of newspapers and television and everything and radio doesn't exist, that the internet is so different and so new. And yet what you're describing seems to be very old, which is a problem of people sharing made-up stuff, or even worse, people making up stuff deliberately to undermine other people and having that shared.

Claire Wardle: I'd say, Adrian, you know, we're both British, we both have lived in a country with terrible tabloid newspapers. I mean, there are very good news outlets and there are news outlets that are pushing disinformation. There are very good politicians and there are politicians that are really pushing disinformation. So, yes, platforms are absolutely part of the problem, but we can't ignore the full information ecosystem.

Rachel just did a great job of explaining all of the checks and balances. Many people have no sense of all of those checks and balances in the newsroom. They have this idea that Rachel has a thought and she just puts it out. She just happens to work at CNN. And she's a mouthpiece of the liberal elite. No.

Unfortunately, we're seeing trust in institutions decline because we haven't done a very good job of explaining exactly what Rachel said, which is what Rachel puts out on CNN is fundamentally different to what my best friend from high school decides to post based on their own experiences and their own reading of a scientific journal article. But they haven't done a research methods class and they're drawing the wrong conclusions from it.

We're all in this space. We all have the power to publish. And so a lot of this goes back to teaching, teaching people to understand how to navigate this world and to think critically about all forms of information, whether it comes from Facebook or whether it comes from a news provider. Is that news provider doing the kind of checks that you would expect so that you can trust that you're reading or consuming credible information? So that's the problem. It's so many elements. I talk about pollution, it's all sorts of pollution coming from all sorts of directions. And so we have to be a bit careful that we're not like, "oh, news is the answer." When unfortunately globally there's all sorts of examples where news is part of the problem.

Melissa Fleming: But I would like to say there is a crisis in public interest media and particularly in countries, I mean, there is here, if you look at the demise of local news and local newspapers. In many countries around the world, in developing countries, I mean, there really is almost an extinction threat of the kind of media that would be that kind of check and balance out there.

And so then Facebook becomes the internet and affiliated, unaccountable portals or kind of fake news organisations spring up to kind of fill that gap and fill that space. So I do think I agree with Claire that we have a polluted information ecosystem that has many parts and many players. But I do think that also the demise of public interest media and the rise of digital alternatives has been dangerous.

I mean, we've seen it in the most egregious forms. For example, in Myanmar. I think it's the case that cited the most where, you know, everybody got their cell phones, as Claire put up, and then everybody got Facebook loaded on to their cell phones. And that was their way of entering this incredible new world without any education on how to navigate. And then a government that made a decision to dehumanise a whole sector of the population, the Rohingya, in such a way that it gave license to kill, license to drive out 700,000 people. And with almost no moderation on the part of Facebook happening. No real realisation that this was going on on the platform.

So it is an ecosystem. I agree with Claire, it is very complicated. I think it needs to be looked at country by country and addressed in so many different ways: through education, through the bolstering of the kind of media that is going to provide factual, good reporting so that, you know, people have news sources they can trust.

And then, on the part of the platform to be more generous with their moderation capacities in countries that are very fragile. And where we're seeing, I just recently visited Bosnia and Herzegovina, and there is a proliferation of denial of the Srebrenica genocide and glorification of war criminals. People there were saying this is to a point where we fear spiralling back into war. And this is driven by this speech which is traveling online, kind of uncontrolled. So anyway, I could go on and on about all of the phenomenon we're seeing.

And finally, you know, our peacekeepers around the world were recently surveyed and almost half of them found that mis- and disinformation is a real problem for them in keeping the peace. Now, it's true what Claire said. You know, in some places this could also be traveling on radio, for example. So it's not just on social media platforms, but I do think that our information ecosystem is a real problem if we want a more stable, peaceful, harmonious and united world.

Adrian Monck: That's really interesting. And I think a couple of things that I'd love to get the views of all three of you.

One is the professionalisation of disinformation, which we've seen in the 2010s, really from the experience of Russian denials of involvement in the downing of an airliner carrying hundreds of passengers from Holland and Asia; to the chemical weapon use or non-use in Syria. And right on until COVID, where we saw what looked like state-sponsored actors engaged in that, and even the creation of these kind of front television stations like Russia Today channeling and broadcasting conspiracies and other kinds of information and acting as a kind of amplification for them.

I want to talk about that. But also, you know, the other side of it is the fact that these platforms are based and have their origin in the US. They come from a very specific place, which is a place of an unrestricted battlefield of speech. And that kind of unrestricted battlefield doesn't exist everywhere. And we know why for quite good reasons. You know, in Germany, for example, there are rules and regulations about what you can say in relation to the Third Reich, the Holocaust, those kinds of things. In the UK, with the huge issues that the UK went through in the seventies and eighties, there is defence against racist speech and hate speech.

And so you have got restrictions on what people can say and how they say it. And I wonder what your sense is, the three of you, in terms of (a) are there lessons to be learnt from some of those measures, or is it the case that the US example is the kind of absolute purest, purest example needs to be replicated everywhere? And also, if you're dealing with professional disinformation, how do you counter that?

So two things I want to look at. But maybe start with the US background because Claire, you come from both sides. You grew up in Britain but you work in America. What's your view on the unmitigated right to say whatever you please, wherever you please?

Claire Wardle: I have to say, there are many things about the First Amendment that are very, very special. But I do feel that it stops nuanced conversations about speech. And there's this idea of, well, it's a marketplace of ideas. More speech is good speech. But the truth is the algorithms are not unbiased. So it's not that every piece of speech is equally weighted. There are certain types of speech that tends to be more emotive and tends to be from certain people that gets more space.

So my frustration is I wish we could talk more about harm when it comes to speech. So people say, "Well, misinformation, it's really legal speech. You know, we know terrorist content, child sexual abuse imagery. We know what to do about that, that's illegal speech. But lots of these examples, Claire, well that's legal speech." And I keep saying: Well, it might be legal, but if it's leading to harm, can't we actually have a conversation about that?

And I think the examples you use, Adrian, is that there are very strong examples of when speech led to very serious harm. And my worry is that we don't think about this problem in a longitudinal way. There was a wonderful New York Times documentary where they actually went back and found footage of KGB spies in the 1980s. And one of them says: "It is like drops of water on a rock. One drop of water doesn't cause any harm, but continuous drops of water will splinter the work into thousands of pieces, and that's what we're trying to do with the US." Now, they said it in the 1980s and you could argue 40 years later they're really starting to see that happen.

So my fear, when it comes to your point, Adrian, is that people say: "Oh, the First Amendment, what kind of harm is this causing?" Well what does this kind of low-level, conspiratorial, hateful, misogynistic content, that doesn't break platform guidelines, over time, where is that leading us? So I just wish we could have a more nuanced conversation about speech, because I worry that this idea of more speech is good speech — that's not really the case. And if you talk to people of colour or women, their experiences on the internet look very different to probably your experience, Adrian. And so this idea that all speech is equal is not true. And I wish we could just have that conversation properly and talk about the long-term impacts of different types of speech.

Adrian Monck: Rachel, CNN is a global news provider, and you operate in many different markets with many different types of regulation. What's your perspective on that issue about the primacy, if you like, of the First Amendment in terms of the global speech environment?

Rachel Smolkin: You know, we do operate globally. And so that means for different countries there are different guidelines or rules. We see that in particular around things like elections where they're handled very differently from place to place.

I think again, to Claire's point, the discussion to me here is less about can somebody say something whether that's an ugly offensive comment. I mean, yes, in the US in most cases they can, but I think the issue is more about how it is handled. And that brings us back to the platforms, from discussions about algorithms to what the platforms are allowing. Yes, somebody can stand up and make a comment. That doesn't mean the comment needs to be shared on a platform, and the platform will set those guidelines.

There was just a study yesterday that we wrote about on CNN. It was a NewsGuard study of TikTok that found that in searches for basic information about news stories, nearly 20% of videos contain search results with misinformation. And that was on everything from the 2020 US presidential election to the Russia-Ukraine war, to misinformation about abortion.

So there is an issue where a platform is acting as a provider of information and a young audience is coming to that looking for information, perhaps not equipped to sort through, not media literate enough to sort through, what is and is incorrect. And this is what they're finding.

So to me, the discussion really has to stay in that space, not so much what can be said and not can be said, but how are we handling the information and what are we putting out? What is getting promoted, what is rising to the top? So that when users and particularly young users are searching for it, what are they coming across?

Adrian Monck: I think that's a really important point. And Melissa, I mean, to some extent, what you've just heard from Rachel there about the speed at which users interact with platforms is so much faster even than the platforms themselves can manage or understand.

It seems that what we saw at the beginning of these platforms, there was a lot of hope about global communities, about everyone having a chance to share lovely pictures of their families and friends, and none of the kind of discussion about the darker side of what could happen. And TikTok is a new platform famous for dancing videos and showdowns of people singing and that kind of stuff. But it's also very susceptible to exactly what you've been talking about in the disinformation, misinformation space. So how do we balance that thing of seeing these platforms suddenly emerge from nowhere and get the users who, some of them with bad faith, bad intention and sometimes state-sponsored, jumping in with this kind of bad content? How do we deal with that, and how do you deal with that at the UN?

Melissa Fleming: I agree with everything Claire and Rachel said about the phenomenon and also that, I can't remember the exact statistic, but it's an astonishing number of young people who get their news from TikTok and no other place. So the responsibility then, with that knowledge, that TikTok has is huge. You know, it's even more. And if there is that much mis- and disinformation traveling on the platform, obviously they need to do more to address it, but also to educate.

But I do think we as news organisations, we as institutions also have a continued responsibility to inform the world about the state of our world, to guide the public.

Like for example, at the UN I was astonished to learn from my social media colleagues that we fall under a category called civic institutions, which means our starting point, we're down ranked. So our starting point is down here. Whereas, Joe Conspiracy-Theorist can start here.

And so Facebook tries to address this by giving us ad credits so that we can then come back and be at the same place that 'whoever wants to say anything' person is. But it is an algorithmic shift that was deliberately taken to favour individuals over institutions. And the institutions who are there to serve the public for good are at a disadvantage.

We also, though, have to get better at communicating in these spaces. And I think the humans who are running our governments, our public health institutions also need to be more human in their communications, because that's what functions well on social media channels.

So it is educating, it's hopefully elevating the content. You know, we partnered with Google, for example, if you Google climate change, you will, at the top of your search, get all kinds of UN resources. We started this partnership when we were shocked to see that when we Googled climate change, we were getting incredibly distorted information right at the top.

So we're becoming much more proactive. You know, we own the science and we think that the world should know it and the platforms themselves also do. But again, it's a huge, huge challenge that I think all sectors of society need to be very active in.

Rebuilding trust and rethinking the media

Adrian Monck: It's been a really interesting discussion. I want to bring it to a close with just asking each of you: to some people watching this, they're going to be saying: "Well, hang on a second, you guys, you're experts, you're institutional, you're mainstream media, you know, you're the people that I'm doing my research to kind of go around, because I don't trust you to deliver on what you say you deliver. You've got an agenda. You're part of these institutions that I didn't vote for. I didn't choose. I didn't pick. I didn't have a say in the editorial policy at CNN. I didn't get to pick the faculty at Brown or vote for the UN leadership or the World Economic Forum."

How do you get to people like that and say: "Look, you know, you possibly might want to check a little harder on what you're looking at, or you might want to think again"?

Or have we lost some people to this debate, some people kind of that's it, they're gone. Well, Claire, you've been doing this probably for longer than anybody. What's your kind of sense check?

Claire Wardle: Yeah. I don't think we can just say to people, "trust us" anymore. Because if you look at the disinformation ecosystem, it's actually really participatory. People feel part of something. They feel like they have agency, they feel heard.

Our ecosystem that we all live in is still pretty top-down. Like, you know, CNN might tell me, tune in at 11 or read this link. Or, as you say, Melissa might publish a PDF, and assume that I'm just going to read it and trust it. We on our side need to understand how can we listen more effectively? How can we be more representative in our newsrooms and faculty of many, many people who feel like they're not seen and they're not represented?

But, again, we're not going to come out of this quickly. So we just have to start on a process and say, how do we build back that trust? We are going to have to make people feel like they're part of something. At the moment they don't, but the other side makes them feel part of something, and that's why they're succeeding.

Adrian Monck: And Rachel, what's your take on that? I mean, do you think we need to rebuild trust? Do we need to rebuild communities, the people that we engage with? How is CNN looking at this?

Rachel Smolkin: We need to do both. Trust is earned. It's a huge responsibility and a privilege to do what we do every day. And we're very mindful of that, of trying to get it right, of trying to serve our audiences, thinking about what information they need and looking for and making sure we provide that in different forms, of bringing in different voices and thinking about different communities and how we reach different people in different places.

News is consumed very differently than it used to be. So are we thinking about reaching people in new ways, in new places or their communities or people's voices that we are overlooking that we can do a better job incorporating into our coverage? These are discussions we have every single day in the newsroom, and we need to keep having them. They're crucial.

Adrian Monck: And Melissa, you've probably been on the frontlines of this in the last three years in ways you probably never expected. How have you look to this issue? Do you think we've lost some people to the conspiracy sphere, to the disinformation dispensers, or are there things we can do to bring people back?

There is a hunger to be a part of something that is not conspiratorial, that is not hateful, that is not divisive, but that is working towards making the world a better place.

Melissa Fleming, Under-Secretary-General for Communications, United Nations

Melissa Fleming: I think there's certain people who've totally got lost down rabbit holes and they're going to hopefully find their way out at some point. But I do think there are all kinds of people in the middle.

And there is evidence that people are feeling really overwhelmed. They're feeling so much gloom and doom from the news environment as well, even the responsible news environment.

And so I think we also have a potential, and we're seeing it on our social media channels, the U.N. has millions of followers, and we put out a lot of messages that are really, you know, they're positive, they're hopeful, they give people agency. They give people the chance to engage, to take climate action, to sign on to initiatives. And they're taking part.

So I think there is a hunger to be a part of something that is not conspiratorial, that is not hateful, that is not divisive, but that is working towards making the world a better place. There is a lot of positivity to be had. And I think we just need to pull people together to provide that kind of incentive and context, in combination with all of the other tools that we discussed here today.

Adrian Monck: Thanks so much. Well, I hope for those of you watching, you haven't come away with too much 'zoom and gloom' from our discussion. A big thank you from my side to Claire Wardle, to Rachel and to Melissa and to all of you for joining for this session.

Check out all our podcasts on wef.ch/podcasts:

Catch up on all the action from the World Economic Forum’s Sustainable Development Impact Meeting 2022 at https://www.weforum.org/events/sustainable-development-impact-meetings-2022 and across social media, using the hashtag #SDIM22.

If you liked this, try these:

Loading...
Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why having low-carbon buildings also makes financial sense

Guy Grainger

September 18, 2024

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum