
At the World Economic Forum's 2025 Annual Meeting, during the panel discussion titled "Truth vs. Misinformation in Elections," guests engaged in an in-depth debate on online disinformation.
Image source:World Economic Forum/Faruk Pinjo
Jesus Serrano
Head of Honorary and Crisis Management, Global Communications Group, World Economic Forum
At the World Economic Forum2025During the two panel discussions at the annual conference, attendees explored the crucial balance that must be struck when addressing the issue of online misinformation.
How can we prevent misinformation while still safeguarding freedom of speech? This is a highly complex issue, making it extremely challenging to reach a consensus.
We believe that we shouldHere are the five key priorities for tackling misinformation.
Conspiracy theories, misinformation, and fake news have long since moved beyond the fringes of the internet—they’ve now seeped into mainstream public discourse, influencing elections, destabilizing markets, and even challenging the very boundaries of free speech.As the World Economic Forum2025"Global Risk Report of the Year"As revealed,For the second year in a row, misinformation has been ranked as the most severe short-term global risk, eroding trust, deepening societal divisions, and undermining governance and global collaboration. Amidst rapid technological advancements, however, regulatory responses have clearly lagged behind, allowing this risk to continue escalating.
World Economic Forum2025Two key topics at the annual meeting——“Truths and Misconceptions in Elections”And“Should the content be reviewed?”It delves into the spread of misinformation, the challenging dilemma of content censorship, and how to strike a delicate balance between democratic integrity, human rights protection, and technological innovation.Although the attendees unanimously agreed that the current trajectory is unsustainable,However, finding common ground in the complex landscape of digital platforms remains a challenge: Platforms must curb harmful content without undermining freedom of expression or deepening societal divisions.
Misinformation: A Threat to Democracy
2024This year, half of the world's population participated in elections—yet false narratives and misleading information have been shaping the global political landscape, further fueling geopolitical fragmentation.
Take Moldova as an example. The country's prime ministerDorin Recean, foreign disinformation networks throughAIA generated deepfake video interferes with the election, featuring children dressed in military uniforms standing in front of the EU flag, deliberately misleading the public.“Joining the EU is equivalent to war being imminent.”“They tried to link the EU to the war, saying‘Look at Ukraine's fate—if you want to join the EU, Moldova will follow in its footsteps.’”Dorin Receanwarned, revealing that MoldovaGDP2.5%Used to counter misinformation attacks.
A structured approach to online content governance.
Image source: World Economic Forum
Governance and Restraint: Offline Rules, Online Reality
As governments around the world begin to recognize the impact of misinformation, the European Commission’s Digital Services Act is emerging as the benchmark for discussions on accountability.“Illegal activities offline are equally illegal online,”French Minister for European AffairsJean-Noël BarrotClaiming.“The transparency of the algorithm is beyond question,”He argued.“In France, racist or homophobic speech is illegal and can be prosecuted in court—so platforms should follow the same rules online. Regarding misinformation, this legislation mandates that platforms take concrete steps to mitigate the systemic risks posed by false information. This issue is growing increasingly critical, especially given how younger generations now consume and engage with information.”
However, the Institute for Strategic DialogueSasha HavlicekWarning peopleWe shouldn't oversimplify the issue. Malicious actors are exploiting the openness of digital platforms, using botnets and fake accounts to manipulate public discourse—tactics that are virtually imperceptible to ordinary users.“Invisible”Yes. She said:“Algorithmic infrastructure amplifies whatever is most likely to grab your attention. The business model of the attention economy disproportionately magnifies the most sensational and extreme content—rather than the more moderate, balanced stuff.
Content moderation for billions of users is no easy task. It remains technically complex and politically controversial. According to the EU Commissioner for Democracy, Justice, and the Rule of LawMichael McGrathStates that, only in2024Year, the top platform conducted the review16Hundreds of millions of pieces of content.“There's no question that freedom of speech must be protected—it’s non-negotiable. However, certain measures need to be taken to strike the right balance.”He added.
Large-scale content moderation poses numerous risks. The former Executive Director of Human Rights WatchTirana HassanEmphasizing that review policies must be grounded in human rights principles to prevent their misuse.“The purpose of content censorship should be protection, not silencing. Overregulation could inadvertently pave the way for authoritarian control.”She stated,“In terms of content moderation, whether through technical tools or manual methods, proper investment can not only meet legal requirements but also align seamlessly with freedom of speech.”
She also emphasized that special attention should be given to protecting vulnerable groups online. In some cases, these individuals are deliberately manipulated and exposed, putting them at risk of physical harm or even forced displacement.
United Nations High Commissioner for Human RightsVolker TürkWarning that,“Many of these products ultimately end up being used in crisis or conflict situations, and we must take greater responsibility to ensure they are not misused to dehumanize people or incite further violence.”
Artificial Intelligence: Amplifier or Antidote?
Meanwhile, tech companies are facing a paradox.On one hand, generativeAIIt has exacerbated the spread of false and misleading content; yet, at the same time, it could also serve as an effective tool for identifying and filtering such material. A key part of the problem stems from the platform's structural design and algorithmic logic.——These systems often unexpectedly amplify the social consequences of content dissemination.
Sasha HavlicekBelieves that,Based on“Engagement First”The algorithmic mechanismTending to push controversial content, rather than real or useful information.“Driven by the attention economy, online platforms are not an environment of free speech—but rather aHighly curated content presentation spaceTech giants decide what you see based on the data they collect and their own profit objectives.”
Even more concerning, the problem isn't just with the platform itself.According to2025Edelman Trust Barometer for the Year,Global40%People expressed their agreement.“Radical action”including attacking others online or deliberately spreading misinformation—whenever such tactics can help advance the social change they support. Especially18To34Among the age group, this proportion is as high as53%This trend reveals a deepening crisis of trust: more and more individuals are resorting to methods that were once considered unethical.
Five Priority Areas for Tackling Misinformation
World Economic Forum2025These two sessions at the annual conference not onlyWe analyzed the impact of misinformation and the challenges surrounding content moderation on social media, while also outlining directions for future action. Here are five top priorities for tackling the misinformation challenge:
Using transparency as a tool:The platform should open up its content promotion mechanisms to allow external oversight.Sasha HavlicekNote:“Authoritarian regimes combat misinformation with censorship, while we must respond withComplete transparencyIn response, this means that independent research institutions should be granted access to data to assess the impact of platform systems on public discourse.”
Media Literacy Education: Education and critical thinking are the most effective defenses against misinformation.“Vaccine”The CEO of The Wall Street JournalAlmar LatourEmphasis:“Helping young people learn to distinguish between accurate and misleading information is crucial, but this kind of educationMust remain neutral, the core is teaching peopleHow to think independently”
Multi-stakeholder governance: Governments, tech companies, and civil society shouldJointly establish rules, while simultaneously safeguarding freedom of speech and public safety. Censorship should serve to protect, not suppress. Without transparency, we risk falling into another form of‘Information distortion’To replace misinformation——That is, a very small number of people decide what the public gets to see. Türk added:“We need open public spaces where discussions can be grounded in facts, evidence, and science—spaces that help us identify the policy solutions the world urgently requires.”
Ethical Technology Design: The platform should prioritizeUser SafetyPut it first, because recently50%Children may accidentally come across harmful content.“We specialize in16Users under [age] have built a network within‘Positive Energy Corner’This is a project we take extremely seriously, aimed at ensuring cybersecurity for the most vulnerable groups.”PinterestHead of Legal and Business AffairsWanji WalcottSaid.WeProtectOf the Global AllianceIain DrennanAlso emphasized the urgency of protecting minors,Call for the development of“Safety-oriented design standards and regulatory mechanisms”
A balanced content review mechanism: Efficient content moderation requiresPrecise and measuredA method that both safeguards freedom of speech and prevents harm.“Freedom of speech is not absolute. To protect the most vulnerable groups, certain restrictions must be imposed. Platforms have a responsibility to ensure that their business models, policies, and technologiesCompliant with international human rights standards”Tirana HassanPoint out.
The truth doesn’t self-censor.
In combating misinformation, journalists and civil society organizations have a critical role to play. They can counter false information through investigative reporting, fact-checking, and public education initiatives.Almar LatourIt is particularly important to emphasize that maintaining editorial independence and transparency is crucial for rebuilding public trust in the media.“Some argue that misinformation played no significant role in the past year—specifically in election-related contexts—and such a claim is utterly absurd. It’s nothing but a blatant attempt to mislead.”He added.
Open debate is the lifeblood of democracy. Rebuilding trust hinges not on reaching consensus, but on fostering an environment where diverse perspectives can coexist peacefully.The environment.The struggle for truth is not about“Victory”Instead, it’s about fostering an environment where diverse perspectives can coexist—without hostility and through rational dialogue.
The above content solely represents the author's personal views.This article is translated from the World Economic Forum's Agenda blog; the Chinese version is for reference purposes only.Feel free to share this in your WeChat Moments; please leave a comment at the end of the post or on our official account if you’d like to republish.
Editor: Wan Ruxin
The World Economic Forum is an independent and neutral platform dedicated to bringing together diverse perspectives to discuss critical global, regional, and industry-specific issues.
Follow us on Weibo, WeChat Video Channels, Douyin, and Xiaohongshu.
"World Economic Forum"