Democracy needs information integrity

Democracy without trust: what happens when the public sphere collapses?
While the institutional framework of democracy, such as elections and parliaments, remains formally intact, a deeper erosion is occurring to democracy across the globe. The core foundations of a democratic society, based on trust, transparency, and a shared understanding of facts, are visibly weakening in the age of information technology. Societies are becoming polarised, confidence in institutions is diminishing, and public discourse has lost any resemblance to constructive dialogue. Not only: the most significant threats to democratic integrity often arise not solely from foreign adversaries or extremist factions, but also from governments, politicians, and institutions that obscure, distort, or weaponize information for political purposes.
Whether in high-level panels or more informal discussions, such as during hikes, or lunches, this common thread ran through the entire debate in the Democracy track in EFA 2025. Concerns about the spread of disinformation, eroding trust in institutions, and the fragile state of democratic discourse, coupled with the disruptive role of big technology platforms on the information ecosystem, surfaced repeatedly, and monopolised the debate, always pointing to the same underlying concern: what happens to democracy when truth itself becomes contested ground? What happens if the public sphere no longer exists? What is the role of big technology companies in this phenomenon? What are the remedies to be experimented, either political, or social, legal, or economic, to avoid a disruption of the methods of our democracies?
EFA’s discussion revealed there is persistent and common awareness of a disconnect between citizens and those in power, who are expected to act in the best interest of the public and listen to their concerns. A key foundation of democratic systems is freedom of expression, which includes media pluralism, media diversity, and information integrity. Every citizen has the right to access diverse and accurate information, as this is essential for making informed decisions that reflect their interests, especially regarding politics and elections. This principle also applies to all other areas of our lives. Furthermore, the media should represent society, especially in matters of diversity and inclusion.
Disinformation is not a new phenomenon; recent years have witnessed a rapid surge in its volume, pace of dissemination, and reach, largely fueled by advancements in digital technologies and distribution on social media platforms online. This increase happened in parallel with the surge of populism in the last years. The populist leader interprets the single voice of the people and underlines the contrast between the people and the élite. Within this narrative, science and media are seen with suspect and mistrust. They come with elaborated and, sometimes, difficult to accept, findings, while the populist leader speaks to the guts of people with emotionally charged and easy language. This resonates well with a quote by Hannah Arendt (in Crises of the Republic – Lying in Politics) “Lies are often much more plausible, more appealing to reason than reality, since the liar has the great advantage of knowing beforehand what the audience wishes or expects to hear. He has prepared his story for public consumption, with a careful eye to making it credible, whereas reality has the disconcerting habit of confronting us with the unexpected, for which we were not prepared”. The issue of disinformation has garnered considerable attention, ranking fourth in the 2025 World Economic Forum Global Risk Report, just behind societal polarization, and it holds the third position in the 2024 United Nations Global Risk Report. The implications of disinformation are profound; it threatens the very fabric of democratic societies and undermines public trust. Without trust, democracy itself is at risk.
Online platforms, including social media and search engines, provide citizens with greater access to information than was possible before their emergence. They also allow for various sources of information, including user-generated content, to be part of this broader offering. As widely known, to manage the enormous amount of content published online every day, content moderation algorithms were introduced by online intermediaries. However, as it was stressed in many discussions in Alpbach: i) algorithms are not neutral; ii) online platforms and search engines are driven by profit and not by public interest; iii) algorithms reward what makes content viral – resulting in increased engagement which is monetised, profiting, among others, from negative emotions such as fear, anger and frustration of the users. In parallel, advertising – and advertising revenue – shifted from legacy to online media, dramatically impacting the traditional business models, especially of the press, with direct consequences on the sustainability of free media and journalism, and, in the end, impacting the quality of the democratic debate. Moreover, with access to information becoming free online and competing in the attention economy, quality and accurateness of traditional media became less attractive.
At the European Forum Alpbach 2025, intense discussions were devoted to information integrity from mainly two angles: i) understanding of how disinformation works; ii) imagining, debating and working on solutions to curb the spread of disinformation which respect fundamental rights, including freedom of expression and in particular freedom of expression online. These solutions mostly included regulatory measures, building on what the European Union has already deployed, but also relied on the proposal of new structural measures to enhance the resilience of the European technological industry.
A system of influence: actors and incentives behind the information integrity erosion
Different actors, dynamics and incentives, which often profit from each other, are at the core of the erosion of information integrity in the EU.
The more money is invested in disinformation campaigns and infrastructures, the more present, constant and diffuse the phenomenon is. A clear example-even if not the only one- is Russian state-backed disinformation, which got even more intense in the last years following the Russian full-scale invasion of Ukraine. Indeed, as it was highlighted during the Forum, the more the EU gets vocal on Ukraine, the more Russian disinformation escalates in response, fueling polarisation in the EU member states. In this context, information functions as an extension of conventional conflict, where coordinated influence operations are deployed to shape public perceptions, manipulate narratives, and erode societal resilience across Europe. The ultimate aims are to create doubts and divisions within the EU, to exhaust people with a vast array of disinformation narratives and to trigger emotions such as fear, panic, anger and uncertainty. Campaigns also become increasingly targeted, by going hyperlocal, hence leveraging visceral reactions. The techniques, tactics and procedures behind, especially state-backed, disinformation strategies are varied, sophisticated and interlinked. Bots, trolls, proxies, influencers and disinformation websites opened for the sole purpose of being caught by Generative AI training models to alter their outputs, do all work under a clear agenda and purpose.
Disinformation is, however, also domestic and is often shared for pure economic purposes. Moreover, it can also address the private sector. In particular, it can either directly target private companies or affect them as a consequence of the content of its narratives. As a result, companies are to build specific crisis teams which include online data analysts, experts mapping the flows and narratives online and testing the algorithms, strategic communications experts, etc. In addition, phenomena like astroturfing – a deceptive practice of hiding the sponsors of an orchestrated message – give the impression of grassroot positions on a number of topics, including those affecting society globally like climate change, migration, diversity and inclusion.
All such malign activities leverage and profit from the design and functioning of online platforms, search engines and GenAI tools.
Content triggering negative emotions such as fear and anger is rewarded by algorithms, it goes more viral and as such is attractive to advertisers and therefore brings profit.
Consequently, hate speech and disinformation, nudge those emotions and travel wider and faster online.
Moreover –an element that was widely reported in most of the conversations in Alpbach-online platforms create addiction that stimulates the dopamine effect and hence ensures that users spend more time online and access that emotionally charged and outrageous content. It is also clustered in a way that content similar to our views is given preference by algorithms, thus putting us in echo chambers. Emotions in those echo chambers get amplified, fear of feeling excluded by the community is present, and the online behaviour becomes extremely visceral. Indeed, we act differently online compared to offline as we experience different biological mechanisms. Online behaviours can then have quite concrete consequences offline, such as hate crimes, discrimination and violent extremism.
These trends are exacerbated by the rapid development of Generative AI tools, which are becoming increasingly common, affordable, and easy to use. By enabling the automated production of text, images, audio, and video at scale, such systems further accelerate the volume and speed of online content creation. This lowers traditional barriers to producing persuasive or seemingly authentic information, making it even easier to shape narratives, influence public debate, or spread misleading material. As a result, the information ecosystem becomes more complex, and distinguishing credible sources from fabricated content grows progressively more challenging. General Purpose GenAI tools – ChatGpt and alike – are trained on a vast amount of data also found for free online. As such, the output often comes with biases and hallucinations. This phenomenon is exploited by malign actors, who place disinformation and propaganda online to be caught by the training models of the GenAI tools and hence alter the output provided by those tools to users.
Supporting information integrity and respecting fundamental rights in the online sphere
Following the identification and explanation of the problem, debates also focused on solutions. As is often repeated, there is no silver bullet that solves the issue, but rather the collective integration of different solutions. As called during the debates, the combination of these different activities must go hand in hand with the enforcement of the regulatory framework.
Online platforms and search engines are central to the debate because their infrastructures can be easily manipulated for harmful purposes. This issue is often compounded by the reluctance of the platforms themselves to change their design and functionality. The reason for this resistance is that the current system is highly profitable for them.
Regulations like the EU Digital Services Act (DSA) or soft-law measures such as the UNESCO Guidelines for the Governance of Digital Platforms represent valuable efforts to mitigate these risks. Their rationale and approach focus on holding online platforms accountable and implementing risk assessment and mitigation measures, particularly for very large operators. Therefore, when addressing content that is not inherently illegal, such as disinformation, the regulatory intervention targets the risks associated with the design and operation of online platforms and search engines, rather than the content itself. Focusing on risks is also seen as a methodology that should be able to ensure that the regulatory framework remains fit for purpose even in a constantly evolving digital and technological environment. As a relatively new tool, the DSA will need assessment in a few years’ time. The discussants raised that there should be more homogeneity between the risk assessment methodologies of the platforms and of the auditors and that enforcement of this regulation is key, also to show the strength of the EU on the global chessboard (particularly towards the US). Access to data (both public and private) was also highlighted as a fundamental asset to better understand the phenomenon and the risks stemming from the design and functioning of large online platforms and search engines. When disinformation campaigns are attributed to States, institutions like the EEAS, NATO, secret-services, ministries of defense and interior and different agencies need to collaborate, as the issue goes beyond tech regulation.
In the thought-provoking discussion in Alpbach, most opposed those in favour of an interpretation of freedom of expression in absolute terms, as the First Amendment to the US Constitution suggests, with the law having very limited scope to restrict speech, even if it is offensive or controversial. The discussion addressed the issue whether the increasing power of private online platforms and the sheer amount of disinformation distributed online, can justify a regulation that asks for more transparency and accountability of online platforms, including content moderation under the umbrella of Article 10 of the European Convention on human rights, in the name of protecting other individual rights, but also social cohesion, civic and democratic discourse. The debate spanned the contemporary paradox from whether free speech is the right to say and claim whatever we want to whether freedom of speech needs protection provided by regulatory guardrails. This is a very heated discussion, that recently has turned to be a geopolitical issue, a divisive topic within the US-EU relationships. Reflections were also made on the fact that free speech is not free reach. Hence, while freedom of expression is seen as a fundamental right for citizens, the same does not apply to algorithms, bots, trolls and GenAI tools.
When talking about disinformation, there is a natural tendency to focus on content, as it is the first thing that is seen and debunked. However, it is important to then move the focus to
the tactics, techniques and procedures (TTPs) as well as to the actors behind and to link them to the manipulation on online platforms and search engines. This allows to picture the entire ecosystem and dynamics behind the content diffusion and to elaborate tailored and accurate counter strategies.
The effects of platform regulation are often difficult to measure and may not be very effective. During discussions, the idea of transforming and regulating online platforms and search engines—entities that control our information infrastructure—into public utilities was explored. Preserving information integrity is essential for democracy, making it a public interest issue. Currently, these online platforms primarily serve private interests. Transitioning them to public utility standards, similar to those applied to energy, water, waste, and telephone services, could enhance accountability. While this conversation requires further exploration, there was a general consensus in Alpbach that it merits to be advanced.
On the side of users and consumption, media and information literacy (MIL) is also seen as a key asset, as long as it is a life-long learning process. Education is a competence of Member States, however EU guidelines and harmonisation efforts are seen as important. Despite such common principles, in order to be impactful, MIL initiatives shall go local. They should be run in local languages, take into consideration the urban vs rural dimensions, as well as the traditions, history, media diet and socio-economic situation of the different countries and target audiences.
Independent fact-checking is also seen as an important ingredient in fighting disinformation. In addition to providing context to content appearing online, it can help for pre-bunking and inoculation strategies as well as to build archives of debunked content. Open-Source Intelligence (OSINT) work complements fact-checking and helps in getting an understanding of TTPs and in attribution.
Addiction to online services was a frequently discussed topic in Alpbach. It is essential to address this issue, as it intersects with various sectors, including health. Addiction influences the brain’s dopamine response and affects critical skills and contextual understanding. Additionally, it has a significant social impact, often leading to isolation. It is important to acknowledge that for some people, leaving social media can be a privilege, particularly for those with strong offline connections and opportunities. Therefore, regulating addiction should be coordinated alongside other social policies. In 2023, the European Parliament called for regulations addressing addiction.
 
Structural conditions to boost the European information ecosystem. A call for digital sovereignty
The Forum also discussed structural solutions aimed at reshaping the information ecosystem and digital markets from a geopolitical perspective. A key point of discussion was the growing need to reduce dependence on U.S.-based platforms. Participants highlighted the importance of investing in European digital infrastructures. This shift is deemed essential, especially as US tech giants strengthen their connections with the executive branch, which poses challenges not only to the global information system and democratic governance but also to financial stability, es. through cryptocurrencies. Among the proposals discussed was the creation of a sovereign tech fund, intended to safeguard European digital autonomy and enhance the continent’s strategic resilience. Participants at the Forum also emphasized that quality journalism is a cornerstone of any healthy democracy. However, it cannot thrive in the current information ecosystem without support and adaptation. Comprehensive training programs are essential to help journalists navigate evolving technologies, combat disinformation, and engage audiences in increasingly fragmented media landscapes.
Additionally, sustained and strategic investment in newsrooms, both local and transnational, by private capital and public funding (with all necessary considerations for the public funds distribution process) is crucial to ensure that independent journalism can continue to serve the public interest, hold power accountable, and provide accurate, nuanced reporting in an age of information overload.
Equally important is the urgent need to rethink journalism’s underlying business model. Traditional revenue streams, especially advertising, have been severely disrupted by digital platforms. This disruption prompts a reevaluation of how journalism is funded, distributed, and consumed. Developing sustainable models that align with the realities of the digital information economy is not just an economic necessity; it is also a democratic imperative.
 
Conclusions. Information integrity as a pillar of democratic resilience: challenges and proposals
The discussions in the Democracy Track at the European Forum Alpbach 2025 clearly emphasised that information integrity is not just an ancillary issue; it is a fundamental pillar of democratic resilience. However, this pillar is currently facing significant and ongoing challenges. While democratic structures may still exist on paper, their foundations, trust, transparency, and shared factual standards, are increasingly being undermined by a volatile, profit-driven, and easily manipulated information ecosystem and by a technological structure that is mostly in the hands of non-EU companies.
There is a general agreement that online platforms and search engines, in particular those that have a wide reach, are at the core of the structure of the problem as they exploit the information ecosystem for profit and are exploited as mechanisms for disseminating disinformation by different actors, including, in an increasing way, by politicians. They attract advertising revenue previously directed to traditional media and quality journalism, they favour content featuring negative emotions by deceitfully profiting from known psychological mechanisms, they create addiction and fuel the attention economy where sensationalism and inaccurate content prevail over reflection and critical thinking, they are not sufficiently transparent in the way they implement their terms and conditions and their structure is too easy to exploit for malign purposes by malign actors, including in the context of hybrid wars. Therefore, there was a strong call for ensuring that private sector should be fully engaged in finding solutions, so, consequently, the DSA is properly enforced, to make platforms more accountable and to advance the conversation of categorising these actors as public utilities.
Pre-bunking, de-bunking, media literacy, etc. are seen as essential tools in the overall resilience building strategy. These mitigation measures are key for preparedness but do put a burden on citizens. Hence, they must go in parallel with solving the issue at a structural level by enforcing the regulatory framework.
Due to their crucial role in providing access to information, facilitating democratic participation, and supporting public debate, major online platforms should be considered for classification as public utilities. This classification would not mean government control over content, but it would introduce obligations similar to those imposed on essential services. Such a framework would decrease current dependence on privately owned, profit-driven infrastructures and help ensure that the digital public sphere operates with safeguards that protect democratic integrity.
The Forum participants stressed a shared understanding that journalism must evolve to meet the demands of a rapidly changing world. High-quality reporting remains essential to the health of democracy, but it cannot endure without sustained investment, proper training, and a reimagined financial foundation. The collapse of traditional business models has made it clear: a new framework, designed for the realities of the digital age, must be developed. Public funding, ethical private investment, and innovative revenue strategies all emerged as critical components of this transformation.
Last, but not least, the Forum underscored that digital sovereignty is no longer a theoretical ambition but a strategic imperative. Reducing dependence on U.S.-based platforms and reinforcing Europe’s own digital infrastructures were presented not merely as technical choices, but as essential steps toward safeguarding democratic institutions, economic stability, and informational integrity.
 
Disclaimer: the image rights belong to c EFA Philipp Huber.

The post Democracy needs information integrity appeared first on Centre for Media Pluralism and Media Freedom.