We get it; we really do. Lawmakers across the world are rightly focused on regulating powerful, for-profit platforms to mitigate the harms ascribed to social media and other threats online. When developing such legislation, however, some draft laws can inadvertently place public interest projects like Wikipedia at risk. At the Wikimedia Foundation, the nonprofit organization that hosts Wikipedia and other Wikimedia platforms, we have found that when a proposed law harms Wikipedia, in many cases it likely harms other community-led websites, open resources, or digital infrastructure.
That is why we have created the Wikipedia Test: a public policy tool and a call to action to help ensure regulators consider how new laws can negatively affect online communities and platforms that provide services and information in the public interest.
The Wikipedia Test offers a straightforward idea as its central premise:
Before passing regulations, legislators should ask themselves whether their proposed laws would make it easier or harder for people to read, contribute to, and/or trust a project like Wikipedia.
When we say “Wikipedia” in the context of the test, we mean it as a model for the best parts of the internet. Wikipedia can act as a stand-in for those other online spaces that are open, privacy-respecting, and enable people around the world to share knowledge that can advance education, development, and civic participation.
This includes things like: Project Gutenberg, which makes educational and cultural resources freely available; FixMyStreet and its public reporting forums so citizens can direct their representatives to their concerns; Global Voices and its citizen journalism platforms, which amplify stories left untold by larger news media; and, a multitude of data-sharing and code repositories and digital public goods that help researchers advance our understanding and actions regarding public health, climate change, and Sustainable Development Goals.
In a nutshell, the Wikipedia Test is a reminder: When regulation fails to account for the various kinds of platforms and services that exist online, the result can be laws that unintentionally harm the very spaces that offer an alternative to the commercial web.
The Wikipedia Test is more than just a safeguard: it is a way to promote a positive vision for the internet. We envision a digital ecosystem where “everyone has easy access to a multilingual digital commons supported by a thriving public domain and freely licensed content.” To get there, policymakers must support online spaces where diverse communities can build and govern knowledge-sharing platforms in their own languages and cultural contexts. The Wikipedia Test helps identify whether a proposal aligns with this future — or undermines it.
As you will see below, the tool itself is a short, easy-to-use rubric designed to help lawmakers, regulators, and public interest advocates ask the right questions. These are the kinds of considerations that define whether a law or regulation protects the knowledge and information that belongs to everyone online — i.e., the digital commons — or, worryingly, threatens it.
Like everything in the Wikimedia ecosystem, the Wikipedia Test is free to access and share. Policy advocates both inside and outside the Wikimedia movement can use the Wikipedia Test to spark better conversations with lawmakers. Regulators can use it to spot potential red flags early in the drafting process. And best of all, it is not a pass-fail assessment: it is an invitation to think more critically, to ask better questions, and to reach out to others that are also concerned about making sure that the internet is the best that it can be.
When in doubt, contact the Global Advocacy team at the Wikimedia Foundation. We are here to help assess the impacts of proposed rules and laws, and to work together toward ensuring better outcomes for everyone.
Last but not least, we would love your feedback. Are you a policymaker looking for a clearer path through complex digital questions? Are you an advocate who wants to integrate the Wikipedia Test into your own work? Let us know at globaladvocacy@wikimedia.org.
By working together, we can ensure that the internet remains a space where knowledge can be built and shared by everyone, in every language, everywhere in the world.
The Wikipedia Test
[Free expression] Could the policy increase the legal risks or costs of hosting community-led public interest projects like Wikipedia?
Community-led moderation on platforms like Wikipedia, Reddit, or OpenStreetMap relies on intermediary liability protections, which shield websites and users from legal responsibility for user-generated content (UGC). The best-known example is Section 230 of the US Communications Decency Act (CDA). Weakening these protections could force centralized moderation, undermining crowdsourced models. Proposals to change Section 230 are frequent — we even published a three-part blog series on the issue. The Electronic Frontier Foundation also explains why this matters for Wikipedia.
[Access to information] Could the policy make it harder to access or share information, including works that are freely licensed, protected by copyright, or in the public domain?
A good example of policy supporting access to freely licensed information is the 2021 UNESCO Recommendation on Open Science. It urges governments to reform copyright and data laws to enable open access, public domain use, and collaboration in order to enhance scientific research. This is a strong foundation for legal frameworks that support cocreation and citizen science. The International Federation of Library Associations and Institutions (IFLA) praised it for reinforcing libraries’ roles in equitable access. Implemented well, it ensures public funding results in public knowledge — not paywalled content.
[Privacy and safety] Could the policy potentially threaten user privacy by requiring the collection of sensitive, identifiable information like ages, real names, or contact information of Wikipedia’s volunteer editors and readers?
The UK Online Safety Act (OSA) and Australia’s Basic Online Safety Expectations (BOSE) are examples of laws that threaten user privacy by requiring websites to collect ages or real names. The collection, processing, and retention of such sensitive data increase the risk of a range of privacy harms, including identity theft, surveillance, and harassment. Journalists have reported on how this could undermine Wikipedia’s commitment to anonymity and privacy, potentially making both readers and volunteer editors less willing to access or contribute to Wikipedia.
[Free expression] Could the policy lead to potential surveillance and cause a chilling effect that discourages people from reading or editing Wikipedia?
Electronic mass surveillance, like that conducted by the United States National Security Agency’s (NSA) “Upstream” program, has been legally contested in a number of countries. It is one of many forms of surveillance that can chill freedom of expression by making people afraid to access or contribute to discussions of certain topics — even on an encyclopedia such as Wikipedia.
[Privacy and safety] Could the policy make it riskier for people to access, edit, and share content on Wikipedia by enabling governments to collect identifying information about volunteer editors or readers, leading to intimidation or retaliation?
The United Nations Convention Against Cybercrime is an international treaty that, if widely adopted, could be used by repressive governments to reach across national borders in order to prosecute political enemies, dissidents, and others who challenge authoritarian rule — including Wikipedia editors. Freedom House also explains that the treaty would make it easier for law enforcement agencies to obtain private companies’ electronic records and data, undermining the human rights of people outside of those agencies’ jurisdictions.
[Free expression] Could the policy limit the ability of volunteer editors to govern Wikipedia’s content and guidelines?
A 2021 bill in Chile could have severely threatened community-led models of platform governance. The bill’s one-size-fits-all approach would have imposed content moderation obligations designed for commercial platforms, including preemptive content takedown. This would have undermined the autonomy of volunteer editors in shaping content and guidelines. The CELE noted how such regulations risked threatening rights, chilling participation, and eroding the collaborative nature of websites — such as Wikipedia.
[Access to information] Could the policy restrict the free flow of information across borders, potentially limiting access to Wikipedia and its content?
During 2017–2020, the government of Turkey blocked Wikipedia in the country, denying 80+ million people there from reading and contributing to an essential source of information that the rest of the world could access. Freedom of expression and access to reliable information empowers people to make better decisions, be more connected, and build sustainable livelihoods. This violation of human rights was ultimately condemned by the Turkish Constitutional Court, who ruled that access to the encyclopedia had to be restored. Since then, Turkish Wikipedia, which is viewed more than 150 million times a month, has grown by more than 474,000 articles.
Remember: We value your feedback, so please reach out to the Global Advocacy team with any questions, thoughts, and suggestions that you might have about the Wikipedia Test.
Together we can promote and protect the best parts of the internet!
Stay informed on digital policy, Wikipedia, and the future of the internet: Subscribe to our quarterly Global Advocacy newsletter.
The post The Wikipedia Test appeared first on Wikimedia Foundation.