Wikipedia:Wikipedia Signpost/2025-05-14/In the media
Wikimedia Foundation sues over UK government decision that might require identity verification of editors worldwide
The BBC [1], The Guardian [2], and The Verge [3] cover an announcement by the Wikimedia Foundation that it is bringing a legal challenge against new regulations under the United Kingdom's Online Safety Act 2023, which according the Foundation "could place Wikipedia as a 'Category 1 service' — a platform posing the highest possible level of risk to the public."
As reported by The Guardian,
The foundation said it was not challenging the act as a whole, nor the existence of the requirements themselves, but the rules that decide how a category 1 platform is designated.
Those rules were set in secondary legislation by the technology secretary, Peter Kyle. The foundation is challenging Kyle’s decision to proceed with that statutory instrument, via a judicial review, where a judge reviews the legality of a decision made by a public body, at the high court of England and Wales.
In a separate Medium post, Wikimedia Foundation lead counsel Phil Bradley-Schmieg explained the concerns about the possible classification of Wikipedia under category 1 in more detail:
There are many OSA Category 1 duties. Each one could impact Wikipedia in different ways, ranging from extraordinary operational burdens to serious human rights risks. [... T]he law’s impact would extend far beyond the UK.
The Category 1 “user verification and filtering” duties are a good example. [...] Sophisticated volunteer communities, working in over 300 languages, collectively govern almost every aspect of day to day life on Wikipedia. Their ability to set and enforce policies, and to review, improve or remove what other volunteers post, is central to Wikipedia’s success, notably in resisting vandalism, abuse, and misinformation. [...]
However, if Wikipedia is designated as Category 1, the Wikimedia Foundation will need to verify the identity of Wikipedia users. That rule does not itself force every user to undergo verification — but under a linked rule (s.15(10)(a)), the Foundation would also need to allow other (potentially malicious) users to block all unverified users from fixing or removing any content they post. This could mean significant amounts of vandalism, disinformation or abuse going unchecked on Wikipedia, unless volunteers of all ages, all over the world, undergo identity verification.
Although the UK government felt this Category 1 duty (which is just one of many) would usefully support police powers “to tackle criminal anonymous abuse” on social media, Wikipedia is not like social media. Wikipedia relies on empowered volunteer users working together to decide what appears on the website. This new duty would be exceptionally burdensome (especially for users with no easy access to digital ID). Worse still, it could expose users to data breaches, stalking, vexatious lawsuits or even imprisonment by authoritarian regimes. Privacy is central to how we keep users safe and empowered. Designed for social media, this is just one of several Category 1 duties that could seriously harm Wikipedia.
Bradley-Schmieg also detailed how some longstanding Wikipedia features might contribute to it being classified as such a high risk social media website, due to what he called "especially broad and vague" criteria in the categorization rules challenged by WMF:
To avoid any risk of loopholes, and due to limited research, the Categorisation Regulations were left especially broad and vague. They have no real connection to actual safety concerns. They were designed around three flawed concepts:
- Definition of content recommender systems: Having any “algorithm” on the site that “affects” what content someone might “encounter”, is seemingly enough to qualify popular websites for Category 1. As written, this could even cover tools that are used to combat harmful content. We, and many other stakeholders, have failed to convince UK rulemakers to clarify that features that help keep services free of bad content — like the New Pages Feed used by Wikipedia article reviewers—should not trigger Category 1 status. Other rarely-used features, like Wikipedia’s Translation Recommendations, are also at risk.
- Content forwarding or sharing functionality: If a popular app or website also has content “forwarding or sharing” features, its chances of ending up in Category 1 are dramatically increased. The Regulations fail to define what they mean by “forwarding or sharing functionality”: features on Wikipedia (like the one allowing users to choose Wikipedia’s daily “Featured Picture”) could be caught.
[...]
As a result, there is now a significant risk that Wikipedia will be included in Category 1, either this year or from 2026 onwards.
The Verge highlighted that
Wikimedia says it has requested to expedite its legal challenge, and that UK communications regulator Ofcom is already demanding the information required to make a preliminary category 1 assessment for Wikipedia.
The BBC noted that
It's thought this is the first judicial review to be brought against the new online safety laws - albeit a narrow part of them - but experts say it may not be the last.
"The Online Safety Act is vast in scope and incredibly complex," Ben Packer, a partner at law firm Linklaters, told the BBC.
The law would inevitably have impacts on UK citizens' freedom of expression and other human rights, so as more of it comes into force "we can expect that more challenges may be forthcoming", he told the BBC.
However, Packer seemed skeptical of the Foundation's chances to prevail in court:
"Typically, it is difficult to succeed in a judicial review challenging regulations," he told BBC News.
"Here, Wikimedia will be challenging regulations set by the Secretary of State on the advice of Ofcom, after they had conducted research and consultation on where those thresholds should be set," he pointed out.
See also previous Signpost coverage:
- "Legal status of Wikimedia projects "unclear" under potential European legislation" (Special report, 2023-02-04)
- "Foundation and British chapter launch last-ditch attempt to get Wikipedia exempted from the UK's sweeping 'Online Safety Bill'" (News and notes, 2023-07-03)
– H
The Indian Supreme Court overturns takedown order
"India Supreme Court reverses content takedown order against Wikipedia operator", according to Reuters, on an earlier High Court ruling that forced the Wikimedia Foundation to take down the Wikipedia article about the court case about ANI's accusation of defamation in the article Asian News International. The article Asian News International vs. Wikimedia Foundation has been reinstated by office action. Indian Express reports that
A bench of Justices A S Oka and Ujjal Bhuyan said, "It is not the duty of the court to tell the media to delete this and take that down… Both the judiciary and the media are the foundational pillars of democracy, which is a basic feature of the Constitution". "For a liberal democracy to thrive, both should supplement each other."
The original defamation case will still be decided by the High Court unless that decision is also overruled by the Supreme Court. – B, Sb
See also this issue's "News and notes"
First a left, then three rights, then come to a full and complete stop
The headlines of two stories appearing on the same day, from different political perspectives, are telling: DC Prosecutor Ed Martin Goes After Wikipedia For Exercising First Amendment Rights (Above the Law) followed by Wikipedia Nonprofit Status Under Scrutiny From US Justice Department Amid Claims of Systemic Anti-Israel Bias (The Algemeiner) [emphasis added by The Signpost].
The next day Lawmakers press Wikipedia to clarify and enforce editorial oversight to prevent anti-Israel bias (Jewish Insider), and Members of Congress call on Wikipedia to curb its antisemitism (Arutz Sheva).
In the meantime the Interim US Attorney for the District of Columbia Ed Martin lost his position and President Trump named Jeanine Pirro as Martin's replacement for the permanent position (according to The New York Times and two more Above the Law articles).
– Sb
$100 million and change
The American MacArthur Foundation announced the five finalists in their 100&change grant competition, which will award a $100 million grant to a single project. This video shows Jimmy Wales and Denny Vrandečić presenting the Wikimedia Foundation's candidate. Vrandečić's long-term effort has already helped to produce Wikifunctions and Abstract Wikipedia, as he has explained on diff. All five finalists are shown on this video.
In brief
- WMF AI strategy "puts humans first": Wikipedia says it will use AI, but not to replace human volunteers Tech Crunch summarizing an April 30 WMF blog post, also syndicated by Yahoo! Finance
- "Wikipedia and the Politics of Knowledge": A podcast from The Tel Aviv Review includes journalist Omer Benjakob who states "Wikipedia is the defining source of knowledge in the digital age."
- Podcast explains: host Sean Rameswaram interviews Wikipedia's beat journalist Stephen Harrison on Vox podcast Today, Explained (26 minutes).
- Seeking CEO: Axios broke the news that Wikimedia Foundation is seeking a new CEO. In a second article, Axios interviews Maryana Iskander, who is expected to remain CEO until January.
- Jay-Z on Wikipedia: "Jay-Z Says Rape Accuser's Attorney Edited Wikipedia Pages to Harm His Reputation" in Billboard. See Wikipedia:Wikipedia Signpost/2025-05-14/Disinformation report for more.
- Order in the Irish court! The Irish Times reported (archive) on the second follow up to a controversial 2022 academic study on the effect of publication of Wikipedia articles about Irish court cases on the number of citations of those cases in future cases. See earlier Signpost coverage on the original paper.
- I fought for knowledge freedom and all I got was this lousy designer t-shirt: from Women's Wear Daily.
- A good overview: ABC News (Australia) published a long-form article by Rhiannon Stevens titled Can Wikipedia survive the rise of AI and Trump?. Though the U.S. president doesn't actually appear in the article, it's a fine, in-depth look at Wikipedia's place in the world that includes quotes from Wiki-journalist Richard Cooke, Associate Professor Heather Ford, and Wikipedia's own Premeditated Chaos.
Discuss this story
I think it's fairly obvious that forwarding and sharing is likely to imply the sort of mechanism used when one shares or forwards a social media post. That doesn't necessarily resolve the issue, because one would need to know what the courts would decide about specific mechanisms (are they "of that sort"). However I hope it moves things a step forward. All the best: Rich Farmbrough 22:44, 14 May 2025 (UTC).[reply]
censorship authoritytelecom regulator - for years, but that their words have been fallen on deaf ears. Hence, presumably, this Hail Mary lawsuit.) See also this part of the WMF's Medium post: