Wikipedia at 25: can its original ideals survive in the age of AI?

Technology


Ink Drop/Shutterstock

Around the turn of the century, the internet underwent a transformation dubbed “web 2.0”. The world wide web of the 1990s had largely been read-only: static pages, hand-built homepages, portal sites with content from a few publishers.

Then came the dotcom crash of 2000 to 2001, when many heavily financed, lightly useful internet businesses collapsed. In the aftermath, surviving companies and new entrants leaned into a different logic that the author-publisher Tim O’Reilly later described as “harnessing collective intelligence”: platforms rather than pages, participation rather than passive consumption.

And on January 15 2001, a website was born that seemed to encapsulate this new era. The first entry on its homepage read simply: “This is the new WikiPedia!”

Screenshot of the Wikipedia homepage in 2001.
Screenshot of the Wikipedia homepage in 2001.
Wikimedia Commons

Wikipedia wasn’t originally conceived as a not-for-profit website. In its early phase, it was hosted and supported through co-founder Jimmy Wales’s for-profit search company, Bomis. But two years on, the Wikimedia Foundation was created as a dedicated non-profit to steward Wikipedia and its sibling projects.

Wikipedia embodied the web 2.0 dream of a non-hierarchical, user-led internet built on participation and sharing. One foundational idea – volunteer human editors reviewing and authenticating content incrementally after publication – was highlighted in a 2007 Los Angeles Times report about Wales himself trying to write an entry for a butcher shop in Gugulethu, South Africa.

His additions were reverted or blocked by other editors who disagreed about the significance of a shop they had never heard of. The entry finally appeared with a clause that neatly encapsulated the platform’s self-governance model: “A Wikipedia article on the shop was created by the encyclopedia’s co-founder Jimmy Wales, which led to a debate on the crowdsourced project’s inclusion criteria.”

As a historical sociologist of artificial intelligence and the internet, I find Wikipedia revealing not because it is flawless, but because it shows its workings (and flaws). Behind almost every entry sits a largely uncredited layer of human judgement: editors weighing sources, disputing framing, clarifying ambiguous claims and enforcing standards such as verifiability and neutrality.

Often, the most instructive way to read Wikipedia is to read its revision history. Scholarship has even used this edit history as a method – for example, when studying scientific discrepancies in the developnent of Crispr gene-editing technology, or the unfolding history of the 2011 Egyptian revolution.

Co-founder Jimmy Wales explains how Wikipedia was created, July 2005. Video: TedX.

The scale of human labour that goes into Wikipedia is easy to take for granted, given its disarming simplicity of presentation. Statista estimates 4.4 billion people accessed the site in 2024 – over half the world and two-thirds of internet users. More than 125 million people have edited at least one entry.

Wikipedia carries no advertising and does not trade in users’ data – central to its claim of editorial independence. But users regularly see fundraising banners and appeals, and the Wikimedia Foundation has built paid services to manage high-volume reuse of its content – particularly by bots scraping it for AI training. The foundation’s total assets now stand at more than US$310 million (£230 million).

‘Wokepedia’ v Grokipedia

At 25, Wikipedia can still look like a rare triumph for the original web 2.0 ideals – at least in contrast to most of today’s major open platforms, which have turned participation into surveillance advertising.

Some universities, including my own, have used the website’s anniversary to soothe fears about student use of generative AI. We panicked about students relying on Wikipedia, then adapted and carried on. The same argument now suggests we should not over-worry about students relying on generative AI to do their work.

This comparison is sharpened by the rapid growth of Elon Musk’s AI-powered version of Wikipedia (or “Wokepedia”, as Musk dismissively refers to it). While Grokipedia uses AI to generate most of its entries, some are near-identical to Wikipedia’s (all of which are available for republication under creative commons licensing).

Grokipedia entries cannot be directly edited, but registered users can suggest corrections for the AI to consider. Despite only launching on October 27 2025, this AI encyclopedia already has more than 5.6 million entries, compared with Wikipedia’s total of over 7.1 million.

So, if Grokipedia overtakes its much older rival in scale at least, which now seems plausible, should we see this as the end of the web 2.0 dream, or simply another moment of adaptation?

Credibility tested

AI and the human-created internet have always been intertwined. Voluntary sharing is exploited for AI training with contested consent and thin attribution. Models trained on human writing generate new text that pollutes the web as “AI slop”.

Wikipedia has already collided with this. Editors report AI-written additions and plausible citations that fail on checking. They have responded with measures such as WikiProject AI Cleanup, which offers guidance on how to detect generic AI phrasing and other false information.

But Wales does not want a full ban on AI within Wikipedia’s domain. Rather, he has expressed hope for human-machine synergy, highlighting AI’s potential to bring more non-native English contributors to the site. Wikipedia also acknowledges it has a serious gender imbalance, both in terms of entries and editors.

A video made by Wikipedia to mark its 25th anniversary.

Wikipedia’s own credibility has regularly been tested over its 25-year history. High-profile examples include the John Seigenthaler Sr biography hoax, when an unregistered editor falsely wrote about the journalist’s supposed ties to the Kennedy assasinations, and the Essjay controversy, in which a prominent editor was found to have fabricated their education credentials.

There have also been recurring controversies over paid- or state-linked conflicts of interest, including the 2012 Wiki-PR case, when volunteers traced patterns to a firm and banned hundreds of accounts.

These vulnerabilities have seen claims of political bias gain traction. Musk has repeatedly framed Wikipedia and mainstream outlets as ideologically slanted, and promoted Grokipedia as a “massive improvement” that needed to “purge out the propaganda”.

As Wikipedia reaches its 25th anniversary, perhaps we are witnessing a new “tragedy of the commons”, where volunteered knowledge becomes raw material for systems that themselves may produce unreliable material at scale. Ursula K. Le Guin’s novel The Dispossessed (1974) dramatises the dilemma Wikipedia faces: an anarchist commons survives only through constant maintenance, while facing the pull of a wealthier capitalist neighbour.

According to the critical theorist McKenzie Wark: “It is not knowledge which is power, but secrecy.” AI often runs on closed, proprietary models that scrape whatever is available. Wikipedia’s counter-model is public curation with legible histories and accountability.

But if Google’s AI summaries and rankings start privileging Grokipedia, habits could change fast. This would repeat the “Californian ideology” that journalist-author Wendy M. Grossman was warned about in the year Wikipedia launched – namely, internet openness becoming fuel for Silicon Valley market power.

Wikipedia and generative AI both alter knowledge circulation. One is a human publishing system with rules and revision histories. The other is a text production system that mimics knowledge without reliably grounding it. The choice, for the moment at least, is all of ours.

The Conversation

Vassilis Galanos has received funding from the University of Edinburgh and the University of Stirling. He is affiliated with the Hype Studies group, the AI Ethics & Society network, and We and AI.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *