Launched in 2001, Wikipedia turns 25 this year. It is now among the most visited websites globally and is often described as a digital public utility. In an exclusive, wide-ranging interview on the sidelines of the Jaipur Literature Festival, Wales spoke at length about political criticism, declining trust, neutrality in an age of noise, and the growing temptation to outsource knowledge to artificial intelligence.
Wikipedia has increasingly been drawn into public and political disputes, but Wales said this is not a new development.
“For the whole life of Wikipedia, there is always some noise and controversy and all this,” he said. “The main way we handle it is just stick to our work.”
That work, according to Wales, is focused on process rather than reaction. “The Wikipedia community is very passionate about quality, about neutrality, about getting it right,” he said. “We spend our time mainly debating about sources.” Wales said criticism is inevitable. “It’s impossible to please everyone,” he said. “So for us, we just try and stay calm and carry on.”
Truth in a ‘post-truth’ world
Asked whether we live in a “post-truth” world, Wales disagreed. “I don’t actually think we live in a post-truth world,” he said. “That’s a common phrase. But I think truth is still incredibly important.”
He added that public reactions to misinformation suggest people still care deeply about honesty. “I think the public is very passionate about not being lied to and still don’t like hypocrisy and lying,” Wales said.
Story continues below this ad
On how Wikipedia understands neutrality, he said the idea is generally well understood. “Most people understand the idea,” he said. “If there are competing narratives, you have to describe them. Don’t get involved in them.”
Wales acknowledged that surveys show a decline in trust across institutions. At the same time, he cautioned against constant doubt. “You can’t be endlessly sceptical,” he said. “There are facts, and you have to get on with life.”
He said it is reasonable to pause judgment. “Usually you should be cautious and really think and reserve judgment and say, ‘Oh, I need to understand more before I make a full decision,’” he said.
The Seven Rules of Trust
Ask Wales, whose book is called The Seven Rules of Trust (2025), which of the seven rules of trust is hardest to apply, Wales pointed to transparency. “I think one of them is the rule on transparency,” he said. “And in fact we write it in such a way to acknowledge the issue, be transparent, especially when you have something to hide.”
He said this is particularly difficult for organisations.
Story continues below this ad
He described situations where transparency requires admitting failure. “It’s quite hard to just say, actually, you know what, we screwed up,” he said. “This isn’t good enough and we want to do better.”
Acknowledging mistakes publicly, he said, is uncomfortable. “It’s embarrassing and it’s difficult,” Wales said. “But it’s worth it for building trust.”
Wales said he finds artificial intelligence interesting but remains cautious. “I’m fascinated by it,” he said. “I’m fascinated with how bad it is, but also how good it is.” He said the Wikipedia community is considering limited uses of AI to support human work. “In the future the community will find AI useful in some small ways to help us with the work,” he said.
He gave an example related to sourcing. AI, he said, could help narrow down relevant sources so editors do not have to read large amounts of irrelevant material. “Then she could spend more of her time doing what only a human can do,” Wales said, “which is that evaluation and judgment.”
Story continues below this ad
However, he drew a clear boundary around content creation. “We don’t anticipate using AI to write Wikipedia entries,” he said. “It’s not good enough and it’s too dangerous.”
Wikipedia versus Grokipedia
Wales was openly critical of encyclopaedias built entirely using AI. “AI hallucinates,” he said. “It gets a lot of things wrong.” Referring to Elon Musk’s AI project, Grokipedia, Wales said some entries were copied directly from Wikipedia, while others were deeply flawed. “Other entries have serious errors, political bias,” he said.
Elon Musk’s AI project, Grokipedia, was inspired by Wikipedia.
He cited a specific example involving his own family. “There’s an entry about my wife,” Wales said, “which is kind of interesting because it just goes on and on and on.”
He said the entry began by noting she was a private person and then veered into speculation. “It starts talking about that she worked for the Labour Party and they were always very secretive,” he said. “It’s like, what is all this about?”
Story continues below this ad
“It’s complete nonsense,” Wales said. “It’s just like the ramblings of someone on drugs or something.” “That’s what you would expect from AI at this stage,” he added. “It isn’t really an encyclopaedia article. It’s just random text.”
Wales said one of the biggest differences between Wikipedia and AI systems is transparency.
“With Wikipedia, it’s so transparent,” he said. “If there’s a bias, you can go and talk to the people and say, ‘Why did you write it this way?’”
“You see all the history and so on,” he said. “With AI, you don’t know. It’s all mysterious.”