Artificial intelligence has become an everyday tool for millions of people. Today many use it to write emails, summarize documents or translate texts in a matter of seconds. However, this speed has a less visible side: generative systems can also make mistakes, invent data or alter sources without the user immediately noticing. When these errors appear in one of the largest encyclopedias in the world, the situation changes completely. That is precisely what has happened on Wikipedia with a series of translations carried out with the help of AI.
The opening episode. It all started within the Wikipedia community itself. Some editors began reviewing recent translations and noticed something strange: certain texts included phrases that did not appear in the cited sources or references that did not seem to fit with what the article stated. According to 404 Mediathese translations were part of a project promoted by an organization that sought to expand the presence of Wikipedia content in different languages using language models to speed up the process.
When translation invents. As editors began to examine these translations in more detail, the problems became more evident. One of the cases cited by 404 Media is that of a draft article about the French royal family La Bourdonnaye. The translated text included a reference to a book and a specific page to explain the origin of the family. However, when editor Ilyas Lebleu, known on Wikipedia as Chaotic Enby, reviewed that source, he discovered that the page cited was incorrect. Lebleu added that, when doing a quick review of several translations, he also found interchanged references, phrases without a source, and cases in which paragraphs were added based on material unrelated to what was being written.
It was published or remained in draft. The case also raised a relevant question: whether these errors had appeared in already published articles or whether they were detected during the review process. At least one of the problematic examples was identified in a draft translation, allowing editors to revise it before it was finalized. With the material provided here, however, it cannot be stated how many translations with problems were published and how many remained under review.


Who is behind these translations. Here appears the name of Open Knowledge Association (OKA), a non-profit organization that claims to work to improve Wikipedia and other open platforms. As the organization itself explains on its website, its model consists of offering monthly stipends to collaborators and translators who work full-time expanding the encyclopedia’s content, and “taking advantage of AI (large language models) to automate most of the work.” According to 404 Media, editors who investigated the project concluded that it relied on contractors.
The editors’ response. As more problematic examples appeared, the Wikipedia community decided to intervene. The editors reviewed the operation of the translation project and ended up establishing new restrictions for those who participated in it. OKA-linked translators who accumulate four strikes for unverifiable content within a six-month period may be blocked without additional notice if a new case appears. Additionally, content added by a translator that ends up being blocked may be removed preventively, unless another reputable editor takes responsibility for reviewing it.
OKA explains. The organization mentioned in the debate also offered its version of the events. Jonathan Zimmermann, founder and president of the Open Knowledge Association, explained to the aforementioned media that the project’s translators work on an hourly basis and that there is no fixed goal of articles per week. In addition, he admitted that “errors happen,” although he defended that the system includes human verification and review of sources. Following the discussion on Wikipedia, he added, the organization is introducing a second review with another AI model to detect possible errors before publishing, and is studying the possibility of adding peer review mechanisms if necessary.
Images | Oberon Copeland @veryinformed.com | Luke Chesser

GIPHY App Key not set. Please check settings