ended up sneaking in errors and references that didn’t add up

Artificial intelligence has become an everyday tool for millions of people. Today many use it to write emails, summarize documents or translate texts in a matter of seconds. However, this speed has a less visible side: generative systems can also make mistakes, invent data or alter sources without the user immediately noticing. When these errors appear in one of the largest encyclopedias in the world, the situation changes completely. That is precisely what has happened on Wikipedia with a series of translations carried out with the help of AI. The opening episode. It all started within the Wikipedia community itself. Some editors began reviewing recent translations and noticed something strange: certain texts included phrases that did not appear in the cited sources or references that did not seem to fit with what the article stated. According to 404 Mediathese translations were part of a project promoted by an organization that sought to expand the presence of Wikipedia content in different languages ​​using language models to speed up the process. When translation invents. As editors began to examine these translations in more detail, the problems became more evident. One of the cases cited by 404 Media is that of a draft article about the French royal family La Bourdonnaye. The translated text included a reference to a book and a specific page to explain the origin of the family. However, when editor Ilyas Lebleu, known on Wikipedia as Chaotic Enby, reviewed that source, he discovered that the page cited was incorrect. Lebleu added that, when doing a quick review of several translations, he also found interchanged references, phrases without a source, and cases in which paragraphs were added based on material unrelated to what was being written. It was published or remained in draft. The case also raised a relevant question: whether these errors had appeared in already published articles or whether they were detected during the review process. At least one of the problematic examples was identified in a draft translation, allowing editors to revise it before it was finalized. With the material provided here, however, it cannot be stated how many translations with problems were published and how many remained under review. Who is behind these translations. Here appears the name of Open Knowledge Association (OKA), a non-profit organization that claims to work to improve Wikipedia and other open platforms. As the organization itself explains on its website, its model consists of offering monthly stipends to collaborators and translators who work full-time expanding the encyclopedia’s content, and “taking advantage of AI (large language models) to automate most of the work.” According to 404 Media, editors who investigated the project concluded that it relied on contractors. The editors’ response. As more problematic examples appeared, the Wikipedia community decided to intervene. The editors reviewed the operation of the translation project and ended up establishing new restrictions for those who participated in it. OKA-linked translators who accumulate four strikes for unverifiable content within a six-month period may be blocked without additional notice if a new case appears. Additionally, content added by a translator that ends up being blocked may be removed preventively, unless another reputable editor takes responsibility for reviewing it. OKA explains. The organization mentioned in the debate also offered its version of the events. Jonathan Zimmermann, founder and president of the Open Knowledge Association, explained to the aforementioned media that the project’s translators work on an hourly basis and that there is no fixed goal of articles per week. In addition, he admitted that “errors happen,” although he defended that the system includes human verification and review of sources. Following the discussion on Wikipedia, he added, the organization is introducing a second review with another AI model to detect possible errors before publishing, and is studying the possibility of adding peer review mechanisms if necessary. Images | Oberon Copeland @veryinformed.com | Luke Chesser In Xataka | Sam Altman says he’s terrified of a world where AI companies believe themselves to be more powerful than the government. It’s just what you’re building

How to add bibliographic quotes and references in a work or report with artificial intelligence

Let’s explain How to add quotes and bibliographic references in a work or report with artificial intelligence. Creating appointments with AI is quite easy, and you will be able to do so with Chatgpt as with COPILOT, Deepseek, Gemini or any other chatbot. To create the citations, you will only need the bonding bond or reference you have used As a reference for your work. Then, at AI you can ask you to use it, and then you will generate the text using different summons formats. Remember that it is still easier to use Tools to cite your bibliographyalthough the AI ​​is gradually arriving there. Create appointments for university jobs with AI To create bibliographic appointments and references in a job or report you only have to use a very simple command For AI. Basically, you have to ask you to make an appointment, and add the link or reference used. The prompt that we have used is as follows: I want you to believe me a citation for a university job of the link/book (reference) When you write this command, You must specify what kind of appointment is itif it is a web link or a book. And then, where we have put (reference) in the prompt put the link to the web or the name and author of the book. Only this will be enough. When you do, The AI ​​will create several appointments in several summons of summons. In fact, it will put both the reference to put in the bibliography and the way you must add the appointment within your text. You can also ask for concrete citation styles. You can specify it in the main prompt, but you can also ask for it later. You shouldn’t even put all the full prompt again, if after the answer you write something like “and in Harvard style?” Within the same chat, the AI ​​will know that you mean citations, and will generate one in the format you ask. You can also add several links To believe you several citations. For that, tell him something like “I want you to believe me a citation for a university work of the following links/books”, and then Chatgpt or the chosen AI will tell you to go ahead, and you can respond with the list of links. In Xataka Basics | 22 useful and not so well -known free artificial intelligence tools

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.