Google’s search engine is not just a search engine. Since Google began to integrate the “ai overViews” function, it is also A chatbot with artificial intelligence that answers the questions of the users without clicking on any link. The problem is that the underlying technology, the great language models, work probabilisticly, so they tend to invent the answer when they are not clear how to answer.
A credible lie. This time, an invented response from AI Overviews can end up sitting Google on the bench. The plaintiff is Wolf River Electric, a Minnesota solar energy company. And the origin of the lawsuit is, the redundancy is worth, a demand that never existed.
According to the lawyers of the energy company, the search for the terms “demands against Wolf River Electric” in Google made it AI responded with defamations. They cite a case in which AI Overviews replied that Wolf River Electric had been sued by Minnesota attorney for “deceptive sales practices”, such as lying to customers about how much they will save and deceive the owners to sign contracts with hidden rates.
The AI presented the case with total confidence involving four of the company’s managers by name: Justin Nielsen, Vladimir Marchenko, Luka Bozek and Jonathan Latcham, coming to show a photo of Nielsen next to the false accusations. To support his statements, the AI cited four links: three news articles and a statement from the attorney general. However, none of the links mentioned a lawsuit against Wolf River Electric.
It is not the first time. This type of error It is known as “hallucination”and it is very common in language models for how their response are weaving through the prediction of the following words, sometimes dragging the initial error until it becomes a credible lie with all kinds of invented ramifications, as in the play of the pickled phone.
When Google began to integrate Ai Overview in the search engine, he had to withdraw it from some searches, especially recipes and nutrition, because he recommended Add glue to pizza Or eat a stone a day to stay healthy.
An answer per question. Wolf River Electric states that, due to what they read in the AI Overviews, several clients canceled their contracts, valued at up to $ 150,000. The problem is that Ai overViews responses are personalized: they are inferred at the time, so it can vary from one consultation to another
That to Wolf River Electric’s lawyers are not worried because they know it can happen again. “This demand is not just about defending the reputation of our company; it will defend equity, truth and responsibility in the era of artificial intelligence,” Nicholas Kasprowicz sayslegal advisor of the company.
{“Videid”: “X7ZW3C2”, “Autoplay”: False, “Title”: “I cheated an artificial intelligence | Captcha 2×02”, “Tag”: “Artificial Intelligence”, “Duration”: “2958”}
David against Goliath. The case was filed in March in a state court and has just been elevated to a Federal Court of the United States. Perhaps it ends up creating jurisprudence on whether a technology company must take responsibility for its generation and disinformation. The answer to this question could mark a turning point for AI companies, which for a long time have tried to avoid responsibility for the results of their language models.
Google, in its defense, described the incident as a harmless mishap. “The vast majority of our AI overViews are precise and useful, but as with any new technology, errors can occur,” A company spokesman says. Google says he had quickly acted to solve the problem as soon as they had knowledge of him, in line with his Recent efforts for allowing users to correct the mistakes of the AI.
Image | Google
In Xataka | Google’s AI advises using pizza cheese glue. The source is a Reddit comment 11 years ago
(Function () {Window._js_modules = Window._js_modules || {}; var headelement = document.getelegsbytagname (‘head’) (0); if (_js_modules.instagram) {var instagramscript = Document.Createlement (‘script’); }}) ();
It was originally posted in
Xataka
by
Matías S. Zavia
.
GIPHY App Key not set. Please check settings