The best applications to have local artificial intelligence on your mobile or PC, without needing a connection and with greater privacy

Let’s tell you the best programs to install a local AI on your mobile or your computer. In this way, you can have the artificial intelligence directly on the installed device, without requiring an Internet connection and keeping your private data within the mobile phone or PC itself. The programs will allow you to install language models LLM open source, being alternatives to ChatGPT and company, and even being able to choose how distillates you want them. However, some of the desktop applications will also allow you to connect to cloud business models such as ChatGPT, Gemini either Claudealthough sacrificing privacy. In the end, the goal of this is have your own private ChatGPT and without the restrictions of online models, to be able to make all the questions you want. They will be less powerful and with fewer capabilities, since there would not be enough space on your mobile or PC to install complete models like the commercial ones, but good tools for quick questions and functions. Apps to have local AI on your mobile Let’s start with a small compilation of the best applications with which you can have artificial intelligence models installed on your mobile. Remember that models can take up a lot of space, so be careful and check your device’s memory. PocketPal AI This is the go-to mobile app if you want to install a local AI on your device. It is open source and freeand is available for both Android and iOS. It stands out above all for its ease of use, and for the many models available. PocketPal AI Integrates directly with Hugging Facewhich is the world’s leading repository for uploading AI models. Since it is integrated, you can download any model directly without leaving the application and without complicating things. MNN Chat This is an exclusive application for Android, and stands out for being one of the fastest applications for this operating system. But its biggest differentiator is your complete multimodal supportbeing able to write your prompts adding text, images or audio. You will find all kinds of models, from those for generating images to others for making responses with text. Has an integrated model catalogso downloading and installing them is a very easy task. It is a free and open source application. Private LLM If you are looking for a premium app for iOSthis is one of the best alternatives. Of course, it has a price of about $5 with a one-time payment. It includes more than 60 curated models, and uses advanced quantization to improve its performance. It also integrates with Siri and Apple Shortcuts, and in addition to your iPhone it will also work for your Mac or iPad. It also has customizable interactions and support for Family Sharing to share the purchase with your family. Google AI Edge Gallery Google AI Edge Gallery is a Google application for Android with which you can install different artificial intelligence models on your mobile. The app allows many uses, from image consultation to audio transcription or chats with AI, all locally. It is an open source project, and against it it must be said that It is a tool in developmentwhich means that there may still be quite a few bugs that will be corrected. Locally AI Another application for the Apple ecosystem, and optimized for Apple Silicon processors. Stands out for having a beautiful and careful interfaceadapted to Apple’s language, and for supporting the main open source models. What this app seeks is to offer you the experience of an app like ChatGPT but with free and offline languages, all local. Has local voice modelanguage and vision models, customizable prompt system, and both integration with Siri and Apple shortcuts. AnythingLLM An exclusive application for Android, although the model’s aspiration is to have a version for iOS. It has support for several small, fast and powerful models. They do not seek to offer a giant catalog, but rather have hand picked those they consider best and most optimized for mobile. This app also offers a default agent modeso you can use it with its models to search websites, read pages, interact with other applications or use your location. And if you’re looking for power, you can sacrifice privacy by connecting to one of the cloud models it offers. SmolChat We finish with another app to download and run popular AI models on Android, locally and offline. All this with an interface adapted to Android, and many customization settings. You can also pin your favorite chats on the home screen with shortcuts. Apps to have local AI on your computer We now continue with another small collection of applications with which you can download artificial intelligence language models directly on your computerwhether Windows, GNU/Linux or Mac. Ollama Ollama is possibly one of the most popular applications you can find to download AI models to your computer. The best of all is that it is multi-system, and you can install it in Windows, macOS and GNU/Linux. It is also open source and completely free, and has a clean and minimalist interface. Its interface is chat, just like any commercial model, and you have a history of conversations. There is also support for dragging files, such as PDFs or images. You have a search engine for AI models to find the one you want with several versions. Jan With over five million downloads, this is an amazing tool. You can add open source models or connect private ones like ChatGPT, Claude and company. This gives you the versatility to become an excellent all-in-one, available for Windows, macOS and GNU/Linux. But there’s more, because Jan also offers the possibility of using connectorsso you can work with your AI in Gmail, Amazon, Google, YouTube, Google Drive and more. They are also working on a memory system, all storing data locally. LM Studio An open source application with a unified graphical interface, since you will be able to Search and download AI models within the program … Read more

the best search engines in Europe for those who want more ethics and privacy

Let’s tell you the best European alternatives to Google and Bing. If you are looking for greater privacy and protection when searching on the Internet, choosing a trusted European provider ensures that they are subject to strict European privacy regulations. In total, we bring you six alternatives, which offer greater privacy and security. Some are environmental or non-profit organizations, others are paid, and there are even some open source. Ecosia It is possibly the most popular European search engine, and is known for your climate mission; since its advertising revenue is used to finance reforestation projects. Ecosia promises to respect your privacy as much as the planet, and only collects the data necessary to offer a quality product. Nothing else. This search engine has an AI search function with smaller, faster models to use less energy while offering accurate answers, all based on some renewable energies with which they feed both their search engine and their AI. Your search results come from Bing or Google, depending on your location, device type, or your preferences. Startpage Startpage is possibly one of the best alternatives to Google, although it is not completely European. It was founded in 2006 in the Netherlands, where its headquarters are still located, although it is a global company. However, having a European headquarters they promise that their users are protected by European privacy lawsincluding the GDPR. This search engine claims to be the most private in the world, including free anonymous viewing. The search results are from the Google engine, but they pass through their own data protection filter that remove users IPblock price trackers and third-party access to ads. Qwant A French search engine, which stands out for your commitment to privacy and not store or sell any type of data about you. Its results index is generated with Bing, although it also adds its own index to the algorithm to improve it. There is nothing from Google. Qwant also has a search engine called Qwant Junior, which adapts its results to the little ones in the house. So that you don’t miss anything if you migrate from Google, it has a partner called Shadow Drive, which offers private cloud storage hosted in Europe. good Good is a non-profit search engine created in Germany. All of its proceeds are donated to charities and other non-profit organizations that have a B Corp certification. Additionally, it is a search engine private and anonymouswithout histories, fingerprints or tracking. For its search results it does not use Bing or Google, but rather uses Brave search engine index. It is CO2 neutral, and has no advertising, nothing. In fact, the way they are maintained is through a voluntary subscription system of 2 euros per month or 19 per year for those who want to support it. Another German search engine, which combines search results from other providers. It is open sourceso that everyone can know how it works, their servers are maintained with renewable energy, and they are a non-profit organization. The most positive part is that they are a search engine committed to privacy and no ads. The negative part is that it is paid. Each search costs one token, and you can buy several packs which start at 500 tokens for 5 euros. Swisscows A Swiss search engine privacy focused and family character. For the latter, it has filters with which it tries to avoid explicit content. For its results it uses Bing’s search index combining it with one of its own. It has two modes, one free and one paid that for $3.80 per month offers total anonymity, zero advertising and exclusive search settings. Therefore, total anonymity is not free as in other models. It also has additional services such as mail, instant messenger, cloud and VPN. In Xataka Basics | 61 European alternatives to Google, X, Gmail, Chrome, Maps, DropBox, Google Drive, WhatsApp and other popular services

The Samsung Galaxy S26 Ultra has shown us a wonderful future. One full of screens with privacy technology

Many revolutions come without us realizing it and by surprise. As if they were a supporting actor that no one seemed to pay attention to and turns out to be the real star of the movie: This is how the privacy screen arrived of the Samsung Galaxy S26 Ultra: an innovation that no one expected and that made the AI ​​or the cameras of that mobile barely matter. Because although all those things add up, they are an evolution that we were all waiting for. But the privacy screen thing is something else: it is an everyday revolution and so obvious that one can only think how it is possible that we are in 2026 and no one would have invented something like this before. Samsung, as our colleague Ana Boria rightly says – please, don’t miss the Short -, has suddenly destroyed the entire industry of tempered glass that protects privacy. For years we have seen how it was possible to add a “privacy protector” in the form of protective glass to our mobile phone or laptop. With it it was possible to prevent any curious/gossip from taking a look at our device over our shoulder, but Samsung has made these protectors no longer necessary, because it has shown us how this technology can be part of the device’s screen itself. The idea is not entirely new, of course. HP has already applied a similar idea in some of its laptops a whopping 10 years ago. He called it Sure View and developed it in collaboration with 3M. That technology effectively allowed the viewing angles of the EliteBook 1040 and 840 to be critically reduced, but the proposal did not seem to work. Image: Samsung. Samsung, however, has gone a step further because this privacy screen can not only be activated and deactivated whenever we want: it can even be activated or deactivated in a personalized way for each application: if you want the privacy screen mode to be activated every time you look at your bank application, you just have to select this option in the settings. The customization of this feature is also extraordinaryand Samsung allows you to adjust it so that it is activated automatically, for example, when we receive notifications, or that the screen also goes into “anti-gossip” mode just when we are entering a PIN for an application. With the function activated, the screen only looks good to those looking at it from the front. This is one of those ideas that show that not everything is invented in the world of technology and that a real practical and everyday improvement as “silly” as this can be much more important and impactful than some AI options that remain fireworks. In fact, here Samsung has surprised us with an innovation that should make apple blush: the Cupertino company does not stop boasting that They are the champions of privacyand although they have certainly traditionally stood out in this section, here Samsung has left them biting the dust. To them and to everyone. Privacy screens have already become one of the clear technological innovations of 2026. Now We just hope that all manufacturers follow the story and end up implementing similar systems on their mobile phones. That may take some time, of course, but today it seems inevitable to think that what Samsung has done is open the door to a wonderful future in which we will be much safer from gossip. Good for Samsung. In Xataka | Image | Xataka with Freepik

Surfshark is a top tool to protect your privacy on the Internet

With news of massive hacks almost every week, it is logical that we want to add an extra layer of security while browsing the Internet. Of all the ways there are to do this, the most useful and easy to use is a VPN. If we are actively searching for one of them, we have one of the best VPN available at a discounted price: it is surfsharkavailable now for 1.99 euros per month. Surfshark Starter Subscription – monthly The price could vary. We earn commission from these links A secure, fast VPN that you can use on all the devices you want As we always tell you, a VPN It is one of those tools that does not get in the way and that is always good to have installed on our devices. Yes, there are free options on the Internet that, for specific use, are not bad. The problem is that most of them They are not as safe as they say they are.hence It is advisable to invest a little and get a payment one that is safeas is the case with Surfshark. How can a VPN help us be safer on the Internet? By activating it, what we will be doing is passing our traffic through an encrypted and secure tunnelin such a way that no one will be able to see what we are doing. This is ideal in any scenario, but especially recommended if we are going to use a WiFi network that is not ours, such as that of a hotel. Another thing that a VPN also allows us to do is keep our IP address hidden. By using it, we will be preventing it from being registered anywhere or that no one will be able to intercept it, something ideal, since with it they can even impersonate our identity. Let’s now talk about Surfshark’s offer. This company offers three different plans, the Starter being the cheapest of them. Right now we have it available for 1.99 euros per month if we go for their two-year plan. In this way, in total we will be paying 53.73 euros and we will take three extra monthsso we will enjoy the tool for 27 months in total. And be careful, because you can use it on as many devices as you want without paying more. If we don’t mind investing a little more and are looking for a more complete tool, we can automatically go to your plan Surfshark Onereduced right now to 2.49 euros per month. This, in addition to including the VPN, comes with more additional tools, such as an antivirus or a tool that warns you in case your data is leaked on the Internet. Surfshark One Subscription – monthly The price could vary. We earn commission from these links The improved version of the previous plan, called Surfshark One+is also on sale: it goes for 4.19 euros per month. This includes, essentially, everything from the previous plan, along with a very interesting tool called Incogni. It allows us to delete our data from people’s company databases, which is very useful. Some of the links in this article are affiliated and may provide a benefit to Xataka. In case of non-availability, offers may vary. Image | Nathan Fertig on Unsplash,Surfshark In Xataka | Why it is dangerous to connect to public Wi-Fi and what you should do to protect yourself In Xataka | Free VPN and security: what’s the problem, why you should be careful

Apple made privacy its flag. One of his functions has resulted in a fine of 98 million euros in Europe

Privacy has been one of Apple’s great arguments to explain why its ecosystem works differently. It is not just a technical issue, but a narrative built over years. Precisely for this reason it is surprising that a tool presented as an advance for the user is at the center of a fine of almost one hundred million euros. The Italian Competition Authority has imposed Apple fined 98.6 million euros for abuse of dominant position, considering that the implementation of App Tracking Transparency restricts competition. The focus is not on the idea of ​​​​protecting data, but on how those rules were applied to developers who distribute their apps on iOS. This is where the underlying shock lies. The origin of the function. Transparency Tracking App It does not arise in this regulatory context, but several years earlier, as part of a broader change in Apple’s privacy strategy. The feature was introduced in April 2021 with the release of iOS 14.5 and was presented as a direct way to return control over advertising tracking to the user. From then on, each app had to ask for explicit permission before tracking user activity on other apps and websites. It was a turn that reordered the mobile ecosystem from within. The logic behind App Tracking Transparency is based on a specific definition of what Apple considers tracking. It is not just about displaying ads, but about linking data collected in an app with information obtained from third-party services for targeted advertising or measurement. If the user chooses not to be tracked, the developer loses access to the IDFA and, according to system rulesnor may you use other personal identifiers for the same purpose. It is a technical cut that simplifies the user’s decision, but has direct consequences on how many applications are monetized. A position of strength in the iOS ecosystem. For the Italian authority, the key is not the subsequent opening of the system, but the situation that existed when ATT began to be applied. During that period, Apple concentrated control over the distribution of iOS apps and over the rules that govern advertising tracking at the system level. From that dominant position, the regulator concludes, the company was able to set conditions that had a competitive impact. All of this, beyond the stated objective of protecting user privacy. The App Tracking Transparency Notice The core of the reproach: “double consent.” The heart of the penalty is how ATT was applied to third-party developers. According to the Italian authorityApple’s screen required a first permit to be requested which, by itself, did not meet all the requirements of European data protection regulations. This forced developers to request a second additional consent for the same advertising purpose. That extra step, the regulator maintains, reduced the probability of acceptance and limited the collection and use of data necessary for personalized advertising. The economic impact is one of the pillars of the file. By increasing the friction of obtaining consent, ATT limited the collection and linking of data used to measure and personalize ads. For the Italian authority, this harmed developers whose business is based on the sale of advertising space and also affected advertisers and intermediation platforms. In the summary of the case, the regulator adds that this design could generate benefits for Apple, both through higher commissions associated with App Store services and the growth of its advertising business. Was there another way to do it? One of the keys to resolution is that the problem is not in the goal, but in the path. The Italian authority claims that Apple could have achieved the same level of privacy protection without requiring duplicate consent requests. Disagreement and notice of appeal. Apple has expressed its disagreement with the resolution of the Italian authority and considers that it does not adequately value the privacy protections provided by ATT. In a statement cited by Reutersthe company insists that the system was created to give users clear control over ad tracking and that its rules apply equally to all developers. The company has also confirmed that it will appeal the fine and that it will maintain its commitment to protecting user privacy. The fine is the result of a long and complex investigation. According to the case summarythe Italian authority opened the file in May 2023 and expanded its scope in October 2024, in coordination with the European Commission, other competition regulators and the national data protection authority. This joint approach underlines that ATT’s analysis was not limited to a single country or a single dimension. Rather, it was approached as a intersection between competition, privacy and the functioning of the digital market. Beyond the announced appeal, the resolution imposes immediate effects. The authority orders Apple to immediately cease the aforementioned conduct and refrain from repeating similar practices in the future. In addition, Apple has 90 days to inform the AGCM how it will comply with those demands. It is not clear, for now, whether this calendar also depends on the appeal process, but the case makes it clear that the debate is no longer just theoretical. Images | Georgiy Lyamin | Screenshot In Xataka | We believed that Microsoft had already put Copilot everywhere. LG shows us that we were very wrong

sacrifices privacy to not be left behind in AI

Europe, which until now seemed one of the few champions of privacy, threatens to stop being so. The European Commission is preparing a “digital omnibus”, a package of measures that will theoretically be announced at the end of the month and that propose notable changes to current privacy regulations. Why is it important. Draft documents obtained by Politico They are worrying. European Commission officials say these measures are intended to simplify many of the laws that regulate the technological field. The executive, they claim in this medium, insists that it is only cutting certain excessive rules through “targeted” amendments, but those drafts show disturbing changes. A weak GDPR for a strong AI. The changes that, for example, will affect the General Data Protection Regulation (RGPD, or GDPR for its acronym in English) will be carried out with a singular objective: to benefit the developers of AI models. The pillars crumble. Jan Philipp Albrecht, former member of the European Parliament and one of the architects of the GDPR, this spells the end of data protection and privacy that were pillars of the EU strategy. “The Commission should be fully aware that this is drastically undermining European standards.” Is Europe left behind, so out of privacy? What is certain is that European economic power is losing relevance and that seems to have motivated these changes. Former Italian Prime Minister Mario Draghi mentioned the General Data Protection Regulation as an obstacle to European innovation in artificial intelligence in its historic report on competitiveness from last year. But. The question, of course, is whether Europe really needs to sacrifice the privacy of its citizens in order not to be left behind in the technological field. When the EU released the first regulation on AI, the AI ​​Acthe stuck out his chest precisely for applying an (overly) cautious approach. This provoked criticism that for months has caused some steps to be taken back in those so rigid goals. Europe, technological pariah. The AI ​​Act, the DMA and the GDPR have certainly caused the deployment of AI models and functions in the old continent to be blocked or delayed. Passed with Apple Intelligence and with Copilotfor example, but while Europe restricted the arrival of AI to users, in the US and China the deployment has been dazzling, total and without restrictions… for better and for worse. In fact, in the United States the philosophy of laissez faire is extraordinary, and companies even advocate for forget copyright laws. Dangerous exceptions. These documents aim to create exceptions for AI companies that would allow them to process disturbing special categories of data such as religious or political beliefs, race or health data that could be used to train and operate their AI models. The definition of such types of data, which enjoy additional protections under privacy regulations, is expected to be redefined. Anonymized data. Another objective seems to be defining what constitutes personal data. Thus, pseudo-anonymized data—with personal details opaque to prevent identification of a person—may not be subject to GDPR protections. This type of change would occur after the precedent of a recent ruling in that sense by the EU Court. Cookie banners. Finally, the draft wants to reform annoying European rules about cookie banners by including a provision in the GDPR that would give website and app owners more legal grounds to justify tracking users beyond simply obtaining their consent. Once again, bad news for the privacy of European users. Total uncertainty. The European Commission is expected to reveal its plans on November 19. Changes to the current drafts may be proposed during these days, however. Once this package of measures is presented, both EU member countries and legislators will have to approve it, something that is not certain either due to the great divisions that exist in terms of privacy between them. We are living it for example with the controversial Chat Controland these measures also go in that direction. In Xataka | Europe dominates open source AI but loses the race: the paradox that 150 billion euros will try to solve

Privacy is dying since ChatGPT arrived. Now our obsession is for AI to know us as best as possible

For years we have learned to distrust. Not to share too much, to be suspicious of each clickof each form, of each extra permission that the mobile phone or some app asked us for. To frown. Privacy was the last bastion of digital dignity, the ground we had to defend. But something has changed. And he has done it without resistance. Since ChatGPT and company arrived, and especially since the projects and expanded memorywe have crossed an invisible line. We no longer just agree to hand over our data, we offer it proactively. What’s more, we get frustrated when AI doesn’t remember enough, or when it’s not able to quickly process a report or analytics. Or when it doesn’t anticipate what we want. The paradox is brutal. We’ve gone from being outraged that Instagram showed us an overly personal and painfully targeted ad (shirts that camouflage lorzas, infertility treatments) to being impatient if ChatGPT doesn’t remember something we could use it to remember. Of the “I don’t want to be tracked” to “why the hell don’t you know me better by now?” The difference comes from the perception of immediate usefulness: social platforms monetized our data by selling their access to third parties to segment ads, AI uses it to give us more useful answers. Or so we think. The trick is in the illusion of reciprocity: When you provide information to a social network, you receive in return content that you did not ask for and advertisements that you do not want, no matter how accurate they may be. When you hand it over to an AI, you get personalized responses, assistance tailored to you, solutions that seem designed exclusively for your case. In the second case, the transaction feels fair. Symmetrical. Even generous on the part of the machine. But the architecture of power has not changed. She has only become more seductive. Now they don’t watch us, they understand us. And they don’t track us, but they remember us. Language matters, because it changes how we perceive what we are giving up. We have gone from being spied on to being cared for. And that makes a psychological difference, even though the end result is the same: handing over the entire map of who we are to entities we do not control. Privacy is not dead. He is giving up due to exhaustion. Because defending something that makes our lives more difficult, that deprives us of comfort and efficiency, is unsustainable when the alternative promises to know us so well that it frees us from explaining ourselves over and over again. In Xataka | OpenAI is making the tech industry unite its destiny with yours. For the sake of the global economy, it better work Featured image | Xataka

It is a trap for privacy

In 2022 Kenn Dahl car insurance raised him 21%. He had not had any incident with his Chevrolet Bolt, so he asked his insurance agent and he gave him a advice: look at your lexisnexis report. This company based in New York is a gigantic data broker that has a division that is responsible for Collect information about drivers and then supplies it to insurance companies. And that’s where Mr. Dahl decided Ask for your report to the company, which was obliged to give it to it due to the Fair Credit Reporting Act. Your car is cool When he received it, Mr. Dahl was amazed. That 258 -page report had more than 130 pages dedicated to each moment in which he or his wife had driven the car in the previous six months. Included details of 640 journeys with their start and end hours, the distance conducted or Even accelerons and brakes. The only thing that did not reveal was the specific places from and where it had gone. Kia Connect is a service that informs the driver of his “driver score” to (theoretically) offer custom automobile insurance. The system does not stop collecting data on your driving. As explained in The New York Times, more and more manufacturers make use of all kinds of sensors and systems that collect information about drivers, and do so without their express knowledge and, of course, without their consent. And modern cars can even have systems that “describe” the driving of who takes them, something that allows manufacturers to collect that data … and sell them. There are more users who have noticed this type of Massive collection of data in your cars. In those of General Motors the Smart Driver onstar system is used that users can deactivate, as several drivers who commented on the situation years ago In Reddit either In a forum Dedicated to “Chevy” Bolt. Other manufacturers make use of this type of systems and activate them by default, such as the Kia Connect system From the KIA aimed at obtaining a “score” that helps your car insurance to adjust to your way of driving and reward the most reliable drivers according to the data collected. In Peugeot support forums even There is talk of the “Private Mode” of driving that when activated “prevents data and/or the position of the vehicle.” But as they also point out in that information, if one deactivates it, it stops accessing functions such as connected navigation, remote control or Mirror Screen function. According to a study of 2023 of the Mozilla Foundation, 88% of the brands analyzed by them inferred additional data from the information they collected. And among those inferred data, something disturbing: they could confirm a profile of personal beliefs and even sexual activity. Not only that: in this study 19 of the companies analyzed (76%) They sold those personal data to other companies. The good thing about the Tesla is that they have cameras. The bad, too The suspicions that can emerge in this type of data collection can go even further, especially if we remember What happened to the Tesla. Between 2019 and 2022 groups of employees of Tesla They privately shared videos and images taken with the cameras of customer cars. In some of those videos, Tesla customers had been captured in pregnant situations. For example, an ex -employed from the company could see the video of a completely naked man approaching one of those cars. In others, even accidents such as a Tesla who ran over a bike child who was fired. That video, said one of the former employees in the Reuters reportspread through those internal networks “like gunpowder.” The Tesla are only One more example of that massive data collection. According to The Guardianthe sensors and cameras of the car get location data – although Tesla does not store them unless they are of an accident – habits and type of driving (speeds, brakes, accelerons), and other data. For example, diagnostic information and car use and data related to infotainment systems such as navigation history or voice commands used. It is possible despite disable the function that transfers part of that information to the Tesla servers, but in doing so we can also lose some functions of the vehicle. The European Data Protection Committee published in 2021 Their guidelines on the processing of personal data in this environment, and according to said regulations manufacturers must minimize data collection and prioritize their local treatment. In addition, control tools that allow you to exercise access, rectification and suppression rights are urged. The requirements are there, but at the moment its compliance seems as little erratic. In Electronic Frontier Foundation (EFF) they recently provided tips when consulting What data your car has and how to avoid That they are shared, but of course the situation could be aggravated, especially considering that the renewal of the mobile park causes that more and more users have cars with all these options … and voracity when it comes to collecting data. Image | Jonas Leupe

Telegram has just starred in a surprising script turn. One in which privacy loses

Telegram announced these days that predictably will integrate Grok in its application This summer. We will have the ability to use a chatbot of AI inside the messaging app, which is precisely the same as the goal has driven to Integrate goal AI In WhatsApp recently. The strategic alliance is striking, but also worrying. A juicy agreement. After the announcement of Pavel Durov, CEO of Telegram, Elon Musk I clarified that the agreement was not yet signed. Durov replied that only some formalities remained. If it is finally confirmed, XAI will pay 300 million dollars to Telegram in cash and shares. Telegram will also take 50% of the revenues of Grok subscriptions that are sold via the messaging app. Everything seems fantastic for Telegram, but what will Musk win? Data are a treasure. This agreement allows Grok to immediate access to a massive user base, part of which can become payment subscribers of the XAI IA service. But above all, it will allow Elon Musk’s company to obtain access to a massive data collection that can be used to train their AI models, for example. The same happens with goal and the inclusion of goal AI in WhatsApp: the conversations we have with the chatbot are not encrypted, and can serve among other things to train the company’s models. Musk already did something similar with x. Just two months ago Elon Musk made a strategic play and fused X with Xai. The operation was logical, especially after Grok’s initial integration in X, but that strengthened even more a forceful reality: that the data of X users could be used to Train XAI AI models. Taking into account that Musk had been Looking for a plan B For data scarcity, that operation and the agreement with Telegram demonstrate that voracious hunger of data. Hardov’s promises. In a tweet after the ad, Durov expressed his Commitment to privacy. “The user’s privacy is essential,” he said, adding that “to be clear, XAI will only access the data that Telegram users explicitly share with Grok through direct interactions. It’s expected: you can’t send messages to anyone (not even a chatbot) without sharing what you write.” That last sentence showed again that by default Telegram does not figure the messages. If users are concerned about their privacy, they must be attentive to this circumstance. As a security expert explained in a 2016 study, “Telegram is not safe“ Telegram and privacy. For years Telegram grew precisely because of its theoretical focus on the privacy of its users. It was one of the first to announce the use of extreme-a-extreme encryption (E2E, end-to-end). However, even that option He had small print. Not many users realized that in reality E2E encryption only applied in secret chats to, according to those responsible, that they would not raise suspicions. Yielding data to governments. Things worsened in September 2024. That was when Telegram CEO, Pavel Durov, announced that would provide users with governments to be asked, although in reality the service conditions They already opened that door long before. Changes in these policies were produced After the controversial detention that Durov had suffered in France a few days before. The measure was especially surprising, especially considering that Telegram had become famous for Resist those same interference by the government of Russia. In January 2025 it was revealed how Telegram had yielded data from thousands of users to the United States authorities. We have just known that then they have continued yielding many more. Changing headquarters. An analysis of the security and privacy of Telegram carried out by ESET He revealed the deficiencies in these areas and in the protection of anonymity, but also pointed out another important detail: Telegram has changed several times of jurisdiction and its headquarters has been traveling from Berlin to London, then to Singapore and finally Dubai. The objective was to maintain its independence, but that leaves an ambiguous legal framework on data management. Bad news for privacy. All this makes the suspicions about the privacy guarantees offered by Telegram to its users are harmed. XAI is an American company, and the already revealed data assignments to that government by Telegram only reinforce a worrying trend. Image | Steve Jurvetson In Xataka | Telegram is much more than a messaging app. It has become the new Deep Web

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.