the most useful will be the one that watches us the most
Google just launched Personal Intelligence. Connect Gemini with your Gmail, Photos, YouTube and your search history “with just one touch.” The promise is that this way you can receive personalized responses without having to explain your context every time. The promotional example is striking: you are in a tire store, you need to give your license plate and Gemini infers it from the photos you have taken of your car. It’s not the best example because how many people don’t know their own license plate. But the idea is understood. Anthropic launched cowork a few days ago and it has already gone viral: you give him access to the files on your computer and Claude can view, edit, create or delete them. Organize them, modify them. OpenAI bought Sky a few months agoa macOS app that “sees your screen and acts on your apps.” There are already three products that converge towards a similar architecture: have full access to your information in exchange for giving you maximum utility. The investment is total. For this century, the mantra has been “more privacy = better product.” It didn’t always work in practice, but on paper everyone seemed to agree. Apple has made privacy a sales argument, Meta has been beaten for not taking good care of it, there are companies like Proton, Internxt, Mega or pCloud that were born precisely with that concern in their DNA. Now the equation is reversed: we perceive greater utility the greater the intrusion. And it does not bother the user. On the contrary, he asks for it, gives more information because he knows the answers will be better. The competition is in the AI models, but also in the levels of consented access: Google doesn’t need its models to always be better than GPTs, it needs you to connect more applications. Anthropic does not need to surpass in benchmarks Gemini needs you to give it access to the files on your computer. And OpenAI didn’t buy Sky for its technology, but because of how refined and practical it has made the intrusion as core of your product. The difference is more psychological than technical. No one says “give us full access to your entire digital life”, they say “personalize your experience”, “connect apps with a tap”. Or “Claude may take potentially destructive actions,” a protective adverb against the fact that his AI can delete your files. Three years ago, a product with these permissions would have been presented as dystopian. Today is the end. What happens is simple: Rearranging downloads manually is a pain. Search for an email among thousands too. Walk to the parking lot to look at the chassis number, another. Giving up privacy to avoid those small frictions seems reasonable. And it is. Nobody has deceived us. But he timing also counts. Because? Because the change is not only technical, it is also cultural. First we had to normalize the “copilot” and then add “the secretary who sees everything.” Each permit prepared us for the next. Analyze this document → Access my Drive → Connect everything with one touch. And it works because it pays off, a lot. AI that only knows what you tell it is, objectively, less useful. Josh Woodward, VP of Gemini, He gave a very good explanation: When you went to change your tires, Gemini suggested specific models based on the trips it detected in Photos. Climates, types of terrain… No AI does that without that total access. The uncomfortable question is what happens when the most useful tool is the most invasive, we know it, and yet we prefer it. When immediate convenience meets abstract privacy, the former always wins. These tools warn us of their risks, but most of us are deciding that we don’t care. Or that it’s worth it. This happened with Google Maps, YouTube, Spotify or Instagram. The difference is that before the product was the map, the music, the social network. Now the product is an assistant that really needs to know everything to really work. And it’s going to work. In a couple of years, AI with full access will be so superior that it will seem absurd to have ever hesitated to give it permissions. Just as now it seems absurd to us to use our mobile phone without geolocation for privacy. When we ask ourselves how we normalize this, the answer will be very simple: we ask for it. The alternative was having to go look for the information. In Xataka | We couldn’t tell you if the image at the top of this post is real or generated by AI: we are in the era of permanent doubt Featured image | Anthropic