Perplexity has already marked its path for what it believes is fundamental

For years, advertising has been the silent fuel of much of the internet. It has financed search engines, web pages, applications and services that we use daily without paying directly for them, something that we have assumed almost as a norm in the digital environment. But the emergence of chatbots and search products powered by AI forces us to rethink that balance, because here the product is no longer just content, but answers that the public should consider reliable. In this area, any suspicion about who influences what is read stops being a technical detail and becomes a central issue.

Clash between trust and monetization. In recent months we have begun to see how large AI companies take clear positions, some betting on introducing advertising in their free products and others rejecting it outright due to the impact it could have on user perception. The result is a map that begins to outline a division, just when the sector needs to demonstrate that it can become a sustainable business. And what’s at stake is not just how you make money with AI, but what kind of relationship you build with those who use it.

Trust as a product. One of the clearest movements in this division is found in Perplexity. The company even tested ads in 2024, showing sponsored responses under the chatbot’s responses, but it began to withdraw them at the end of last year and now assures that it has no plans to continue down that path. “The user must believe that this is the best possible response to continue using the product and be willing to pay for it,” an executive explained to the Financial Times. The internal conclusion is direct: if the advertisements sow doubts, the value of the product itself is compromised, although the company leaves the door open to revisit that path in the future.

Perplexity 1
Perplexity 1

War of ads against ads. Anthropic is in the same line as Perplexity, which not only defends keeping its chatbot without advertising, but has turned that decision into a public message. The company launched a campaign prior to the Super Bowl with a direct slogan, “The ads are coming to AI. But not to Claude,” which points to the direction of the sector without explicitly mentioning competing products. The campaign shows scenes that caricature commercial recommendations within personal conversations, underscoring the discomfort that such a scenario could generate.

The reaction from OpenAI. Sam Altman rated the campaign of Anthropic as “clearly dishonest” and defended that the ad model his company is exploring would follow different principles, with advertising “separated and clearly labeled” and without influence on the system’s responses. The manager framed this decision as a question of access, arguing that expanding the free use of AI requires new sources of income, and he contrasted this with the idea that Anthropic offers an expensive product for those who can pay for it while OpenAI seeks massive reach with free access.

Not everyone wants (or can) do without advertising. Training and sustaining these systems burns cash and has increased the pressure to find sustainable income as usage grows. In this context, some relevant actors are exploring advertising as a way to finance free access, with formulas in which sponsored content appears separated from the responses, with testing on products like ChatGPT and in formats Google Search with AI. It should be noted that Google has not introduced ads to its Gemini chatbot.

Images | Perplexity + Nano Banana

In Xataka | Claude Sonnet 4.6 promises to do the paperwork for you. Now he has the challenge of his life: dealing with the Spanish administration’s websites

Leave your vote

Leave a Comment

GIPHY App Key not set. Please check settings

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.