Chatgpt’s mobile app generates 30 times more money than Claude, Copilot and Grok together. Still not enough

If there is a chatbot that stands out in popularity over the rest, that is undoubtedly chatgpt. His mobile app was launched in May 2023 And since then he has occupied the download tops of the main stores, becoming The most downloaded app in the world A few months ago. Openai has reached another milestone with its app: since its launch already has generated 2,000 million dollars. To put it in context, this would be approximately 30 times more than what Claude, Grok and Copilot combined have generated. However, not everything is as beautiful as it sounds. Undisputed leader. According to figures AppfiguresOnly for 2025, the Chatgpt app has generated 1,350 million dollars, which represents a growth of 673% compared to the same period of 2024. Chatgpt is generating 193 million dollars per month, while the next on the list is Grok with 3.6 million per month. If we look at the average download per expense, ChatgPT goes to the head with 2.91 dollars, followed by Claude with $ 2.55, Grok with $ 0.75 and finally co -pilot with only $ 0.28. It is clear: Openai is winning the battle of mobile apps. Still not enough. 2,000 million are many millions and that only with its mobile app. Adding all your services, only In July 1,000 million entered And it is estimated that they will enter 12,000 million this year. However, It is still light years of being profitable And the reality is that they enter much less than they spend. The company did An internal study in which they estimated that the losses between 2023 and 2028 would amount to 44,000 million dollars. According to their forecasts, they will not be profitable until 2029, when they expect to enter 100,000 million dollars annually, almost ten times more than they invoice now. The Big Tech are on the right track. The great technology have invested amounts of authentic madness in AI And it has not been until recently that they have begun to see A slight green outbreak in its results. After several years burning huge amounts of money, Google, Amazon and Microsoft have seen how their income is finally to cover the investment so tremendous. However, it is still not thanks to the products AI directly, but to the cloud services. Even so, the reality is that None is making gold with AI. Mission: Monetize the AI. If there is something that brings to the business of AI is How to monetize your chatbots. Subscriptions “pro” have become the appeal to get income, Some like Claude Max cost a real fortune and OpenAi He followed his steps with O3 Pro. The Subscriptions are getting more expensivebut they are not yet enough to reach the level of expenses. There is no azure or a web service that can get the chestnuts out of the fire as it is happening with the Big Tech. The exit seems clear. Advertising. At the end of last year there were rumors that they could start putting advertising in Chatgpt. At the moment it has not materialized, but Rumors have not ceased and seeing the numbers may be a solution to the profitability problem. They have not been the only ones who have flirted with this idea, Perplexity was also testing it And Elon Musk recently confirmed that There will be advertising in Grok. Very careful. Implement advertising in a chatbot is delicate since we could find ourselves in a scenario in which it ends up losing the trust of users. For example, if we go to a chatbot in the process of buying a car, we could doubt whether the recommendations are based on an advertising campaign. Integration should be clear to avoid possible confusing situations. What seems clear is that, given the serious problem of profitability, advertising stands as a more than attractive option for AI companies. In Xataka | Big Tech have buried thousands and billions in AI. They are earning money, but not thanks to the AI

Sam Altman states that Chatgpt’s water and energy consumption is tiny. The problem is that it does not give evidence of it

An email of 100 words generated by GPT-4 Consume 519 milliliters of water. That was the conclusion to which researchers at the University of California arrived a few months ago after analyzing this OpenAi model. Sam Altman, CEO of the company, has just yielded its own estimate on the consumption of water and energy of each consultation of Chatgpt. And it is very different. 1,000 times less than what was said. According to Altman, an average consultation in Chatgpt consumes much less than what had been indicated in previous studies. Your data are strikingand to understand them makes interesting analogies: “As production automated in data centers automates, the cost of intelligence should approach electricity. (People are usually curious to know how much energy consumes a chatgpt consultation; the average consult (0.32 ml); A previous study of Epoch ai corroborates the data that Sam Altman has now wielded. Source: Epoch AI. And the tests? Those figures mentioned by the OpenAi CEO have a problem: they have no visible support. He throws them without citing sources or explaining where he has taken them out, something that makes it difficult to believe. A Meta executive answered the question of How much consumes the inference AI A year and a half ago, responding that “only two nuclear reactors would be needed to cover it.” But previous studies coincide with Altman. Although he does not mention any evidence, in February, Epoch AI researchers precisely They published a study trying to estimate the energy consumption of chatgpt. In their conclusions they indicated that on average a chatgpt consult Previous report of the researcher Alex de Vries. Since then, of course, many things have happened. Too pessimistic. And as they commented on the study of Epoch AI, the difference comes from the fact that the models are today much more efficient than in 2023, when VRies conducted their study. So is the hardware in which these models are executed, and that estimate was also used a “especially pessimistic” approach. In Openai’s study they also threw an especially pessimistic estimate and pointed out that “most of the requests (A chatgpt) are much cheaper (energetically).” More studies. Another independent study published by Andy Masey in January 2025 reached a similar conclusion and claimed that “using Chatgpt is not bad for the environment.” It was based on EPRI data May 2024 that also estimated a high consumption of 2.9 Wh by chatgpt consultation. Estimated water consumption In data centersfrom A SUNBIRD studyit was also very modest compared to other online activities. Water consumption in data centers for various online activities. Source: Andy Masley. Fifte. Precisely the data of water consumption was another striking in that estimate of Sam Altman. According to him, a chatgpt consultation barely consumed 0.32 ml of water, “a quinceava part of a teaspoon.” The figure suggests that the water needed to refrigerate data centers that process these requests is much less than what was thought only one year ago. And training, what? These estimates focus on the AI ​​inference section, that is, our use of chatgpt that receives a consultation and processes it inferring (generating) a text result. Although Altman does not clarify it, he does not seem to include here the energy and water cost of training AI models, which is very high and makes thousands of Gpus They work at full power For months, with the consequent water expense in data centers to refrigerate all those components that dissipate high heat amounts. As I pointed out The researcher Ethan Mollick, GPT-4 probably used more than 50 GW to be trained, enough to give energy to 5,500 homes in a year. We continue without definitive data. Altman’s claims are as always striking, but the lack of clear evidence makes it difficult to believe these data. Other recent studies are more useful when it comes to reflecting this increasingly lower cost both in energy and water from the use of AI, but there are no accepted standards or a consensus on the true impact of energy and water consumption when using chatgpt or other AI models. Image | Lukáš Lehotský | Village Global In Xataka | The light price is again negative: it is a sign that the system needs a redesign

How to improve Chatgpt’s privacy preventing what you write is used to train artificial intelligence

Let’s explain How to improve Chatgpt’s privacydeactivating the option with which you allow OpenAi to use all the content you write or believe to continue training its artificial intelligence models. It is an option that is activated by default in your profile, but it is easy to disconnect. When you are using chatgpt, if you don’t change anything you are giving the company permission to collect your interactions. Then, these questions that you have asked the AI ​​and the answers generated for you will be used in the future to continue training and improving the models. But if you don’t want this information to be used because it is private, we will tell you how to deactivate it. Disable data sending to chatgpt The first thing you have to do is enter the configuration of Chatgpt. For that, on the mobile click on the side options button and click on your username. In the web version click your profile image and choose the option of Configuration which will appear in the window that opens. If you are on the mobile, what you have to do once you enter the configuration is click on the option Data controls that will appear in the section of Accountwhich is the first to see above all. Once inside, deactivate the option Improve the model for all That will appear in the first place. With this, your content will no longer be used to continue training OpenAi’s models. In the desktop versionwithin the configuration click on the section of Data controls. Once inside, click on Model improvementwhere you will be able Disable the option Improve the model for all That will appear in the first place.

A study has estimated Chatgpt’s energy cost. According to its conclusions, it is not as apocalyptic as it appears

Chatgpt at 3 WHating. In October 2023 A study Alex de Vries pointed out that a Chatgpt consultation had an estimated energy cost of 3 Wh. His estimate came from comments from Google, whose managers indicated that Chatgpt’s consumption was “probably” 10 times that of a search. And Google herself had revealed that in 2009 each search was 0.3 Wh, and hence the final figure. De Vries, by the way, has previously published another study in which he warned of the worrying bitcoin mining energy consumption. From Vries, yes, he took reference that the average consultations were about 4,000 input tokens and 2,000 output, which would be equivalent to quite long questions and answers, when it is normal to make them shorter. Efficiency gains whole. Throughout that time many things have changed, both for Google – whose results and infrastructure are very different from that of 15 years ago – and for Chatgpt, which has also gained whole efficiency. It is very likely that Google consultations are now more efficient, but surely those that are also those of Chatgpt and other chatbots. A new estimate. Epoch AI is a non -profit organization that among other things is responsible for the creation of the Benchmark FrontierMath. This test tries to assess the mathematical computing capacity of AI models, and has become one of the most interesting metrics for its difficulty. Your researchers have now published a study in which they precisely estimate the energy consumption of a chatgpt consultation. Chatgpt consumes ten times less than what was thought. According to its conclusions, Chatgpt consultations based on the LLM GPT-4O consume about 0.3 watts-room, which is ten times less than what was previously considered. That 0.3 Wh “calculation is in fact relatively pessimistic, and it is possible that many or most requests are even cheaper.” How they have done the calculation. In Epoch they have been based on known data for their calculations. Thus, they point out that according to Openai a Token equals approximately 0.75 words and that generating a token costs approximately 2 flops. Taking into account the calculation capacity of the GPUS NVIDIA H100 (989 TFLOPS in TF32, 67 Tlops in FP32 Operations) and their consumption (1500 W, although they consider that they actually consume 70% of that average power), the result It is the aforementioned. Everything has improved. As we pointed out, in Epoch AI, they emphasize that the difference between this estimate and the previous Until better multiply. In addition, in the previous estimate “there was an excessively pessimistic count of the necessary tokens.” How much are 0.3 Wh? They are equivalent to less than the amount of electricity than a LED bulb or a laptop consume “in a few minutes.” According to the Energy Information Administration of the United States, an average home there consumes 10,500 kWh per year, that is, about 28,000 Wh a day. Even an intensive use of chatgpt does not seem excessively to influence that consumption. Reason consumes more. Although they take GPT-4O as a reference, they make it clear that using reasoning models such as O1 or O3-mini requires more energy consumption, but for the moment they are less popular. And train the models, too. These researchers have also highlighted the energy cost of training models such as GPT-4O, which according to their estimates would have been between 20 and 25 MW in a period of three months. That would be equivalent to some 20,000 middle homes in the US. The general costs are worrisome. Although the data in this study reveal that using chatgpt does not consume as much energy as previously estimated, the problem may be another. The general energy costs of AI are colossal and aim to be much higher in the short term: the Big Tech fever for investing tens of billions of dollars to build data centers. And he does it because all these data centers will have huge needs at the energy level. Eye: Let’s not forget the air conditioning. Image | Xataka with Freepik Pikaso In Xataka | The amazing history of ARM, the architecture that triumphs on the mobile and that was born more than 30 years ago in Acorn Computer

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.