Chatgpt costs $ 700,000 a day, Deepseek just 83,000. A key is huawei and play cards differently

2025 began with a tsunami in the segment of the artificial intelligence. After months and months talking about different models and companies such as Google, Microsoft, Apple, Goal And, of course, OpenAIa Chinese company took off the manga Deekseekan AI that shook industry foundations.

Beyond its possibilities or how good it worked, which stirred the waters were economic and hardware issues. Almost from the beginning the question of How China had taken out an AI like Deepseek With the hardware limitation they have due to the Commercial War with the United States and the impossibility of buying the most powerful graphs of Nvidia (although with controversy).

To do this, the company defended that it had to pull ingenuity Thanks to an infrastructure of NVIDIA H800 chips and a training of more than 2,788 million hours with a ridiculous cost: 5.6 million dollars. And it seems scarce because Openai He invested about 100 million dollars to train GPT-4. Another melon is what it costs to keep it.

As noted ReutersIf Chatgpt costs about $ 700,000 a day, Deepseek drops to $ 87,000. And here some things that have to be taken into account.

Deepseek is 10 times cheaper to maintain than chatgpt, according to Depseek

Last Saturday, and as noted ReutersDeepseek revealed some data on costs and income related to its V3 and R1 models. The first is a traditional, more conversational chatbot, resulting ideal for the writing and creation of content. R1, however, is a reasoning model. It stands out solving problems, using logic and is able to show step by step reasoning, using continuous learning.

To compare with better known models, Depseek V3 would be like GPT-4 and R1 something similar to OPENAI O1. In the report, Reuters highlights that the theoretical relationship of Deepseek-benefit costs is up to 545% per day.

Of course, the company itself warns that real income is significantly lower, but another pearl they have left since Deepseek is the cost of maintenance. Keep chatgpt working costs about $ 700,000 a day to OpenAI (at least two years ago). The reason is that the infrastructure of Microsoft Azure Serversit has a considerable energy cost, you have to pay wages and, obviously, all the power in hardware to process the consultations you receive every second.

Cost and Theoretical Ink
Cost and Theoretical Ink

In yellow, the costs. In blue, theoretical income

Deepseek “only” 87,072, a ridiculous price compared. A few days ago affirmed That, renting the H800 costs less than two dollars per hour and the estimated theoretical income is just over 560,000 dollars, which would add more than 200 million dollars in a year. In the upper graph, Depseek shows the cost of maintaining R1 and theoretical income thanks to the tokens that are generated, whose price depends on the moment of the day, being cheaper at night. They also clarify that Deepseek V3 is “significantly cheaper.”

This opens more issues. One is how it is so cheap because training an AI, of course, is not. Ignoring the accusation of theft by OpenAIif Deepseek has not deflated the numbers, puts on the table a situation in which it is not needed so many graphic power to train an artificial intelligence.

Here the key is the ‘reinforcement learning’, the way Deepseek has found do with much lessbut it should also be noted that, although during training the Nvidia chips are used for the R1 model, in the inference it is using The Ascend 910b of Huawei.

Huawei chips are cheaper And, supposedly, more efficient, and this decision of Deepseek is almost more relevant than what may cost to maintain the system. The reason is that you can teach the rest of artificial intelligence companies that, perhaps, It is not worth using the latest generation GPUs For everything, but only for training that occurs sometimes counted, before the implementation of AI, and then use other more efficient and cheap GPUs for inference. This inference is what is done later, in a use that we could call “real

The training would be the equivalent of swallowing technical manuals in a five -year career and the inference such as implementing that knowledge and reasoning starting from the base you have and without having to learn them again. In the end, the controversy of five million dollars Deepseek is going to be there for a while, especially when we compare with OpenAi numbers, but it is clear that Depseek is doing things from another approach and can be a good mirror for companies that come behind.

And, with one China very focused on the development of both AI as of Hardware for AIit can be the perfect ‘spearhead’ model.

Images | Github (Deepseek), Xataka

In Xataka | Deepseek has created another Milmillonaria fortune: Liang Wenfeng has become popular but its wealth is still a mystery

Leave a Comment