Chinese startups have been relying on NVIDIA chips to train their models for years. That is already changing

The name of the Chinese startup Zhipu AI (Z.ai) may not sound familiar to you, but perhaps GLM, its AI model, does a little more than its latest version, GLM-4.7already competes with Claude Sonnet 4.5 or GPT-5.1. The real surprise of this “Chinese AI tiger” is the launch of GLM-Image…and not so much for what he does, but for how he has managed to do it.

what has happened. GLM-Image is a multimodal generative AI model that focuses on image generation. The idea, of course, is to compete with options like Nano Bananafrom Google. That’s interesting, but even more striking is the fact that the model has not been trained with conventional chips.

Trained with Chinese chips. According to those responsible for Z.ai, this model is the first developed in China that has been fully trained with “local” chips. Specifically, it has been trained with Huawei’s Ascend chips thanks to the use of servers Huawei Ascend Atlas 800T A2 and a framework called MindSpore. Thus, traditional NVIDIA AI chips, which are usually the usual choice for AI model developers in Chinese startups, have not been used.

Turning point? This milestone demonstrates the real feasibility of training high-performance generative AI models on a platform developed entirely in China. We are not dealing with something minor: it is validation that it is possible to continue innovating in this area despite the restrictions imposed by the US. In fact, Zhipu AI — included last year on the US blacklist — has intensified its collaboration with other local manufacturers, such as the promising firm Cambricon that has risen from the ashes thanks to tariffs.

Threat to NVIDIA. The news comes at a unique time, because NVIDIA has not stopped pressuring the US government to once again allow it to sell its advanced AI chips to Chinese companies. He has obtained that permission—which It won’t be free—, but now the one that might not be interested is China, which he hasn’t said anything at all. That chips from companies like Huawei are a valid alternative for training quality AI models can change many things in this area.

Zhipu goes like a shot. The Chinese startup has also just gone public, and since it has done so its shares they have shot up more than 80%. Investors see the company no longer as a rival to Google or OpenAI, but as a banner. One that shows that it is possible to compete without depending on the US and its companies.

Huawei, great beneficiary. If the trend continues, Huawei can become the Chinese NVIDIA, and the company prepares an increase in production of its AI chips. It is not the only one: Cambricon plans triple your production by 2026, which seems to make it clear that the Chinese industrial machinery is moving quickly to neutralize the impact of US vetoes.

Challenges…Despite everything, Zhipu already has warned that the price war in the AI ​​sector will become international. If Chinese companies end up controlling the entire chain (or rather, their chain), they could offer AI services at much lower costs than their Western competitors, who must pay NVIDIA’s margins and Big Tech’s cloud infrastructure.

…and unknowns. This technological achievement raises other questions. One of the most important is how powerful and capable Huawei chips are compared to NVIDIA’s in these processes: is training much slower? Is it more expensive in time and resources? The efficiency of the MindSpore framework compared to Pytorch or TensorFlow is another of the key components of these developments.

In Xataka | Faced with the US strategy, China has a plan to revive its technology industry: that AI belongs to everyone

Leave your vote

Leave a Comment

GIPHY App Key not set. Please check settings

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.