Post Thumbnail

Chinese model orders of magnitude cheaper than Western analogs

The Chinese model Kimi K2-Thinking with a trillion parameters cost 4.5 million dollars in the final training stage. According to CNBC, this is orders of magnitude cheaper than Western analogs. The publication cites an anonymous source. And acknowledges that the figure couldn’t be independently confirmed. And Moonshot AI company itself hasn’t officially disclosed the cost.

The model is built on the Mixture of Experts architecture – a trillion parameters in total volume, but only a small part is active during operation. Created for complex reasoning tasks and interaction with tools. Such systems usually require astronomical computing costs, so several million looks almost ridiculous.

History repeats itself. At the end of 2024, Chinese DeepSeek trained the base model V3 for 5.5 million dollars. And the reasoning overlay R1 – for 294 thousand. Kimi K2-Thinking is also built on base K2, so the proportion is similar. However, these estimates don’t include experiments, testing, office rent and salaries.

In most benchmarks K2-Thinking holds at the level of leading Western models like GPT-5 Pro and Grok 4. And now about competitors’ prices. The final training stage of GPT-4, according to SemiAnalysis estimate, cost 63 million dollars. In the AI Index report an even larger sum of 78 million appears. And the full training of Grok 4 was estimated by Epoch AI analysts at astronomical 490 million dollars.

Either this is some magic of optimization, or someone is withholding something. Or someone is greatly overpaying for computations.

Autor: AIvengo
For 5 years I have been working with machine learning and artificial intelligence. And this field never ceases to amaze, inspire and interest me.
Latest News
OpenAI capitulated to feelings and returned warmth to ChatGPT

OpenAI released GPT-5.1, and this is not just an update. This is a capitulation to human feelings. The updated lineup includes 2 models: GPT-5.1 Instant and GPT-5.1 Thinking.

Same AI behaves differently depending on interface

Hamburg University conducted a study that shows a strange pattern. It turns out that ChatGPT's news recommendations differ greatly depending on whether the web interface or API is used. Analysis of more than 24 thousand responses in German revealed a clear picture.

Nvidia is building monopoly through investments in startups on its chips

Nvidia has turned into a venture machine that works faster than classic funds. Bloomberg reports that in 2025 the company supported 59 startups from the AI sphere. And these are not just investments - this is a strategy for controlling the entire ecosystem.

Elon Musk warned that OpenAI will eat Microsoft alive

I told yesterday that enthusiasts dug up traces of a new feature called "Group chats" in the web version of ChatGPT. And then I saw an interview with Sam Altman. And there Altman stated in the interview, quote: "I think all corporate applications can be replaced with a common AI-based platform. There's a lot of good in Slack, but sometimes it creates a bunch of fake work. I think you can create something new instead. Something like an AI-based office suite that will replace docs, slides, email, Slack and so on".

Chinese model orders of magnitude cheaper than Western analogs

The Chinese model Kimi K2-Thinking with a trillion parameters cost 4.5 million dollars in the final training stage. According to CNBC, this is orders of magnitude cheaper than Western analogs. The publication cites an anonymous source. And acknowledges that the figure couldn't be independently confirmed. And Moonshot AI company itself hasn't officially disclosed the cost.