Post Thumbnail

DeepSeek open sources super-fast GPU kernels

Chinese company DeepSeek, which has made a breakthrough in the field of artificial intelligence, has begun an unprecedented week of open source releases, launching the first of five promised tools – FlashMLA. This project represents optimized GPU kernels that the company uses in its production systems.

FlashMLA implements multi latent attention (MLA) technology, a revolutionary method that significantly reduces memory consumption in transformers by efficiently compressing key and value matrices. Although the method itself has already proven its effectiveness in DeepSeek models, until today, optimized implementations for it practically did not exist.

The key technical characteristics of FlashMLA are impressive:
– Support for bfloat16 format, providing an optimal balance between computation speed and accuracy
– KV page cache with block size 64
– Record performance: up to 3000 GB/s in memory-bound configuration
– 580 teraflops in compute-bound configuration on H800 SXM5 GPU using CUDA 12.6

The tool is fully compatible with the entire line of NVIDIA Hopper graphics processors, including H100, H800, and other models. FlashMLA is particularly effective when processing variable-length sequences, making it an ideal solution for modern natural language processing tasks.

DeepSeek plans to continue publishing its internal developments: from February 24 to 28, the company promises to release four more repositories from its internal ecosystem to open access. This decision could significantly impact the development of the entire AI industry by providing developers with access to advanced optimizations previously available only within the company.

The project code is already available on GitHub (github.com/deepseek-ai/FlashMLA), allowing developers from around the world to begin integrating these optimizations into their projects, potentially significantly improving the performance of their AI systems.

Autor: AIvengo
For 5 years I have been working with machine learning and artificial intelligence. And this field never ceases to amaze, inspire and interest me.

Latest News

Chinese sphere robot RT-G weighing 150 kg reaches speeds up to 35 km/h

China has such a unique engineering marvel — the spherical robot Rotunbot RT-G. Which can fundamentally change the perception of future police technologies.

22% of British children aged 8-12 use AI without knowing what it is

22% of British schoolchildren aged 8 to 12 are already actively using artificial intelligence tools. Despite most of them never even hearing the term "generative artificial intelligence". This is data from a study by the Alan Turing Institute and Lego Foundation.

First Google Veo 3 advertisement shown to millions during NBA finals

Millions of NBA finals viewers witnessed a completely new stage in creative evolution. Fully computer algorithm-generated advertisement for betting platform Kalshi, created using Google Veo 3.

Chinese platform QiMeng creates processors at Intel 486 and Arm level

Chinese scientists developed a new AI platform capable of independently designing processors at the level of human experts. Researchers from the State Laboratory for Processor Development and the Intelligent Software Research Center presented an open-source project called QiMeng.

Meta AI turns private AI chats into public posts without knowledge

Meta AI app turned out to be a real catastrophe for user privacy. Turning their private conversations with artificial intelligence into public content. Imagine a modern horror movie: your entire query history became publicly accessible, and you didn't even suspect it.