Post Thumbnail

Hugging Face speeds up data processing by 3x

The Xet team at Hugging Face introduced a new approach to optimizing data upload and download on the Hub platform, which allows for 2-3 times faster file processing. The technology is based on an improved Content-Defined Chunking (CDC) method, which revolutionizes the way information is stored and transmitted.

The scale of the problem is impressive: the Hub platform stores nearly 45 petabytes of data distributed across 2 million repositories of models, datasets, and spaces. With a standard approach to splitting files into 64 KB chunks, uploading a 200 GB repository would require creating 3 million storage records. At the platform scale, this could lead to 690 billion chunks.

The Hugging Face team identified serious problems that arise when simply striving for maximum data deduplication through chunk size reduction. Millions of separate requests during each upload and download create critical load on network infrastructure. There’s also excessive load on databases and storage systems, leading to significant increases in metadata management costs in services like DynamoDB and S3.

To solve these problems, the company developed and open-sourced xet-core and hf_xet tools, written in Rust and integrated with huggingface_hub. The new approach focuses not only on data deduplication but also on optimizing network transfer, storage, and overall development experience.

The team’s main goal is to ensure fast experimentation and effective collaboration for teams working on models and datasets.

Autor: AIvengo
For 5 years I have been working with machine learning and artificial intelligence. And this field never ceases to amaze, inspire and interest me.
Latest News
IMF chief economist compared AI boom to dotcom bubble

IMF chief economist Pierre-Olivier Gourinchas stated that the world has already traveled halfway to a burst AI bubble and a new financial crisis.

Researchers cracked 12 AI protection systems

You know what researchers from OpenAI, Anthropic, Google DeepMind and Harvard just found out? They tried to break popular AI security systems and found a bypass almost everywhere. They checked 12 common protection approaches. From smart system prompt formulations to external filters that should catch dangerous queries.

OpenAI has 5 years to turn $13 billion into trillion

You know what position OpenAI is in now? According to Financial Times, the company has 5 years to turn 13 billion dollars into a trillion. And here's what it looks like in practice.

Sam Altman promises to return humanity to ChatGPT

OpenAI head Sam Altman made a statement after numerous offline and online protests against shutting down the GPT-4o model occurred. And then turning it on, but with a wild router. I talked about this last week in maximum detail. Direct quote from OpenAI head.

AI comes to life: Why Anthropic co-founder fears his creation

Anthropic co-founder Jack Clark published an essay that makes you uneasy. He wrote about the nature of modern artificial intelligence, and his conclusions sound like a warning.