Post Thumbnail

Zuckerberg poached 3 top OpenAI researchers for $100 million

The epic battle for talent continues. And Zuckerberg is poaching elite researchers from OpenAI! Mark made a real strategic move and lured 3 outstanding specialists from OpenAI: Lukas Beyer, Alexander Kolesnikov and Xiaohua Zhai.

They all worked in OpenAI’s Zurich office and were considered key figures in the company’s research projects. According to information from reliable sources, Meta’s offer included a compensation package around $100 million. A sum that proved more convincing than OpenAI’s philosophy and culture. The irony of the situation is that just a few days ago I told you that Sam Altman publicly declared his employees’ loyalty. Claiming that, I quote – “the best of ours haven’t left yet”, despite Zuckerberg’s attempts to hire them.

Even earlier, the OpenAI head boasted about the “unique culture” in his company, which allegedly retains talent even in the face of multi-million dollar offers. Reality turned out different. Financial incentives ultimately outweighed the atmosphere created by Altman. $100 million is $100 million.

Autor: AIvengo
For 5 years I have been working with machine learning and artificial intelligence. And this field never ceases to amaze, inspire and interest me.

Latest News

MIT graduate student reduced painting restoration from 230 to 3.5 hours

MIT graduate student Alex Kachkin developed a cool method for painting restoration using artificial intelligence. Reducing work time from many months to several hours. As a demonstration, he restored a work by an unknown Dutch master of the 15th century that had seriously suffered from time.

AI prosthetic from Canada analyzes objects and decides how to grasp them

Artificial intelligence gives prosthetics independence! Scientists from Memorial University of Newfoundland created a revolutionary arm prosthetic that literally "thinks" for itself. Unlike traditional models that require reading muscle signals through sensors, the new device is completely autonomous.

DeepSeek packed LLM engine into 1200 lines of Python code

The DeepSeek team presented nano-vLLM. This is a lightweight and compact engine for running large language models. Which could change perceptions about code efficiency. Amazingly, all functionality fit into just 1200 lines of Python code! This is true technological minimalism in the world of artificial intelligence. Traditional engines like this, for all their power, often suffer from an overloaded codebase. Which makes their modification a real trial for developers. Nano-vLLM solves this problem by offering a simple but powerful tool without unnecessary complexity. The code is open.

Tesla robotaxi failure: 11 traffic violations in first days from 20 cars

The dream of robotaxis faces harsh reality! Tesla launched public tests of autonomous taxis in Austin, but the results were far from the promised technological miracle. In the first days of testing, at least 11 serious traffic violations were recorded. And this with only 20 vehicles selected for a limited circle of bloggers. Philip Koopman, professor at Carnegie Mellon University and expert on autonomous technologies, doesn't hide his surprise: "This is terribly fast for so many videos with unstable driving to appear".

Zuckerberg poached 3 top OpenAI researchers for $100 million

The epic battle for talent continues. And Zuckerberg is poaching elite researchers from OpenAI! Mark made a real strategic move and lured 3 outstanding specialists from OpenAI: Lukas Beyer, Alexander Kolesnikov and Xiaohua Zhai.