Post Thumbnail

ByteDance released model with 512K token context

ByteDance company released an open AI model with incredible context of 512,000 tokens. The model name is Seed-OSS-36B. Link in description.

While the world discusses TikTok and the White House, ByteDance quietly rolls out technology that processes information volume equivalent to an entire bookshelf in one session! 3 model versions — with synthetic data, without them, and instructional version — each tailored for its tasks.

The architecture impresses with its elegance. 36 billion parameters distributed across 64 layers. Vocabulary of 155,000 tokens. But the main magic — the thinking budget mechanism! You literally set how much time the model should think before answering. Want instant response — set 0. Need deep analysis — increase the budget.

Test results are awesome! Mathematics — 91.7% on AIME. Programming — 67.4% on LiveCodeBench. Long context work — 94.6% on RULER. All indicators — absolute records among open models!

The key question here is what’s the performance on real tasks, not benchmarks. But so far, ByteDance unexpectedly demonstrates world-class competencies in LLM. This is interesting.

Autor: AIvengo
For 5 years I have been working with machine learning and artificial intelligence. And this field never ceases to amaze, inspire and interest me.

Latest News

How xAI competes with OpenAI in developer tools

xAI is launching Grok Code Fast 1. This is a compact agentic model for coding. $0.20 for 1 million input tokens, $1.50 for output — and just $0.02 when using cache!

Battle for browsers: Anthropic vs Perplexity and OpenAI

It seems Anthropic wants to transform the concept of browser technologies. Claude for Chrome enters closed testing for 1,000 Max-plan subscribers.

Chinese autonomous tractor without steering wheel and cabin works in fields

Chinese company Shiyan Guoke Honghu Technology introduced the fully autonomous tractor Honghu T70. Which independently moves across fields and performs the entire spectrum of agricultural tasks without any human participation.

Nvidia introduced Jetson AGX Thor: 2560 cores for robots

Nvidia company presented a development for physical AI - Jetson AGX Thor. This isn't just a chip, this is literally a brain for future robots. Imagine — 2560 Blackwell cores and 128 GB of RAM in one compact device!

GPT-5 optimizes costs

The Register reveals OpenAI's strategy and according to them, GPT-5 turned out to be not a revolution of capabilities, but genius cost optimization.