RSS.Social

Daniil Sedov — Blog

follow: @[email protected]

Posts

Things got too easy with AI

I gave Codex its own Mac Mini

TON Vanity: 286,000x faster vanity addresses

AI in 2026

What LLM to use today?

There is nothing out-of-distribution

There is no singularity

Writing with AI

My impression of GPT-5

The complexity threshold of AI

Billions of Tokens Later: Scaling LLM Fuzzing in Practice

Multitasking in 2025

Documentation-Driven Compiler Fuzzing with Large Language Models

Measuring and Analyzing Entropy in Large Language Models