Skip to content

Technical Domains

We are applying the AI-native Systems vision to three initial technical domains. Each represents a complex software system under constant pressure to evolve — and each stands to benefit from continuous, AI-driven improvement.

llm-d — Inference Platform

A Kubernetes-native distributed LLM inference framework where AI-native approaches drive scheduling, caching, and performance optimization.

llm-d

AI-Generated Kernels

Autonomous generation and continuous optimization of compute kernels for GPUs and accelerators, driven by real workload observations.

AI-Generated Kernels

Storage Systems

Applying spec-driven development and AI-native continuous improvement to storage infrastructure — enabling highly reliable, self-optimizing, and workload-aware storage systems.

Storage Systems

We use cookieless Google Analytics to count how many readers each post gets — no cookies, no tracking across sites. Your page URL (without query parameters), browser, and approximate location may be processed. Read what's collected →