Technical Domains¶
We are applying the AI-native Systems vision to three initial technical domains. Each represents a complex software system under constant pressure to evolve — and each stands to benefit from continuous, AI-driven improvement.
llm-d — Inference Platform¶
A Kubernetes-native distributed LLM inference framework where AI-native approaches drive scheduling, caching, and performance optimization.
AI-Generated Kernels¶
Autonomous generation and continuous optimization of compute kernels for GPUs and accelerators, driven by real workload observations.
Storage Systems¶
Applying spec-driven development and AI-native continuous improvement to storage infrastructure — enabling highly reliable, self-optimizing, and workload-aware storage systems.