Read: From using AI to building AI systems, a defining note on what I’m exploring.

AI Native Infra & Agentic Runtime

Through writing, publishing, and engineering practice, systematically building engineering methodologies for AI infrastructure and agentic runtimes.

What Is AI-Native Infrastructure?

Infrastructure redesigned for AI systems — not retrofitted from cloud-native stacks.

Agent / AI Applications
Agentic Runtime & Context
Inference · Training · Governance
GPU & Accelerated Infrastructure
  • NON-DETERMINISM AI workloads are non-deterministic by nature
  • AGENT-FIRST Agents, not services, are primary execution unit
  • FIRST-CLASS RESOURCES GPU, context, and tokens become first-class resources
  • GOVERNANCE > DEPLOY Scheduling and governance matter more than deployment

Core Technology Domains

These are the threads I keep returning to: practical abstractions, system boundaries, and what it takes to ship reliably.

AI Infrastructure

AI Infrastructure

I explore inference, RAG, and agent collaboration from a runtime-first lens, aiming for abstractions that hold up in production.

Cloud Native

Cloud Native

I study how Kubernetes evolves under AI workloads: scheduling constraints, elasticity, observability, and multi-tenant governance.

Open Source

Open Source

I contribute to and learn from the AI infrastructure open-source ecosystem, with an engineer’s bias toward verifiable designs.

About Jimmy Song

I focus on AI infrastructure and agent runtimes, especially, engineering questions that show up in production: inference, RAG, platform engineering, and governance. I currently serve as Open Source Ecosystem VP at Dynamia.ai. I’m also a CNCF Ambassador and AI Infrastructure Architect, and I started ArkSphere (an engineering collective around AI infrastructure).

Jimmy Song's Avatar