The App Router
for AI agents.
A TypeScript-first framework for building and deploying graph-based AI systems with the ergonomics of Next.js. File-system routing, type-safe tools, zero boilerplate.
The Problem
LangChain gave us the runtime. Every team builds the framework around it by hand.
LangGraph is powerful and unopinionated — that's the design. So every team adopting it — including ours — ends up inventing project structure, type wiring, dev tooling, and deployment scripts from scratch. We've watched this happen at every company building agents on LangChain.
You've written the same StateGraph boilerplate five times.
State channels, nodes, edges, the same pattern in every project. Every repo invents its own structure.
Your tool's Zod schema drifted from its function signature.
You found out at runtime. Schemas live in one file, the actual function in another, and the types between them quietly disagree.
You added console.log to find out what state your agent is in.
There's no dev server that shows the graph mid-run. No hot reload when you change a tool. No structured way to test scenarios.
Your deployment is a hand-rolled Docker image.
You wrote the server, the routing, the protocol adapter. Every team building production agents on LangGraph rebuilds this from scratch.
The same agent, two ways
One greets a tenant.
import { StateGraph, START, END } from "@langchain/langgraph"
import { z } from "zod"
const GreetSchema = z.object({ tenant: z.string() })
type State = { tenant: string; greeting?: string }
async function greet(i: z.infer<typeof GreetSchema>) {
return { greeting: `Hello, ${i.tenant}!` }
}
const graph = new StateGraph<State>({
channels: {
tenant: { value: (_, y) => y, default: () => "" },
greeting: { value: (_, y) => y, default: () => "" },
},
})
.addNode("greet", async (state) => {
const r = await greet({ tenant: state.tenant })
return { greeting: r.greeting }
})
.addEdge(START, "greet")
.addEdge("greet", END)
const app = graph.compile()
const result = await app.invoke({ tenant: "acme" })
// + write your own dev loop, types, server, and deploy.// src/app/(public)/hello/[tenant]/state.ts
export interface HelloState {
tenant: string
greeting?: string
}
// src/app/(public)/hello/[tenant]/tools/greet.ts
export default async (i: { readonly tenant: string }) =>
({ greeting: `Hello, ${i.tenant}!` })
// src/app/(public)/hello/[tenant]/index.ts
import type { RuntimeContext } from "@dawn/sdk"
import type { RouteTools } from "dawn:routes"
import type { HelloState } from "./state.js"
export async function workflow(
state: HelloState,
ctx: RuntimeContext<RouteTools<"/hello/[tenant]">>,
) {
const { greeting } = await ctx.tools.greet({
tenant: state.tenant,
})
return { ...state, greeting }
}
// $ dawn run "/hello/acme" · dawn dev · dawn testDawn writes the StateGraph wiring, generates the tool types from your function signatures, runs the dev server, and speaks the LangGraph Platform protocol. You write the agent logic. The framework gives you back the time.
The Pattern
You already know this story.
Every runtime gets a framework. React got Next.js. Svelte got SvelteKit. Vue got Nuxt. LangGraph just got Dawn.
Same conventions you already know. Purpose-built for AI agents.
The Solution
Dawn gives your agents the structure they deserve.
Convention
Routes, tools, state, config. Everything in the right place. If you know App Router, you know Dawn.
Type Safety
Tool signatures extracted at build time. Full autocomplete. No manual type wiring.
Tooling
Dev server with hot reload. CLI for running, testing, and validating. Vite-powered.
See It
A Dawn app, typed end to end.
Project Structure
src/app/(public)/hello/[tenant]/index.ts
tools/greet.ts
dawn.generated.d.ts (auto-generated)
Terminal
Type-safe tools, inferred automatically. No manual type wiring. No Zod boilerplate.
The Deploy Story
Build locally. Deploy to LangSmith.
Dawn owns your local development lifecycle. When you're ready to ship, your routes speak the LangGraph Platform protocol natively — deploy as LangSmith assistants with the infrastructure you already trust.
Develop
Validate
Deploy
Dawn's dev server speaks the LangGraph Platform protocol natively — /runs/wait, /runs/stream, assistant_id routing. What runs locally deploys without translation.
Everything you need.
And nothing you don't.
File-system Routing
Routes map to directories. Route groups, dynamic segments, catch-all params. Same conventions as Next.js App Router.
Type-safe Tools
Tool types inferred from source via the TypeScript compiler API. Full autocomplete. Zero manual wiring.
Vite Dev Server
Hot reload on tool and route changes. Parent-child process architecture for clean restarts.
Scenario Testing
Co-located test scenarios with expected outputs. Run against in-process, CLI, or dev server.
Pluggable Backends
LangGraph graphs, LangGraph workflows, LangChain LCEL chains. One framework, multiple execution modes.
Dawn CLI
check, routes, typegen, run, test, dev. Everything from one command. No config sprawl.
Up and running in 30 seconds.
Scaffold
npx create-dawn-app my-agentWrite a route
Export a workflow, graph, or chainfrom your route's index.ts. Add tools in a tools/ directory.
Run it
dawn run '/hello/acme'Test & iterate
dawn devHot reload. Change tools, see results instantly.
Ecosystem
Built for the LangChain ecosystem.
Dawn is a meta-framework for LangGraph and LangChain. Use the tools and models you already know.
@dawn/langgraph
Backend adapter for LangGraph graphs and workflows. Native execution.
@dawn/langchain
Adapter for LCEL chains. Convert Dawn tools to LangChain tools automatically.
@dawn/sdk
Backend-neutral contract. RuntimeContext, tools, route config. Bring any adapter.
Ready to build?
Give your AI agents the structure they deserve.