A single HTTP endpoint that streams responses from multiple AI SDKs
| Endpoint | Purpose |
|---|---|
POST /agent/run
|
Stream agent responses (SSE) |
GET /health
|
Service & SDK availability check |
POST /project/init
|
Initialize project (async job) |
GET /project/status/{job_id}
|
Poll job status |
POST /project/upload
|
Upload context files |
POST /project/init-sdk
|
Configure SDK for project |
POST /project/init-mcp
|
Initialize MCP servers |
Adding AI coding features to your product
Building client tools with multiple AI providers
Shipping AI products fast
Standardizing on a single AI interface
No subscriptions. No hidden fees. Pay once, use forever.
Yes. Full source code with MIT-style license for commercial use. Modify, extend, and build upon it however you need.
The factory pattern makes adding new providers straightforward. Pro tier includes updates for new SDK integrations for one year.
Yes. The service wraps Claude Code CLI and Codex CLI. Installation is one npm command each: npm install -g @anthropic-ai/claude-code and npm install -g @openai/codex
Yes. Stateless design, proper error handling, async patterns, and clean architecture throughout. Built for horizontal scaling from day one.
Get the code today and ship your AI features this week.