Framework Guides
Keito integrates with popular AI agent frameworks through the Python SDK. Each integration automatically tracks time and LLM token costs.
Supported Frameworks
| Framework | Integration | Documentation |
|---|---|---|
| LangChain | Callback handler | LangChain Integration |
| CrewAI | Task callback | CrewAI Integration |
| Custom Python | Context manager / decorator | Python Agent Integration |
| Custom TypeScript | Wrapper function | Node Agent Integration |
| CLI-based agents | Auto-detection | CLI Agent Mode |
How Framework Integrations Work
All framework integrations follow the same underlying pattern:
- Hook into the framework’s lifecycle — callbacks, decorators, or context managers that fire when agent work starts and stops.
- Create a time entry with
source: "agent"when work begins. - Capture token usage from the framework’s built-in tracking.
- Stop the timer and log an expense when work completes.
Choosing a Framework Integration
- LangChain — best for chain and agent pipelines. The callback handler hooks into LangChain’s callback system and tracks each invocation automatically.
- CrewAI — best for multi-agent crews. The task callback tracks each task separately, attributing work to the specific agent that executed it.
- Custom — best for bespoke agent implementations. Use the context manager (Python) or wrapper helper (Node) for full control.
Building a Custom Integration
If your framework isn’t listed, use the REST API directly. The pattern is always the same:
POST /v1/time-entrieswithis_running: truewhen the agent starts.PATCH /v1/time-entries/:idwithis_running: falseand finalhourswhen done.POST /v1/expenseswith token counts.
See REST API Reference for full endpoint documentation.