Framework Guides

Keito integrates with popular AI agent frameworks through the Python SDK. Each integration automatically tracks time and LLM token costs.

Supported Frameworks

FrameworkIntegrationDocumentation
LangChainCallback handlerLangChain Integration
CrewAITask callbackCrewAI Integration
Custom PythonContext manager / decoratorPython Agent Integration
Custom TypeScriptWrapper functionNode Agent Integration
CLI-based agentsAuto-detectionCLI Agent Mode

How Framework Integrations Work

All framework integrations follow the same underlying pattern:

  1. Hook into the framework’s lifecycle — callbacks, decorators, or context managers that fire when agent work starts and stops.
  2. Create a time entry with source: "agent" when work begins.
  3. Capture token usage from the framework’s built-in tracking.
  4. Stop the timer and log an expense when work completes.

Choosing a Framework Integration

  • LangChain — best for chain and agent pipelines. The callback handler hooks into LangChain’s callback system and tracks each invocation automatically.
  • CrewAI — best for multi-agent crews. The task callback tracks each task separately, attributing work to the specific agent that executed it.
  • Custom — best for bespoke agent implementations. Use the context manager (Python) or wrapper helper (Node) for full control.

Building a Custom Integration

If your framework isn’t listed, use the REST API directly. The pattern is always the same:

  1. POST /v1/time-entries with is_running: true when the agent starts.
  2. PATCH /v1/time-entries/:id with is_running: false and final hours when done.
  3. POST /v1/expenses with token counts.

See REST API Reference for full endpoint documentation.