Connect Kubiya to your existing LangGraph, LangChain, CrewAI, and other AI frameworks
Kubiya seamlessly integrates with popular AI frameworks, allowing you to enhance your existing agent systems with production-grade orchestration, execution, and scaling capabilities. Instead of replacing your current setup, Kubiya acts as an execution layer that brings deterministic workflows and enterprise features to your AI agents.
🚀 Production-Ready Execution
Transform your experimental AI agents into production-ready systems with containerized execution, error handling, and automatic scaling.
🔒 Enterprise Security & Compliance
Add enterprise-grade security, audit trails, and compliance features to your existing AI workflows without changing your core logic.
🌊 Streaming & Real-time Updates
Enable real-time streaming of agent execution to frontend applications using standard protocols like SSE and Vercel AI SDK.
⚖️ Deterministic & Reliable
Unlike free-roaming agent frameworks, Kubiya provides rails and boundaries that ensure predictable execution paths.
Multi-agent workflows with state management
Chain-based agent pipelines
Multi-agent collaboration
Your own AI framework
There are three main patterns for integrating existing AI frameworks with Kubiya:
Wrap your existing agent code within Kubiya workflows for execution orchestration:
Deploy your framework as a custom agent server provider:
Combine Kubiya’s orchestration with your framework’s agent logic:
Kubernetes-Native Scaling
Your existing AI agents automatically gain:
Kubernetes-Native Scaling
Your existing AI agents automatically gain:
Built-in Observability
Gain visibility into your AI agents:
Enterprise Security Layer
Add security without code changes:
Start with Workflow Wrapping
Begin by wrapping your existing agents in Kubiya workflows without changing core logic
Add Orchestration Features
Gradually add streaming, monitoring, and error handling capabilities
Optimize for Production
Refactor critical paths to use native Kubiya features for better performance
Full Integration
Move to custom providers or hybrid architectures for maximum benefit
Deploy Side by Side
Run your existing system alongside Kubiya-integrated versions
A/B Testing
Compare performance and reliability between implementations
Gradual Traffic Shift
Slowly move traffic from legacy to Kubiya-integrated agents
Sunset Legacy
Retire old implementation once migration is complete
🔍 Research & Analysis Agents
Scenario: You have LangGraph agents doing research and analysis
Integration: Wrap in Kubiya workflows to add streaming results, automatic retries, and result caching
Benefit: Research agents become production-ready with real-time progress updates
👥 Multi-Agent Teams
Scenario: CrewAI teams handling complex business processes
Integration: Use Kubiya’s orchestration for team coordination and result aggregation
Benefit: Better coordination, fault tolerance, and scalability across agent teams
🔗 Chain-Based Processing
Scenario: LangChain pipelines for document processing or RAG
Integration: Add Kubiya’s execution layer for parallel processing and error recovery
Benefit: Faster processing through parallelization and better error handling
🤝 Human-in-the-Loop
Scenario: Agents requiring human approval or intervention
Integration: Use Kubiya’s workflow pausing and resumption capabilities
Benefit: Seamless human-AI collaboration with proper state management
Choose your framework and follow the specific integration guide:
Connect stateful multi-agent graphs
Enhance chain-based workflows
Scale multi-agent teams