MCP Integration
Connect Kubiya to any AI assistant (Claude, ChatGPT, Cursor) using the Model Context Protocol (MCP). Write simple workflows and let AI execute them on behalf of authenticated users.
Identity-Aware Workflow Execution
When users interact with Kubiya through the Composer UI or other interfaces, the MCP server ensures workflows execute with their identity. This provides:
- User Attribution - Every workflow run is tied to the user who initiated it
- Permission Enforcement - Users can only execute workflows they have access to
- Audit Compliance - Complete traceability of who did what and when
- Multi-tenant Isolation - Users’ workflows and data remain separate
Quick Start (6 Lines of Code!)
import asyncio
from mcp_use import MCPClient
from mcp_use.adapters.langchain_adapter import LangChainAdapter
# Configure and connect to Kubiya
client = MCPClient.from_dict({
"mcpServers": {
"kubiya": {
"command": "python3",
"args": ["-m", "kubiya_workflow_sdk.mcp.server"]
}
}
})
# That's it! Now use the tools
adapter = LangChainAdapter()
tools = await adapter.create_tools(client)
What Can You Do?
The Kubiya MCP server exposes these powerful tools to any AI:
Tool | What it does | Example |
---|
define_workflow | Create workflows from simple Python code | Create deployment pipelines |
execute_workflow | Run workflows with parameters | Deploy to production |
list_workflows | See all your workflows | Check available automations |
validate_workflow | Check workflow syntax | Validate before running |
export_workflow | Export as YAML/JSON | Share with team |
get_execution | Track workflow status | Monitor deployments |
Creating Workflows - It’s Simple!
No decorators, no complexity. Just simple, clean Python:
Hello World
from kubiya_workflow_sdk.dsl import Workflow
wf = Workflow("hello-world")
wf.description("My first workflow!")
wf.step("greet", "echo 'Hello, Kubiya!'")
wf.step("date", "date")
Real-World Example: Deploy Pipeline
from kubiya_workflow_sdk.dsl import Workflow
wf = Workflow("deploy-app")
wf.description("Deploy application to production")
# Add parameters
wf.params(app="myapp", env="staging", version="latest")
# Build and test
wf.step("build", "docker build -t {{app}}:{{version}} .")
wf.step("test", "docker run {{app}}:{{version}} npm test")
# Deploy based on environment
wf.step("deploy-staging",
"kubectl apply -f k8s/staging/",
condition="{{env}} == 'staging'"
)
wf.step("deploy-prod",
"kubectl apply -f k8s/production/",
condition="{{env}} == 'production'"
)
# Verify deployment
wf.step("verify", "curl https://{{app}}-{{env}}.example.com/health")
Connect Your AI Assistant
Option 1: Use with Any LangChain LLM
from langchain_openai import ChatOpenAI
from mcp_use import MCPAgent, MCPClient
# Setup
client = MCPClient.from_dict({
"mcpServers": {
"kubiya": {
"command": "python3",
"args": ["-m", "kubiya_workflow_sdk.mcp.server"],
"env": {"KUBIYA_API_KEY": "your-key"} # For execution
}
}
})
# Create AI agent
llm = ChatOpenAI(model="gpt-4")
agent = MCPAgent(llm=llm, client=client)
# Let AI create workflows!
result = await agent.run(
"Create a workflow that backs up my database every night"
)
# Get tools
adapter = LangChainAdapter()
tools = await adapter.create_tools(client)
# Find the tool you need
define_tool = next(t for t in tools if t.name == "define_workflow")
# Create workflow directly
result = await define_tool.ainvoke({
"name": "backup-db",
"code": '''
from kubiya_workflow_sdk.dsl import Workflow
wf = Workflow("backup-db")
wf.step("dump", "pg_dump mydb > backup.sql")
wf.step("compress", "gzip backup.sql")
wf.step("upload", "aws s3 cp backup.sql.gz s3://backups/")
'''
})
Real Examples That Work
1. Data Processing Pipeline
wf = Workflow("process-data")
wf.params(input="/data/raw", output="/data/processed")
# Process files in parallel
wf.parallel_steps(
"process-files",
items=["file1.csv", "file2.csv", "file3.csv"],
command="python process.py {{input}}/{{item}} {{output}}/{{item}}"
)
2. Infrastructure Automation
wf = Workflow("setup-infra")
wf.params(provider="aws", region="us-east-1")
wf.step("terraform-init", "terraform init")
wf.step("terraform-plan", "terraform plan -out=tfplan")
wf.step("terraform-apply", "terraform apply tfplan")
3. CI/CD Pipeline
wf = Workflow("ci-pipeline")
wf.step("checkout", "git checkout main")
wf.step("install", "npm install")
wf.step("lint", "npm run lint")
wf.step("test", "npm test")
wf.step("build", "npm run build")
wf.step("deploy", "npm run deploy:prod", condition="{{branch}} == 'main'")
Installation
# Install the SDK and mcp-use
pip install kubiya-workflow-sdk mcp-use
# For AI agents, also install your preferred LLM
pip install langchain-openai # For OpenAI
pip install langchain-anthropic # For Claude
Running the MCP Server
The MCP server starts automatically when you connect, but you can also run it standalone:
# Run the server
python -m kubiya_workflow_sdk.mcp.server
Authentication Options
The MCP server supports two authentication approaches:
1. Simple API Key (Getting Started)
Perfect for development and simple integrations:
# Pass API key as parameter
result = await execute_tool.ainvoke({
"name": "my-workflow",
"params": {"env": "production"},
"api_key": "your-kubiya-api-key"
})
# Or use environment variable
export KUBIYA_API_KEY="your-api-key"
2. OAuth/OIDC Authentication (Production)
For enterprise deployments with proper authentication:
# Run server with OAuth/OIDC
python -m kubiya_workflow_sdk.mcp.server_auth \
--auth-server https://your-auth-server.com \
--auth-type oidc
Learn more in the Authentication Guide.
The server exposes 8 powerful tools:
Tool | Purpose | Requires API Key |
---|
define_workflow | Create workflows from code | No |
create_workflow_from_json | Create from JSON/YAML | No |
execute_workflow | Run workflows | Yes (via parameter) |
get_execution | Check execution status | No |
list_workflows | List all workflows | No |
validate_workflow | Validate workflow syntax | No |
export_workflow | Export to YAML/JSON | No |
validate_workflow_code | Validate Python code | No |
What’s Next?
FAQ
Q: Do I need decorators?
A: No! Just use the simple DSL: wf = Workflow("name")
and add steps.
Q: Can I use this with Claude/ChatGPT/Cursor?
A: Yes! Any tool that supports MCP or LangChain can connect.
Q: Do I need an API key?
A: Only for executing workflows. Creating and validating workflows works without a key.
Q: Is it production-ready?
A: Yes! The MCP server is battle-tested and ready for production use.