The ADK (Agent Development Kit) provider enables AI-powered workflow generation using advanced language models. It’s the first official provider for the Kubiya Workflow SDK.

What is ADK Provider?

The ADK provider integrates Google’s Agent Development Kit with the Kubiya Workflow SDK to enable:

  • Natural Language to Workflow: Describe what you want in plain English
  • Intelligent Code Generation: AI writes the workflow code for you
  • Automatic Validation: Ensures generated workflows are syntactically correct
  • Smart Refinement: Automatically fixes errors through iterative improvement
  • Production Ready: Generated workflows are optimized for production use

Plan Mode

Generate workflows without executing them

Act Mode

Generate and immediately execute workflows

Prerequisites

1

Install SDK with ADK

pip install kubiya-workflow-sdk[adk]
2

Get API Keys

You’ll need:

  • Kubiya API Key: From app.kubiya.ai
  • Together AI Key: From together.ai (default)
  • Google AI Key (optional): For Google models
3

Set Environment Variables

export KUBIYA_API_KEY="your-kubiya-key"
export TOGETHER_API_KEY="your-together-key"

# Optional for Google models
export GOOGLE_API_KEY="your-google-key"

Quick Example

Generate a Simple Workflow

from kubiya_workflow_sdk.providers import get_provider

# Initialize ADK provider
adk = get_provider("adk")

# Generate a workflow from natural language
result = await adk.compose(
    task="Create a workflow that checks disk space and sends an alert if usage is above 80%",
    mode="plan"  # Just generate, don't execute
)

# View the generated workflow
workflow = result["workflow"]
print(f"Generated workflow: {workflow.name}")
print(f"Steps: {len(workflow.steps)}")

# Print the workflow as YAML
print(workflow.to_yaml())

Generate and Execute

# Generate AND execute in one go
async for event in adk.compose(
    task="List all Python files in /tmp and count lines of code",
    mode="act",  # Generate and execute
    stream=True  # Stream execution events
):
    if event.get("type") == "text":
        # Generation progress
        print(f"AI: {event.get('content')}")
    elif event.get("type") == "execution":
        # Execution events
        print(f"Execution: {event.get('data')}")

How It Works

1

Context Loading

The ADK provider first loads context about your Kubiya environment:

  • Available runners and their capabilities
  • Installed integrations (Slack, AWS, etc.)
  • Accessible secrets and resources
2

Workflow Generation

Using advanced AI models (DeepSeek V3 by default), it generates:

  • Complete workflow structure
  • Appropriate steps with dependencies
  • Error handling and retries
  • Integration with platform features
3

Validation & Refinement

The generated code is:

  • Compiled to check syntax
  • Validated against SDK requirements
  • Refined if errors are found
  • Optimized for production use
4

Execution (Act Mode)

If requested, the workflow is:

  • Submitted to Kubiya platform
  • Executed with real-time streaming
  • Monitored for completion

Advanced Examples

Complex Multi-Step Workflow

task = """
Create a comprehensive backup and disaster recovery workflow that:
1. Backs up all PostgreSQL databases
2. Compresses backups with timestamps
3. Uploads to S3 with encryption
4. Validates backup integrity
5. Removes backups older than 30 days
6. Sends detailed report to Slack with:
   - Backup sizes
   - Duration
   - Success/failure status
7. Triggers PagerDuty alert if backup fails
"""

result = await adk.compose(task=task, mode="plan")

# The AI will generate a complete workflow with:
# - Proper error handling
# - Retry logic for network operations  
# - Parallel execution where appropriate
# - Integration with S3, Slack, and PagerDuty

Infrastructure Automation

task = """
Create a Kubernetes deployment workflow that:
1. Validates the deployment manifest
2. Checks if the namespace exists, creates if not
3. Applies the deployment with rolling update strategy
4. Waits for all pods to be ready (timeout 5 minutes)
5. Runs smoke tests against the new deployment
6. If tests fail, automatically rollback to previous version
7. Update deployment status in GitHub PR
"""

# Generate and execute with streaming
async for event in adk.compose(task=task, mode="act", stream=True):
    # Handle events as they come
    handle_event(event)

Incident Response

task = """
Create an incident response workflow that:
1. Fetches alert details from PagerDuty
2. Checks system health metrics for the affected service
3. Attempts automatic remediation based on alert type:
   - For high CPU: restart the service
   - For high memory: clear caches and restart
   - For disk space: clean up logs and temp files
4. Verifies the issue is resolved
5. Creates a JIRA ticket with incident details
6. Posts incident summary to #incidents Slack channel
"""

# With custom context
context = {
    "services": ["api", "web", "worker"],
    "monitoring_tool": "datadog",
    "ticketing_system": "jira"
}

result = await adk.compose(
    task=task,
    context=context,
    mode="plan"
)

Streaming Formats

The ADK provider supports multiple streaming formats:

Compatible with Vercel’s AI SDK format:

result = await adk.compose(
    task="Your task",
    mode="act",
    stream=True,
    stream_format="vercel"
)

Configuration Options

Best Practices

Be Specific

The more specific your task description, the better the generated workflow

Include Examples

Provide example commands or expected outputs when relevant

Specify Requirements

Mention specific tools, integrations, or constraints

Iterate

Use plan mode first, review, then execute

Troubleshooting

Next Steps