Skip to content

Teradata/teradata_agent_cookbook

Repository files navigation

Teradata Agent Recipes

A collection of easy-to-run AI agent examples built on Teradata. Each recipe is a self-contained Python application that exposes a standard HTTP endpoint, allowing a single React UI to interact with any of them.

All 7 recipes can be discovered, set up, and launched from a centralized launcher at the project root—no need to cd into individual directories.


Environment Setup

Before running agents, you need to configure your credentials and endpoints.

Step 1: Create your .env file

cp .env.example .env

Step 2: Edit .env with your credentials

Open .env and populate the required variables:

Variable Required For Description Example
TD_HOST All recipes Teradata host (ClearScape or on-premise) myinstance-xxx.env.clearscape.teradata.com
TD_USER All recipes Teradata username demo_user
TD_PASSWORD All recipes Teradata password YourPassword123
LLM_API_KEY All recipes API key for your LLM provider sk-... (OpenAI) or claude-... (Anthropic)
LLM_PROVIDER Optional LLM provider: openai, anthropic, bedrock, litellm (default: openai) openai
LITELLM_URL If using litellm provider Chat completions endpoint of liteLLM proxy server http://localhost:4000/v1/chat/completions
TD_MCP_URL enterprise-templates/governed URL to Teradata MCP server https://mcp-server.example.com:8443
TD_UES_URI enterprise-templates/vector-search, open-analytics Teradata Unified Execution Server URI https://ues.example.com
TD_PAT enterprise-templates/vector-search, open-analytics Personal Access Token for UES your-pat-token
TD_PEM_PATH enterprise-templates/vector-search, open-analytics Path to PEM certificate file /path/to/cert.pem

Security Note: Never commit the .env file to version control. It's listed in .gitignore to prevent accidental credential leaks.


Recipe Families

Family Tier Recipes Ports What you need
basic-agents 1 data-analyst, data-analyst-multiturn, data-analyst-dbt, data-analyst-mcp 8001-8004 Teradata host, username, password, LLM API key
enterprise-templates 2+ governed, vector-search, open-analytics 8005-8007 Tier 1 + MCP server (governed) / UES URI (vector, analytics)

Quick Start (Recommended)

Prerequisites: Complete the Environment Setup section above by creating and populating your .env file.

# From project root: setup all recipes + UI
python launcher.py all --setup

# Or use shell wrapper (auto-adds UI)
./start-agents.sh all         # Mac/Linux
.\start-agents.ps1 all        # Windows

Status: Currently, the basic recipe is fully documented and available. Other recipes are coming soon with full documentation.

Then:

See LAUNCHER.md for all launcher options.


Recipes at a Glance

Learning & Foundations (basic-agents)

Recipe Port Complexity Status What it teaches
data-analyst 8001 beginner ✅ Available Build your first NL-to-SQL agent
data-analyst-multiturn 8002 beginner 🔜 Coming soon Add conversation memory
data-analyst-dbt 8003 intermediate 🔜 Coming soon Integrate with dbt metadata
data-analyst-mcp 8004 intermediate 🔜 Coming soon Use MCP for dynamic tool discovery

Enterprise Patterns (enterprise-templates)

Template Port Complexity Status What it shows
governed 8005 intermediate 🔜 Coming soon Data governance & row-level security
vector-search 8006 advanced 🔜 Coming soon Semantic search at scale
open-analytics 8007 advanced 🔜 Coming soon Server-side computation via OAF
git

Agent Behavior

Each recipe demonstrates different AI agent patterns. See the behavior descriptions in individual recipe documentation:

  • data-analyst — Single-turn NL-to-SQL with dynamic schema introspection. Works with teddy_retailers dataset.
  • data-analyst-multiturn — Coming soon
  • data-analyst-dbt — Coming soon
  • data-analyst-mcp — Coming soon
  • governed — Coming soon
  • vector-search — Coming soon
  • open-analytics — Coming soon

Project Structure

teradata-agent-recipes/
├── launcher.py              # Central launcher (auto-discovers all recipes)
├── start-agents.sh / .ps1   # Shell wrappers (convenience, auto-adds UI)
├── LAUNCHER.md              # Complete launcher documentation
├── .env.example             # Environment template
├── .gitignore               # Git ignore rules
│
├── shared/                  # Shared utilities for all recipes
│   ├── config.py            # Environment & configuration loading
│   ├── connection.py        # Teradata connection factory
│   ├── server_factory.py    # FastAPI app creation (metadata, routing, LLM setup)
│   ├── recipe_loader.py     # recipe.yaml parser & validator
│   └── llm_providers.py     # Unified LLM provider adapters
│
├── recipes/
│   ├── basic_agents/        # Learning & foundational recipes
│   │   ├── data_analyst/
│   │   │   ├── recipe.yaml          # Metadata (name, port, family, tier, providers)
│   │   │   ├── server.py            # FastAPI app entry point
│   │   │   ├── agent.py             # Agent logic & tool definitions
│   │   │   ├── pyproject.toml       # Python dependencies (uv format)
│   │   │   ├── uv.lock              # Dependency lock file (reproducible installs)
│   │   │   ├── README.md            # Recipe documentation
│   │   │   └── setup.sql            # Sample data SQL
│   │   ├── data_analyst_multiturn/  # Multi-turn variant with conversation history
│   │   ├── data_analyst_dbt/        # DBT catalog integration variant
│   │   └── data_analyst_mcp/        # MCP server integration variant
│   │
│   └── enterprise_templates/        # Production-ready patterns
│       ├── governed/                # Data governance & row-level security
│       ├── vector-search/           # Semantic search at scale
│       └── open-analytics/          # Server-side Python computation
│
├── ui/                      # React application (Vite + Tailwind)
│   ├── src/
│   │   ├── components/
│   │   ├── hooks/
│   │   └── modes/
│   ├── package.json
│   ├── vite.config.js
│   └── tailwind.config.js
│
└── index.yaml               # Machine-readable recipe catalog

Architecture Highlights

1. Centralized Launcher

  • Auto-discovers all recipes by scanning for recipe.yaml + server.py pairs
  • Supports interactive menu, CLI args, and setup operations
  • Manages process lifecycle: startup, monitoring, graceful shutdown
  • See LAUNCHER.md for full documentation

2. Recipe Metadata as Single Source of Truth

Each recipe's recipe.yaml defines:

  • name: Display name
  • family: basic-agents or enterprise-templates
  • tier: Dependency level (1 for basic, 2+ for enterprise)
  • port: Default port number
  • display_mode: UI mode (chat, visual, etc.)
  • llm_providers: Supported providers (openai, anthropic, bedrock)

Example:

name: Basic SQL agent
family: basic-agents
tier: 1
port: 8001
display_mode: chat
llm_providers: [openai, anthropic, bedrock]

3. Shared Server Factory Pattern

server_factory.py creates FastAPI apps uniformly:

  • Loads metadata from recipe.yaml
  • Routes agent logic from recipe's agent.py
  • Injects configuration (databases, LLM providers, etc.)
  • Exposes /info (metadata) and /invoke (agent execution) endpoints

4. Unified LLM Provider Support

Each recipe family implements multi-provider support:

Basic Agents (recipes/basic_agents/llm_providers.py):

  • OpenAI: gpt-4o
  • Anthropic: claude-opus-4-5
  • AWS Bedrock: anthropic.claude-opus-4-5-v1:0
  • LiteLLM: HTTP proxy for any provider (see LITELLM_URL below)

How to use:

  1. Set LLM_PROVIDER in .env to your choice (default: openai)
  2. If using liteLLM, also set LITELLM_URL to your proxy endpoint
  3. All recipes use the same provider interface—no code changes needed

Switch providers without code changes:

export LLM_PROVIDER=anthropic
python launcher.py data-analyst

LiteLLM Setup: LiteLLM is an open-source proxy that unifies access to 100+ LLM providers. To use it:

# Install: pip install litellm
# Start proxy pointing to your chosen provider:
litellm --model gpt-4o  # routes to OpenAI
litellm --model claude-opus-4-5  # routes to Anthropic
litellm --model bedrock/anthropic.claude-opus-4-5-v1:0  # routes to AWS Bedrock

# Then in .env:
LLM_PROVIDER=litellm
LITELLM_URL=http://localhost:4000/v1/chat/completions
LLM_API_KEY=your-provider-api-key

5. Per-Recipe Configuration

Each recipe owns its configuration:

  • basic-agents: SQL agents with schema introspection or MCP tool discovery
  • enterprise-templates: Governance (RLS), semantic search, or advanced analytics

No global TD_DATABASES env var.

6. Reproducible Installs with uv

  • Per-recipe pyproject.toml (Python dependencies)
  • Locked via per-recipe uv.lock (reproducible)
  • Fast parallel installs via uv sync
  • No virtual environment conflicts

Using the Launcher

All launcher options work through both launcher.py and shell wrappers (with auto-UI inclusion).

Setup (Install Dependencies)

# Setup all recipes
python launcher.py setup

# Setup specific recipes
python launcher.py setup data-analyst data-analyst-multiturn

# Setup across families
python launcher.py setup data-analyst governed open-analytics

# Setup UI only
python launcher.py setup ui

Launch (Run Services)

# Interactive menu
python launcher.py

# Named recipes (shell wrapper auto-adds UI)
./start-agents.sh data-analyst data-analyst-multiturn
.\start-agents.ps1 data-analyst data-analyst-multiturn

# All recipes + UI
python launcher.py all

# Specific recipes: recipes only (shell adds UI)
python launcher.py data-analyst data-analyst-mcp

# UI only
python launcher.py ui

Combined Setup + Launch

# One command: setup all, then launch
python launcher.py all --setup

# Setup then launch specific recipes
python launcher.py data-analyst --setup

See LAUNCHER.md for complete documentation and advanced usage.


Environment Configuration

Create a .env file at project root (use .env.example as template):

# Teradata connection (required for all recipes)
TD_HOST=your-teradata-host
TD_USER=your-username
TD_PASSWORD=your-password

# LLM provider (default: openai)
# Options: openai, anthropic, bedrock, litellm
LLM_PROVIDER=openai

# LLM API keys (choose based on LLM_PROVIDER)
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
AWS_ACCESS_KEY_ID=your-aws-key
AWS_SECRET_ACCESS_KEY=your-aws-secret

# MCP server (required for sql recipe in basic-agents & governed in enterprise-templates)
TERADATA_MCP_SERVER_URL=http://localhost:5000

# Advanced features (required for vector-search and open-analytics in enterprise-templates)
UES_URI=https://your-ues-host
UES_PAT=your-personal-access-token
UES_PEM_FILE=/path/to/pem/file

Each recipe sources this .env automatically via shared/config.py.


Port Conventions

Port Recipe Family
8001 basic basic-agents
8002 data-analyst-multiturn basic-agents
8003 data-analyst-dbt basic-agents
8004 sql basic-agents
8005 governed enterprise-templates
8006 vector-search enterprise-templates
8007 open-analytics enterprise-templates
5173 ui N/A

Typical Workflows

Workflow 1: Try All Recipes

python launcher.py all --setup
# Visit http://localhost:5173
# Select recipes from dropdown

Workflow 2: Develop One Recipe

python launcher.py setup data-analyst
# Edit recipes/basic_agents/data_analyst/agent.py
python launcher.py data-analyst

Workflow 3: Test LLM Provider Switch

export LLM_PROVIDER=anthropic
python launcher.py data-analyst
curl -X POST http://localhost:8001/invoke \
  -H "Content-Type: application/json" \
  -d '{"query": "your question", "llm_provider": "anthropic"}'

Workflow 4: Setup Specific Family

# Setup only basic agents
python launcher.py setup data-analyst data-analyst-dbt data-analyst-multiturn

# Launch them
./start-agents.sh data-analyst data-analyst-dbt data-analyst-multiturn

Next Steps

  1. Set up environment: Copy .env.example to .env, add credentials
  2. Setup recipes: python launcher.py all --setup
  3. Launch: python launcher.py all or ./start-agents.sh all
  4. Access UI: http://localhost:5173
  5. Deploy: Each recipe is a standalone FastAPI app—deploy to any server

See LAUNCHER.md for complete launcher documentation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

No contributors