Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
84 changes: 84 additions & 0 deletions demos/lakebase-devconnect/Demoscript-operational-lakebase.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
# Operational Lakebase Demo Script

## Pre-Demo Checklist

> **CK-yourname workspace** — Confirm everything below is ready. Setup may take up to 30 minutes.

- [ ] **Model Serving Endpoint** — `caspersdev_support_agent` (named "Support Agent")
- [ ] **Jobs & Pipelines** — `Support Request Agent Stream`
- [ ] **Catalog Tables**
- `support_agent_reports` *(assign a compute resource)*
- `support_agent_reports_sync` *(auto-synced with Lakebase)*
- [ ] **Lakebase Instance** — Autoscaling instance is automatically created
- [ ] **Data Verification** — `support_agent_report_sync` in Lakebase matches the Lakehouse data

---

## Phase 1: Data Ingestion & AI Processing

1. **Access the Workspace**
Start in the Databricks workspace for Casper's Kitchen (the ghost kitchen company).

2. **Review the Catalog**
Open Unity Catalog to show where all incoming food order events are streamed.

3. **Demonstrate the Support Agent**
- Navigate to the Model Serving endpoint named **"Support Agent."**
- Explain that this agent processes incoming support requests to suggest refund amounts and draft customer responses.

4. **Inspect the Processing Notebook**
- Go to **Jobs & Pipelines** and locate the **"Support Request Agent Stream"** task.
- Show the notebook that uses an OpenAI client to query raw support requests and write processed responses to the `support_agent_reports` table.

5. **Verify Structured Data**
Return to Unity Catalog and inspect the `support_agent_reports` table's sample data to show the structured JSON responses generated by the agent.

---

## Phase 2: Operationalizing Data with Lakebase

1. **Explain the Need for Lakebase**
The Lakehouse is built for analytics. An OLTP database like Lakebase is required for low-latency operational use cases and application support.

2. **Create a Sync Table**
Demonstrate synchronizing data from Unity Catalog to Lakebase by selecting a Postgres database instance and a specific branch. *(One is already created for you.)*

3. **Navigate the Lakebase Dashboard**
- Use the app switcher to open the Lakebase dashboard.
- Confirm that `support_agent_report_sync` in Lakebase matches the Lakehouse data.
- *(You may need to manually assign a compute resource.)*

4. **Demonstrate Instant Branching**
- Show the UI's ability to instantly create a **"test branch"** of the production data.
- Delete data in the test branch to prove it does **not** affect the production environment.

---

## Phase 3: The Support Console Application (Databricks Apps)

1. **Launch the Support Console**
Go to **Compute → Apps** and open the **"Support Console."** Highlight the fast page loading enabled by querying Lakebase instead of the Lakehouse directly.

2. **Process a Support Case**
Open a pending case from a user. Review the agent-suggested refund, credits, and draft response.

3. **Interact with the AI**
- Demonstrate **"regenerating"** a report by providing the agent with new context (e.g., *"customer is angry on the phone"*).
- Show the agent updating its suggestions based on persistent data and new context.

---

## Phase 4: Advanced Agentic Development *(Optional)*

1. **Trigger Code Generation**
Open a code editor (e.g., Cursor) with the Casper's Kitchen codebase.

2. **Issue a Natural Language Prompt**
Provide a prompt asking for a new feature, such as a **"Flag as Abusive"** button. Explicitly instruct the AI agent to create its own Lakebase development branch and a new "dev app" environment to avoid touching production.

3. **Verify the Dev Environment**
- Return to the Lakebase dashboard to show the new branch created by the agent.
- Go to **Compute → Apps** to find the newly deployed **"Support Dev"** environment.

4. **Test the New Feature**
Open the dev app and demonstrate the **"Handle Abuse"** card to show it works as intended — fully isolated from production.