Skip to content
/ Devonz Public

Full-stack vibe coding platform with model freedom. Any LLM, local or cloud, zero lock-in.

Notifications You must be signed in to change notification settings

zebbern/Devonz

Repository files navigation

devonz

AI-powered full-stack development agent :-: describe what you want, watch it build.

Docker Build Node TypeScript

Features ━━ Installation ━━ Configuration

Docker ━━ Scripts ━━ Keeping Up to Date

Project Structure ━━ Contributing ━━ Acknowledgments

Devonz Screenshot

Quick Start

Docker (recommended):

docker compose up -d
# Open http://localhost:5173

From source:

git clone https://github.com/zebbern/Devonz.git && cd Devonz
pnpm install
pnpm run dev
# Open http://localhost:5173

First load can take up to 2 minutes while dependencies compile.


Why Devonz?

  • 22 AI providers — OpenAI, Anthropic, Google, Groq, Z.ai, Ollama, and more. Swap models mid-conversation.
  • Full dev environment in the browser — editor, terminal, live preview, all powered by WebContainers.
  • One-click deploy — push to GitHub, GitLab, Netlify, or Vercel directly from the UI.
  • MCP tools — extend the agent with Model Context Protocol servers for specialized workflows.
  • Auto-fix — terminal error detection catches failures and patches them automatically.
  • Image context — attach screenshots or design files to prompts for visual understanding.
  • Design Palette — pick custom color themes that get injected into AI-generated apps.
  • 3D support — generate React Three Fiber apps with automatic version pinning and peer dependency resolution.
  • Template gallery — start from popular frameworks and boilerplates, then customize with AI.
  • State-of-the-art AI SDK — built on the Vercel AI SDK for best-in-class LLM performance and reliability.

Features

AI & Code Generation

  • Natural language to full-stack apps (Node.js-based)
  • 22 LLM providers with hot-swappable model selection
  • MCP (Model Context Protocol) tool integration
  • Automatic error detection and auto-fix from terminal output
  • Attach images to prompts for visual context
  • Extended Thinking — AI reasoning visualization for Anthropic Claude and Google Gemini
  • MCP Schema Sanitization — Automatic schema compatibility for Google Gemini (strips unsupported constructs)
  • MCP Auto-Approve — Per-server auto-approve toggle for trusted MCP servers
  • Formatted Tool Results — MCP tool results render as formatted markdown instead of raw JSON
  • Design Palette — UI for picking custom color themes injected into generated code
  • 3D App Generation — React Three Fiber support with automatic version pinning for React 18/19

Development Environment

  • In-browser code editor (CodeMirror) with syntax highlighting
  • Integrated terminal (xterm.js) with full shell access
  • Real-time application preview
  • Diff view for AI-generated changes
  • File locking to prevent conflicts during generation

Security & Reliability

  • Security HardenedwithSecurity() on all 35+ API routes with input validation, rate limiting options, URL allowlisting

Deployment & Integrations

  • GitHub / GitLab push and repo management
  • Netlify and Vercel one-click deploy
  • Supabase database integration
  • Git version control built-in
  • Template gallery for popular frameworks

Installation

From Source (Node.js)

Prerequisites: Node.js 18.18+ and pnpm (latest)

git clone https://github.com/zebbern/Devonz.git
cd Devonz
pnpm install
pnpm run dev

Open http://localhost:5173.

Docker

No Node.js required. Just Docker.

# Option 1: Docker Compose (pulls from GHCR)
docker compose up -d

# Option 2: Build locally
docker build -t devonz .
docker run --rm -p 5173:5173 --env-file .env.local devonz

Open http://localhost:5173.


Configuration

Copy the example env file and add your API keys:

cp .env.example .env.local

Or use the interactive setup wizard (can init .env.local, sync .env for Docker, and prompt for keys with hidden input):

pnpm run setup
# AI Provider API Keys (add any you use)
OPENAI_API_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
GOOGLE_GENERATIVE_AI_API_KEY=your_google_key
ZAI_API_KEY=your_zai_key

# Local Provider URLs (optional)
OLLAMA_BASE_URL=http://127.0.0.1:11434

# Deployment Tokens (optional)
GITHUB_ACCESS_TOKEN=your_github_token
NETLIFY_AUTH_TOKEN=your_netlify_token

You can also configure providers in-app: Settings (gear icon) → Providers tab.

See .env.example for all available variables.


Docker

pnpm docker:build          # Build image locally
pnpm docker:run            # Run standalone container
pnpm docker:up             # Start via Docker Compose
pnpm docker:down           # Stop services
pnpm docker:dev            # Dev mode with hot reload
pnpm docker:update         # Pull latest image + restart

Auto-update with Watchtower (polls GHCR every 5 minutes):

docker compose --profile auto-update up -d

Scripts

# Development
pnpm run dev               # Start dev server
pnpm run build             # Production build
pnpm run start             # Run production build
pnpm run preview           # Build + preview locally

# Quality
pnpm test                  # Run test suite
pnpm test:watch            # Tests in watch mode
pnpm run typecheck         # TypeScript type check
pnpm run lint              # ESLint check
pnpm run lint:fix          # Auto-fix lint issues

# Utilities
pnpm run clean             # Clean build artifacts
pnpm run update            # Pull latest + install + rebuild

Keeping Up to Date

Git clone users:

pnpm run update                    # pulls, installs deps, rebuilds
pnpm run update -- --skip-build    # pull + install only

Docker users:

pnpm docker:update                 # pulls latest image, restarts

Hands-free (Watchtower):

docker compose --profile auto-update up -d

Project Structure

Expand file tree
bolt.diy/
├── app/
│   ├── components/         # React components
│   │   ├── @settings/      # Settings panel (14 tabs)
│   │   ├── chat/           # Chat interface
│   │   ├── deploy/         # Deployment integrations
│   │   ├── editor/         # Code editor
│   │   ├── git/            # Git integration
│   │   ├── header/         # App header
│   │   ├── sidebar/        # Sidebar navigation
│   │   ├── ui/             # Shared UI components
│   │   └── workbench/      # Development workbench
│   ├── lib/
│   │   ├── hooks/          # React hooks
│   │   ├── modules/        # Feature modules (LLM providers)
│   │   ├── services/       # API services
│   │   ├── stores/         # State management (nanostores)
│   │   └── utils/          # Utility functions
│   ├── routes/             # Remix routes (39 API + page routes)
│   ├── styles/             # Global styles
│   └── types/              # TypeScript type definitions
├── docs/                   # Extended documentation
├── scripts/                # Build & update scripts
└── supabase/               # Supabase configuration

Tech Stack

Remix + Vite + TypeScript · UnoCSS · Radix UI · Framer Motion · Vercel AI SDK · CodeMirror · xterm.js · WebContainers


Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/my-feature
  3. Commit changes: git commit -m 'feat: add my feature'
  4. Push: git push origin feature/my-feature
  5. Open a Pull Request

See CONTRIBUTING.md for detailed guidelines.


Acknowledgments


Build anything with AI. Just describe what you want.

GitHub · Issues · Documentation