Skip to content

technowizard/monorepo-template

Repository files navigation

React + Hono Monorepo Template

An opinionated full-stack monorepo template with a React frontend and a Hono API server, wired together and ready to extend

Stack

Frontend (apps/web)

Backend (apps/server)

  • Hono - lightweight HTTP framework
  • Drizzle ORM + PostgreSQL - type-safe database access
  • Zod - request/response validation
  • OpenAPI docs via @hono/zod-openapi + Scalar UI
  • Vitest - handler tests with centralized in-memory adapters, no database required

Monorepo

Project structure

monorepo-template/
├── apps/
│   ├── server/          # Hono API server
│   │   └── src/
│   │       ├── db/           # Drizzle schema and migrations
│   │       ├── routes/       # Route definitions and handlers
│   │       ├── repositories/ # Repository interfaces and Drizzle adapters
│   │       └── tests/
│   │           └── in-memory/  # In-memory adapters per repo + createInMemoryRepos()
│   └── web/             # React frontend
│       └── src/
│           ├── features/   # Feature-scoped components and API hooks
│           ├── pages/      # Page-level components
│           ├── routes/     # TanStack Router file-based routes
│           ├── components/ # Shared UI components
│           ├── lib/        # API client, query client, i18n, env
│           ├── locales/    # en / id translation files
│           └── tests/      # Test utilities, MSW handlers
└── packages/
    ├── api-contract/    # Re-exports AppType from server — zero runtime, browser-safe
    └── config/          # Shared TypeScript configs

Prerequisites

Getting started

With Docker (recommended)

Copy the root env file and fill in the required values:

cp .env.example .env

.env at the root is only used by Docker Compose to configure the Postgres service:

POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=app
CORS_ORIGINS=http://localhost:3001

Start everything:

pnpm dev
# or: docker compose up

This starts Postgres, runs migrations, and serves:

Without Docker (local)

  1. Start a PostgreSQL instance manually

  2. Copy and configure the server env file:

cp apps/server/.env.example apps/server/.env
  1. Install dependencies and run:
pnpm install
pnpm dev:local

Environment variables

apps/server/.env

Variable Default Description
NODE_ENV production development | production | test
PORT 3000 Server port
HOST localhost Server hostname
DATABASE_USER - Required. Postgres user
DATABASE_PASSWORD - Required. Postgres password
DATABASE_DB postgres Database name
DATABASE_HOST localhost Database host
DATABASE_PORT 5432 Database port
CORS_ORIGINS http://localhost:3001 Allowed origins, comma-separated
RATE_LIMIT_WINDOW_MS 60000 Rate limit window in ms
RATE_LIMIT_MAX 50 Max requests per window

apps/web/.env

Variable Description
VITE_API_URL Backend base URL, e.g. http://localhost:3000

Development commands

# Start everything via Docker
pnpm dev

# Start frontend and backend directly (no Docker)
pnpm dev:local

# Start individually
pnpm dev:web
pnpm dev:server

Database

# Push schema changes to the database (no migration file)
pnpm db:push

# Generate and apply a migration
pnpm db:migrate

# Open Drizzle Studio
pnpm db:studio

Adding a new schema

When you replace or remove the example tasks schema with your own, generate and apply a migration before starting the server or running tests:

# 1. Edit or create your schema in apps/server/src/db/schemas/
# 2. Generate the migration file
pnpm db:migrate

# 3. Apply to your test database as well
pnpm db:migrate:test

Drizzle compares your schema against the current database state and generates the SQL diff. Skipping this step will cause startup errors or test failures if the table doesn't exist yet

Testing

# Run all tests
pnpm test

# Run per app
pnpm test:web
pnpm test:server

Frontend tests use Vitest + Testing Library + MSW. MSW intercepts fetch at the network layer, so tests exercise the full component → hook → API client chain without a running server.

Page-level tests use renderPage() from tests/test-utils, which spins up a TanStack Router instance with a fresh, isolated QueryClient per test. This lets tests exercise URL search params and navigation without any global state cleanup. Component-level tests use the simpler render() helper which provides only a QueryClientProvider.

Backend tests use in-memory adapters injected via Hono context — no database required to run the handler suite. Each adapter is a Map-backed implementation of the repository interface, centralized in src/tests/in-memory/. One file per repository, assembled into a single createInMemoryRepos() factory in index.ts. Test files import only that one function — buildClient() injects it into the app via createTestApp. The global setup still applies migrations once before the suite starts (pnpm db:migrate:test), but individual tests are fully isolated in memory. Set up apps/server/.env.test with a test database before running.

Test isolation: each it block gets a fresh adapter instance (backend via beforeEach) or a fresh router + query client (frontend via renderPage()). Tests are independent — run them in any order, skip any one, and the others still pass

Code quality

# Lint and format all files
pnpm lint

# Type-check all packages
pnpm typecheck

Linting and formatting run automatically on staged files via Husky pre-commit hooks. Commit messages are enforced to follow Conventional Commits

# Interactive commit prompt
pnpm commit

Deployment

1. Name your images

Update docker-compose.prod.yml with your registry and image names:

server:
  image: your-username/my-app-server:latest

web:
  image: your-username/my-app-web:latest

2. Build the images

Run both build commands from the monorepo root. The web image requires VITE_API_URL at build time - Vite bakes it into the static bundle

docker build \
  -f apps/server/Dockerfile.prod \
  -t your-username/my-app-server:latest \
  .

docker build \
  -f apps/web/Dockerfile.prod \
  --build-arg VITE_API_URL=https://api.yourdomain.com \
  -t your-username/my-app-web:latest \
  .

3. Test locally before pushing

You can run the full production stack on your machine without pushing to any registry. Docker Compose uses locally built images if the tag already exists:

# Create a local .env with production-like values
cp .env.example .env

docker compose -f docker-compose.prod.yml up -d

Tear down when done:

docker compose -f docker-compose.prod.yml down -v

4. Push and deploy

Push the images to your registry:

docker push your-username/my-app-server:latest
docker push your-username/my-app-web:latest

On your production server, create an .env file and bring the stack up:

# Copy docker-compose.prod.yml and .env.example to the server, then:
cp .env.example .env
# Edit .env with real credentials

docker compose -f docker-compose.prod.yml up -d

The server container runs database migrations automatically on startup before accepting traffic

Adding shadcn/ui components

pnpm --filter web shadcn:add <component>
# e.g.: pnpm --filter web shadcn:add dialog

Adding a new feature

The frontend follows a feature-slice pattern. Each feature lives under src/features/<name>/:

features/tasks/
├── api/
│   ├── query-keys.ts   # Hierarchical query key factory
│   ├── get-tasks.ts    # useGetTasks — list query with optional filter via select
│   ├── get-task.ts     # useGetTask — single item query
│   ├── create-task.ts  # useCreateTask — mutation with meta.invalidates
│   ├── update-task.ts  # useUpdateTask — mutation with meta.invalidates
│   └── delete-task.ts  # useDeleteTask — mutation with meta.invalidates
└── components/         # Feature-specific UI components

Page-level components go in src/pages/ and are referenced from src/routes/.

Client state (Zustand): use src/stores/ for ephemeral UI state that doesn't belong in the URL (e.g. notification queues, modal state, selected item IDs). Each store is created with createSelectorHooks so every field gets a free useField() hook, and state updates use Immer for safe mutations. See stores/notifications.ts for the pattern.

Query invalidation: mutations declare meta: { invalidates: [featureKeys.all] }. The global MutationCache observer in lib/react-query.ts calls invalidateQueries automatically on success — no manual queryClient.invalidateQueries calls in mutation files.

URL state: filter and other navigational state live in the URL as typed search params (validateSearch in the route file). Read them with Route.useSearch(), update them with useNavigate. This makes filter state bookmarkable, shareable, and restored on back navigation without any store reset in tests.

Typed API client: lib/api-client.ts exports apiClient = hc<AppType>(...). Use InferResponseType and InferRequestType from hono/client to derive request and response types directly from the server route definitions — no separate type files to maintain.

About

An opinionated full-stack monorepo template with React + Hono

Resources

License

Stars

Watchers

Forks

Languages