This is a Chatbot web app that uses the Ollama API to chat with a locally deployed LLM model.
- Chat features
- Chat with locally deployed Ollama-backend LLM.
- Continuous conversation.
- Choose model from local tags.
- Chat history. (Database & LocalStorage)
- Text file uploading and model fine-tuning.
- App features
- Dark mode.
- Responsive design.
- User authentication.
- User settings.
- Multi-user support.
- Deployment
- Run on Node server.
- Dockerize the app.
Environment variables are stored in .env.$(NODE_ENV) files. By default, the app runs in development mode.
Files ending with .local are git-ignored and should be used for storing sensitive information. For example, API keys, contact information, etc.
Run development server: npm run dev.
This app is supposed to run by Node server.
- Build:
npm run build. - Start:
npm run start.
- Front-end
- Next.js is the main framework.
- Material-UI simplifies the UI design.
- Axios provides some handy ways to handle HTTP requests.
- react-markdown for parsing and rendering markdown content.
- Back-end
Common styles are applied using Tailwind CSS.
Color Palette
Neutral
50
#fafafa
100
#f5f5f5
200
#e5e5e5
300
#d4d4d4
400
#a3a3a3
500
#737373
600
#525252
700
#404040
800
#262626
900
#171717
950
#0a0a0a
Color consistences of Material-UI components with TailwindCSS are applied via the MuiThemeWrapper at components/muiThemeWrapper.js.
Icons
This project utilizes Material-Icons to provide a consistent icon set.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

