OpenLLM Monitor is a plug-and-play, real-time observability dashboard for monitoring and debugging LLM API calls across OpenAI, Ollama, OpenRouter, and more. Tracks tokens, latency, cost, retries, and lets you replay prompts โ fully open-source and self-hostable.
undefinedReal-time LLM Observability Dashboardundefined
Monitor, analyze, and optimize your LLM usage across multiple providers in real-time
undefinedCreated and Developed by Prajeesh Chavanundefined
Full-Stack Developer & AI Enthusiast
๐ View Full Credits & Project Journey
๐ Quick Start โข ๐ Features โข ๐ง Installation โข ๐ Documentation โข ๐ค Contributing

Real-time monitoring of all your LLM requests with comprehensive analytics

Detailed logging of all LLM API calls with filtering and search

Detailed logging of all LLM API calls with filtering and search

Test and compare prompts across different providers and models
|
undefined๐ฏ Zero-Code Integrationundefined
|
undefined๐ Real-Time Analyticsundefined
|
undefined๐ Multi-Provider Supportundefined
|
OpenLLM Monitor has received major UI/UX improvements and feature enhancements!
undefined๐ Whatโs New:undefined
undefined๐ See Complete Enhancement Guide: Enhanced Features Documentation
| Provider | Status | Models Supported |
|---|---|---|
| undefinedOpenAIundefined | โ | GPT-3.5, GPT-4, GPT-4o, DALL-E |
| undefinedOllamaundefined | โ | Llama2, Mistral, CodeLlama, Custom |
| undefinedOpenRouterundefined | โ | 100+ models via unified API |
| undefinedMistral AIundefined | โ | Mistral-7B, Mistral-8x7B, Mixtral |
| undefinedAnthropicundefined | ๐ | Claude 3, Claude 2 |
undefinedWant to showcase your system immediately? Generate comprehensive seed data:
# Windows PowerShell (Recommended)
cd "scripts"
.\generate-seed-data.ps1
# Or use Node.js directly
cd scripts
npm install
node seed-data.js
undefinedโจ What you get:undefined
๐ Complete Seed Data Guideundefined | โ๏ธ Advanced Configurationundefined
Get up and running in less than 2 minutes:
# Clone the repository
git clone https://github.com/prajeesh-chavan/openllm-monitor.git
cd openllm-monitor
# Start with Docker (includes everything)
docker-compose up -d
# Or use our setup script
./docker-setup.sh # Linux/Mac
.\docker-setup.ps1 # Windows PowerShell
undefined๐ Access your dashboard: http://localhost:3000
git clone https://github.com/prajeesh-chavan/openllm-monitor.git
cd openllm-monitor
# Backend setup
cd backend
npm install
cp ../.env.example .env
# Frontend setup
cd ../frontend
npm install
Edit backend/.env:
MONGODB_URI=mongodb://localhost:27017/openllm-monitor
PORT=3001
OPENAI_API_KEY=your-openai-key-here
OLLAMA_BASE_URL=http://localhost:11434
# Terminal 1: MongoDB
mongod
# Terminal 2: Backend
cd backend && npm run dev
# Terminal 3: Frontend
cd frontend && npm run dev
undefined๐ Open: http://localhost:5173
| Component | Minimum | Recommended |
|---|---|---|
| undefinedNode.jsundefined | 18.x | 20.x LTS |
| undefinedMemoryundefined | 4GB RAM | 8GB RAM |
| undefinedStorageundefined | 10GB | 20GB SSD |
| undefinedMongoDBundefined | 4.4+ | 6.0+ |
|
undefined๐ณ Dockerundefined
โ
Everything included |
undefined๐ป Manual Installundefined
โ
Customizable |
undefinedโ๏ธ Cloud Deployundefined
โ
Scalable |
undefinedWindows:undefined
# PowerShell (Recommended)
.\docker-setup.ps1
# Command Prompt
docker-setup.bat
undefinedLinux/macOS:undefined
# Make executable and run
chmod +x docker-setup.sh
./docker-setup.sh
undefinedValidation:undefined
# Check if everything is configured correctly
.\docker\docker-validate.ps1 # Windows
./docker/docker-validate.sh # Linux/Mac
graph TB
A[Client Applications] --> B[OpenLLM Monitor Proxy]
B --> C{LLM Provider}
C --> D[OpenAI]
C --> E[Ollama]
C --> F[OpenRouter]
C --> G[Mistral AI]
B --> H[Backend API]
H --> I[MongoDB]
H --> J[WebSocket Server]
J --> K[React Dashboard]
H --> L[Analytics Engine]
H --> M[Cost Calculator]
H --> N[Token Counter]
openllm-monitor/
โโโ ๐ฏ backend/ # Node.js + Express + MongoDB
โ โโโ controllers/ # ๐ฎ API request handlers
โ โโโ models/ # ๐ Database schemas & models
โ โโโ routes/ # ๐ฃ๏ธ API route definitions
โ โโโ middlewares/ # ๐ Custom middleware (LLM logger)
โ โโโ services/ # ๐ง LLM provider integrations
โ โโโ utils/ # ๐ ๏ธ Helper functions & utilities
โ โโโ config/ # โ๏ธ Configuration management
โ
โโโ ๐จ frontend/ # React + Vite + Tailwind
โ โโโ src/components/ # ๐งฉ Reusable UI components
โ โโโ src/pages/ # ๐ Page-level components
โ โโโ src/services/ # ๐ API communication layer
โ โโโ src/hooks/ # ๐ช Custom React hooks
โ โโโ src/store/ # ๐๏ธ State management (Zustand)
โ โโโ public/ # ๐ Static assets
โ
โโโ ๐ณ docker/ # Docker configuration
โโโ ๐ docs/ # Documentation & guides
โโโ ๐งช scripts/ # Setup & utility scripts
โโโ ๐ README.md # You are here!
| undefinedBackendundefined | undefinedFrontendundefined | undefinedDatabaseundefined | undefinedDevOpsundefined |
|---|---|---|---|
# Start the proxy server
npm run proxy
# Your existing code works unchanged!
# All OpenAI calls are automatically logged
const response = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
messages: [{ role: "user", content: "Hello!" }]
});
// Add to your existing application
const { LLMLogger } = require("openllm-monitor");
const logger = new LLMLogger({
apiUrl: "http://localhost:3001",
});
// Wrap your LLM calls
const response = await logger.track(async () => {
return await openai.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Explain quantum computing" }],
});
});
// Get comprehensive analytics
const analytics = await fetch("/api/analytics", {
method: "POST",
body: JSON.stringify({
dateRange: "last-7-days",
providers: ["openai", "ollama"],
groupBy: "model",
}),
});
console.log(analytics.data);
// {
// totalRequests: 1247,
// totalCost: 23.45,
// averageLatency: 850,
// topModels: [...]
// }
// Compare the same prompt across providers
const comparison = await fetch("/api/replay/compare", {
method: "POST",
body: JSON.stringify({
prompt: "Write a haiku about coding",
configurations: [
{ provider: "openai", model: "gpt-3.5-turbo" },
{ provider: "ollama", model: "llama2:7b" },
{ provider: "openrouter", model: "anthropic/claude-2" },
],
}),
});
Create backend/.env from the template:
cp .env.example .env
undefinedEssential Configuration:undefined
# ๐๏ธ Database
MONGODB_URI=mongodb://localhost:27017/openllm-monitor
# ๐ Server
PORT=3001
NODE_ENV=development
FRONTEND_URL=http://localhost:5173
# ๐ค LLM Provider API Keys
OPENAI_API_KEY=sk-your-openai-key-here
OPENROUTER_API_KEY=sk-your-openrouter-key-here
MISTRAL_API_KEY=your-mistral-key-here
# ๐ฆ Ollama (Local)
OLLAMA_BASE_URL=http://localhost:11434
# ๐ Security
JWT_SECRET=your-super-secret-jwt-key
RATE_LIMIT_MAX_REQUESTS=100
.env: OPENAI_API_KEY=sk-...ollama serveollama pull llama2.env: OLLAMA_BASE_URL=http://localhost:11434.env: OPENROUTER_API_KEY=sk-or-....env: MISTRAL_API_KEY=...# Start MongoDB with Docker
docker-compose up -d mongodb
# Access MongoDB Admin UI
open http://localhost:8081 # admin/admin
.env: MONGODB_URI=mongodb+srv://...# Download and install MongoDB Community Server
# https://www.mongodb.com/try/download/community
# Or with Chocolatey
choco install mongodb
# Start MongoDB service
net start MongoDB
# Install with Homebrew
brew tap mongodb/brew
brew install mongodb-community
# Start MongoDB
brew services start mongodb/brew/mongodb-community
# Install MongoDB
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | sudo apt-key add -
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-6.0.list
sudo apt-get update && sudo apt-get install -y mongodb-org
# Start MongoDB
sudo systemctl start mongod
sudo systemctl enable mongod
# Use our setup scripts
./scripts/setup-mongodb.sh # Linux/Mac
.\scripts\setup-mongodb.ps1 # Windows PowerShell
.\scripts\setup-mongodb.bat # Windows CMD
# Production build and deploy
docker-compose -f docker/docker-compose.prod.yml up -d
# With custom environment
docker-compose -f docker/docker-compose.prod.yml --env-file .env.production up -d
# Build and push to ECR
aws ecr get-login-password --region us-west-2 | docker login --username AWS --password-stdin 123456789012.dkr.ecr.us-west-2.amazonaws.com
docker build -t openllm-monitor .
docker tag openllm-monitor:latest 123456789012.dkr.ecr.us-west-2.amazonaws.com/openllm-monitor:latest
docker push 123456789012.dkr.ecr.us-west-2.amazonaws.com/openllm-monitor:latest
# Deploy with ECS or EKS
# Use DigitalOcean App Platform
doctl apps create --spec .do/app.yaml
# Or deploy to Droplet
docker-compose -f docker/docker-compose.prod.yml up -d
# Frontend only (with separate backend)
cd frontend
npm run build
# Deploy frontend to Vercel
vercel --prod
# Deploy backend separately to Railway/Render
undefinedProduction Environment Variables:undefined
NODE_ENV=production
MONGODB_URI=mongodb+srv://production-cluster/openllm-monitor
JWT_SECRET=super-secure-production-secret
CORS_ORIGIN=https://your-domain.com
RATE_LIMIT_MAX_REQUESTS=1000
LOG_LEVEL=info
undefinedSecurity Checklist:undefined
| Endpoint | Method | Description |
|---|---|---|
/api/health |
GET | Service health check |
/api/info |
GET | API version & information |
/api/status |
GET | System status & metrics |
| Endpoint | Method | Description |
|---|---|---|
/api/logs |
GET | Retrieve logs with filtering |
/api/logs/:id |
GET | Get specific log details |
/api/logs/stats |
GET | Dashboard statistics |
/api/logs/export |
POST | Export logs (CSV/JSON) |
/api/analytics |
POST | Advanced analytics queries |
| Endpoint | Method | Description |
|---|---|---|
/api/replay |
POST | Replay a prompt |
/api/replay/compare |
POST | Compare across providers |
/api/replay/estimate |
POST | Get cost estimates |
/api/replay/models |
GET | Available models list |
| Endpoint | Method | Description |
|---|---|---|
/api/providers |
GET | List provider configs |
/api/providers/:id |
PUT | Update provider settings |
/api/providers/:id/test |
POST | Test provider connection |
// Real-time events
socket.on("new-log", (log) => {
console.log("New request:", log);
});
socket.on("stats-update", (stats) => {
console.log("Updated stats:", stats);
});
socket.on("error-alert", (error) => {
console.log("Error detected:", error);
});
# Backend tests
cd backend
npm test # Run all tests
npm run test:watch # Watch mode
npm run test:coverage # With coverage
# Frontend tests
cd frontend
npm test # Run all tests
npm run test:ui # UI test runner
npm run test:coverage # With coverage
| Component | Coverage | Status |
|---|---|---|
| Backend API | 85% | โ Good |
| Frontend Components | 78% | โ Good |
| Integration Tests | 92% | โ Excellent |
| E2E Tests | 65% | โ ๏ธ Needs Work |
# 1. Fork & Clone
git clone https://github.com/prajeesh-chavan/openllm-monitor.git
cd openllm-monitor
# 2. Create Feature Branch
git checkout -b feature/amazing-feature
# 3. Start Development
npm run dev:all # Start all services
# 4. Make Changes & Test
npm test
npm run lint
# 5. Commit & Push
git commit -m "feat: add amazing feature"
git push origin feature/amazing-feature
# 6. Create Pull Request
| Component | Status | Progress |
|---|---|---|
| โ Backend APIundefined | Complete | 100% |
| โ Database Modelsundefined | Complete | 100% |
| โ Provider Servicesundefined | Complete | 95% |
| โ WebSocket Serverundefined | Complete | 100% |
| โ Frontend Dashboardundefined | Complete | 90% |
| โ Analytics Engineundefined | Complete | 85% |
| ๐ง Mobile Appundefined | In Progress | 30% |
| ๐ API v2undefined | Planned | 0% |
We welcome contributions in these areas:
We love contributions! Hereโs how you can help make OpenLLM Monitor even better:
git checkout -b feature/AmazingFeature)git commit -m 'feat: Add some AmazingFeature')git push origin feature/AmazingFeature)|
undefined๐ Bug Reportsundefined |
undefinedโจ Feature Requestsundefined |
undefined๐ Documentationundefined |
undefined๐งช Testingundefined |
We use Conventional Commits:
feat: add new dashboard widget
fix: resolve login issue
docs: update API documentation
test: add unit tests for analytics
refactor: optimize database queries
chore: update dependencies
Thanks to all the amazing people who have contributed to this project!
|
undefined๐ Documentationundefined |
undefined๐ฌ Discussionsundefined |
undefined๏ฟฝ Issuesundefined |
undefined๐ง Emailundefined |
undefinedCommon Issues:undefined
# Check if MongoDB is running
docker ps | grep mongo
# Restart MongoDB
docker-compose restart mongodb
# Check logs
docker-compose logs mongodb
# Find what's using the port
netstat -tulpn | grep :3001
# Kill the process
kill -9 <PID>
# Or change port in .env
PORT=3002
# Check Ollama status
ollama ps
# Restart Ollama
ollama serve
# Check logs
tail -f ~/.ollama/logs/server.log
This project is licensed under the MIT License - see the LICENSE file for details.
undefinedCopyright ยฉ 2024-2025 Prajeesh Chavanundefined
MIT License
Copyright (c) 2024-2025 Prajeesh Chavan
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
undefinedAbout Me:undefined
Iโm a passionate full-stack developer with expertise in modern web technologies and AI/ML systems. I built OpenLLM Monitor to solve the real-world challenge of monitoring and optimizing LLM usage across different providers. This project represents my commitment to creating tools that help developers work more efficiently with AI technologies.
undefinedSkills & Technologies:undefined
undefinedConnect with me if you have questions about the project, want to collaborate, or discuss opportunities!
This project represents months of dedicated development and continuous improvement. Hereโs what makes it special:
undefinedIf you find this project valuable, please:undefined
undefinedBuilt with โค๏ธ by Prajeesh Chavan for the LLM developer communityundefined
This project is the result of extensive research, development, and testing to provide the best LLM monitoring experience. If this project helped you, please consider giving it a โญ star on GitHub and connecting with me!
undefinedCreator: Prajeesh Chavan โข License: MIT โข Year: 2024-2025
We use cookies
We use cookies to analyze traffic and improve your experience. You can accept or reject analytics cookies.