Quick Start
The fastest way to self-host Orca Memory is with Docker Compose.1. Clone the Repository
2. Configure Environment
Copy the example environment file:.env with your configuration. See Environment Variables for all options.
3. Start Services
- Dashboard (port 3000)
- Embeddings service (port 8000)
4. Initialize Convex
In a separate terminal, deploy the Convex backend:Docker Compose Configuration
The defaultdocker-compose.yml:
Production Deployment
For production, consider:Reverse Proxy
Use nginx or Traefik as a reverse proxy:Health Checks
The embeddings service exposes a health endpoint:Scaling
For high availability:- Run multiple dashboard instances behind a load balancer
- The embeddings service can be scaled horizontally
- Convex handles scaling automatically
Updating
To update to the latest version:Troubleshooting
Container won't start
Container won't start
Check logs with
docker-compose logs -f. Common issues:- Missing environment variables
- Port conflicts
- Insufficient memory
Can't connect to Convex
Can't connect to Convex
Ensure your
CONVEX_URL is correct and the Convex project is deployed.Embeddings service slow
Embeddings service slow
The first request loads the model into memory. Subsequent requests are faster. Consider a GPU-enabled instance for better performance.

