Skip to main content

Quickstart

Welcome to the oryelle quickstart guide! This guide will help you set up a local oryelle instance using Docker Compose. We'll walk you through setting up these essential components:

  • oryelle frontend and backend containers
  • Traefik as the reverse proxy
  • PostgreSQL database
  • Your choice of AI model provider:
    • Local Ollama installation, or
    • OpenAI-compatible endpoint (we will use the OpenAI API in this example, but any OpenAI compatible endpoint will work!)
  • A file system plugin for configuration management
info

For detailed explanations of the configuration, visit our detailed start guide

Choose whether you want to view the Ollama or OpenAI setup guide:

1. Install Ollama and download models

First, install Ollama from ollama.com. Once installed, start the service:

Terminal
ollama serve

Next, we'll install llama3.2, a compact LLM by Meta. While this model is lightweight enough to run on most devices, be aware that it prioritizes accessibility over performance. Feel free to substitute it with any other tool-compatible model of your choice - just remember to adjust the configuration accordingly.

ollama pull llama3.2:latest

For chat title generation, we'll add Microsoft's efficient gemma3 model:

ollama pull gemma3:1b

Finally, we will also install an embedding model, so we can let our model look at files:

ollama pull mxbai-embed-large:latest

2. Create a config file for oryelle

Create your configuration file with these initial settings:

/your/setup/config.json
{
"version": "1",
"authentication": "password",
"users": [
{
"name": "Admin",
"role": "admin",
"email": "[email protected]",
"password": "password1234"
}
],
"requireMFA": false,
"mcpServers": [
{
"id": "tools.oryelle.dev/filesystem/",
"name": "Filesystem Tool",
"description": "A tool that searches the filesystem for a given query.",
"shortDescription": "A tool that searches the filesystem for a given query.",
"type": "Local",
"schemaVersion": "1.0.0",
"version": "1.0.0",
"requirements": null,
"config": {
"transport": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/app/config"
],
"restart": { "enabled": true, "maxAttempts": 3, "delayMs": 1000 }
}
}
],
"models": [
{
"id": "models.oryelle.dev/llama3.2/",
"name": "Llama 3.2",
"description": "Local Llama 3.2 model - A powerful open source large language model developed by Meta, optimized for local deployment and inference. Provides strong natural language understanding and generation capabilities while maintaining reasonable resource requirements.",
"shortDescription": "Local Llama 3.2 model developed by Meta",
"supportsTools": true,
"supportsImages": false,
"type": "Local",
"version": "1.0.0",
"schemaVersion": "1.0.0",
"requirements": null,
"graph": "react",
"adapter": "ollama",
"adapterConfig": {
"model": "llama3.2:latest",
"temperature": 0.5,
"baseUrl": "http://host.docker.internal:11434"
}
},
{
"id": "models.oryelle.dev/gemma3:1b/",
"name": "Gemma Mini",
"description": "Local Gemma3 1B model - A powerful open source large language model developed by Microsoft. Provides strong natural language understanding and generation capabilities while maintaining reasonable resource requirements.",
"shortDescription": "Local Gemma3 1B model developed by Microsoft",
"supportsTools": false,
"supportsImages": false,
"type": "Local",
"version": "1.0.0",
"schemaVersion": "1.0.0",
"requirements": null,
"graph": "react",
"adapter": "ollama",
"adapterConfig": {
"model": "gemma3:1b",
"temperature": 0.3,
"baseUrl": "http://host.docker.internal:11434"
}
}
],
"chatNameGenerationModel": {
"adapter": "ollama",
"adapterConfig": {
"model": "gemma3:1b",
"temperature": 0.7,
"baseUrl": "http://host.docker.internal:11434"
}
},
"embeddingModel": {
"adapter": "ollama",
"adapterConfig": {
"model": "mxbai-embed-large",
"baseUrl": "http://host.docker.internal:11434",
"maxRetries": 2
}
}
}

3. Creating the docker compose file

Create a docker compose file with these configurations:

/your/setup/docker-compose.yml
services:
traefik:
image: traefik:v2.10
command:
- "--api.insecure=false"
- "--providers.docker=true"
- "--providers.docker.exposedbydefault=false"
- "--entrypoints.web.address=:80"
networks:
- default
ports:
- "3000:80"
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
frontend:
container_name: frontend
image: ghcr.io/oryelle/oryelle-frontend:latest
restart: always
depends_on:
- backend
networks:
- default
labels:
- "traefik.enable=true"
- "traefik.http.routers.frontend.rule=PathPrefix(`/`) && !PathPrefix(`/api/v1`)"
- "traefik.http.routers.frontend.entrypoints=web"
- "traefik.http.services.frontend.loadbalancer.server.port=3000"
backend:
container_name: backend
image: ghcr.io/oryelle/oryelle-backend:latest
restart: always
environment:
- DB_USER=postgres
- DB_PASSWORD=password
- DB_HOST=db
- DB_PORT=5432
- DB_NAME=oryelle
- DB_SSL=false
# Generate your own secret in production!
- SESSION_SECRET=98df1c68fcd23d6f3ab08623e6985af421eb41b95e5cb6bf8f87c2edf893f469
- COOKIE_MAX_AGE=2592000000
- COOKIE_SAMESITE=lax
- LOG_LEVEL=debug
- LOG_FILES=false
networks:
- default
volumes:
- app_data:/app/data
- ./:/app/config:ro
labels:
- "traefik.enable=true"
- "traefik.http.routers.backend.rule=PathPrefix(`/api/v1`)"
- "traefik.http.routers.backend.entrypoints=web"
- "traefik.http.services.backend.loadbalancer.server.port=3333"
depends_on:
db:
condition: service_healthy
db:
image: pgvector/pgvector:pg17
restart: always
shm_size: 128mb
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
interval: 30s
timeout: 60s
retries: 5
start_period: 20s
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- default
environment:
POSTGRES_PASSWORD: password
POSTGRES_USER: postgres
POSTGRES_DB: oryelle
networks:
default:
internal: false
volumes:
postgres_data:
app_data:

4. Start oryelle

Launch your oryelle instance with Docker Compose:

Terminal
docker compose up -d

Congratulations! Your local oryelle instance is now running and reachable via http://localhost:3000.

Important Security Notice

Before deploying oryelle to the internet, please ensure you:

  1. Review our Detailed Setup Guide for critical security configurations
  2. Follow best practices in our Security Considerations documentation
  3. Consider implementing Cloudflare Zero Trust to secure your instance with enterprise-grade protection

Proper security configuration is essential for protecting your oryelle deployment and its data from unauthorized access and potential threats.