localGPT/docker.env
Devin AI fb75541eb3 fix: resolve Docker networking issue for Ollama connectivity
- Modified OllamaClient to read OLLAMA_HOST environment variable
- Updated docker-compose.yml to pass OLLAMA_HOST to backend service
- Changed docker.env to use Docker gateway IP (172.18.0.1:11434)
- Configured Ollama service to bind to 0.0.0.0:11434 for container access
- Added test script to verify Ollama connectivity from within container
- All backend tests now pass including chat functionality

Co-Authored-By: PromptEngineer <jnfarooq@outlook.com>
2025-07-15 21:34:17 +00:00

12 lines
456 B
Bash

# Docker environment configuration
# Set this to use local Ollama instance running on host
# Note: Using Docker gateway IP instead of host.docker.internal for Linux compatibility
OLLAMA_HOST=http://172.18.0.1:11434
# Alternative: Use containerized Ollama (uncomment and run with --profile with-ollama)
# OLLAMA_HOST=http://ollama:11434
# Other configuration
NODE_ENV=production
NEXT_PUBLIC_API_URL=http://localhost:8000
RAG_API_URL=http://rag-api:8001