mirror of
https://github.com/zebrajr/ollama-webui.git
synced 2025-12-06 12:19:46 +01:00
* feat: improve ollama model management experience
This commit introduces several improvements to the Ollama model management modal:
- Adds a cancel button to the model pulling operation, using the existing 'x' button pattern.
- Adds a cancel button to the "Update All" models operation, allowing the user to cancel the update for the currently processing model.
- Cleans up toast notifications when updating all models. A single toast is now shown at the beginning and a summary toast at the end, preventing notification spam.
- Refactors the `ManageOllama.svelte` component to support these new cancellation features.
- Adds tooltips to all buttons in the modal to improve clarity.
- Disables buttons when their corresponding input fields are empty to prevent accidental clicks.
* fix
* i18n: improve Chinese translation
* fix: handle non‑UTF8 chars in third‑party responses without error
* German translation of new strings in i18n
* log web search queries only with level 'debug' instead of 'info'
* Tool calls now only include text and dont inlcude other content like image b64
* fix onedrive
* fix: discovery url
* fix: default permissions not being loaded
* fix: ai hallucination
* fix: non rich text input copy
* refac: rm print statements
* refac: disable direct models from model editors
* refac/fix: do not process xlsx files with azure doc intelligence
* Update pull_request_template.md
* Update generated image translation in DE-de
* added missing danish translations
* feat(onedrive): Enable search and "My Organization" pivot
* style(onedrive): Formatting fix
* feat: Implement toggling for vertical and horizontal flow layouts
This commit introduces the necessary logic and UI controls to allow users to switch the Flow component layout between vertical and horizontal orientations.
* **`Flow.svelte` Refactoring:**
* Updates logic for calculating level offsets and node positions to consistently respect the current flow orientation.
* Adds a control panel using `<Controls>` and `<SwitchButton>` components.
* Provides user interface elements to easily switch the flow layout between horizontal and vertical orientations.
* build(deps): bump pydantic from 2.11.7 to 2.11.9 in /backend
Bumps [pydantic](https://github.com/pydantic/pydantic) from 2.11.7 to 2.11.9.
- [Release notes](https://github.com/pydantic/pydantic/releases)
- [Changelog](https://github.com/pydantic/pydantic/blob/v2.11.9/HISTORY.md)
- [Commits](https://github.com/pydantic/pydantic/compare/v2.11.7...v2.11.9)
---
updated-dependencies:
- dependency-name: pydantic
dependency-version: 2.11.9
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
* build(deps): bump black from 25.1.0 to 25.9.0 in /backend
Bumps [black](https://github.com/psf/black) from 25.1.0 to 25.9.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/25.1.0...25.9.0)
---
updated-dependencies:
- dependency-name: black
dependency-version: 25.9.0
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
* build(deps): bump markdown from 3.8.2 to 3.9 in /backend
Bumps [markdown](https://github.com/Python-Markdown/markdown) from 3.8.2 to 3.9.
- [Release notes](https://github.com/Python-Markdown/markdown/releases)
- [Changelog](https://github.com/Python-Markdown/markdown/blob/master/docs/changelog.md)
- [Commits](https://github.com/Python-Markdown/markdown/compare/3.8.2...3.9.0)
---
updated-dependencies:
- dependency-name: markdown
dependency-version: '3.9'
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
* build(deps): bump chromadb from 1.0.20 to 1.1.0 in /backend
Bumps [chromadb](https://github.com/chroma-core/chroma) from 1.0.20 to 1.1.0.
- [Release notes](https://github.com/chroma-core/chroma/releases)
- [Changelog](https://github.com/chroma-core/chroma/blob/main/RELEASE_PROCESS.md)
- [Commits](https://github.com/chroma-core/chroma/compare/1.0.20...1.1.0)
---
updated-dependencies:
- dependency-name: chromadb
dependency-version: 1.1.0
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
* build(deps): bump opentelemetry-api from 1.36.0 to 1.37.0
Bumps [opentelemetry-api](https://github.com/open-telemetry/opentelemetry-python) from 1.36.0 to 1.37.0.
- [Release notes](https://github.com/open-telemetry/opentelemetry-python/releases)
- [Changelog](https://github.com/open-telemetry/opentelemetry-python/blob/main/CHANGELOG.md)
- [Commits](https://github.com/open-telemetry/opentelemetry-python/compare/v1.36.0...v1.37.0)
---
updated-dependencies:
- dependency-name: opentelemetry-api
dependency-version: 1.37.0
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
* refac: ollama embed form data
* fix: non rich text handling
* fix: oauth client registration
* refac
* chore: dep bump
* chore: fastapi bump
* chore/refac: bump bcrypt and remove passlib
* Improving Korean Translation
* refac
* Improving Korean Translation
* feat: PWA share_target implementation
Co-Authored-By: gjveld <19951982+gjveld@users.noreply.github.com>
* refac: message input mobile detection behaviour
* feat: model_ids per folder
* Update translation.json (pt-BR)
inclusion of new translations of items that have been added
* refac
* refac
* refac
* refac
* refac/fix: temp chat
* refac
* refac: stop task
* refac/fix: azure audio escape
* refac: external tool validation
* refac/enh: start.sh additional args support
* refac
* refac: styling
* refac/fix: direct connection floating action buttons
* refac/fix: system prompt duplication
* refac/enh: openai tts additional params support
* refac
* feat: load data in parallel to accelerate page loading speed
* i18n: improve Chinese translation
* refac
* refac: model selector
* UPD: i18n es-ES Translation v0.6.33
UPD: i18n es-ES Translation v0.6.33
Updated new strings.
* refac
* improved query pref by querying only relevant columns
* refac/enh: docling params
* refac
* refac: openai additional headers support
* refac
* FEAT: Add Vega Char Visualizer Renderer
### FEAT: Add Vega Char Visualizer Renderer
Feature required in https://github.com/open-webui/open-webui/discussions/18022
Added npm vega lib to package.json
Added function for visualization renderer to src/libs/utils/index.ts
Added logic to src/lib/components/chat/Messages/CodeBlock.svelte
The treatment is similar as for mermaid diagrams.
Reference: https://vega.github.io/vega/
* refac
* chore
* refac
* FEAT: Add Vega-Lite Char Visualizer Renderer
### FEAT: Add Vega Char Visualizer Renderer
Add suport for Vega-Lite Specifications.
Vega-Lite is a "compiled" version of Vega Char Visualizer.
For be rendered with Vega it have to be compiled.
This PR add the check and compile if necessary, is a complement of recent Vega Renderer Feature added.
* refac
* refac/fix: switch
* enh/refac: url input handling
* refac
* refac: styling
* UPD: Add Validators & Error Toast for Mermaid & Vega diagrams
### UPD: Feat: Add Validators & Error Toast for Mermaid & Vega diagrams
Description:
As many time the diagrams generated or entered have syntax errors the diagrams are not rendered due to that errors, but as there isn't any notification is difficult to know what happend.
This PR add validator and toast notification when error on Mermaid and Vega/Vega-Lite diagrams, helping the user to fix its.
* removed redundant knowledge API call
* Fix Code Format
* refac: model workspace view
* refac
* refac: knowledge
* refac: prompts
* refac: tools
* refac
* feat: attach folder
* refac: make tencentcloud-sdk-python optional
* refac/fix: oauth
* enh: ENABLE_OAUTH_EMAIL_FALLBACK
* refac/fix: folders
* Update requirements.txt
* Update pyproject.toml
* UPD: Add Validators & Error Toast for Mermaid & Vega diagrams
### UPD: Feat: Add Validators & Error Toast for Mermaid & Vega diagrams
Description:
As many time the diagrams generated or entered have syntax errors the diagrams are not rendered due to that errors, but as there isn't any notification is difficult to know what happend.
This PR add validator and toast notification when error on Mermaid and Vega/Vega-Lite diagrams, helping the user to fix its.
Note:
Another possibility of integrating this Graph Visualizer is through its svelte component: https://github.com/vega/svelte-vega/tree/main/packages/svelte-vega
* Removed unused toast import & Code Format
* refac
* refac: external tool server view
* refac
* refac: overview
* refac: styling
* refac
* Update bug_report.yaml
* refac
* refac
* refac
* refac
* refac: oauth client fallback
* Fixed: Cannot handle batch sizes > 1 if no padding token is defined
Fixes Cannot handle batch sizes > 1 if no padding token is defined
For reranker models that do not have this defined in their config by using the eos_token_id if present as pad_token_id.
* refac: fallback to reasoning content
* fix(i18n): corrected typo in Spanish translation for "Reasoning Tags"
Typo fixed in Spanish translation file at line 1240 of `open-webui/src/lib/i18n/locales/es-ES/translation.json`:
- Incorrect: "Eriquetas de Razonamiento"
- Correct: "Etiquetas de Razonamiento"
This improves clarity and consistency in the UI.
* refac/fix: ENABLE_STAR_SESSIONS_MIDDLEWARE
* refac/fix: redirect
* refac
* refac
* refac
* refac: web search error handling
* refac: source parsing
* refac: functions
* refac
* refac/enh: note pdf export
* refac/fix: mcp oauth2.1
* chore: format
* chore: Changelog (#17995)
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* Update CHANGELOG.md
* refac
* chore: dep bump
---------
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: silentoplayz <jacwoo21@outlook.com>
Co-authored-by: Shirasawa <764798966@qq.com>
Co-authored-by: Jan Kessler <jakessle@uni-mainz.de>
Co-authored-by: Jacob Leksan <jacob.leksan@expedient.com>
Co-authored-by: Classic298 <27028174+Classic298@users.noreply.github.com>
Co-authored-by: sinejespersen <sinejespersen@protonmail.com>
Co-authored-by: Selene Blok <selene.blok@rws.nl>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Cyp <cypher9715@naver.com>
Co-authored-by: gjveld <19951982+gjveld@users.noreply.github.com>
Co-authored-by: joaoback <156559121+joaoback@users.noreply.github.com>
Co-authored-by: _00_ <131402327+rgaricano@users.noreply.github.com>
Co-authored-by: expruc <eygabi01@gmail.com>
Co-authored-by: YetheSamartaka <55753928+YetheSamartaka@users.noreply.github.com>
Co-authored-by: Akutangulo <akutangulo@gmail.com>
816 lines
24 KiB
Python
816 lines
24 KiB
Python
import importlib.metadata
|
|
import json
|
|
import logging
|
|
import os
|
|
import pkgutil
|
|
import sys
|
|
import shutil
|
|
from uuid import uuid4
|
|
from pathlib import Path
|
|
from cryptography.hazmat.primitives import serialization
|
|
|
|
import markdown
|
|
from bs4 import BeautifulSoup
|
|
from open_webui.constants import ERROR_MESSAGES
|
|
|
|
####################################
|
|
# Load .env file
|
|
####################################
|
|
|
|
# Use .resolve() to get the canonical path, removing any '..' or '.' components
|
|
ENV_FILE_PATH = Path(__file__).resolve()
|
|
|
|
# OPEN_WEBUI_DIR should be the directory where env.py resides (open_webui/)
|
|
OPEN_WEBUI_DIR = ENV_FILE_PATH.parent
|
|
|
|
# BACKEND_DIR is the parent of OPEN_WEBUI_DIR (backend/)
|
|
BACKEND_DIR = OPEN_WEBUI_DIR.parent
|
|
|
|
# BASE_DIR is the parent of BACKEND_DIR (open-webui-dev/)
|
|
BASE_DIR = BACKEND_DIR.parent
|
|
|
|
try:
|
|
from dotenv import find_dotenv, load_dotenv
|
|
|
|
load_dotenv(find_dotenv(str(BASE_DIR / ".env")))
|
|
except ImportError:
|
|
print("dotenv not installed, skipping...")
|
|
|
|
DOCKER = os.environ.get("DOCKER", "False").lower() == "true"
|
|
|
|
# device type embedding models - "cpu" (default), "cuda" (nvidia gpu required) or "mps" (apple silicon) - choosing this right can lead to better performance
|
|
USE_CUDA = os.environ.get("USE_CUDA_DOCKER", "false")
|
|
|
|
if USE_CUDA.lower() == "true":
|
|
try:
|
|
import torch
|
|
|
|
assert torch.cuda.is_available(), "CUDA not available"
|
|
DEVICE_TYPE = "cuda"
|
|
except Exception as e:
|
|
cuda_error = (
|
|
"Error when testing CUDA but USE_CUDA_DOCKER is true. "
|
|
f"Resetting USE_CUDA_DOCKER to false: {e}"
|
|
)
|
|
os.environ["USE_CUDA_DOCKER"] = "false"
|
|
USE_CUDA = "false"
|
|
DEVICE_TYPE = "cpu"
|
|
else:
|
|
DEVICE_TYPE = "cpu"
|
|
|
|
try:
|
|
import torch
|
|
|
|
if torch.backends.mps.is_available() and torch.backends.mps.is_built():
|
|
DEVICE_TYPE = "mps"
|
|
except Exception:
|
|
pass
|
|
|
|
####################################
|
|
# LOGGING
|
|
####################################
|
|
|
|
GLOBAL_LOG_LEVEL = os.environ.get("GLOBAL_LOG_LEVEL", "").upper()
|
|
if GLOBAL_LOG_LEVEL in logging.getLevelNamesMapping():
|
|
logging.basicConfig(stream=sys.stdout, level=GLOBAL_LOG_LEVEL, force=True)
|
|
else:
|
|
GLOBAL_LOG_LEVEL = "INFO"
|
|
|
|
log = logging.getLogger(__name__)
|
|
log.info(f"GLOBAL_LOG_LEVEL: {GLOBAL_LOG_LEVEL}")
|
|
|
|
if "cuda_error" in locals():
|
|
log.exception(cuda_error)
|
|
del cuda_error
|
|
|
|
log_sources = [
|
|
"AUDIO",
|
|
"COMFYUI",
|
|
"CONFIG",
|
|
"DB",
|
|
"IMAGES",
|
|
"MAIN",
|
|
"MODELS",
|
|
"OLLAMA",
|
|
"OPENAI",
|
|
"RAG",
|
|
"WEBHOOK",
|
|
"SOCKET",
|
|
"OAUTH",
|
|
]
|
|
|
|
SRC_LOG_LEVELS = {}
|
|
|
|
for source in log_sources:
|
|
log_env_var = source + "_LOG_LEVEL"
|
|
SRC_LOG_LEVELS[source] = os.environ.get(log_env_var, "").upper()
|
|
if SRC_LOG_LEVELS[source] not in logging.getLevelNamesMapping():
|
|
SRC_LOG_LEVELS[source] = GLOBAL_LOG_LEVEL
|
|
log.info(f"{log_env_var}: {SRC_LOG_LEVELS[source]}")
|
|
|
|
log.setLevel(SRC_LOG_LEVELS["CONFIG"])
|
|
|
|
WEBUI_NAME = os.environ.get("WEBUI_NAME", "Open WebUI")
|
|
if WEBUI_NAME != "Open WebUI":
|
|
WEBUI_NAME += " (Open WebUI)"
|
|
|
|
WEBUI_FAVICON_URL = "https://openwebui.com/favicon.png"
|
|
|
|
TRUSTED_SIGNATURE_KEY = os.environ.get("TRUSTED_SIGNATURE_KEY", "")
|
|
|
|
####################################
|
|
# ENV (dev,test,prod)
|
|
####################################
|
|
|
|
ENV = os.environ.get("ENV", "dev")
|
|
|
|
FROM_INIT_PY = os.environ.get("FROM_INIT_PY", "False").lower() == "true"
|
|
|
|
if FROM_INIT_PY:
|
|
PACKAGE_DATA = {"version": importlib.metadata.version("open-webui")}
|
|
else:
|
|
try:
|
|
PACKAGE_DATA = json.loads((BASE_DIR / "package.json").read_text())
|
|
except Exception:
|
|
PACKAGE_DATA = {"version": "0.0.0"}
|
|
|
|
VERSION = PACKAGE_DATA["version"]
|
|
INSTANCE_ID = os.environ.get("INSTANCE_ID", str(uuid4()))
|
|
|
|
|
|
# Function to parse each section
|
|
def parse_section(section):
|
|
items = []
|
|
for li in section.find_all("li"):
|
|
# Extract raw HTML string
|
|
raw_html = str(li)
|
|
|
|
# Extract text without HTML tags
|
|
text = li.get_text(separator=" ", strip=True)
|
|
|
|
# Split into title and content
|
|
parts = text.split(": ", 1)
|
|
title = parts[0].strip() if len(parts) > 1 else ""
|
|
content = parts[1].strip() if len(parts) > 1 else text
|
|
|
|
items.append({"title": title, "content": content, "raw": raw_html})
|
|
return items
|
|
|
|
|
|
try:
|
|
changelog_path = BASE_DIR / "CHANGELOG.md"
|
|
with open(str(changelog_path.absolute()), "r", encoding="utf8") as file:
|
|
changelog_content = file.read()
|
|
|
|
except Exception:
|
|
changelog_content = (pkgutil.get_data("open_webui", "CHANGELOG.md") or b"").decode()
|
|
|
|
# Convert markdown content to HTML
|
|
html_content = markdown.markdown(changelog_content)
|
|
|
|
# Parse the HTML content
|
|
soup = BeautifulSoup(html_content, "html.parser")
|
|
|
|
# Initialize JSON structure
|
|
changelog_json = {}
|
|
|
|
# Iterate over each version
|
|
for version in soup.find_all("h2"):
|
|
version_number = version.get_text().strip().split(" - ")[0][1:-1] # Remove brackets
|
|
date = version.get_text().strip().split(" - ")[1]
|
|
|
|
version_data = {"date": date}
|
|
|
|
# Find the next sibling that is a h3 tag (section title)
|
|
current = version.find_next_sibling()
|
|
|
|
while current and current.name != "h2":
|
|
if current.name == "h3":
|
|
section_title = current.get_text().lower() # e.g., "added", "fixed"
|
|
section_items = parse_section(current.find_next_sibling("ul"))
|
|
version_data[section_title] = section_items
|
|
|
|
# Move to the next element
|
|
current = current.find_next_sibling()
|
|
|
|
changelog_json[version_number] = version_data
|
|
|
|
CHANGELOG = changelog_json
|
|
|
|
####################################
|
|
# SAFE_MODE
|
|
####################################
|
|
|
|
SAFE_MODE = os.environ.get("SAFE_MODE", "false").lower() == "true"
|
|
|
|
|
|
####################################
|
|
# ENABLE_FORWARD_USER_INFO_HEADERS
|
|
####################################
|
|
|
|
ENABLE_FORWARD_USER_INFO_HEADERS = (
|
|
os.environ.get("ENABLE_FORWARD_USER_INFO_HEADERS", "False").lower() == "true"
|
|
)
|
|
|
|
# Experimental feature, may be removed in future
|
|
ENABLE_STAR_SESSIONS_MIDDLEWARE = (
|
|
os.environ.get("ENABLE_STAR_SESSIONS_MIDDLEWARE", "False").lower() == "true"
|
|
)
|
|
|
|
####################################
|
|
# WEBUI_BUILD_HASH
|
|
####################################
|
|
|
|
WEBUI_BUILD_HASH = os.environ.get("WEBUI_BUILD_HASH", "dev-build")
|
|
|
|
####################################
|
|
# DATA/FRONTEND BUILD DIR
|
|
####################################
|
|
|
|
DATA_DIR = Path(os.getenv("DATA_DIR", BACKEND_DIR / "data")).resolve()
|
|
|
|
if FROM_INIT_PY:
|
|
NEW_DATA_DIR = Path(os.getenv("DATA_DIR", OPEN_WEBUI_DIR / "data")).resolve()
|
|
NEW_DATA_DIR.mkdir(parents=True, exist_ok=True)
|
|
|
|
# Check if the data directory exists in the package directory
|
|
if DATA_DIR.exists() and DATA_DIR != NEW_DATA_DIR:
|
|
log.info(f"Moving {DATA_DIR} to {NEW_DATA_DIR}")
|
|
for item in DATA_DIR.iterdir():
|
|
dest = NEW_DATA_DIR / item.name
|
|
if item.is_dir():
|
|
shutil.copytree(item, dest, dirs_exist_ok=True)
|
|
else:
|
|
shutil.copy2(item, dest)
|
|
|
|
# Zip the data directory
|
|
shutil.make_archive(DATA_DIR.parent / "open_webui_data", "zip", DATA_DIR)
|
|
|
|
# Remove the old data directory
|
|
shutil.rmtree(DATA_DIR)
|
|
|
|
DATA_DIR = Path(os.getenv("DATA_DIR", OPEN_WEBUI_DIR / "data"))
|
|
|
|
STATIC_DIR = Path(os.getenv("STATIC_DIR", OPEN_WEBUI_DIR / "static"))
|
|
|
|
FONTS_DIR = Path(os.getenv("FONTS_DIR", OPEN_WEBUI_DIR / "static" / "fonts"))
|
|
|
|
FRONTEND_BUILD_DIR = Path(os.getenv("FRONTEND_BUILD_DIR", BASE_DIR / "build")).resolve()
|
|
|
|
if FROM_INIT_PY:
|
|
FRONTEND_BUILD_DIR = Path(
|
|
os.getenv("FRONTEND_BUILD_DIR", OPEN_WEBUI_DIR / "frontend")
|
|
).resolve()
|
|
|
|
####################################
|
|
# Database
|
|
####################################
|
|
|
|
# Check if the file exists
|
|
if os.path.exists(f"{DATA_DIR}/ollama.db"):
|
|
# Rename the file
|
|
os.rename(f"{DATA_DIR}/ollama.db", f"{DATA_DIR}/webui.db")
|
|
log.info("Database migrated from Ollama-WebUI successfully.")
|
|
else:
|
|
pass
|
|
|
|
DATABASE_URL = os.environ.get("DATABASE_URL", f"sqlite:///{DATA_DIR}/webui.db")
|
|
|
|
DATABASE_TYPE = os.environ.get("DATABASE_TYPE")
|
|
DATABASE_USER = os.environ.get("DATABASE_USER")
|
|
DATABASE_PASSWORD = os.environ.get("DATABASE_PASSWORD")
|
|
|
|
DATABASE_CRED = ""
|
|
if DATABASE_USER:
|
|
DATABASE_CRED += f"{DATABASE_USER}"
|
|
if DATABASE_PASSWORD:
|
|
DATABASE_CRED += f":{DATABASE_PASSWORD}"
|
|
|
|
DB_VARS = {
|
|
"db_type": DATABASE_TYPE,
|
|
"db_cred": DATABASE_CRED,
|
|
"db_host": os.environ.get("DATABASE_HOST"),
|
|
"db_port": os.environ.get("DATABASE_PORT"),
|
|
"db_name": os.environ.get("DATABASE_NAME"),
|
|
}
|
|
|
|
if all(DB_VARS.values()):
|
|
DATABASE_URL = f"{DB_VARS['db_type']}://{DB_VARS['db_cred']}@{DB_VARS['db_host']}:{DB_VARS['db_port']}/{DB_VARS['db_name']}"
|
|
elif DATABASE_TYPE == "sqlite+sqlcipher" and not os.environ.get("DATABASE_URL"):
|
|
# Handle SQLCipher with local file when DATABASE_URL wasn't explicitly set
|
|
DATABASE_URL = f"sqlite+sqlcipher:///{DATA_DIR}/webui.db"
|
|
|
|
# Replace the postgres:// with postgresql://
|
|
if "postgres://" in DATABASE_URL:
|
|
DATABASE_URL = DATABASE_URL.replace("postgres://", "postgresql://")
|
|
|
|
DATABASE_SCHEMA = os.environ.get("DATABASE_SCHEMA", None)
|
|
|
|
DATABASE_POOL_SIZE = os.environ.get("DATABASE_POOL_SIZE", None)
|
|
|
|
if DATABASE_POOL_SIZE != None:
|
|
try:
|
|
DATABASE_POOL_SIZE = int(DATABASE_POOL_SIZE)
|
|
except Exception:
|
|
DATABASE_POOL_SIZE = None
|
|
|
|
DATABASE_POOL_MAX_OVERFLOW = os.environ.get("DATABASE_POOL_MAX_OVERFLOW", 0)
|
|
|
|
if DATABASE_POOL_MAX_OVERFLOW == "":
|
|
DATABASE_POOL_MAX_OVERFLOW = 0
|
|
else:
|
|
try:
|
|
DATABASE_POOL_MAX_OVERFLOW = int(DATABASE_POOL_MAX_OVERFLOW)
|
|
except Exception:
|
|
DATABASE_POOL_MAX_OVERFLOW = 0
|
|
|
|
DATABASE_POOL_TIMEOUT = os.environ.get("DATABASE_POOL_TIMEOUT", 30)
|
|
|
|
if DATABASE_POOL_TIMEOUT == "":
|
|
DATABASE_POOL_TIMEOUT = 30
|
|
else:
|
|
try:
|
|
DATABASE_POOL_TIMEOUT = int(DATABASE_POOL_TIMEOUT)
|
|
except Exception:
|
|
DATABASE_POOL_TIMEOUT = 30
|
|
|
|
DATABASE_POOL_RECYCLE = os.environ.get("DATABASE_POOL_RECYCLE", 3600)
|
|
|
|
if DATABASE_POOL_RECYCLE == "":
|
|
DATABASE_POOL_RECYCLE = 3600
|
|
else:
|
|
try:
|
|
DATABASE_POOL_RECYCLE = int(DATABASE_POOL_RECYCLE)
|
|
except Exception:
|
|
DATABASE_POOL_RECYCLE = 3600
|
|
|
|
DATABASE_ENABLE_SQLITE_WAL = (
|
|
os.environ.get("DATABASE_ENABLE_SQLITE_WAL", "False").lower() == "true"
|
|
)
|
|
|
|
DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL = os.environ.get(
|
|
"DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL", None
|
|
)
|
|
if DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL is not None:
|
|
try:
|
|
DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL = float(
|
|
DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL
|
|
)
|
|
except Exception:
|
|
DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL = 0.0
|
|
|
|
RESET_CONFIG_ON_START = (
|
|
os.environ.get("RESET_CONFIG_ON_START", "False").lower() == "true"
|
|
)
|
|
|
|
ENABLE_REALTIME_CHAT_SAVE = (
|
|
os.environ.get("ENABLE_REALTIME_CHAT_SAVE", "False").lower() == "true"
|
|
)
|
|
|
|
ENABLE_QUERIES_CACHE = os.environ.get("ENABLE_QUERIES_CACHE", "False").lower() == "true"
|
|
|
|
####################################
|
|
# REDIS
|
|
####################################
|
|
|
|
REDIS_URL = os.environ.get("REDIS_URL", "")
|
|
REDIS_CLUSTER = os.environ.get("REDIS_CLUSTER", "False").lower() == "true"
|
|
|
|
REDIS_KEY_PREFIX = os.environ.get("REDIS_KEY_PREFIX", "open-webui")
|
|
|
|
REDIS_SENTINEL_HOSTS = os.environ.get("REDIS_SENTINEL_HOSTS", "")
|
|
REDIS_SENTINEL_PORT = os.environ.get("REDIS_SENTINEL_PORT", "26379")
|
|
|
|
# Maximum number of retries for Redis operations when using Sentinel fail-over
|
|
REDIS_SENTINEL_MAX_RETRY_COUNT = os.environ.get("REDIS_SENTINEL_MAX_RETRY_COUNT", "2")
|
|
try:
|
|
REDIS_SENTINEL_MAX_RETRY_COUNT = int(REDIS_SENTINEL_MAX_RETRY_COUNT)
|
|
if REDIS_SENTINEL_MAX_RETRY_COUNT < 1:
|
|
REDIS_SENTINEL_MAX_RETRY_COUNT = 2
|
|
except ValueError:
|
|
REDIS_SENTINEL_MAX_RETRY_COUNT = 2
|
|
|
|
####################################
|
|
# UVICORN WORKERS
|
|
####################################
|
|
|
|
# Number of uvicorn worker processes for handling requests
|
|
UVICORN_WORKERS = os.environ.get("UVICORN_WORKERS", "1")
|
|
try:
|
|
UVICORN_WORKERS = int(UVICORN_WORKERS)
|
|
if UVICORN_WORKERS < 1:
|
|
UVICORN_WORKERS = 1
|
|
except ValueError:
|
|
UVICORN_WORKERS = 1
|
|
log.info(f"Invalid UVICORN_WORKERS value, defaulting to {UVICORN_WORKERS}")
|
|
|
|
####################################
|
|
# WEBUI_AUTH (Required for security)
|
|
####################################
|
|
|
|
WEBUI_AUTH = os.environ.get("WEBUI_AUTH", "True").lower() == "true"
|
|
|
|
ENABLE_INITIAL_ADMIN_SIGNUP = (
|
|
os.environ.get("ENABLE_INITIAL_ADMIN_SIGNUP", "False").lower() == "true"
|
|
)
|
|
ENABLE_SIGNUP_PASSWORD_CONFIRMATION = (
|
|
os.environ.get("ENABLE_SIGNUP_PASSWORD_CONFIRMATION", "False").lower() == "true"
|
|
)
|
|
|
|
WEBUI_AUTH_TRUSTED_EMAIL_HEADER = os.environ.get(
|
|
"WEBUI_AUTH_TRUSTED_EMAIL_HEADER", None
|
|
)
|
|
WEBUI_AUTH_TRUSTED_NAME_HEADER = os.environ.get("WEBUI_AUTH_TRUSTED_NAME_HEADER", None)
|
|
WEBUI_AUTH_TRUSTED_GROUPS_HEADER = os.environ.get(
|
|
"WEBUI_AUTH_TRUSTED_GROUPS_HEADER", None
|
|
)
|
|
|
|
|
|
BYPASS_MODEL_ACCESS_CONTROL = (
|
|
os.environ.get("BYPASS_MODEL_ACCESS_CONTROL", "False").lower() == "true"
|
|
)
|
|
|
|
WEBUI_AUTH_SIGNOUT_REDIRECT_URL = os.environ.get(
|
|
"WEBUI_AUTH_SIGNOUT_REDIRECT_URL", None
|
|
)
|
|
|
|
####################################
|
|
# WEBUI_SECRET_KEY
|
|
####################################
|
|
|
|
WEBUI_SECRET_KEY = os.environ.get(
|
|
"WEBUI_SECRET_KEY",
|
|
os.environ.get(
|
|
"WEBUI_JWT_SECRET_KEY", "t0p-s3cr3t"
|
|
), # DEPRECATED: remove at next major version
|
|
)
|
|
|
|
WEBUI_SESSION_COOKIE_SAME_SITE = os.environ.get("WEBUI_SESSION_COOKIE_SAME_SITE", "lax")
|
|
|
|
WEBUI_SESSION_COOKIE_SECURE = (
|
|
os.environ.get("WEBUI_SESSION_COOKIE_SECURE", "false").lower() == "true"
|
|
)
|
|
|
|
WEBUI_AUTH_COOKIE_SAME_SITE = os.environ.get(
|
|
"WEBUI_AUTH_COOKIE_SAME_SITE", WEBUI_SESSION_COOKIE_SAME_SITE
|
|
)
|
|
|
|
WEBUI_AUTH_COOKIE_SECURE = (
|
|
os.environ.get(
|
|
"WEBUI_AUTH_COOKIE_SECURE",
|
|
os.environ.get("WEBUI_SESSION_COOKIE_SECURE", "false"),
|
|
).lower()
|
|
== "true"
|
|
)
|
|
|
|
if WEBUI_AUTH and WEBUI_SECRET_KEY == "":
|
|
raise ValueError(ERROR_MESSAGES.ENV_VAR_NOT_FOUND)
|
|
|
|
ENABLE_COMPRESSION_MIDDLEWARE = (
|
|
os.environ.get("ENABLE_COMPRESSION_MIDDLEWARE", "True").lower() == "true"
|
|
)
|
|
|
|
####################################
|
|
# OAUTH Configuration
|
|
####################################
|
|
ENABLE_OAUTH_EMAIL_FALLBACK = (
|
|
os.environ.get("ENABLE_OAUTH_EMAIL_FALLBACK", "False").lower() == "true"
|
|
)
|
|
|
|
ENABLE_OAUTH_ID_TOKEN_COOKIE = (
|
|
os.environ.get("ENABLE_OAUTH_ID_TOKEN_COOKIE", "True").lower() == "true"
|
|
)
|
|
|
|
OAUTH_CLIENT_INFO_ENCRYPTION_KEY = os.environ.get(
|
|
"OAUTH_CLIENT_INFO_ENCRYPTION_KEY", WEBUI_SECRET_KEY
|
|
)
|
|
|
|
OAUTH_SESSION_TOKEN_ENCRYPTION_KEY = os.environ.get(
|
|
"OAUTH_SESSION_TOKEN_ENCRYPTION_KEY", WEBUI_SECRET_KEY
|
|
)
|
|
|
|
####################################
|
|
# SCIM Configuration
|
|
####################################
|
|
|
|
SCIM_ENABLED = os.environ.get("SCIM_ENABLED", "False").lower() == "true"
|
|
SCIM_TOKEN = os.environ.get("SCIM_TOKEN", "")
|
|
|
|
####################################
|
|
# LICENSE_KEY
|
|
####################################
|
|
|
|
LICENSE_KEY = os.environ.get("LICENSE_KEY", "")
|
|
|
|
LICENSE_BLOB = None
|
|
LICENSE_BLOB_PATH = os.environ.get("LICENSE_BLOB_PATH", DATA_DIR / "l.data")
|
|
if LICENSE_BLOB_PATH and os.path.exists(LICENSE_BLOB_PATH):
|
|
with open(LICENSE_BLOB_PATH, "rb") as f:
|
|
LICENSE_BLOB = f.read()
|
|
|
|
LICENSE_PUBLIC_KEY = os.environ.get("LICENSE_PUBLIC_KEY", "")
|
|
|
|
pk = None
|
|
if LICENSE_PUBLIC_KEY:
|
|
pk = serialization.load_pem_public_key(
|
|
f"""
|
|
-----BEGIN PUBLIC KEY-----
|
|
{LICENSE_PUBLIC_KEY}
|
|
-----END PUBLIC KEY-----
|
|
""".encode(
|
|
"utf-8"
|
|
)
|
|
)
|
|
|
|
|
|
####################################
|
|
# MODELS
|
|
####################################
|
|
|
|
MODELS_CACHE_TTL = os.environ.get("MODELS_CACHE_TTL", "1")
|
|
if MODELS_CACHE_TTL == "":
|
|
MODELS_CACHE_TTL = None
|
|
else:
|
|
try:
|
|
MODELS_CACHE_TTL = int(MODELS_CACHE_TTL)
|
|
except Exception:
|
|
MODELS_CACHE_TTL = 1
|
|
|
|
|
|
####################################
|
|
# CHAT
|
|
####################################
|
|
|
|
CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE = os.environ.get(
|
|
"CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE", "1"
|
|
)
|
|
|
|
if CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE == "":
|
|
CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE = 1
|
|
else:
|
|
try:
|
|
CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE = int(
|
|
CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE
|
|
)
|
|
except Exception:
|
|
CHAT_RESPONSE_STREAM_DELTA_CHUNK_SIZE = 1
|
|
|
|
|
|
CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES = os.environ.get(
|
|
"CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES", "30"
|
|
)
|
|
|
|
if CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES == "":
|
|
CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES = 30
|
|
else:
|
|
try:
|
|
CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES = int(CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES)
|
|
except Exception:
|
|
CHAT_RESPONSE_MAX_TOOL_CALL_RETRIES = 30
|
|
|
|
|
|
####################################
|
|
# WEBSOCKET SUPPORT
|
|
####################################
|
|
|
|
ENABLE_WEBSOCKET_SUPPORT = (
|
|
os.environ.get("ENABLE_WEBSOCKET_SUPPORT", "True").lower() == "true"
|
|
)
|
|
|
|
|
|
WEBSOCKET_MANAGER = os.environ.get("WEBSOCKET_MANAGER", "")
|
|
|
|
WEBSOCKET_REDIS_URL = os.environ.get("WEBSOCKET_REDIS_URL", REDIS_URL)
|
|
WEBSOCKET_REDIS_CLUSTER = (
|
|
os.environ.get("WEBSOCKET_REDIS_CLUSTER", str(REDIS_CLUSTER)).lower() == "true"
|
|
)
|
|
|
|
websocket_redis_lock_timeout = os.environ.get("WEBSOCKET_REDIS_LOCK_TIMEOUT", "60")
|
|
|
|
try:
|
|
WEBSOCKET_REDIS_LOCK_TIMEOUT = int(websocket_redis_lock_timeout)
|
|
except ValueError:
|
|
WEBSOCKET_REDIS_LOCK_TIMEOUT = 60
|
|
|
|
WEBSOCKET_SENTINEL_HOSTS = os.environ.get("WEBSOCKET_SENTINEL_HOSTS", "")
|
|
WEBSOCKET_SENTINEL_PORT = os.environ.get("WEBSOCKET_SENTINEL_PORT", "26379")
|
|
|
|
|
|
AIOHTTP_CLIENT_TIMEOUT = os.environ.get("AIOHTTP_CLIENT_TIMEOUT", "")
|
|
|
|
if AIOHTTP_CLIENT_TIMEOUT == "":
|
|
AIOHTTP_CLIENT_TIMEOUT = None
|
|
else:
|
|
try:
|
|
AIOHTTP_CLIENT_TIMEOUT = int(AIOHTTP_CLIENT_TIMEOUT)
|
|
except Exception:
|
|
AIOHTTP_CLIENT_TIMEOUT = 300
|
|
|
|
|
|
AIOHTTP_CLIENT_SESSION_SSL = (
|
|
os.environ.get("AIOHTTP_CLIENT_SESSION_SSL", "True").lower() == "true"
|
|
)
|
|
|
|
AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = os.environ.get(
|
|
"AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST",
|
|
os.environ.get("AIOHTTP_CLIENT_TIMEOUT_OPENAI_MODEL_LIST", "10"),
|
|
)
|
|
|
|
if AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST == "":
|
|
AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = None
|
|
else:
|
|
try:
|
|
AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = int(AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST)
|
|
except Exception:
|
|
AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = 10
|
|
|
|
|
|
AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA = os.environ.get(
|
|
"AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA", "10"
|
|
)
|
|
|
|
if AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA == "":
|
|
AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA = None
|
|
else:
|
|
try:
|
|
AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA = int(
|
|
AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA
|
|
)
|
|
except Exception:
|
|
AIOHTTP_CLIENT_TIMEOUT_TOOL_SERVER_DATA = 10
|
|
|
|
|
|
AIOHTTP_CLIENT_SESSION_TOOL_SERVER_SSL = (
|
|
os.environ.get("AIOHTTP_CLIENT_SESSION_TOOL_SERVER_SSL", "True").lower() == "true"
|
|
)
|
|
|
|
|
|
####################################
|
|
# SENTENCE TRANSFORMERS
|
|
####################################
|
|
|
|
|
|
SENTENCE_TRANSFORMERS_BACKEND = os.environ.get("SENTENCE_TRANSFORMERS_BACKEND", "")
|
|
if SENTENCE_TRANSFORMERS_BACKEND == "":
|
|
SENTENCE_TRANSFORMERS_BACKEND = "torch"
|
|
|
|
|
|
SENTENCE_TRANSFORMERS_MODEL_KWARGS = os.environ.get(
|
|
"SENTENCE_TRANSFORMERS_MODEL_KWARGS", ""
|
|
)
|
|
if SENTENCE_TRANSFORMERS_MODEL_KWARGS == "":
|
|
SENTENCE_TRANSFORMERS_MODEL_KWARGS = None
|
|
else:
|
|
try:
|
|
SENTENCE_TRANSFORMERS_MODEL_KWARGS = json.loads(
|
|
SENTENCE_TRANSFORMERS_MODEL_KWARGS
|
|
)
|
|
except Exception:
|
|
SENTENCE_TRANSFORMERS_MODEL_KWARGS = None
|
|
|
|
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_BACKEND = os.environ.get(
|
|
"SENTENCE_TRANSFORMERS_CROSS_ENCODER_BACKEND", ""
|
|
)
|
|
if SENTENCE_TRANSFORMERS_CROSS_ENCODER_BACKEND == "":
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_BACKEND = "torch"
|
|
|
|
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS = os.environ.get(
|
|
"SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS", ""
|
|
)
|
|
if SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS == "":
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS = None
|
|
else:
|
|
try:
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS = json.loads(
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS
|
|
)
|
|
except Exception:
|
|
SENTENCE_TRANSFORMERS_CROSS_ENCODER_MODEL_KWARGS = None
|
|
|
|
####################################
|
|
# OFFLINE_MODE
|
|
####################################
|
|
|
|
ENABLE_VERSION_UPDATE_CHECK = (
|
|
os.environ.get("ENABLE_VERSION_UPDATE_CHECK", "true").lower() == "true"
|
|
)
|
|
OFFLINE_MODE = os.environ.get("OFFLINE_MODE", "false").lower() == "true"
|
|
|
|
if OFFLINE_MODE:
|
|
os.environ["HF_HUB_OFFLINE"] = "1"
|
|
ENABLE_VERSION_UPDATE_CHECK = False
|
|
|
|
####################################
|
|
# AUDIT LOGGING
|
|
####################################
|
|
# Where to store log file
|
|
AUDIT_LOGS_FILE_PATH = f"{DATA_DIR}/audit.log"
|
|
# Maximum size of a file before rotating into a new log file
|
|
AUDIT_LOG_FILE_ROTATION_SIZE = os.getenv("AUDIT_LOG_FILE_ROTATION_SIZE", "10MB")
|
|
|
|
# Comma separated list of logger names to use for audit logging
|
|
# Default is "uvicorn.access" which is the access log for Uvicorn
|
|
# You can add more logger names to this list if you want to capture more logs
|
|
AUDIT_UVICORN_LOGGER_NAMES = os.getenv(
|
|
"AUDIT_UVICORN_LOGGER_NAMES", "uvicorn.access"
|
|
).split(",")
|
|
|
|
# METADATA | REQUEST | REQUEST_RESPONSE
|
|
AUDIT_LOG_LEVEL = os.getenv("AUDIT_LOG_LEVEL", "NONE").upper()
|
|
try:
|
|
MAX_BODY_LOG_SIZE = int(os.environ.get("MAX_BODY_LOG_SIZE") or 2048)
|
|
except ValueError:
|
|
MAX_BODY_LOG_SIZE = 2048
|
|
|
|
# Comma separated list for urls to exclude from audit
|
|
AUDIT_EXCLUDED_PATHS = os.getenv("AUDIT_EXCLUDED_PATHS", "/chats,/chat,/folders").split(
|
|
","
|
|
)
|
|
AUDIT_EXCLUDED_PATHS = [path.strip() for path in AUDIT_EXCLUDED_PATHS]
|
|
AUDIT_EXCLUDED_PATHS = [path.lstrip("/") for path in AUDIT_EXCLUDED_PATHS]
|
|
|
|
|
|
####################################
|
|
# OPENTELEMETRY
|
|
####################################
|
|
|
|
ENABLE_OTEL = os.environ.get("ENABLE_OTEL", "False").lower() == "true"
|
|
ENABLE_OTEL_TRACES = os.environ.get("ENABLE_OTEL_TRACES", "False").lower() == "true"
|
|
ENABLE_OTEL_METRICS = os.environ.get("ENABLE_OTEL_METRICS", "False").lower() == "true"
|
|
ENABLE_OTEL_LOGS = os.environ.get("ENABLE_OTEL_LOGS", "False").lower() == "true"
|
|
|
|
OTEL_EXPORTER_OTLP_ENDPOINT = os.environ.get(
|
|
"OTEL_EXPORTER_OTLP_ENDPOINT", "http://localhost:4317"
|
|
)
|
|
OTEL_METRICS_EXPORTER_OTLP_ENDPOINT = os.environ.get(
|
|
"OTEL_METRICS_EXPORTER_OTLP_ENDPOINT", OTEL_EXPORTER_OTLP_ENDPOINT
|
|
)
|
|
OTEL_LOGS_EXPORTER_OTLP_ENDPOINT = os.environ.get(
|
|
"OTEL_LOGS_EXPORTER_OTLP_ENDPOINT", OTEL_EXPORTER_OTLP_ENDPOINT
|
|
)
|
|
OTEL_EXPORTER_OTLP_INSECURE = (
|
|
os.environ.get("OTEL_EXPORTER_OTLP_INSECURE", "False").lower() == "true"
|
|
)
|
|
OTEL_METRICS_EXPORTER_OTLP_INSECURE = (
|
|
os.environ.get(
|
|
"OTEL_METRICS_EXPORTER_OTLP_INSECURE", str(OTEL_EXPORTER_OTLP_INSECURE)
|
|
).lower()
|
|
== "true"
|
|
)
|
|
OTEL_LOGS_EXPORTER_OTLP_INSECURE = (
|
|
os.environ.get(
|
|
"OTEL_LOGS_EXPORTER_OTLP_INSECURE", str(OTEL_EXPORTER_OTLP_INSECURE)
|
|
).lower()
|
|
== "true"
|
|
)
|
|
OTEL_SERVICE_NAME = os.environ.get("OTEL_SERVICE_NAME", "open-webui")
|
|
OTEL_RESOURCE_ATTRIBUTES = os.environ.get(
|
|
"OTEL_RESOURCE_ATTRIBUTES", ""
|
|
) # e.g. key1=val1,key2=val2
|
|
OTEL_TRACES_SAMPLER = os.environ.get(
|
|
"OTEL_TRACES_SAMPLER", "parentbased_always_on"
|
|
).lower()
|
|
OTEL_BASIC_AUTH_USERNAME = os.environ.get("OTEL_BASIC_AUTH_USERNAME", "")
|
|
OTEL_BASIC_AUTH_PASSWORD = os.environ.get("OTEL_BASIC_AUTH_PASSWORD", "")
|
|
|
|
OTEL_METRICS_BASIC_AUTH_USERNAME = os.environ.get(
|
|
"OTEL_METRICS_BASIC_AUTH_USERNAME", OTEL_BASIC_AUTH_USERNAME
|
|
)
|
|
OTEL_METRICS_BASIC_AUTH_PASSWORD = os.environ.get(
|
|
"OTEL_METRICS_BASIC_AUTH_PASSWORD", OTEL_BASIC_AUTH_PASSWORD
|
|
)
|
|
OTEL_LOGS_BASIC_AUTH_USERNAME = os.environ.get(
|
|
"OTEL_LOGS_BASIC_AUTH_USERNAME", OTEL_BASIC_AUTH_USERNAME
|
|
)
|
|
OTEL_LOGS_BASIC_AUTH_PASSWORD = os.environ.get(
|
|
"OTEL_LOGS_BASIC_AUTH_PASSWORD", OTEL_BASIC_AUTH_PASSWORD
|
|
)
|
|
|
|
OTEL_OTLP_SPAN_EXPORTER = os.environ.get(
|
|
"OTEL_OTLP_SPAN_EXPORTER", "grpc"
|
|
).lower() # grpc or http
|
|
|
|
OTEL_METRICS_OTLP_SPAN_EXPORTER = os.environ.get(
|
|
"OTEL_METRICS_OTLP_SPAN_EXPORTER", OTEL_OTLP_SPAN_EXPORTER
|
|
).lower() # grpc or http
|
|
|
|
OTEL_LOGS_OTLP_SPAN_EXPORTER = os.environ.get(
|
|
"OTEL_LOGS_OTLP_SPAN_EXPORTER", OTEL_OTLP_SPAN_EXPORTER
|
|
).lower() # grpc or http
|
|
|
|
####################################
|
|
# TOOLS/FUNCTIONS PIP OPTIONS
|
|
####################################
|
|
|
|
PIP_OPTIONS = os.getenv("PIP_OPTIONS", "").split()
|
|
PIP_PACKAGE_INDEX_OPTIONS = os.getenv("PIP_PACKAGE_INDEX_OPTIONS", "").split()
|
|
|
|
|
|
####################################
|
|
# PROGRESSIVE WEB APP OPTIONS
|
|
####################################
|
|
|
|
EXTERNAL_PWA_MANIFEST_URL = os.environ.get("EXTERNAL_PWA_MANIFEST_URL")
|