Compare commits

..

17 Commits

Author SHA1 Message Date
MickLesk
7cbde08586 fix(update-apps): dry-run uses check_for_gh_release args, not Source header
The # Source: header can point to a different repo than what
check_for_gh_release actually queries (e.g. RustDesk uses
lejianwen fork, not official rustdesk repo).

Now parse both app name and source repo directly from the
check_for_gh_release call in the ct script:
  check_for_gh_release "appname" "owner/repo"

Also fix $HOME/.appname path expansion in pct exec context.
2026-05-05 21:46:18 +02:00
MickLesk
4da0f5aaf1 feat(update-apps): add var_dry_run to check updates without applying
Adds dry-run mode (var_dry_run=yes) that reports available updates for
all selected containers without modifying anything:

- Extracts GitHub source repo from the ct script header (# Source:)
- Resolves the version file name from check_for_gh_release app arg
- Reads current installed version from ~/.appname inside the container
- Queries GitHub API /releases/latest for comparison
- Outputs color-coded status: up-to-date (green), update available (yellow),
  or unknown (blue/yellow with reason)

Non-GitHub sources (Codeberg, custom URLs) are skipped with a notice.
Resource scaling is suppressed entirely during dry-run.

Example usage:
  var_container=all_running var_skip_confirm=yes var_dry_run=yes \
    bash -c "$(curl -fsSL .../update-apps.sh)"
2026-05-05 21:44:23 +02:00
MickLesk
b1148a486f feat(update-apps): add var_continue_on_error and TERM=dumb fix
- Add var_continue_on_error=yes to skip failed containers instead
  of aborting all remaining updates. Useful for cron/unattended runs
  where one disabled or broken script should not stop others.
  Containers with backup still attempt restore on failure regardless.

- Set TERM=dumb when running pct exec to prevent whiptail from
  hanging when no TTY is available (e.g. cron jobs redirecting
  stdout/stderr). This causes whiptail to fail-fast instead of
  blocking indefinitely.

- Add var_continue_on_error to export_config_json, --help output,
  and usage examples (cron-style invocation example added).
2026-05-05 21:40:01 +02:00
community-scripts-pr-app[bot]
a3e147cf20 Update CHANGELOG.md (#14270)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-05 19:06:04 +00:00
Donovan
4e9352572f Fix container count message in update-apps.sh (#14265)
Trivial update that corrects displayed container count by dividing by 3 (pct list displays 3 columns for each container)
2026-05-05 21:05:31 +02:00
community-scripts-pr-app[bot]
686657e8ec Update CHANGELOG.md (#14261)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-05 06:47:05 +00:00
push-app-to-main[bot]
9b8302cba0 LibreChat (#14247)
Co-authored-by: push-app-to-main[bot] <203845782+push-app-to-main[bot]@users.noreply.github.com>
2026-05-05 08:46:40 +02:00
community-scripts-pr-app[bot]
5a6392d95f Update CHANGELOG.md (#14260)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-05 06:34:49 +00:00
push-app-to-main[bot]
160c198731 Matomo (#14248)
* Add matomo (ct)

* Remove flatten_matomo_layout function

Removed the flatten_matomo_layout function and its calls.

---------

Co-authored-by: push-app-to-main[bot] <203845782+push-app-to-main[bot]@users.noreply.github.com>
Co-authored-by: CanbiZ (MickLesk) <47820557+MickLesk@users.noreply.github.com>
2026-05-05 08:34:24 +02:00
community-scripts-pr-app[bot]
b91ec6f7bc Update CHANGELOG.md (#14259)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-05 06:34:10 +00:00
push-app-to-main[bot]
a7ddc3502b Storyteller (#14122)
Co-authored-by: push-app-to-main[bot] <203845782+push-app-to-main[bot]@users.noreply.github.com>
Co-authored-by: CanbiZ (MickLesk) <47820557+MickLesk@users.noreply.github.com>
Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>
2026-05-05 08:33:46 +02:00
community-scripts-pr-app[bot]
9bf64f60b9 Update CHANGELOG.md (#14254)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-04 20:19:39 +00:00
Copilot
559cfff56a Databasus: move .env to filesystem root so service starts correctly (#14252)
* Initial plan

* Fix Databasus .env file location so service starts correctly

Agent-Logs-Url: https://github.com/community-scripts/ProxmoxVE/sessions/5b4ddcc8-18a3-49b4-9281-b14c712ebb7f

Co-authored-by: MickLesk <47820557+MickLesk@users.noreply.github.com>

* fix(databasus): update service EnvironmentFile path in update_script

Agent-Logs-Url: https://github.com/community-scripts/ProxmoxVE/sessions/b4dcde99-e021-40ce-bdbd-3afc224ab6d4

Co-authored-by: MickLesk <47820557+MickLesk@users.noreply.github.com>

* fix(databasus): only patch service EnvironmentFile if not already updated

Agent-Logs-Url: https://github.com/community-scripts/ProxmoxVE/sessions/47c48b0f-1527-4b9c-a6f5-74c789e79785

Co-authored-by: MickLesk <47820557+MickLesk@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: MickLesk <47820557+MickLesk@users.noreply.github.com>
2026-05-04 22:19:14 +02:00
community-scripts-pr-app[bot]
b353063720 Update CHANGELOG.md (#14253)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-04 20:14:29 +00:00
CanbiZ (MickLesk)
26b41d74ee tools.func get_latest_gh_tag - add pagination to find prefixed tags beyond first 50 (#14241)
Co-authored-by: MickLesk <mickey.leskowitz@levelbuild.com>
2026-05-04 22:13:49 +02:00
community-scripts-pr-app[bot]
812f8ed1c7 Update CHANGELOG.md (#14250)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-04 18:32:54 +00:00
CanbiZ (MickLesk)
75c5aa3d5d tools.func: add GitLab release check/fetch/deploy helpers (#14242) 2026-05-04 20:32:15 +02:00
11 changed files with 1344 additions and 26 deletions

View File

@@ -458,14 +458,34 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
</details>
## 2026-05-05
### 🆕 New Scripts
- LibreChat ([#14247](https://github.com/community-scripts/ProxmoxVE/pull/14247))
- Matomo ([#14248](https://github.com/community-scripts/ProxmoxVE/pull/14248))
- Storyteller ([#14122](https://github.com/community-scripts/ProxmoxVE/pull/14122))
### 🧰 Tools
- Fix container count message in update-apps.sh [@Quotacious](https://github.com/Quotacious) ([#14265](https://github.com/community-scripts/ProxmoxVE/pull/14265))
## 2026-05-04
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- Databasus: move .env to filesystem root so service starts correctly [@Copilot](https://github.com/Copilot) ([#14252](https://github.com/community-scripts/ProxmoxVE/pull/14252))
- Databasus: update mongo-tools fallback to 100.16.1 and use now pnpm instead of npm ci [@MickLesk](https://github.com/MickLesk) ([#14240](https://github.com/community-scripts/ProxmoxVE/pull/14240))
### 💾 Core
- #### ✨ New Features
- tools.func get_latest_gh_tag - add pagination to find prefixed tags beyond first 50 [@MickLesk](https://github.com/MickLesk) ([#14241](https://github.com/community-scripts/ProxmoxVE/pull/14241))
- tools.func: add GitLab release check/fetch/deploy helpers [@MickLesk](https://github.com/MickLesk) ([#14242](https://github.com/community-scripts/ProxmoxVE/pull/14242))
## 2026-05-03
### 🚀 Updated Scripts

View File

@@ -35,7 +35,8 @@ function update_script() {
msg_ok "Stopped Databasus"
msg_info "Backing up Configuration"
cp /opt/databasus/.env /opt/databasus.env.bak
cp /.env /opt/databasus.env.bak
chmod 600 /opt/databasus.env.bak
msg_ok "Backed up Configuration"
msg_info "Ensuring Database Clients"
@@ -84,11 +85,18 @@ function update_script() {
msg_ok "Updated Databasus"
msg_info "Restoring Configuration"
cp /opt/databasus.env.bak /opt/databasus/.env
cp /opt/databasus.env.bak /.env
rm -f /opt/databasus.env.bak
chown postgres:postgres /opt/databasus/.env
chmod 600 /.env
msg_ok "Restored Configuration"
if ! grep -q "EnvironmentFile=/.env" /etc/systemd/system/databasus.service; then
msg_info "Updating Service"
sed -i 's|EnvironmentFile=.*|EnvironmentFile=/.env|' /etc/systemd/system/databasus.service
$STD systemctl daemon-reload
msg_ok "Updated Service"
fi
msg_info "Starting Databasus"
$STD systemctl start databasus
msg_ok "Started Databasus"

6
ct/headers/librechat Normal file
View File

@@ -0,0 +1,6 @@
__ _ __ ________ __
/ / (_) /_ ________ / ____/ /_ ____ _/ /_
/ / / / __ \/ ___/ _ \/ / / __ \/ __ `/ __/
/ /___/ / /_/ / / / __/ /___/ / / / /_/ / /_
/_____/_/_.___/_/ \___/\____/_/ /_/\__,_/\__/

6
ct/headers/storyteller Normal file
View File

@@ -0,0 +1,6 @@
_____ __ __ ____
/ ___// /_____ _______ __/ /____ / / /__ _____
\__ \/ __/ __ \/ ___/ / / / __/ _ \/ / / _ \/ ___/
___/ / /_/ /_/ / / / /_/ / /_/ __/ / / __/ /
/____/\__/\____/_/ \__, /\__/\___/_/_/\___/_/
/____/

101
ct/librechat.sh Normal file
View File

@@ -0,0 +1,101 @@
#!/usr/bin/env bash
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
# Copyright (c) 2021-2026 community-scripts ORG
# Author: MickLesk (CanbiZ)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/danny-avila/LibreChat
APP="LibreChat"
var_tags="${var_tags:-ai;chat}"
var_cpu="${var_cpu:-4}"
var_ram="${var_ram:-6144}"
var_disk="${var_disk:-20}"
var_os="${var_os:-debian}"
var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}"
header_info "$APP"
variables
color
catch_errors
function update_script() {
header_info
check_container_storage
check_container_resources
if [[ ! -d /opt/librechat ]]; then
msg_error "No ${APP} Installation Found!"
exit
fi
if check_for_gh_tag "librechat" "danny-avila/LibreChat" "v"; then
msg_info "Stopping Services"
systemctl stop librechat rag-api
msg_ok "Stopped Services"
msg_info "Backing up Configuration"
cp /opt/librechat/.env /opt/librechat.env.bak
msg_ok "Backed up Configuration"
CLEAN_INSTALL=1 fetch_and_deploy_gh_tag "librechat" "danny-avila/LibreChat"
msg_info "Installing Dependencies"
cd /opt/librechat
$STD npm ci
msg_ok "Installed Dependencies"
msg_info "Building Frontend"
$STD npm run frontend
$STD npm prune --production
$STD npm cache clean --force
msg_ok "Built Frontend"
msg_info "Restoring Configuration"
cp /opt/librechat.env.bak /opt/librechat/.env
rm -f /opt/librechat.env.bak
msg_ok "Restored Configuration"
msg_info "Starting Services"
systemctl start rag-api librechat
msg_ok "Started Services"
msg_ok "Updated LibreChat Successfully!"
fi
if check_for_gh_release "rag-api" "danny-avila/rag_api"; then
msg_info "Stopping RAG API"
systemctl stop rag-api
msg_ok "Stopped RAG API"
msg_info "Backing up RAG API Configuration"
cp /opt/rag-api/.env /opt/rag-api.env.bak
msg_ok "Backed up RAG API Configuration"
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "rag-api" "danny-avila/rag_api" "tarball"
msg_info "Updating RAG API Dependencies"
cd /opt/rag-api
$STD .venv/bin/pip install -r requirements.lite.txt
msg_ok "Updated RAG API Dependencies"
msg_info "Restoring RAG API Configuration"
cp /opt/rag-api.env.bak /opt/rag-api/.env
rm -f /opt/rag-api.env.bak
msg_ok "Restored RAG API Configuration"
msg_info "Starting RAG API"
systemctl start rag-api
msg_ok "Started RAG API"
msg_ok "Updated RAG API Successfully!"
fi
exit
}
start
build_container
description
msg_ok "Completed Successfully!\n"
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:3080${CL}"

85
ct/storyteller.sh Normal file
View File

@@ -0,0 +1,85 @@
#!/usr/bin/env bash
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
# Copyright (c) 2021-2026 community-scripts ORG
# Author: MickLesk (CanbiZ)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://gitlab.com/storyteller-platform/storyteller
APP="Storyteller"
var_tags="${var_tags:-media;ebook;audiobook}"
var_cpu="${var_cpu:-4}"
var_ram="${var_ram:-10240}"
var_disk="${var_disk:-20}"
var_os="${var_os:-debian}"
var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}"
header_info "$APP"
variables
color
catch_errors
function update_script() {
header_info
check_container_storage
check_container_resources
if [[ ! -d /opt/storyteller ]]; then
msg_error "No ${APP} Installation Found!"
exit
fi
if check_for_gl_release "storyteller" "storyteller-platform/storyteller"; then
msg_info "Stopping Service"
systemctl stop storyteller
msg_ok "Stopped Service"
msg_info "Backing up Data"
cp /opt/storyteller/.env /opt/storyteller_env.bak
msg_ok "Backed up Data"
CLEAN_INSTALL=1 fetch_and_deploy_gl_release "storyteller" "storyteller-platform/storyteller" "tarball" "latest" "/opt/storyteller"
msg_info "Restoring Configuration"
mv /opt/storyteller_env.bak /opt/storyteller/.env
msg_ok "Restored Configuration"
msg_info "Rebuilding Storyteller"
cd /opt/storyteller
export NODE_OPTIONS="--max-old-space-size=4096"
$STD yarn install --network-timeout 600000
$STD gcc -g -fPIC -rdynamic -shared web/sqlite/uuid.c -o web/sqlite/uuid.c.so
export CI=1
export NODE_ENV=production
export NEXT_TELEMETRY_DISABLED=1
export SQLITE_NATIVE_BINDING=/opt/storyteller/node_modules/better-sqlite3/build/Release/better_sqlite3.node
$STD yarn workspaces foreach -Rpt --from @storyteller-platform/web --exclude @storyteller-platform/eslint run build
mkdir -p /opt/storyteller/web/.next/standalone/web/.next/static
cp -rT /opt/storyteller/web/.next/static /opt/storyteller/web/.next/standalone/web/.next/static
if [[ -d /opt/storyteller/web/public ]]; then
mkdir -p /opt/storyteller/web/.next/standalone/web/public
cp -rT /opt/storyteller/web/public /opt/storyteller/web/.next/standalone/web/public
fi
mkdir -p /opt/storyteller/web/.next/standalone/web/migrations
cp -rT /opt/storyteller/web/migrations /opt/storyteller/web/.next/standalone/web/migrations
mkdir -p /opt/storyteller/web/.next/standalone/web/sqlite
cp -rT /opt/storyteller/web/sqlite /opt/storyteller/web/.next/standalone/web/sqlite
ln -sf /opt/storyteller/.env /opt/storyteller/web/.next/standalone/web/.env
msg_ok "Rebuilt Storyteller"
msg_info "Starting Service"
systemctl start storyteller
msg_ok "Started Service"
msg_ok "Updated successfully!"
fi
exit
}
start
build_container
description
msg_ok "Completed Successfully!\n"
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:8001${CL}"

View File

@@ -79,7 +79,7 @@ ENCRYPTION_KEY=$(openssl rand -hex 32)
# Install goose for migrations
$STD go install github.com/pressly/goose/v3/cmd/goose@latest
ln -sf /root/go/bin/goose /usr/local/bin/goose
cat <<EOF >/opt/databasus/.env
cat <<EOF >/.env
# Environment
ENV_MODE=production
@@ -109,8 +109,7 @@ DATA_DIR=/databasus-data/data
BACKUP_DIR=/databasus-data/backups
LOG_DIR=/databasus-data/logs
EOF
chown postgres:postgres /opt/databasus/.env
chmod 600 /opt/databasus/.env
chmod 600 /.env
msg_ok "Configured Databasus"
msg_info "Configuring Valkey"
@@ -148,7 +147,7 @@ Requires=postgresql.service valkey.service
[Service]
Type=simple
WorkingDirectory=/opt/databasus
EnvironmentFile=/opt/databasus/.env
EnvironmentFile=/.env
ExecStart=/opt/databasus/databasus
Restart=always
RestartSec=5

View File

@@ -0,0 +1,139 @@
#!/usr/bin/env bash
# Copyright (c) 2021-2026 community-scripts ORG
# Author: MickLesk (CanbiZ)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/danny-avila/LibreChat
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
MONGO_VERSION="8.0" setup_mongodb
setup_meilisearch
PG_VERSION="17" PG_MODULES="pgvector" setup_postgresql
PG_DB_NAME="ragapi" PG_DB_USER="ragapi" PG_DB_EXTENSIONS="vector" setup_postgresql_db
NODE_VERSION="24" setup_nodejs
UV_PYTHON="3.12" setup_uv
fetch_and_deploy_gh_tag "librechat" "danny-avila/LibreChat"
fetch_and_deploy_gh_release "rag-api" "danny-avila/rag_api" "tarball"
msg_info "Installing LibreChat Dependencies"
cd /opt/librechat
$STD npm ci
msg_ok "Installed LibreChat Dependencies"
msg_info "Building Frontend"
$STD npm run frontend
$STD npm prune --production
$STD npm cache clean --force
msg_ok "Built Frontend"
msg_info "Installing RAG API Dependencies"
cd /opt/rag-api
$STD uv venv --python 3.12 --seed .venv
$STD .venv/bin/pip install -r requirements.lite.txt
mkdir -p /opt/rag-api/uploads
msg_ok "Installed RAG API Dependencies"
msg_info "Configuring LibreChat"
JWT_SECRET=$(openssl rand -hex 32)
JWT_REFRESH_SECRET=$(openssl rand -hex 32)
CREDS_KEY=$(openssl rand -hex 32)
CREDS_IV=$(openssl rand -hex 16)
cat <<EOF >/opt/librechat/.env
HOST=0.0.0.0
PORT=3080
MONGO_URI=mongodb://127.0.0.1:27017/LibreChat
DOMAIN_CLIENT=http://${LOCAL_IP}:3080
DOMAIN_SERVER=http://${LOCAL_IP}:3080
NO_INDEX=true
TRUST_PROXY=1
JWT_SECRET=${JWT_SECRET}
JWT_REFRESH_SECRET=${JWT_REFRESH_SECRET}
SESSION_EXPIRY=1000 * 60 * 15
REFRESH_TOKEN_EXPIRY=(1000 * 60 * 60 * 24) * 7
CREDS_KEY=${CREDS_KEY}
CREDS_IV=${CREDS_IV}
ALLOW_EMAIL_LOGIN=true
ALLOW_REGISTRATION=true
ALLOW_SOCIAL_LOGIN=false
ALLOW_SOCIAL_REGISTRATION=false
ALLOW_PASSWORD_RESET=false
ALLOW_UNVERIFIED_EMAIL_LOGIN=true
SEARCH=true
MEILI_NO_ANALYTICS=true
MEILI_HOST=http://127.0.0.1:7700
MEILI_MASTER_KEY=${MEILISEARCH_MASTER_KEY}
RAG_PORT=8000
RAG_API_URL=http://127.0.0.1:8000
APP_TITLE=LibreChat
ENDPOINTS=openAI,agents,assistants,anthropic,google
# OPENAI_API_KEY=your-key-here
# OPENAI_MODELS=
# ANTHROPIC_API_KEY=your-key-here
# GOOGLE_KEY=your-key-here
EOF
msg_ok "Configured LibreChat"
msg_info "Configuring RAG API"
cat <<EOF >/opt/rag-api/.env
VECTOR_DB_TYPE=pgvector
DB_HOST=127.0.0.1
DB_PORT=5432
POSTGRES_DB=${PG_DB_NAME}
POSTGRES_USER=${PG_DB_USER}
POSTGRES_PASSWORD=${PG_DB_PASS}
RAG_HOST=0.0.0.0
RAG_PORT=8000
JWT_SECRET=${JWT_SECRET}
RAG_UPLOAD_DIR=/opt/rag-api/uploads/
EOF
msg_ok "Configured RAG API"
msg_info "Creating Services"
cat <<EOF >/etc/systemd/system/librechat.service
[Unit]
Description=LibreChat
After=network.target mongod.service meilisearch.service rag-api.service
[Service]
Type=simple
User=root
WorkingDirectory=/opt/librechat
EnvironmentFile=/opt/librechat/.env
ExecStart=/usr/bin/npm run backend
Restart=on-failure
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
cat <<EOF >/etc/systemd/system/rag-api.service
[Unit]
Description=LibreChat RAG API
After=network.target postgresql.service
[Service]
Type=simple
User=root
WorkingDirectory=/opt/rag-api
EnvironmentFile=/opt/rag-api/.env
ExecStart=/opt/rag-api/.venv/bin/uvicorn main:app --host 0.0.0.0 --port 8000
Restart=on-failure
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
systemctl enable -q --now rag-api librechat
msg_ok "Created Services"
motd_ssh
customize
cleanup_lxc

View File

@@ -0,0 +1,98 @@
#!/usr/bin/env bash
# Copyright (c) 2021-2026 community-scripts ORG
# Author: MickLesk (CanbiZ)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://gitlab.com/storyteller-platform/storyteller
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
msg_info "Installing Dependencies"
$STD apt install -y \
build-essential \
git \
pkg-config \
libsqlite3-dev \
sqlite3 \
python3-setuptools \
ffmpeg
msg_ok "Installed Dependencies"
NODE_VERSION="22" NODE_MODULE="yarn" setup_nodejs
fetch_and_deploy_gh_release "readium" "readium/cli" "prebuild" "latest" "/opt/readium" "readium_linux_x86_64.tar.gz"
ln -sf /opt/readium/readium /usr/local/bin/readium
fetch_and_deploy_gl_release "storyteller" "storyteller-platform/storyteller" "tarball" "latest" "/opt/storyteller"
msg_info "Setting up Storyteller"
cd /opt/storyteller
$STD yarn install --network-timeout 600000
$STD gcc -g -fPIC -rdynamic -shared web/sqlite/uuid.c -o web/sqlite/uuid.c.so
STORYTELLER_SECRET_KEY=$(openssl rand -base64 32)
cat <<EOF >/opt/storyteller/.env
STORYTELLER_SECRET_KEY=${STORYTELLER_SECRET_KEY}
STORYTELLER_DATA_DIR=/opt/storyteller/data
PORT=8001
HOSTNAME=0.0.0.0
READIUM_PORT=9000
NODE_ENV=production
NEXT_TELEMETRY_DISABLED=1
EOF
mkdir -p /opt/storyteller/data
{
echo "Storyteller Credentials"
echo "======================="
echo "Secret Key: ${STORYTELLER_SECRET_KEY}"
} >~/storyteller.creds
msg_ok "Set up Storyteller"
msg_info "Building Storyteller"
cd /opt/storyteller
export CI=1
export NODE_ENV=production
export NEXT_TELEMETRY_DISABLED=1
export SQLITE_NATIVE_BINDING=/opt/storyteller/node_modules/better-sqlite3/build/Release/better_sqlite3.node
$STD yarn workspaces foreach -Rpt --from @storyteller-platform/web --exclude @storyteller-platform/eslint run build
mkdir -p /opt/storyteller/web/.next/standalone/web/.next/static
cp -rT /opt/storyteller/web/.next/static /opt/storyteller/web/.next/standalone/web/.next/static
if [[ -d /opt/storyteller/web/public ]]; then
mkdir -p /opt/storyteller/web/.next/standalone/web/public
cp -rT /opt/storyteller/web/public /opt/storyteller/web/.next/standalone/web/public
fi
mkdir -p /opt/storyteller/web/.next/standalone/web/migrations
cp -rT /opt/storyteller/web/migrations /opt/storyteller/web/.next/standalone/web/migrations
mkdir -p /opt/storyteller/web/.next/standalone/web/sqlite
cp -rT /opt/storyteller/web/sqlite /opt/storyteller/web/.next/standalone/web/sqlite
ln -sf /opt/storyteller/.env /opt/storyteller/web/.next/standalone/web/.env
msg_ok "Built Storyteller"
msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/storyteller.service
[Unit]
Description=Storyteller
After=network.target
[Service]
Type=simple
User=root
WorkingDirectory=/opt/storyteller/web/.next/standalone/web
EnvironmentFile=/opt/storyteller/.env
ExecStart=/usr/bin/node --enable-source-maps server.js
Restart=on-failure
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
systemctl enable -q --now storyteller
msg_ok "Created Service"
motd_ssh
customize
cleanup_lxc

View File

@@ -2079,15 +2079,33 @@ get_latest_gh_tag() {
local temp_file
temp_file=$(mktemp)
if ! github_api_call "https://api.github.com/repos/${repo}/tags?per_page=50" "$temp_file"; then
rm -f "$temp_file"
return 22
fi
local tag=""
if [[ -n "$prefix" ]]; then
tag=$(jq -r --arg p "$prefix" '[.[] | select(.name | startswith($p))][0].name // empty' "$temp_file")
# Use git/matching-refs API for server-side prefix filtering. This avoids
# paging through unrelated tags (e.g. mongodb/mongo-tools where 100.x tags
# only appear after page 4 of /tags). Returns ALL tags matching the prefix
# in a single call, sorted lexicographically ascending; we pick the
# highest version using `sort -V`.
if ! github_api_call "https://api.github.com/repos/${repo}/git/matching-refs/tags/${prefix}" "$temp_file"; then
rm -f "$temp_file"
return 22
fi
local count
count=$(jq 'length' "$temp_file" 2>/dev/null || echo 0)
if [[ "$count" -gt 0 ]]; then
tag=$(jq -r '.[].ref' "$temp_file" \
| sed 's|^refs/tags/||' \
| sort -V \
| tail -n1)
fi
else
# No prefix: just take the first (newest) tag from /tags
if ! github_api_call "https://api.github.com/repos/${repo}/tags?per_page=1" "$temp_file"; then
rm -f "$temp_file"
return 22
fi
tag=$(jq -r '.[0].name // empty' "$temp_file")
fi
@@ -8665,3 +8683,759 @@ EOF
$STD apt update
return 0
}
# ------------------------------------------------------------------------------
# Get latest GitLab release version.
# Usage: get_latest_gitlab_release "owner/repo" [strip_v]
# ------------------------------------------------------------------------------
get_latest_gitlab_release() {
local repo="$1"
local strip_v="${2:-true}"
local repo_encoded
repo_encoded=$(printf '%s' "$repo" | sed 's|/|%2F|g')
local header=()
[[ -n "${GITLAB_TOKEN:-}" ]] && header=(-H "PRIVATE-TOKEN: $GITLAB_TOKEN")
local temp_file
temp_file=$(mktemp)
local http_code
http_code=$(curl --connect-timeout 10 --max-time 30 -sSL \
-w "%{http_code}" -o "$temp_file" \
"${header[@]}" \
"https://gitlab.com/api/v4/projects/$repo_encoded/releases?per_page=1&order_by=released_at&sort=desc" 2>/dev/null) || true
if [[ "$http_code" != "200" ]]; then
rm -f "$temp_file"
msg_warn "GitLab API call failed for ${repo} (HTTP ${http_code})"
return 22
fi
local version
version=$(jq -r '.[0].tag_name // empty' "$temp_file")
rm -f "$temp_file"
if [[ -z "$version" ]]; then
msg_error "Could not determine latest version for ${repo}"
return 250
fi
if [[ "$strip_v" == "true" ]]; then
[[ "$version" =~ ^v[0-9] ]] && version="${version:1}"
fi
echo "$version"
}
# ------------------------------------------------------------------------------
# Checks for new GitLab release (latest tag).
#
# Description:
# - Queries the GitLab API for the latest release tag
# - Compares it to a local cached version (~/.<app>)
# - If newer, sets global CHECK_UPDATE_RELEASE and returns 0
#
# Usage:
# if check_for_gl_release "myapp" "owner/repo" [optional] "v1.2.3"; then
# # trigger update...
# fi
# exit 0
# } (end of update_script not from the function)
#
# Notes:
# - Requires `jq` (auto-installed if missing)
# - Supports GITLAB_TOKEN env var for private/rate-limited repos
# - Does not modify anything, only checks version state
# ------------------------------------------------------------------------------
check_for_gl_release() {
local app="$1"
local source="$2"
local pinned_version_in="${3:-}" # optional
local pin_reason="${4:-}" # optional reason shown to user
local app_lc="${app,,}"
local current_file="$HOME/.${app_lc}"
msg_info "Checking for update: ${app}"
# DNS check
if ! getent hosts gitlab.com >/dev/null 2>&1; then
msg_error "Network error: cannot resolve gitlab.com"
return 6
fi
ensure_dependencies jq
local repo_encoded
repo_encoded=$(printf '%s' "$source" | sed 's|/|%2F|g')
local header=()
[[ -n "${GITLAB_TOKEN:-}" ]] && header=(-H "PRIVATE-TOKEN: $GITLAB_TOKEN")
local releases_json="" http_code=""
# For pinned versions, try to fetch the specific release tag first
if [[ -n "$pinned_version_in" ]]; then
local pinned_encoded="${pinned_version_in//\//%2F}"
http_code=$(curl -sSL --max-time 20 -w "%{http_code}" -o /tmp/gl_check.json \
"${header[@]}" \
"https://gitlab.com/api/v4/projects/$repo_encoded/releases/$pinned_encoded" 2>/dev/null) || true
if [[ "$http_code" == "200" ]] && [[ -s /tmp/gl_check.json ]]; then
releases_json="[$(</tmp/gl_check.json)]"
fi
rm -f /tmp/gl_check.json
fi
# Fetch full releases list if needed
if [[ -z "$releases_json" ]]; then
http_code=$(curl -sSL --max-time 20 -w "%{http_code}" -o /tmp/gl_check.json \
"${header[@]}" \
"https://gitlab.com/api/v4/projects/$repo_encoded/releases?per_page=100&order_by=released_at&sort=desc" 2>/dev/null) || true
if [[ "$http_code" == "200" ]] && [[ -s /tmp/gl_check.json ]]; then
releases_json=$(</tmp/gl_check.json)
elif [[ "$http_code" == "401" ]]; then
msg_error "GitLab API authentication failed (HTTP 401)."
if [[ -n "${GITLAB_TOKEN:-}" ]]; then
msg_error "Your GITLAB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITLAB_TOKEN=\"glpat-your_token\""
fi
rm -f /tmp/gl_check.json
return 22
elif [[ "$http_code" == "404" ]]; then
msg_error "GitLab project not found (HTTP 404). Ensure '${source}' is correct and publicly accessible."
rm -f /tmp/gl_check.json
return 22
elif [[ "$http_code" == "429" ]]; then
msg_error "GitLab API rate limit exceeded (HTTP 429)."
msg_error "To increase the limit, export a GitLab token: export GITLAB_TOKEN=\"glpat-your_token_here\""
rm -f /tmp/gl_check.json
return 22
elif [[ "$http_code" == "000" || -z "$http_code" ]]; then
msg_error "GitLab API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://gitlab.com/api/v4/version"
rm -f /tmp/gl_check.json
return 7
else
msg_error "Unable to fetch releases for ${app} (HTTP ${http_code})"
rm -f /tmp/gl_check.json
return 22
fi
rm -f /tmp/gl_check.json
fi
mapfile -t raw_tags < <(jq -r '.[] | .tag_name' <<<"$releases_json")
if ((${#raw_tags[@]} == 0)); then
msg_error "No releases found for ${app} on GitLab"
return 250
fi
local clean_tags=()
for t in "${raw_tags[@]}"; do
# Only strip leading 'v' when followed by a digit (e.g. v1.2.3)
if [[ "$t" =~ ^v[0-9] ]]; then
clean_tags+=("${t:1}")
else
clean_tags+=("$t")
fi
done
local latest_raw="${raw_tags[0]}"
local latest_clean="${clean_tags[0]}"
# current installed (stored without v)
local current=""
if [[ -f "$current_file" ]]; then
current="$(<"$current_file")"
else
# Migration: search for any /opt/*_version.txt
local legacy_files
mapfile -t legacy_files < <(find /opt -maxdepth 1 -type f -name "*_version.txt" 2>/dev/null)
if ((${#legacy_files[@]} == 1)); then
current="$(<"${legacy_files[0]}")"
echo "${current#v}" >"$current_file"
rm -f "${legacy_files[0]}"
fi
fi
if [[ "$current" =~ ^v[0-9] ]]; then
current="${current:1}"
fi
# Pinned version handling
if [[ -n "$pinned_version_in" ]]; then
local pin_clean
if [[ "$pinned_version_in" =~ ^v[0-9] ]]; then
pin_clean="${pinned_version_in:1}"
else
pin_clean="$pinned_version_in"
fi
local match_raw=""
for i in "${!clean_tags[@]}"; do
if [[ "${clean_tags[$i]}" == "$pin_clean" ]]; then
match_raw="${raw_tags[$i]}"
break
fi
done
if [[ -z "$match_raw" ]]; then
msg_error "Pinned version ${pinned_version_in} not found upstream"
return 250
fi
if [[ "$current" != "$pin_clean" ]]; then
CHECK_UPDATE_RELEASE="$match_raw"
msg_ok "Update available: ${app} ${current:-not installed}${pin_clean}"
return 0
fi
if [[ -n "$pin_reason" ]]; then
msg_ok "No update available: ${app} (${current}) - update held back: ${pin_reason}"
else
msg_ok "No update available: ${app} (${current}) - update temporarily held back due to issues with newer releases"
fi
return 1
fi
# No pinning → use latest
if [[ -z "$current" || "$current" != "$latest_clean" ]]; then
CHECK_UPDATE_RELEASE="$latest_raw"
msg_ok "Update available: ${app} ${current:-not installed}${latest_clean}"
return 0
fi
msg_ok "No update available: ${app} (${latest_clean})"
return 1
}
# ------------------------------------------------------------------------------
# Scan older GitLab releases for a matching asset (fallback helper).
#
# Description:
# When the latest release does not contain the expected asset
# (e.g. .deb for the current arch, or a custom pattern), walks back
# through up to 15 recent releases and returns the first release JSON
# that has a matching asset. Used internally by fetch_and_deploy_gl_release.
#
# Usage (internal):
# _gl_scan_older_releases "owner/repo" "owner%2Frepo" "https://gitlab.com" \
# "binary|prebuild|singlefile" "$asset_pattern" "$skip_tag"
#
# Returns:
# - stdout: JSON of the matching release (single object) on success
# - 0 on success, 22 on API error, 250 if no match found
# ------------------------------------------------------------------------------
_gl_scan_older_releases() {
local repo="$1"
local repo_encoded="$2"
local base_url="${3:-https://gitlab.com}"
local mode="$4"
local asset_pattern="$5"
local skip_tag="$6"
local header=()
[[ -n "${GITLAB_TOKEN:-}" ]] && header=(-H "PRIVATE-TOKEN: $GITLAB_TOKEN")
local releases_list
releases_list=$(curl --connect-timeout 10 --max-time 30 -fsSL \
"${header[@]}" \
"${base_url}/api/v4/projects/${repo_encoded}/releases?per_page=15&order_by=released_at&sort=desc" 2>/dev/null) || {
msg_warn "Failed to fetch older releases for ${repo}"
return 22
}
local count
count=$(echo "$releases_list" | jq 'length' 2>/dev/null || echo 0)
[[ "$count" -eq 0 ]] && return 250
for ((i = 0; i < count; i++)); do
local rel_tag
rel_tag=$(echo "$releases_list" | jq -r ".[$i].tag_name")
# Skip the tag we already checked
[[ "$rel_tag" == "$skip_tag" ]] && continue
# Asset URLs for this release (direct_asset_url preferred, fallback to url)
local asset_urls
asset_urls=$(echo "$releases_list" | jq -r ".[$i].assets.links // [] | .[] | .direct_asset_url // .url")
[[ -z "$asset_urls" ]] && continue
local has_match=false
if [[ "$mode" == "binary" ]]; then
local arch
arch=$(dpkg --print-architecture 2>/dev/null || uname -m)
[[ "$arch" == "x86_64" ]] && arch="amd64"
[[ "$arch" == "aarch64" ]] && arch="arm64"
# Check with explicit pattern first, then arch heuristic, then any .deb
if [[ -n "$asset_pattern" ]]; then
while read -r u; do
case "${u##*/}" in $asset_pattern)
has_match=true
break
;;
esac
done <<<"$asset_urls"
fi
if [[ "$has_match" != "true" ]]; then
echo "$asset_urls" | grep -qE "($arch|amd64|x86_64|aarch64|arm64).*\.deb$" && has_match=true
fi
if [[ "$has_match" != "true" ]]; then
echo "$asset_urls" | grep -qE '\.deb$' && has_match=true
fi
elif [[ "$mode" == "prebuild" || "$mode" == "singlefile" ]]; then
while read -r u; do
case "${u##*/}" in $asset_pattern)
has_match=true
break
;;
esac
done <<<"$asset_urls"
fi
if [[ "$has_match" == "true" ]]; then
local use_fallback="y"
if [[ -t 0 ]]; then
msg_warn "Release ${skip_tag} has no matching asset. Previous release ${rel_tag} has a compatible asset."
read -rp "Use version ${rel_tag} instead? [Y/n] (auto-yes in 60s): " -t 60 use_fallback || use_fallback="y"
use_fallback="${use_fallback:-y}"
fi
if [[ "${use_fallback,,}" == "y" || "${use_fallback,,}" == "yes" ]]; then
echo "$releases_list" | jq ".[$i]"
return 0
else
return 250
fi
fi
done
return 250
}
function fetch_and_deploy_gl_release() {
local app="$1"
local repo="$2"
local mode="${3:-tarball}"
local version="${var_appversion:-${4:-latest}}"
local target="${5:-/opt/$app}"
local asset_pattern="${6:-}"
if [[ -z "$app" ]]; then
app="${repo##*/}"
if [[ -z "$app" ]]; then
msg_error "fetch_and_deploy_gl_release requires app name or valid repo"
return 1
fi
fi
local app_lc=$(echo "${app,,}" | tr -d ' ')
local version_file="$HOME/.${app_lc}"
local api_timeout="--connect-timeout 10 --max-time 60"
local download_timeout="--connect-timeout 15 --max-time 900"
local current_version=""
[[ -f "$version_file" ]] && current_version=$(<"$version_file")
ensure_dependencies jq
local repo_encoded
repo_encoded=$(printf '%s' "$repo" | sed 's|/|%2F|g')
local api_base="https://gitlab.com/api/v4/projects/$repo_encoded/releases"
local api_url
if [[ "$version" != "latest" ]]; then
api_url="$api_base/$version"
else
api_url="$api_base?per_page=1&order_by=released_at&sort=desc"
fi
local header=()
[[ -n "${GITLAB_TOKEN:-}" ]] && header=(-H "PRIVATE-TOKEN: $GITLAB_TOKEN")
local max_retries=3 retry_delay=2 attempt=1 success=false http_code
while ((attempt <= max_retries)); do
http_code=$(curl $api_timeout -sSL -w "%{http_code}" -o /tmp/gl_rel.json "${header[@]}" "$api_url" 2>/dev/null) || true
if [[ "$http_code" == "200" ]]; then
success=true
break
elif [[ "$http_code" == "429" ]]; then
if ((attempt < max_retries)); then
msg_warn "GitLab API rate limit hit, retrying in ${retry_delay}s... (attempt $attempt/$max_retries)"
sleep "$retry_delay"
retry_delay=$((retry_delay * 2))
fi
else
sleep "$retry_delay"
fi
((attempt++))
done
if ! $success; then
if [[ "$http_code" == "401" ]]; then
msg_error "GitLab API authentication failed (HTTP 401)."
if [[ -n "${GITLAB_TOKEN:-}" ]]; then
msg_error "Your GITLAB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITLAB_TOKEN=\"glpat-your_token\""
fi
elif [[ "$http_code" == "404" ]]; then
msg_error "GitLab project or release not found (HTTP 404)."
msg_error "Ensure '$repo' is correct and the project is accessible."
elif [[ "$http_code" == "429" ]]; then
msg_error "GitLab API rate limit exceeded (HTTP 429)."
msg_error "To increase the limit, export a GitLab token before running the script:"
msg_error " export GITLAB_TOKEN=\"glpat-your_token_here\""
elif [[ "$http_code" == "000" || -z "$http_code" ]]; then
msg_error "GitLab API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://gitlab.com/api/v4/version"
else
msg_error "Failed to fetch release metadata (HTTP $http_code)"
fi
return 1
fi
local json tag_name
json=$(</tmp/gl_rel.json)
if [[ "$version" == "latest" ]]; then
json=$(echo "$json" | jq '.[0] // empty')
if [[ -z "$json" || "$json" == "null" ]]; then
msg_error "No releases found for $repo on GitLab"
return 1
fi
fi
tag_name=$(echo "$json" | jq -r '.tag_name // empty')
if [[ -z "$tag_name" ]]; then
msg_error "Could not determine tag name from release metadata"
return 1
fi
[[ "$tag_name" =~ ^v[0-9] ]] && version="${tag_name:1}" || version="$tag_name"
local version_safe="${version//\//-}"
if [[ "$current_version" == "$version" ]]; then
$STD msg_ok "$app is already up-to-date (v$version)"
return 0
fi
local tmpdir
tmpdir=$(mktemp -d) || return 1
local filename=""
msg_info "Fetching GitLab release: $app ($version)"
_gl_asset_urls() {
local release_json="$1"
echo "$release_json" | jq -r '
(.assets.links // [])[] | .direct_asset_url // .url
'
}
### Tarball Mode ###
if [[ "$mode" == "tarball" || "$mode" == "source" ]]; then
local direct_tarball_url="https://gitlab.com/$repo/-/archive/$tag_name/${app_lc}-${version_safe}.tar.gz"
filename="${app_lc}-${version_safe}.tar.gz"
curl $download_timeout -fsSL "${header[@]}" -o "$tmpdir/$filename" "$direct_tarball_url" || {
msg_error "Download failed: $direct_tarball_url"
rm -rf "$tmpdir"
return 1
}
mkdir -p "$target"
if [[ "${CLEAN_INSTALL:-0}" == "1" ]]; then
rm -rf "${target:?}/"*
fi
tar --no-same-owner -xzf "$tmpdir/$filename" -C "$tmpdir" || {
msg_error "Failed to extract tarball"
rm -rf "$tmpdir"
return 1
}
local unpack_dir
unpack_dir=$(find "$tmpdir" -mindepth 1 -maxdepth 1 -type d | head -n1)
shopt -s dotglob nullglob
cp -r "$unpack_dir"/* "$target/"
shopt -u dotglob nullglob
### Binary Mode ###
elif [[ "$mode" == "binary" ]]; then
local arch
arch=$(dpkg --print-architecture 2>/dev/null || uname -m)
[[ "$arch" == "x86_64" ]] && arch="amd64"
[[ "$arch" == "aarch64" ]] && arch="arm64"
local assets url_match=""
assets=$(_gl_asset_urls "$json")
if [[ -n "$asset_pattern" ]]; then
for u in $assets; do
case "${u##*/}" in
$asset_pattern)
url_match="$u"
break
;;
esac
done
fi
if [[ -z "$url_match" ]]; then
for u in $assets; do
if [[ "$u" =~ ($arch|amd64|x86_64|aarch64|arm64).*\.deb$ ]]; then
url_match="$u"
break
fi
done
fi
if [[ -z "$url_match" ]]; then
for u in $assets; do
[[ "$u" =~ \.deb$ ]] && url_match="$u" && break
done
fi
if [[ -z "$url_match" ]]; then
local fallback_json
if fallback_json=$(_gl_scan_older_releases "$repo" "$repo_encoded" "https://gitlab.com" "binary" "$asset_pattern" "$tag_name"); then
json="$fallback_json"
tag_name=$(echo "$json" | jq -r '.tag_name // empty')
[[ "$tag_name" =~ ^v[0-9] ]] && version="${tag_name:1}" || version="$tag_name"
msg_info "Fetching GitLab release: $app ($version)"
assets=$(_gl_asset_urls "$json")
if [[ -n "$asset_pattern" ]]; then
for u in $assets; do
case "${u##*/}" in $asset_pattern)
url_match="$u"
break
;;
esac
done
fi
if [[ -z "$url_match" ]]; then
for u in $assets; do
[[ "$u" =~ ($arch|amd64|x86_64|aarch64|arm64).*\.deb$ ]] && url_match="$u" && break
done
fi
if [[ -z "$url_match" ]]; then
for u in $assets; do
[[ "$u" =~ \.deb$ ]] && url_match="$u" && break
done
fi
fi
fi
if [[ -z "$url_match" ]]; then
msg_error "No suitable .deb asset found for $app"
rm -rf "$tmpdir"
return 1
fi
filename="${url_match##*/}"
curl $download_timeout -fsSL "${header[@]}" -o "$tmpdir/$filename" "$url_match" || {
msg_error "Download failed: $url_match"
rm -rf "$tmpdir"
return 1
}
chmod 644 "$tmpdir/$filename"
local dpkg_opts=""
[[ "${DPKG_FORCE_CONFOLD:-}" == "1" ]] && dpkg_opts="-o Dpkg::Options::=--force-confold"
[[ "${DPKG_FORCE_CONFNEW:-}" == "1" ]] && dpkg_opts="-o Dpkg::Options::=--force-confnew"
DEBIAN_FRONTEND=noninteractive SYSTEMD_OFFLINE=1 $STD apt install -y $dpkg_opts "$tmpdir/$filename" || {
SYSTEMD_OFFLINE=1 $STD dpkg -i "$tmpdir/$filename" || {
msg_error "Both apt and dpkg installation failed"
rm -rf "$tmpdir"
return 1
}
}
### Prebuild Mode ###
elif [[ "$mode" == "prebuild" ]]; then
local pattern="${6%\"}"
pattern="${pattern#\"}"
[[ -z "$pattern" ]] && {
msg_error "Mode 'prebuild' requires 6th parameter (asset filename pattern)"
rm -rf "$tmpdir"
return 1
}
local asset_url=""
for u in $(_gl_asset_urls "$json"); do
filename_candidate="${u##*/}"
case "$filename_candidate" in
$pattern)
asset_url="$u"
break
;;
esac
done
if [[ -z "$asset_url" ]]; then
local fallback_json
if fallback_json=$(_gl_scan_older_releases "$repo" "$repo_encoded" "https://gitlab.com" "prebuild" "$pattern" "$tag_name"); then
json="$fallback_json"
tag_name=$(echo "$json" | jq -r '.tag_name // empty')
[[ "$tag_name" =~ ^v[0-9] ]] && version="${tag_name:1}" || version="$tag_name"
msg_info "Fetching GitLab release: $app ($version)"
for u in $(_gl_asset_urls "$json"); do
filename_candidate="${u##*/}"
case "$filename_candidate" in $pattern)
asset_url="$u"
break
;;
esac
done
fi
fi
[[ -z "$asset_url" ]] && {
msg_error "No asset matching '$pattern' found"
rm -rf "$tmpdir"
return 1
}
filename="${asset_url##*/}"
curl $download_timeout -fsSL "${header[@]}" -o "$tmpdir/$filename" "$asset_url" || {
msg_error "Download failed: $asset_url"
rm -rf "$tmpdir"
return 1
}
local unpack_tmp
unpack_tmp=$(mktemp -d)
mkdir -p "$target"
if [[ "${CLEAN_INSTALL:-0}" == "1" ]]; then
rm -rf "${target:?}/"*
fi
if [[ "$filename" == *.zip ]]; then
ensure_dependencies unzip
unzip -q "$tmpdir/$filename" -d "$unpack_tmp" || {
msg_error "Failed to extract ZIP archive"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
}
elif [[ "$filename" == *.tar.* || "$filename" == *.tgz || "$filename" == *.txz ]]; then
tar --no-same-owner -xf "$tmpdir/$filename" -C "$unpack_tmp" || {
msg_error "Failed to extract TAR archive"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
}
else
msg_error "Unsupported archive format: $filename"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
fi
local top_entries inner_dir
top_entries=$(find "$unpack_tmp" -mindepth 1 -maxdepth 1)
if [[ "$(echo "$top_entries" | wc -l)" -eq 1 && -d "$top_entries" ]]; then
inner_dir="$top_entries"
shopt -s dotglob nullglob
if compgen -G "$inner_dir/*" >/dev/null; then
cp -r "$inner_dir"/* "$target/" || {
msg_error "Failed to copy contents from $inner_dir to $target"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
}
else
msg_error "Inner directory is empty: $inner_dir"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
fi
shopt -u dotglob nullglob
else
shopt -s dotglob nullglob
if compgen -G "$unpack_tmp/*" >/dev/null; then
cp -r "$unpack_tmp"/* "$target/" || {
msg_error "Failed to copy contents to $target"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
}
else
msg_error "Unpacked archive is empty"
rm -rf "$tmpdir" "$unpack_tmp"
return 1
fi
shopt -u dotglob nullglob
fi
### Singlefile Mode ###
elif [[ "$mode" == "singlefile" ]]; then
local pattern="${6%\"}"
pattern="${pattern#\"}"
[[ -z "$pattern" ]] && {
msg_error "Mode 'singlefile' requires 6th parameter (asset filename pattern)"
rm -rf "$tmpdir"
return 1
}
local asset_url=""
for u in $(_gl_asset_urls "$json"); do
filename_candidate="${u##*/}"
case "$filename_candidate" in
$pattern)
asset_url="$u"
break
;;
esac
done
if [[ -z "$asset_url" ]]; then
local fallback_json
if fallback_json=$(_gl_scan_older_releases "$repo" "$repo_encoded" "https://gitlab.com" "singlefile" "$pattern" "$tag_name"); then
json="$fallback_json"
tag_name=$(echo "$json" | jq -r '.tag_name // empty')
[[ "$tag_name" =~ ^v[0-9] ]] && version="${tag_name:1}" || version="$tag_name"
msg_info "Fetching GitLab release: $app ($version)"
for u in $(_gl_asset_urls "$json"); do
filename_candidate="${u##*/}"
case "$filename_candidate" in $pattern)
asset_url="$u"
break
;;
esac
done
fi
fi
[[ -z "$asset_url" ]] && {
msg_error "No asset matching '$pattern' found"
rm -rf "$tmpdir"
return 1
}
filename="${asset_url##*/}"
mkdir -p "$target"
local use_filename="${USE_ORIGINAL_FILENAME:-false}"
local target_file="$app"
[[ "$use_filename" == "true" ]] && target_file="$filename"
curl $download_timeout -fsSL "${header[@]}" -o "$target/$target_file" "$asset_url" || {
msg_error "Download failed: $asset_url"
rm -rf "$tmpdir"
return 1
}
if [[ "$target_file" != *.jar && -f "$target/$target_file" ]]; then
chmod +x "$target/$target_file"
fi
else
msg_error "Unknown mode: $mode"
rm -rf "$tmpdir"
return 1
fi
echo "$version" >"$version_file"
msg_ok "Deployed: $app ($version)"
rm -rf "$tmpdir"
}

View File

@@ -42,6 +42,17 @@ var_skip_confirm="${var_skip_confirm:-no}"
# Options: "yes" | "no" | "" (empty = interactive prompt)
var_auto_reboot="${var_auto_reboot:-}"
# var_continue_on_error: Continue updating remaining containers if one update fails
# Options: "yes" | "no" (default: no = stop on first error)
# Note: containers with backups always attempt restore on failure regardless of this setting
var_continue_on_error="${var_continue_on_error:-no}"
# var_dry_run: Check for available updates without applying them
# Options: "yes" | "no" (default: no)
# Output: lists each container with current vs. latest version
# Note: requires the container to be running; does not modify any container
var_dry_run="${var_dry_run:-no}"
# var_tags: Optionally override the tags used for auto-detection
# Options: "community-script|proxmox-helper-scripts" (default)
var_tags="${var_tags:-community-script|proxmox-helper-scripts}"
@@ -59,6 +70,8 @@ function export_config_json() {
"var_unattended": "${var_unattended}",
"var_skip_confirm": "${var_skip_confirm}",
"var_auto_reboot": "${var_auto_reboot}",
"var_continue_on_error": "${var_continue_on_error}",
"var_dry_run": "${var_dry_run}",
"var_tags": "${var_tags}"
}
EOF
@@ -78,10 +91,12 @@ Environment Variables:
var_backup Enable backup before update (yes/no)
var_backup_storage Storage location for backups
var_container Container selection (all/all_running/all_stopped/101,102,...)
var_unattended Run updates unattended (yes/no)
var_skip_confirm Skip initial confirmation (yes/no)
var_auto_reboot Auto-reboot containers if required (yes/no)
var_tags Optionally override auto-detection tags ("prod|smb|community-script")
var_unattended Run updates unattended (yes/no)
var_skip_confirm Skip initial confirmation (yes/no)
var_auto_reboot Auto-reboot containers if required (yes/no)
var_continue_on_error Continue to next container on update failure (yes/no)
var_dry_run Check for updates without applying them (yes/no)
var_tags Optionally override auto-detection tags ("prod|smb|community-script")
Examples:
# Run interactively
@@ -93,6 +108,12 @@ Examples:
# Update specific containers without backup
var_backup=no var_container=101,102,105 var_unattended=yes var_skip_confirm=yes $(basename "$0")
# Unattended cron-style: skip confirm, continue on error, no backup
var_backup=no var_container=all_running var_unattended=yes var_skip_confirm=yes var_continue_on_error=yes $(basename "$0")
# Dry-run: show available updates for all running containers without applying
var_container=all_running var_skip_confirm=yes var_dry_run=yes $(basename "$0")
# Export current configuration
$(basename "$0") --export-config
EOF
@@ -131,6 +152,56 @@ function detect_service() {
popd >/dev/null
}
function dry_run_container() {
local container="$1"
local service="$2"
# Extract app name and source repo directly from check_for_gh_release call in the ct script
# Pattern: check_for_gh_release "appname" "owner/repo"
local check_line app_name app_lc source_repo
check_line=$(echo "$script" | grep -m1 'check_for_gh_release')
if [[ -z "$check_line" ]]; then
echo -e "${YW}[DRY-RUN]${CL} Container $container ($service): no check_for_gh_release found — skipping"
return
fi
app_name=$(echo "$check_line" | cut -d'"' -f2)
source_repo=$(echo "$check_line" | cut -d'"' -f4)
app_lc=$(echo "${app_name,,}" | tr -d ' ')
if [[ -z "$source_repo" || "$source_repo" != *"/"* ]]; then
echo -e "${YW}[DRY-RUN]${CL} Container $container ($service): cannot parse source repo — skipping"
return
fi
# Read installed version from container (stored by check_for_gh_release as ~/.<appname>)
local current_version
current_version=$(pct exec "$container" -- bash -c "cat \$HOME/.${app_lc} 2>/dev/null" 2>/dev/null || true)
current_version="${current_version#v}"
# Query latest release from GitHub API
local latest_version
latest_version=$(curl -sSL --max-time 10 \
-H 'Accept: application/vnd.github+json' \
-H 'X-GitHub-Api-Version: 2022-11-28' \
"https://api.github.com/repos/${source_repo}/releases/latest" 2>/dev/null \
| grep '"tag_name"' | head -1 | cut -d'"' -f4 | sed 's/^v//')
if [[ -z "$latest_version" ]]; then
echo -e "${YW}[DRY-RUN]${CL} Container $container ($service): cannot fetch latest version from $source_repo"
return
fi
if [[ -z "$current_version" ]]; then
echo -e "${BL}[DRY-RUN]${CL} Container $container ($service): installed version unknown, latest: ${latest_version} (${source_repo})"
elif [[ "$current_version" == "$latest_version" ]]; then
echo -e "${GN}[DRY-RUN]${CL} Container $container ($service): up to date (${current_version})"
else
echo -e "${YW}[DRY-RUN]${CL} Container $container ($service): update available ${current_version}${latest_version}"
fi
}
function backup_container() {
msg_info "Creating backup for container $1"
vzdump $1 --compress zstd --storage $STORAGE_CHOICE -notes-template "{{guestname}} - community-scripts backup updater" >/dev/null 2>&1
@@ -199,7 +270,7 @@ while read -r container; do
menu_items+=("$container_id" "$formatted_line" "OFF")
fi
done <<<"$containers"
msg_ok "Loaded ${#menu_items[@]} containers"
msg_ok "Loaded $((${#menu_items[@]} / 3)) containers"
# Determine container selection based on var_container
if [[ -n "$var_container" ]]; then
@@ -391,17 +462,23 @@ for container in $CHOICE; do
fi
#3) if build resources are different than run resources, then:
if [ "$UPDATE_BUILD_RESOURCES" -eq "1" ]; then
if [ "$UPDATE_BUILD_RESOURCES" -eq "1" ] && [[ "$var_dry_run" != "yes" ]]; then
pct set "$container" --cores "$build_cpu" --memory "$build_ram"
fi
#3.5) Dry-run: report update availability without applying
if [[ "$var_dry_run" == "yes" ]]; then
dry_run_container "$container" "$service"
continue
fi
#4) Update service, using the update command
case "$os" in
alpine) pct exec "$container" -- ash -c "$UPDATE_CMD" ;;
archlinux) pct exec "$container" -- bash -c "$UPDATE_CMD" ;;
fedora | rocky | centos | alma) pct exec "$container" -- bash -c "$UPDATE_CMD" ;;
ubuntu | debian | devuan) pct exec "$container" -- bash -c "$UPDATE_CMD" ;;
opensuse) pct exec "$container" -- bash -c "$UPDATE_CMD" ;;
alpine) pct exec "$container" -- ash -c "export TERM=dumb;$UPDATE_CMD" ;;
archlinux) pct exec "$container" -- bash -c "export TERM=dumb;$UPDATE_CMD" ;;
fedora | rocky | centos | alma) pct exec "$container" -- bash -c "export TERM=dumb;$UPDATE_CMD" ;;
ubuntu | debian | devuan) pct exec "$container" -- bash -c "export TERM=dumb;$UPDATE_CMD" ;;
opensuse) pct exec "$container" -- bash -c "export TERM=dumb;$UPDATE_CMD" ;;
esac
exit_code=$?
@@ -446,8 +523,13 @@ for container in $CHOICE; do
exit 235
fi
else
msg_error "Update failed for container $container. Exiting"
exit "$exit_code"
msg_error "Update failed for container $container (exit code: $exit_code)"
if [[ "$var_continue_on_error" == "yes" ]]; then
echo -e "${YW}[WARN]${CL} Continuing to next container (var_continue_on_error=yes)"
continue
else
exit "$exit_code"
fi
fi
done