Compare commits

..

8 Commits

Author SHA1 Message Date
CanbiZ (MickLesk)
cfc1fc3c6b fix(build): show telemetry status only in verbose mode
Telemetry reporting is an implementation detail that doesn't help
the user during failure recovery. Wrap echo statements with
VERBOSE check so they only appear when verbose mode is enabled.
2026-02-25 14:02:21 +01:00
CanbiZ (MickLesk)
6731001360 Revert "fix(zammad): configure Elasticsearch for LXC container startup"
This reverts commit 10e450b72f.
2026-02-25 14:00:21 +01:00
CanbiZ (MickLesk)
603cba8683 chore: remove test-recovery-dialog.sh from branch 2026-02-25 13:58:40 +01:00
CanbiZ (MickLesk)
7db8ddda52 fix(test): initialize colors and remove illegal local in test harness
- Call load_functions() after sourcing core.func to initialize
  color/formatting/icon variables (RD, GN, YW, CL, TAB, etc.)
- Remove 'local' keyword from top-level scope (not inside function)
- Default REPO_SOURCE to ref_api instead of main
2026-02-25 13:56:55 +01:00
CanbiZ (MickLesk)
6b107fc4d3 fix(build): prevent SIGTSTP from killing recovery dialog
- Replace msg_info/stop_spinner with plain echo for telemetry reporting
  The background spinner process in non-interactive shells (bash -c)
  can trigger SIGTSTP, stopping the entire process group before the
  recovery dialog appears. Plain echo avoids this.

- Add trap '' TSTP at failure path entry to ignore suspension signals
  Prevents Ctrl+Z or terminal-related SIGTSTP from interrupting the
  recovery menu. Restored with trap - TSTP before exit.

- Root cause: msg_info starts a background process (spinner &) that
  is not properly detached in non-interactive shells where job control
  (set -m) is OFF. The disown builtin has no effect without job
  control, leaving the spinner in the same process group. This can
  cause terminal I/O conflicts during the 33-second post_update_to_api
  retry window, resulting in [2]+ Stopped.
2026-02-25 13:34:44 +01:00
CanbiZ (MickLesk)
2cdeb07353 fix(build): show spinner during post_update_to_api to prevent Ctrl+Z abort
post_update_to_api can take up to 33 seconds worst-case (3 curl attempts
x 10s timeout + sleep delays). Without any terminal output during this
time, users think the script is stuck and press Ctrl+Z, which prevents
the recovery menu from ever appearing.

Add msg_info spinner before both post_update_to_api calls in the failure
path (initial report + final force retry after recovery menu).
2026-02-25 13:07:58 +01:00
CanbiZ (MickLesk)
866c4062e0 refactor(api): eliminate duplicate traps, harden error handling & telemetry
Phase 1 - Structural:
- Remove api_exit_script() and 5 inline traps from build.func
- error_handler.func is now the sole trap owner via catch_errors()
- Update api.func comment reference (api_exit_script -> on_exit)

Phase 2 - Quality:
- Add stop_spinner() + cursor restore to error_handler(), on_interrupt(),
  on_terminate(), on_hangup() to prevent spinner/cursor artifacts
- Enhance _send_abort_telemetry() with error text (last 20 log lines),
  duration calculation, and 2 retry attempts (was fire-and-forget)
- Harden json_escape() to also strip DEL (0x7F) character
2026-02-25 12:58:06 +01:00
CanbiZ (MickLesk)
10e450b72f fix(zammad): configure Elasticsearch for LXC container startup
- Set discovery.type: single-node (required for single-node ES)
- Set xpack.security.enabled: false (not needed in local LXC)
- Set bootstrap.memory_lock: false (fails in unprivileged LXC)
- Add startup wait loop (up to 60s) to ensure ES is ready before
  Zammad installation continues

Fixes #12301-related recurring Elasticsearch startup failures
2026-02-25 10:13:37 +01:00
14 changed files with 34 additions and 185 deletions

View File

@@ -13,7 +13,7 @@ permissions:
jobs: jobs:
check-node-versions: check-node-versions:
if: github.repository == 'community-scripts/ProxmoxVE' if: github.repository == 'community-scripts/ProxmoxVE'
runs-on: coolify-runner runs-on: ubuntu-latest
steps: steps:
- name: Checkout Repository - name: Checkout Repository
@@ -110,94 +110,22 @@ jobs:
} }
# Extract Node major from engines.node in package.json # Extract Node major from engines.node in package.json
# Sets: ENGINES_NODE_RAW (raw string), ENGINES_MIN_MAJOR, ENGINES_IS_MINIMUM # Sets: ENGINES_NODE_RAW (raw string), ENGINES_MIN_MAJOR
extract_engines_node() { extract_engines_node() {
local content="$1" local content="$1"
ENGINES_NODE_RAW="" ENGINES_NODE_RAW=""
ENGINES_MIN_MAJOR="" ENGINES_MIN_MAJOR=""
ENGINES_IS_MINIMUM="false"
ENGINES_NODE_RAW=$(echo "$content" | jq -r '.engines.node // empty' 2>/dev/null || echo "") ENGINES_NODE_RAW=$(echo "$content" | jq -r '.engines.node // empty' 2>/dev/null || echo "")
if [[ -z "$ENGINES_NODE_RAW" ]]; then if [[ -z "$ENGINES_NODE_RAW" ]]; then
return return
fi fi
# Detect if constraint is a minimum (>=, ^) vs exact pinning
if [[ "$ENGINES_NODE_RAW" =~ ^(\>=|\^|\~) ]]; then
ENGINES_IS_MINIMUM="true"
fi
# Extract the first number (major) from the constraint # Extract the first number (major) from the constraint
# Handles: ">=24.13.1", "^22", ">=18.0.0", ">=18.15.0 <19 || ^20", etc. # Handles: ">=24.13.1", "^22", ">=18.0.0", ">=18.15.0 <19 || ^20", etc.
ENGINES_MIN_MAJOR=$(echo "$ENGINES_NODE_RAW" | grep -oP '\d+' | head -1 || echo "") ENGINES_MIN_MAJOR=$(echo "$ENGINES_NODE_RAW" | grep -oP '\d+' | head -1 || echo "")
} }
# Check if our_version satisfies an engines.node constraint
# Returns 0 if satisfied, 1 if not
# Usage: version_satisfies_engines "22" ">=18.0.0" "true"
version_satisfies_engines() {
local our="$1"
local min_major="$2"
local is_minimum="$3"
if [[ -z "$min_major" || -z "$our" ]]; then
return 1
fi
if [[ "$is_minimum" == "true" ]]; then
# >= or ^ constraint: our version must be >= min_major
if [[ "$our" -ge "$min_major" ]]; then
return 0
fi
fi
return 1
}
# Search for files in subdirectories via GitHub API tree
# Usage: find_repo_file "owner/repo" "branch" "filename" => sets REPLY to raw URL or empty
find_repo_file() {
local repo="$1"
local branch="$2"
local filename="$3"
REPLY=""
# Try root first (fast)
local root_url="https://raw.githubusercontent.com/${repo}/${branch}/${filename}"
if curl -sfI "$root_url" >/dev/null 2>&1; then
REPLY="$root_url"
return
fi
# Search via GitHub API tree (recursive)
local tree_url="https://api.github.com/repos/${repo}/git/trees/${branch}?recursive=1"
local tree_json
tree_json=$(curl -sf -H "Authorization: token $GH_TOKEN" "$tree_url" 2>/dev/null || echo "")
if [[ -z "$tree_json" ]]; then
return
fi
# Find first matching path (prefer shorter/root-level paths)
local match_path
match_path=$(echo "$tree_json" | jq -r --arg fn "$filename" \
'.tree[]? | select(.path | endswith("/" + $fn) or . == $fn) | .path' 2>/dev/null \
| sort | head -1 || echo "")
if [[ -n "$match_path" ]]; then
REPLY="https://raw.githubusercontent.com/${repo}/${branch}/${match_path}"
fi
}
# Extract Node major from .nvmrc or .node-version
# Sets: NVMRC_NODE_MAJOR
extract_nvmrc_node() {
local content="$1"
NVMRC_NODE_MAJOR=""
# .nvmrc/.node-version typically has: "v22.9.0", "22", "lts/iron", etc.
local ver
ver=$(echo "$content" | tr -d '[:space:]' | grep -oP '^v?\K[0-9]+' | head -1 || echo "")
NVMRC_NODE_MAJOR="$ver"
}
# Collect results # Collect results
declare -a issue_scripts=() declare -a issue_scripts=()
declare -a report_lines=() declare -a report_lines=()
@@ -215,10 +143,7 @@ jobs:
slug=$(basename "$script" | sed 's/-install\.sh$//') slug=$(basename "$script" | sed 's/-install\.sh$//')
# Extract Source URL (GitHub only) # Extract Source URL (GitHub only)
# Supports both: source_url=$(head -20 "$script" | grep -oP '(?<=# Source: )https://github\.com/[^\s]+' | head -1 || echo "")
# # Source: https://github.com/owner/repo
# # Source: https://example.com | Github: https://github.com/owner/repo
source_url=$(head -20 "$script" | grep -oP 'https://github\.com/[^\s|]+' | head -1 || echo "")
if [[ -z "$source_url" ]]; then if [[ -z "$source_url" ]]; then
report_lines+=("| \`$slug\` | — | — | — | — | ⏭️ No GitHub source |") report_lines+=("| \`$slug\` | — | — | — | — | ⏭️ No GitHub source |")
continue continue
@@ -242,23 +167,12 @@ jobs:
fi fi
fi fi
# Determine default branch via GitHub API (fast, single call) # Fetch upstream Dockerfile
detected_branch=""
api_default=$(curl -sf -H "Authorization: token $GH_TOKEN" \
"https://api.github.com/repos/${repo}" 2>/dev/null \
| jq -r '.default_branch // empty' 2>/dev/null || echo "")
if [[ -n "$api_default" ]]; then
detected_branch="$api_default"
else
detected_branch="main"
fi
# Fetch upstream Dockerfile (root + subdirectories)
df_content="" df_content=""
find_repo_file "$repo" "$detected_branch" "Dockerfile" for branch in main master dev; do
if [[ -n "$REPLY" ]]; then df_content=$(curl -sf "https://raw.githubusercontent.com/${repo}/${branch}/Dockerfile" 2>/dev/null || echo "")
df_content=$(curl -sf "$REPLY" 2>/dev/null || echo "") [[ -n "$df_content" ]] && break
fi done
DF_NODE_MAJOR="" DF_NODE_MAJOR=""
DF_SOURCE="" DF_SOURCE=""
@@ -266,35 +180,19 @@ jobs:
extract_dockerfile_node "$df_content" extract_dockerfile_node "$df_content"
fi fi
# Fetch upstream package.json (root + subdirectories) # Fetch upstream package.json
pkg_content="" pkg_content=""
find_repo_file "$repo" "$detected_branch" "package.json" for branch in main master dev; do
if [[ -n "$REPLY" ]]; then pkg_content=$(curl -sf "https://raw.githubusercontent.com/${repo}/${branch}/package.json" 2>/dev/null || echo "")
pkg_content=$(curl -sf "$REPLY" 2>/dev/null || echo "") [[ -n "$pkg_content" ]] && break
fi done
ENGINES_NODE_RAW="" ENGINES_NODE_RAW=""
ENGINES_MIN_MAJOR="" ENGINES_MIN_MAJOR=""
ENGINES_IS_MINIMUM="false"
if [[ -n "$pkg_content" ]]; then if [[ -n "$pkg_content" ]]; then
extract_engines_node "$pkg_content" extract_engines_node "$pkg_content"
fi fi
# Fallback: check .nvmrc or .node-version
NVMRC_NODE_MAJOR=""
if [[ -z "$DF_NODE_MAJOR" && -z "$ENGINES_MIN_MAJOR" ]]; then
for nvmfile in .nvmrc .node-version; do
find_repo_file "$repo" "$detected_branch" "$nvmfile"
if [[ -n "$REPLY" ]]; then
nvmrc_content=$(curl -sf "$REPLY" 2>/dev/null || echo "")
if [[ -n "$nvmrc_content" ]]; then
extract_nvmrc_node "$nvmrc_content"
[[ -n "$NVMRC_NODE_MAJOR" ]] && break
fi
fi
done
fi
# Determine upstream recommended major version # Determine upstream recommended major version
upstream_major="" upstream_major=""
upstream_hint="" upstream_hint=""
@@ -305,9 +203,6 @@ jobs:
elif [[ -n "$ENGINES_MIN_MAJOR" ]]; then elif [[ -n "$ENGINES_MIN_MAJOR" ]]; then
upstream_major="$ENGINES_MIN_MAJOR" upstream_major="$ENGINES_MIN_MAJOR"
upstream_hint="engines: $ENGINES_NODE_RAW" upstream_hint="engines: $ENGINES_NODE_RAW"
elif [[ -n "$NVMRC_NODE_MAJOR" ]]; then
upstream_major="$NVMRC_NODE_MAJOR"
upstream_hint=".nvmrc/.node-version"
fi fi
# Build display values # Build display values
@@ -319,23 +214,13 @@ jobs:
if [[ "$our_version" == "dynamic" ]]; then if [[ "$our_version" == "dynamic" ]]; then
status="🔄 Dynamic" status="🔄 Dynamic"
elif [[ "$our_version" == "unset" ]]; then elif [[ "$our_version" == "unset" ]]; then
if [[ -n "$upstream_major" ]]; then status="⚠️ NODE_VERSION not set"
status="⚠️ NODE_VERSION not set (upstream=$upstream_major via $upstream_hint)"
else
status="⚠️ NODE_VERSION not set (no upstream info found)"
fi
issue_scripts+=("$slug|$our_version|$upstream_major|$upstream_hint|$repo") issue_scripts+=("$slug|$our_version|$upstream_major|$upstream_hint|$repo")
drift_count=$((drift_count + 1)) drift_count=$((drift_count + 1))
elif [[ -n "$upstream_major" && "$our_version" != "$upstream_major" ]]; then elif [[ -n "$upstream_major" && "$our_version" != "$upstream_major" ]]; then
# Check if engines.node is a minimum constraint that our version satisfies status="🔸 Drift → upstream=$upstream_major ($upstream_hint)"
if [[ -z "$DF_NODE_MAJOR" && "$ENGINES_IS_MINIMUM" == "true" ]] && \ issue_scripts+=("$slug|$our_version|$upstream_major|$upstream_hint|$repo")
version_satisfies_engines "$our_version" "$ENGINES_MIN_MAJOR" "$ENGINES_IS_MINIMUM"; then drift_count=$((drift_count + 1))
status="✅ (engines: $ENGINES_NODE_RAW — ours: $our_version satisfies)"
else
status="🔸 Drift → upstream=$upstream_major ($upstream_hint)"
issue_scripts+=("$slug|$our_version|$upstream_major|$upstream_hint|$repo")
drift_count=$((drift_count + 1))
fi
fi fi
report_lines+=("| \`$slug\` | $our_version | $engines_display | $dockerfile_display | [$repo](https://github.com/$repo) | $status |") report_lines+=("| \`$slug\` | $our_version | $engines_display | $dockerfile_display | [$repo](https://github.com/$repo) | $status |")

View File

@@ -411,11 +411,6 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
### 🚀 Updated Scripts ### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- Passbolt: Update Nginx config `client_max_body_size` [@tremor021](https://github.com/tremor021) ([#12313](https://github.com/community-scripts/ProxmoxVE/pull/12313))
- Zammad: configure Elasticsearch before zammad start [@MickLesk](https://github.com/MickLesk) ([#12308](https://github.com/community-scripts/ProxmoxVE/pull/12308))
- #### 🔧 Refactor - #### 🔧 Refactor
- OpenProject: Various fixes [@tremor021](https://github.com/tremor021) ([#12246](https://github.com/community-scripts/ProxmoxVE/pull/12246)) - OpenProject: Various fixes [@tremor021](https://github.com/tremor021) ([#12246](https://github.com/community-scripts/ProxmoxVE/pull/12246))
@@ -426,14 +421,6 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
- Fix detection of ssh keys [@1-tempest](https://github.com/1-tempest) ([#12230](https://github.com/community-scripts/ProxmoxVE/pull/12230)) - Fix detection of ssh keys [@1-tempest](https://github.com/1-tempest) ([#12230](https://github.com/community-scripts/ProxmoxVE/pull/12230))
- #### 🔧 Refactor
- core: remove duplicate traps, consolidate error handling and harden signal traps [@MickLesk](https://github.com/MickLesk) ([#12316](https://github.com/community-scripts/ProxmoxVE/pull/12316))
### 📂 Github
- github: improvements for node drift wf [@MickLesk](https://github.com/MickLesk) ([#12309](https://github.com/community-scripts/ProxmoxVE/pull/12309))
## 2026-02-24 ## 2026-02-24
### 🚀 Updated Scripts ### 🚀 Updated Scripts

View File

@@ -28,7 +28,7 @@ function update_script() {
exit exit
fi fi
NODE_VERSION="24" NODE_MODULE="yarn,npm,pm2" setup_nodejs NODE_VERSION=24 NODE_MODULE="yarn,npm,pm2" setup_nodejs
if check_for_gh_release "joplin-server" "laurent22/joplin"; then if check_for_gh_release "joplin-server" "laurent22/joplin"; then
msg_info "Stopping Services" msg_info "Stopping Services"

View File

@@ -65,7 +65,6 @@ function update_script() {
msg_ok "Stopped Service" msg_ok "Stopped Service"
fetch_and_deploy_gh_release "vikunja" "go-vikunja/vikunja" "binary" fetch_and_deploy_gh_release "vikunja" "go-vikunja/vikunja" "binary"
$STD systemctl daemon-reload
msg_info "Starting Service" msg_info "Starting Service"
systemctl start vikunja systemctl start vikunja

View File

@@ -29,7 +29,7 @@ function update_script() {
fi fi
if check_for_gh_release "Zigbee2MQTT" "Koenkk/zigbee2mqtt"; then if check_for_gh_release "Zigbee2MQTT" "Koenkk/zigbee2mqtt"; then
NODE_VERSION="24" NODE_MODULE="pnpm@$(curl -fsSL https://raw.githubusercontent.com/Koenkk/zigbee2mqtt/master/package.json | jq -r '.packageManager | split("@")[1]')" setup_nodejs NODE_VERSION=24 NODE_MODULE="pnpm@$(curl -fsSL https://raw.githubusercontent.com/Koenkk/zigbee2mqtt/master/package.json | jq -r '.packageManager | split("@")[1]')" setup_nodejs
msg_info "Stopping Service" msg_info "Stopping Service"
systemctl stop zigbee2mqtt systemctl stop zigbee2mqtt
msg_ok "Stopped Service" msg_ok "Stopped Service"

View File

@@ -1,5 +1,5 @@
{ {
"generated": "2026-02-25T12:14:52Z", "generated": "2026-02-25T06:25:10Z",
"versions": [ "versions": [
{ {
"slug": "2fauth", "slug": "2fauth",
@@ -39,9 +39,9 @@
{ {
"slug": "ampache", "slug": "ampache",
"repo": "ampache/ampache", "repo": "ampache/ampache",
"version": "7.9.1", "version": "7.9.0",
"pinned": false, "pinned": false,
"date": "2026-02-25T08:52:58Z" "date": "2026-02-19T07:01:25Z"
}, },
{ {
"slug": "argus", "slug": "argus",
@@ -823,9 +823,9 @@
{ {
"slug": "manyfold", "slug": "manyfold",
"repo": "manyfold3d/manyfold", "repo": "manyfold3d/manyfold",
"version": "v0.133.0", "version": "v0.132.1",
"pinned": false, "pinned": false,
"date": "2026-02-25T10:40:26Z" "date": "2026-02-09T22:02:28Z"
}, },
{ {
"slug": "mealie", "slug": "mealie",
@@ -995,13 +995,6 @@
"pinned": false, "pinned": false,
"date": "2026-02-03T09:00:43Z" "date": "2026-02-03T09:00:43Z"
}, },
{
"slug": "openproject",
"repo": "jemalloc/jemalloc",
"version": "5.3.0",
"pinned": false,
"date": "2022-05-06T19:14:21Z"
},
{ {
"slug": "ots", "slug": "ots",
"repo": "Luzifer/ots", "repo": "Luzifer/ots",

View File

@@ -39,7 +39,7 @@ $STD apt install -y \
texlive-xetex texlive-xetex
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
NODE_VERSION="22" NODE_MODULE="bun" setup_nodejs NODE_VERSION=22 NODE_MODULE="bun" setup_nodejs
fetch_and_deploy_gh_release "ConvertX" "C4illin/ConvertX" "tarball" "latest" "/opt/convertx" fetch_and_deploy_gh_release "ConvertX" "C4illin/ConvertX" "tarball" "latest" "/opt/convertx"
msg_info "Installing ConvertX" msg_info "Installing ConvertX"

View File

@@ -21,7 +21,7 @@ msg_ok "Installed Dependencies"
PG_VERSION="17" setup_postgresql PG_VERSION="17" setup_postgresql
PG_DB_NAME="joplin" PG_DB_USER="joplin" setup_postgresql_db PG_DB_NAME="joplin" PG_DB_USER="joplin" setup_postgresql_db
NODE_VERSION="24" NODE_MODULE="yarn,npm,pm2" setup_nodejs NODE_VERSION=24 NODE_MODULE="yarn,npm,pm2" setup_nodejs
mkdir -p /opt/pm2 mkdir -p /opt/pm2
export PM2_HOME=/opt/pm2 export PM2_HOME=/opt/pm2
$STD pm2 install pm2-logrotate $STD pm2 install pm2-logrotate

View File

@@ -44,8 +44,6 @@ echo passbolt-ce-server passbolt/nginx-domain string $LOCAL_IP | debconf-set-sel
echo passbolt-ce-server passbolt/nginx-certificate-file string /etc/ssl/passbolt/passbolt.crt | debconf-set-selections echo passbolt-ce-server passbolt/nginx-certificate-file string /etc/ssl/passbolt/passbolt.crt | debconf-set-selections
echo passbolt-ce-server passbolt/nginx-certificate-key-file string /etc/ssl/passbolt/passbolt.key | debconf-set-selections echo passbolt-ce-server passbolt/nginx-certificate-key-file string /etc/ssl/passbolt/passbolt.key | debconf-set-selections
$STD apt install -y --no-install-recommends passbolt-ce-server $STD apt install -y --no-install-recommends passbolt-ce-server
sed -i 's/client_max_body_size[[:space:]]\+[0-9]\+M;/client_max_body_size 15M;/' /etc/nginx/sites-enabled/nginx-passbolt.conf
systemctl reload nginx
msg_ok "Setup Passbolt" msg_ok "Setup Passbolt"
motd_ssh motd_ssh

View File

@@ -21,7 +21,7 @@ $STD apt install -y \
expect expect
msg_ok "Dependencies installed." msg_ok "Dependencies installed."
NODE_VERSION="24" setup_nodejs NODE_VERSION=24 setup_nodejs
fetch_and_deploy_gh_release "ProxmoxVE-Local" "community-scripts/ProxmoxVE-Local" "tarball" fetch_and_deploy_gh_release "ProxmoxVE-Local" "community-scripts/ProxmoxVE-Local" "tarball"
msg_info "Installing PVE Scripts local" msg_info "Installing PVE Scripts local"

View File

@@ -28,23 +28,12 @@ setup_deb822_repo \
"stable" \ "stable" \
"main" "main"
$STD apt install -y elasticsearch $STD apt install -y elasticsearch
sed -i 's/^#\{0,2\} *-Xms[0-9]*g.*/-Xms2g/' /etc/elasticsearch/jvm.options sed -i 's/^-Xms.*/-Xms2g/' /etc/elasticsearch/jvm.options
sed -i 's/^#\{0,2\} *-Xmx[0-9]*g.*/-Xmx2g/' /etc/elasticsearch/jvm.options sed -i 's/^-Xmx.*/-Xmx2g/' /etc/elasticsearch/jvm.options
cat <<EOF >>/etc/elasticsearch/elasticsearch.yml
discovery.type: single-node
xpack.security.enabled: false
bootstrap.memory_lock: false
EOF
$STD /usr/share/elasticsearch/bin/elasticsearch-plugin install ingest-attachment -b $STD /usr/share/elasticsearch/bin/elasticsearch-plugin install ingest-attachment -b
systemctl daemon-reload systemctl daemon-reload
systemctl enable -q elasticsearch systemctl enable -q elasticsearch
systemctl restart -q elasticsearch systemctl restart -q elasticsearch
for i in $(seq 1 30); do
if curl -s http://localhost:9200 >/dev/null 2>&1; then
break
fi
sleep 2
done
msg_ok "Setup Elasticsearch" msg_ok "Setup Elasticsearch"
msg_info "Installing Zammad" msg_info "Installing Zammad"

View File

@@ -4098,11 +4098,10 @@ EOF'
# Installation failed? # Installation failed?
if [[ $install_exit_code -ne 0 ]]; then if [[ $install_exit_code -ne 0 ]]; then
# Prevent job-control signals from suspending the script during recovery. # Prevent SIGTSTP (Ctrl+Z) from suspending the script during recovery.
# In non-interactive shells (bash -c), background processes (spinner) can # In non-interactive shells (bash -c), background processes (spinner) can
# trigger terminal-related signals that stop the entire process group. # trigger terminal-related signals that stop the entire process group.
# TSTP = Ctrl+Z, TTIN = bg read from tty, TTOU = bg write to tty (tostop) trap '' TSTP
trap '' TSTP TTIN TTOU
msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})" msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})"
@@ -4552,8 +4551,8 @@ EOF'
post_update_to_api "failed" "$install_exit_code" "force" post_update_to_api "failed" "$install_exit_code" "force"
$STD echo -e "${TAB}${CM:-} Telemetry finalized" $STD echo -e "${TAB}${CM:-} Telemetry finalized"
# Restore default job-control signal handling before exit # Restore default SIGTSTP handling before exit
trap - TSTP TTIN TTOU trap - TSTP
exit $install_exit_code exit $install_exit_code
fi fi

View File

@@ -607,7 +607,6 @@ stop_spinner() {
unset SPINNER_PID SPINNER_MSG unset SPINNER_PID SPINNER_MSG
stty sane 2>/dev/null || true stty sane 2>/dev/null || true
stty -tostop 2>/dev/null || true
} }
# ============================================================================== # ==============================================================================

View File

@@ -28,7 +28,7 @@ INSTALL_PATH="/opt/immich-proxy"
CONFIG_PATH="/opt/immich-proxy/app" CONFIG_PATH="/opt/immich-proxy/app"
DEFAULT_PORT=3000 DEFAULT_PORT=3000
# Initialize all core functions (colors, formatting, icons, $STD mode) # Initialize all core functions (colors, formatting, icons, STD mode)
load_functions load_functions
init_tool_telemetry "" "addon" init_tool_telemetry "" "addon"