Compare commits

...

17 Commits

Author SHA1 Message Date
CanbiZ (MickLesk)
3ea51f5a62 Add Alpine support and improve Tailscale install
Detect Alpine inside the LXC container and install Tailscale via apk (add community repo if missing), enable/start the service. Preserve Debian/Ubuntu install path but improve DNS resolution checks, temporarily override /etc/resolv.conf if DNS appears blocked, and restore it afterwards. Switch pct exec to use sh -c, tighten command existence checks and redirections, ensure curl and keyring directory are present, add Tailscale apt source and install package. Overall robustness and error-handling improvements for installing Tailscale in containers.
2026-02-26 10:01:41 +01:00
community-scripts-pr-app[bot]
40aa06940c Update CHANGELOG.md (#12347)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-26 07:46:30 +00:00
Chris
117786376a [QOL] Immich: add warning regarding library compilation time (#12345) 2026-02-26 08:45:59 +01:00
community-scripts-pr-app[bot]
c5a635cdd7 chore: update github-versions.json (#12346)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-26 06:22:51 +00:00
community-scripts-pr-app[bot]
165e3f22cd Update CHANGELOG.md (#12344)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-26 00:19:57 +00:00
community-scripts-pr-app[bot]
2561a50d05 chore: update github-versions.json (#12343)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-26 00:19:34 +00:00
community-scripts-pr-app[bot]
6db5479b26 Update CHANGELOG.md (#12341)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-25 21:47:40 +00:00
Tobias
e74ddff49a fix: overseer migration (#12340) 2026-02-25 22:47:14 +01:00
community-scripts-pr-app[bot]
80132b0332 Update CHANGELOG.md (#12338)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-25 20:02:12 +00:00
CanbiZ (MickLesk)
83a19adbb4 tools.func: Improve GitHub/Codeberg API error handling and error output (#12330) 2026-02-25 21:01:45 +01:00
community-scripts-pr-app[bot]
6b196a7c81 chore: update github-versions.json (#12334)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-25 18:24:21 +00:00
community-scripts-pr-app[bot]
30082a1ba7 Update CHANGELOG.md (#12332)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-25 17:27:38 +00:00
Tobias
7741caa6ba add: vikunja: daemon reload (#12323) 2026-02-25 18:27:11 +01:00
community-scripts-pr-app[bot]
a3841d3cef Update CHANGELOG.md (#12331)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-25 16:32:03 +00:00
CanbiZ (MickLesk)
89cdabd040 opnsense-VM: Use ip link to verify bridge existence (#12329) 2026-02-25 17:31:41 +01:00
CanbiZ (MickLesk)
cbb82812b2 wger: Use $http_host for proxy Host header (#12327) 2026-02-25 17:31:14 +01:00
community-scripts-pr-app[bot]
1c463369c7 Update .app files (#12325)
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2026-02-25 16:18:48 +01:00
12 changed files with 260 additions and 117 deletions

View File

@@ -407,6 +407,14 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
</details> </details>
## 2026-02-26
### 🚀 Updated Scripts
- #### ✨ New Features
- [QOL] Immich: add warning regarding library compilation time [@vhsdream](https://github.com/vhsdream) ([#12345](https://github.com/community-scripts/ProxmoxVE/pull/12345))
## 2026-02-25 ## 2026-02-25
### 🆕 New Scripts ### 🆕 New Scripts
@@ -417,6 +425,10 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
- #### 🐞 Bug Fixes - #### 🐞 Bug Fixes
- fix: overseer migration [@CrazyWolf13](https://github.com/CrazyWolf13) ([#12340](https://github.com/community-scripts/ProxmoxVE/pull/12340))
- add: vikunja: daemon reload [@CrazyWolf13](https://github.com/CrazyWolf13) ([#12323](https://github.com/community-scripts/ProxmoxVE/pull/12323))
- opnsense-VM: Use ip link to verify bridge existence [@MickLesk](https://github.com/MickLesk) ([#12329](https://github.com/community-scripts/ProxmoxVE/pull/12329))
- wger: Use $http_host for proxy Host header [@MickLesk](https://github.com/MickLesk) ([#12327](https://github.com/community-scripts/ProxmoxVE/pull/12327))
- Passbolt: Update Nginx config `client_max_body_size` [@tremor021](https://github.com/tremor021) ([#12313](https://github.com/community-scripts/ProxmoxVE/pull/12313)) - Passbolt: Update Nginx config `client_max_body_size` [@tremor021](https://github.com/tremor021) ([#12313](https://github.com/community-scripts/ProxmoxVE/pull/12313))
- Zammad: configure Elasticsearch before zammad start [@MickLesk](https://github.com/MickLesk) ([#12308](https://github.com/community-scripts/ProxmoxVE/pull/12308)) - Zammad: configure Elasticsearch before zammad start [@MickLesk](https://github.com/MickLesk) ([#12308](https://github.com/community-scripts/ProxmoxVE/pull/12308))
@@ -430,6 +442,10 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
- Fix detection of ssh keys [@1-tempest](https://github.com/1-tempest) ([#12230](https://github.com/community-scripts/ProxmoxVE/pull/12230)) - Fix detection of ssh keys [@1-tempest](https://github.com/1-tempest) ([#12230](https://github.com/community-scripts/ProxmoxVE/pull/12230))
- #### ✨ New Features
- tools.func: Improve GitHub/Codeberg API error handling and error output [@MickLesk](https://github.com/MickLesk) ([#12330](https://github.com/community-scripts/ProxmoxVE/pull/12330))
- #### 🔧 Refactor - #### 🔧 Refactor
- core: remove duplicate traps, consolidate error handling and harden signal traps [@MickLesk](https://github.com/MickLesk) ([#12316](https://github.com/community-scripts/ProxmoxVE/pull/12316)) - core: remove duplicate traps, consolidate error handling and harden signal traps [@MickLesk](https://github.com/MickLesk) ([#12316](https://github.com/community-scripts/ProxmoxVE/pull/12316))

6
ct/headers/zerobyte Normal file
View File

@@ -0,0 +1,6 @@
_____ __ __
/__ / ___ _________ / /_ __ __/ /____
/ / / _ \/ ___/ __ \/ __ \/ / / / __/ _ \
/ /__/ __/ / / /_/ / /_/ / /_/ / /_/ __/
/____/\___/_/ \____/_.___/\__, /\__/\___/
/____/

View File

@@ -97,7 +97,7 @@ EOF
if [[ -f ~/.immich_library_revisions ]]; then if [[ -f ~/.immich_library_revisions ]]; then
libraries=("libjxl" "libheif" "libraw" "imagemagick" "libvips") libraries=("libjxl" "libheif" "libraw" "imagemagick" "libvips")
cd "$BASE_DIR" cd "$BASE_DIR"
msg_info "Checking for updates to custom image-processing libraries" msg_warn "Checking for updates to custom image-processing libraries (recompile time: 2-15min per library)"
$STD git pull $STD git pull
for library in "${libraries[@]}"; do for library in "${libraries[@]}"; do
compile_"$library" compile_"$library"

View File

@@ -28,7 +28,7 @@ function update_script() {
exit exit
fi fi
if [[ -f "$HOME/.overseerr" ]] && [[ "$(cat "$HOME/.overseerr")" == "1.34.0" ]]; then if [[ -f "$HOME/.overseerr" ]] && [[ "$(printf '%s\n' "1.34.0" "$(cat "$HOME/.overseerr")" | sort -V | head -n1)" == "1.35.0" ]]; then
echo echo
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Overseerr v1.34.0 detected." echo "Overseerr v1.34.0 detected."

View File

@@ -65,6 +65,7 @@ function update_script() {
msg_ok "Stopped Service" msg_ok "Stopped Service"
fetch_and_deploy_gh_release "vikunja" "go-vikunja/vikunja" "binary" fetch_and_deploy_gh_release "vikunja" "go-vikunja/vikunja" "binary"
$STD systemctl daemon-reload
msg_info "Starting Service" msg_info "Starting Service"
systemctl start vikunja systemctl start vikunja

View File

@@ -1,5 +1,5 @@
{ {
"generated": "2026-02-25T12:14:52Z", "generated": "2026-02-26T06:22:43Z",
"versions": [ "versions": [
{ {
"slug": "2fauth", "slug": "2fauth",
@@ -151,9 +151,9 @@
{ {
"slug": "booklore", "slug": "booklore",
"repo": "booklore-app/BookLore", "repo": "booklore-app/BookLore",
"version": "v2.0.1", "version": "v2.0.2",
"pinned": false, "pinned": false,
"date": "2026-02-24T04:15:33Z" "date": "2026-02-25T19:59:20Z"
}, },
{ {
"slug": "bookstack", "slug": "bookstack",
@@ -242,9 +242,9 @@
{ {
"slug": "cosmos", "slug": "cosmos",
"repo": "azukaar/Cosmos-Server", "repo": "azukaar/Cosmos-Server",
"version": "v0.20.2", "version": "v0.21.0",
"pinned": false, "pinned": false,
"date": "2026-01-24T00:12:39Z" "date": "2026-02-25T17:26:37Z"
}, },
{ {
"slug": "cronicle", "slug": "cronicle",
@@ -270,16 +270,16 @@
{ {
"slug": "databasus", "slug": "databasus",
"repo": "databasus/databasus", "repo": "databasus/databasus",
"version": "v3.16.2", "version": "v3.16.3",
"pinned": false, "pinned": false,
"date": "2026-02-22T21:10:12Z" "date": "2026-02-25T19:57:26Z"
}, },
{ {
"slug": "dawarich", "slug": "dawarich",
"repo": "Freika/dawarich", "repo": "Freika/dawarich",
"version": "1.2.0", "version": "1.3.0",
"pinned": false, "pinned": false,
"date": "2026-02-15T22:33:56Z" "date": "2026-02-25T19:30:25Z"
}, },
{ {
"slug": "discopanel", "slug": "discopanel",
@@ -452,9 +452,9 @@
{ {
"slug": "gitea-mirror", "slug": "gitea-mirror",
"repo": "RayLabsHQ/gitea-mirror", "repo": "RayLabsHQ/gitea-mirror",
"version": "v3.9.4", "version": "v3.9.5",
"pinned": false, "pinned": false,
"date": "2026-02-24T06:17:56Z" "date": "2026-02-26T05:32:12Z"
}, },
{ {
"slug": "glance", "slug": "glance",
@@ -606,16 +606,16 @@
{ {
"slug": "invoiceninja", "slug": "invoiceninja",
"repo": "invoiceninja/invoiceninja", "repo": "invoiceninja/invoiceninja",
"version": "v5.12.66", "version": "v5.12.68",
"pinned": false, "pinned": false,
"date": "2026-02-24T09:12:50Z" "date": "2026-02-25T19:38:19Z"
}, },
{ {
"slug": "jackett", "slug": "jackett",
"repo": "Jackett/Jackett", "repo": "Jackett/Jackett",
"version": "v0.24.1205", "version": "v0.24.1218",
"pinned": false, "pinned": false,
"date": "2026-02-25T05:49:14Z" "date": "2026-02-26T05:55:11Z"
}, },
{ {
"slug": "jellystat", "slug": "jellystat",
@@ -627,9 +627,9 @@
{ {
"slug": "joplin-server", "slug": "joplin-server",
"repo": "laurent22/joplin", "repo": "laurent22/joplin",
"version": "v3.5.12", "version": "v3.5.13",
"pinned": false, "pinned": false,
"date": "2026-01-17T14:20:33Z" "date": "2026-02-25T21:19:11Z"
}, },
{ {
"slug": "jotty", "slug": "jotty",
@@ -669,9 +669,9 @@
{ {
"slug": "kimai", "slug": "kimai",
"repo": "kimai/kimai", "repo": "kimai/kimai",
"version": "2.49.0", "version": "2.50.0",
"pinned": false, "pinned": false,
"date": "2026-02-15T20:40:19Z" "date": "2026-02-25T20:13:51Z"
}, },
{ {
"slug": "kitchenowl", "slug": "kitchenowl",
@@ -711,9 +711,9 @@
{ {
"slug": "kubo", "slug": "kubo",
"repo": "ipfs/kubo", "repo": "ipfs/kubo",
"version": "v0.39.0", "version": "v0.40.0",
"pinned": false, "pinned": false,
"date": "2025-11-27T03:47:38Z" "date": "2026-02-25T23:16:17Z"
}, },
{ {
"slug": "kutt", "slug": "kutt",
@@ -1166,9 +1166,9 @@
{ {
"slug": "prometheus", "slug": "prometheus",
"repo": "prometheus/prometheus", "repo": "prometheus/prometheus",
"version": "v3.9.1", "version": "v3.10.0",
"pinned": false, "pinned": false,
"date": "2026-01-07T17:05:53Z" "date": "2026-02-26T01:19:51Z"
}, },
{ {
"slug": "prometheus-alertmanager", "slug": "prometheus-alertmanager",
@@ -1264,9 +1264,9 @@
{ {
"slug": "radicale", "slug": "radicale",
"repo": "Kozea/Radicale", "repo": "Kozea/Radicale",
"version": "v3.6.0", "version": "v3.6.1",
"pinned": false, "pinned": false,
"date": "2026-01-10T06:56:46Z" "date": "2026-02-24T06:36:23Z"
}, },
{ {
"slug": "rclone", "slug": "rclone",
@@ -1292,9 +1292,9 @@
{ {
"slug": "recyclarr", "slug": "recyclarr",
"repo": "recyclarr/recyclarr", "repo": "recyclarr/recyclarr",
"version": "v8.3.1", "version": "v8.3.2",
"pinned": false, "pinned": false,
"date": "2026-02-25T01:01:31Z" "date": "2026-02-25T22:39:51Z"
}, },
{ {
"slug": "reitti", "slug": "reitti",
@@ -1390,9 +1390,9 @@
{ {
"slug": "signoz", "slug": "signoz",
"repo": "SigNoz/signoz-otel-collector", "repo": "SigNoz/signoz-otel-collector",
"version": "v0.144.1", "version": "v0.144.2",
"pinned": false, "pinned": false,
"date": "2026-02-25T05:57:17Z" "date": "2026-02-26T05:57:26Z"
}, },
{ {
"slug": "silverbullet", "slug": "silverbullet",
@@ -1600,9 +1600,9 @@
{ {
"slug": "tunarr", "slug": "tunarr",
"repo": "chrisbenincasa/tunarr", "repo": "chrisbenincasa/tunarr",
"version": "v1.1.16", "version": "v1.1.17",
"pinned": false, "pinned": false,
"date": "2026-02-23T21:24:47Z" "date": "2026-02-25T19:56:36Z"
}, },
{ {
"slug": "uhf", "slug": "uhf",
@@ -1663,9 +1663,9 @@
{ {
"slug": "vikunja", "slug": "vikunja",
"repo": "go-vikunja/vikunja", "repo": "go-vikunja/vikunja",
"version": "v1.1.0", "version": "v2.0.0",
"pinned": false, "pinned": false,
"date": "2026-02-09T10:34:29Z" "date": "2026-02-25T13:58:47Z"
}, },
{ {
"slug": "wallabag", "slug": "wallabag",
@@ -1779,6 +1779,13 @@
"pinned": false, "pinned": false,
"date": "2026-02-24T15:15:46Z" "date": "2026-02-24T15:15:46Z"
}, },
{
"slug": "zerobyte",
"repo": "restic/restic",
"version": "v0.18.1",
"pinned": false,
"date": "2025-09-21T18:24:38Z"
},
{ {
"slug": "zigbee2mqtt", "slug": "zigbee2mqtt",
"repo": "Koenkk/zigbee2mqtt", "repo": "Koenkk/zigbee2mqtt",

View File

@@ -51,6 +51,10 @@
{ {
"text": "Logs: `/var/log/immich`", "text": "Logs: `/var/log/immich`",
"type": "info" "type": "info"
},
{
"text": "During first install, 5 custom libraries need to be compiled from source. Depending on your CPU, this can take anywhere between 15 minutes and 2 hours. Please be patient. Touch grass or something.",
"type": "warning"
} }
] ]
} }

View File

@@ -154,7 +154,7 @@ sed -i -e "/^#shared_preload/s/^#//;/^shared_preload/s/''/'vchord.so'/" /etc/pos
systemctl restart postgresql.service systemctl restart postgresql.service
PG_DB_NAME="immich" PG_DB_USER="immich" PG_DB_GRANT_SUPERUSER="true" PG_DB_SKIP_ALTER_ROLE="true" setup_postgresql_db PG_DB_NAME="immich" PG_DB_USER="immich" PG_DB_GRANT_SUPERUSER="true" PG_DB_SKIP_ALTER_ROLE="true" setup_postgresql_db
msg_info "Compiling Custom Photo-processing Library (extreme patience)" msg_warn "Compiling Custom Photo-processing Libraries (can take anywhere from 15min to 2h)"
LD_LIBRARY_PATH=/usr/local/lib LD_LIBRARY_PATH=/usr/local/lib
export LD_RUN_PATH=/usr/local/lib export LD_RUN_PATH=/usr/local/lib
STAGING_DIR=/opt/staging STAGING_DIR=/opt/staging

View File

@@ -164,7 +164,7 @@ server {
location / { location / {
proxy_pass http://127.0.0.1:8000; proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host; proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Forwarded-Proto $scheme;
proxy_redirect off; proxy_redirect off;

View File

@@ -783,16 +783,25 @@ github_api_call() {
for attempt in $(seq 1 $max_retries); do for attempt in $(seq 1 $max_retries); do
local http_code local http_code
http_code=$(curl -fsSL -w "%{http_code}" -o "$output_file" \ http_code=$(curl -sSL -w "%{http_code}" -o "$output_file" \
-H "Accept: application/vnd.github+json" \ -H "Accept: application/vnd.github+json" \
-H "X-GitHub-Api-Version: 2022-11-28" \ -H "X-GitHub-Api-Version: 2022-11-28" \
"${header_args[@]}" \ "${header_args[@]}" \
"$url" 2>/dev/null || echo "000") "$url" 2>/dev/null) || true
case "$http_code" in case "$http_code" in
200) 200)
return 0 return 0
;; ;;
401)
msg_error "GitHub API authentication failed (HTTP 401)."
if [[ -n "${GITHUB_TOKEN:-}" ]]; then
msg_error "Your GITHUB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITHUB_TOKEN=\"ghp_your_token\""
fi
return 1
;;
403) 403)
# Rate limit - check if we can retry # Rate limit - check if we can retry
if [[ $attempt -lt $max_retries ]]; then if [[ $attempt -lt $max_retries ]]; then
@@ -801,11 +810,22 @@ github_api_call() {
retry_delay=$((retry_delay * 2)) retry_delay=$((retry_delay * 2))
continue continue
fi fi
msg_error "GitHub API rate limit exceeded. Set GITHUB_TOKEN to increase limits." msg_error "GitHub API rate limit exceeded (HTTP 403)."
msg_error "To increase the limit, export a GitHub token before running the script:"
msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\""
return 1 return 1
;; ;;
404) 404)
msg_error "GitHub API endpoint not found: $url" msg_error "GitHub repository or release not found (HTTP 404): $url"
return 1
;;
000 | "")
if [[ $attempt -lt $max_retries ]]; then
sleep "$retry_delay"
continue
fi
msg_error "GitHub API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://api.github.com/rate_limit"
return 1 return 1
;; ;;
*) *)
@@ -813,7 +833,7 @@ github_api_call() {
sleep "$retry_delay" sleep "$retry_delay"
continue continue
fi fi
msg_error "GitHub API call failed with HTTP $http_code" msg_error "GitHub API call failed (HTTP $http_code)."
return 1 return 1
;; ;;
esac esac
@@ -833,14 +853,18 @@ codeberg_api_call() {
for attempt in $(seq 1 $max_retries); do for attempt in $(seq 1 $max_retries); do
local http_code local http_code
http_code=$(curl -fsSL -w "%{http_code}" -o "$output_file" \ http_code=$(curl -sSL -w "%{http_code}" -o "$output_file" \
-H "Accept: application/json" \ -H "Accept: application/json" \
"$url" 2>/dev/null || echo "000") "$url" 2>/dev/null) || true
case "$http_code" in case "$http_code" in
200) 200)
return 0 return 0
;; ;;
401)
msg_error "Codeberg API authentication failed (HTTP 401)."
return 1
;;
403) 403)
# Rate limit - retry # Rate limit - retry
if [[ $attempt -lt $max_retries ]]; then if [[ $attempt -lt $max_retries ]]; then
@@ -849,11 +873,20 @@ codeberg_api_call() {
retry_delay=$((retry_delay * 2)) retry_delay=$((retry_delay * 2))
continue continue
fi fi
msg_error "Codeberg API rate limit exceeded." msg_error "Codeberg API rate limit exceeded (HTTP 403)."
return 1 return 1
;; ;;
404) 404)
msg_error "Codeberg API endpoint not found: $url" msg_error "Codeberg repository or release not found (HTTP 404): $url"
return 1
;;
000 | "")
if [[ $attempt -lt $max_retries ]]; then
sleep "$retry_delay"
continue
fi
msg_error "Codeberg API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://codeberg.org"
return 1 return 1
;; ;;
*) *)
@@ -861,7 +894,7 @@ codeberg_api_call() {
sleep "$retry_delay" sleep "$retry_delay"
continue continue
fi fi
msg_error "Codeberg API call failed with HTTP $http_code" msg_error "Codeberg API call failed (HTTP $http_code)."
return 1 return 1
;; ;;
esac esac
@@ -1441,7 +1474,7 @@ get_latest_github_release() {
if ! github_api_call "https://api.github.com/repos/${repo}/releases/latest" "$temp_file"; then if ! github_api_call "https://api.github.com/repos/${repo}/releases/latest" "$temp_file"; then
rm -f "$temp_file" rm -f "$temp_file"
return 1 return 0
fi fi
local version local version
@@ -1449,7 +1482,8 @@ get_latest_github_release() {
rm -f "$temp_file" rm -f "$temp_file"
if [[ -z "$version" ]]; then if [[ -z "$version" ]]; then
return 1 msg_error "Could not determine latest version for ${repo}"
return 0
fi fi
echo "$version" echo "$version"
@@ -1466,7 +1500,7 @@ get_latest_codeberg_release() {
# Codeberg API: get all releases and pick the first non-draft/non-prerelease # Codeberg API: get all releases and pick the first non-draft/non-prerelease
if ! codeberg_api_call "https://codeberg.org/api/v1/repos/${repo}/releases" "$temp_file"; then if ! codeberg_api_call "https://codeberg.org/api/v1/repos/${repo}/releases" "$temp_file"; then
rm -f "$temp_file" rm -f "$temp_file"
return 1 return 0
fi fi
local version local version
@@ -1480,7 +1514,8 @@ get_latest_codeberg_release() {
rm -f "$temp_file" rm -f "$temp_file"
if [[ -z "$version" ]]; then if [[ -z "$version" ]]; then
return 1 msg_error "Could not determine latest version for ${repo}"
return 0
fi fi
echo "$version" echo "$version"
@@ -1567,13 +1602,34 @@ get_latest_gh_tag() {
"${header_args[@]}" \ "${header_args[@]}" \
"https://api.github.com/repos/${repo}/tags?per_page=100" 2>/dev/null) || true "https://api.github.com/repos/${repo}/tags?per_page=100" 2>/dev/null) || true
if [[ "$http_code" == "401" ]]; then
msg_error "GitHub API authentication failed (HTTP 401)."
if [[ -n "${GITHUB_TOKEN:-}" ]]; then
msg_error "Your GITHUB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITHUB_TOKEN=\"ghp_your_token\""
fi
rm -f /tmp/gh_tags.json
return 1
fi
if [[ "$http_code" == "403" ]]; then if [[ "$http_code" == "403" ]]; then
msg_warn "GitHub API rate limit exceeded while fetching tags for ${repo}" msg_error "GitHub API rate limit exceeded (HTTP 403)."
msg_error "To increase the limit, export a GitHub token before running the script:"
msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\""
rm -f /tmp/gh_tags.json
return 1
fi
if [[ "$http_code" == "000" || -z "$http_code" ]]; then
msg_error "GitHub API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://api.github.com/rate_limit"
rm -f /tmp/gh_tags.json rm -f /tmp/gh_tags.json
return 1 return 1
fi fi
if [[ "$http_code" != "200" ]] || [[ ! -s /tmp/gh_tags.json ]]; then if [[ "$http_code" != "200" ]] || [[ ! -s /tmp/gh_tags.json ]]; then
msg_error "Unable to fetch tags for ${repo} (HTTP ${http_code})"
rm -f /tmp/gh_tags.json rm -f /tmp/gh_tags.json
return 1 return 1
fi fi
@@ -1659,6 +1715,15 @@ check_for_gh_release() {
if [[ "$http_code" == "200" ]] && [[ -s /tmp/gh_check.json ]]; then if [[ "$http_code" == "200" ]] && [[ -s /tmp/gh_check.json ]]; then
releases_json="[$(</tmp/gh_check.json)]" releases_json="[$(</tmp/gh_check.json)]"
elif [[ "$http_code" == "401" ]]; then
msg_error "GitHub API authentication failed (HTTP 401)."
if [[ -n "${GITHUB_TOKEN:-}" ]]; then
msg_error "Your GITHUB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITHUB_TOKEN=\"ghp_your_token\""
fi
rm -f /tmp/gh_check.json
return 1
elif [[ "$http_code" == "403" ]]; then elif [[ "$http_code" == "403" ]]; then
msg_error "GitHub API rate limit exceeded (HTTP 403)." msg_error "GitHub API rate limit exceeded (HTTP 403)."
msg_error "To increase the limit, export a GitHub token before running the script:" msg_error "To increase the limit, export a GitHub token before running the script:"
@@ -1679,12 +1744,26 @@ check_for_gh_release() {
if [[ "$http_code" == "200" ]] && [[ -s /tmp/gh_check.json ]]; then if [[ "$http_code" == "200" ]] && [[ -s /tmp/gh_check.json ]]; then
releases_json=$(</tmp/gh_check.json) releases_json=$(</tmp/gh_check.json)
elif [[ "$http_code" == "401" ]]; then
msg_error "GitHub API authentication failed (HTTP 401)."
if [[ -n "${GITHUB_TOKEN:-}" ]]; then
msg_error "Your GITHUB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITHUB_TOKEN=\"ghp_your_token\""
fi
rm -f /tmp/gh_check.json
return 1
elif [[ "$http_code" == "403" ]]; then elif [[ "$http_code" == "403" ]]; then
msg_error "GitHub API rate limit exceeded (HTTP 403)." msg_error "GitHub API rate limit exceeded (HTTP 403)."
msg_error "To increase the limit, export a GitHub token before running the script:" msg_error "To increase the limit, export a GitHub token before running the script:"
msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\"" msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\""
rm -f /tmp/gh_check.json rm -f /tmp/gh_check.json
return 1 return 1
elif [[ "$http_code" == "000" || -z "$http_code" ]]; then
msg_error "GitHub API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://api.github.com/rate_limit"
rm -f /tmp/gh_check.json
return 1
else else
msg_error "Unable to fetch releases for ${app} (HTTP ${http_code})" msg_error "Unable to fetch releases for ${app} (HTTP ${http_code})"
rm -f /tmp/gh_check.json rm -f /tmp/gh_check.json
@@ -2608,12 +2687,22 @@ function fetch_and_deploy_gh_release() {
done done
if ! $success; then if ! $success; then
if [[ "$http_code" == "403" ]]; then if [[ "$http_code" == "401" ]]; then
msg_error "GitHub API authentication failed (HTTP 401)."
if [[ -n "${GITHUB_TOKEN:-}" ]]; then
msg_error "Your GITHUB_TOKEN appears to be invalid or expired."
else
msg_error "The repository may require authentication. Try: export GITHUB_TOKEN=\"ghp_your_token\""
fi
elif [[ "$http_code" == "403" ]]; then
msg_error "GitHub API rate limit exceeded (HTTP 403)." msg_error "GitHub API rate limit exceeded (HTTP 403)."
msg_error "To increase the limit, export a GitHub token before running the script:" msg_error "To increase the limit, export a GitHub token before running the script:"
msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\"" msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\""
elif [[ "$http_code" == "000" || -z "$http_code" ]]; then
msg_error "GitHub API connection failed (no response)."
msg_error "Check your network/DNS: curl -sSL https://api.github.com/rate_limit"
else else
msg_error "Failed to fetch release metadata from $api_url after $max_retries attempts (HTTP $http_code)" msg_error "Failed to fetch release metadata (HTTP $http_code)"
fi fi
return 1 return 1
fi fi

View File

@@ -76,70 +76,90 @@ grep -q "lxc.mount.entry: /dev/net/tun" "$CTID_CONFIG_PATH" || echo "lxc.mount.e
header_info header_info
msg_info "Installing Tailscale in CT $CTID" msg_info "Installing Tailscale in CT $CTID"
pct exec "$CTID" -- bash -c ' pct exec "$CTID" -- sh -c '
set -e set -e
export DEBIAN_FRONTEND=noninteractive
# Source os-release properly (handles quoted values) # Detect OS inside container
source /etc/os-release if [ -f /etc/alpine-release ]; then
# ── Alpine Linux ──
echo "[INFO] Alpine Linux detected, installing Tailscale via apk..."
# Fallback if DNS is poisoned or blocked # Enable community repo if not already enabled
ORIG_RESOLV="/etc/resolv.conf" if ! grep -q "^[^#].*community" /etc/apk/repositories 2>/dev/null; then
BACKUP_RESOLV="/tmp/resolv.conf.backup" ALPINE_VERSION=$(cat /etc/alpine-release | cut -d. -f1,2)
echo "https://dl-cdn.alpinelinux.org/alpine/v${ALPINE_VERSION}/community" >> /etc/apk/repositories
fi
apk update
apk add --no-cache tailscale
# Enable and start Tailscale service
rc-update add tailscale default 2>/dev/null || true
rc-service tailscale start 2>/dev/null || true
# Check DNS resolution using multiple methods (dig may not be installed)
dns_check_failed=true
if command -v dig &>/dev/null; then
if dig +short pkgs.tailscale.com 2>/dev/null | grep -qvE "^127\.|^0\.0\.0\.0$|^$"; then
dns_check_failed=false
fi
elif command -v host &>/dev/null; then
if host pkgs.tailscale.com 2>/dev/null | grep -q "has address"; then
dns_check_failed=false
fi
elif command -v nslookup &>/dev/null; then
if nslookup pkgs.tailscale.com 2>/dev/null | grep -q "Address:"; then
dns_check_failed=false
fi
elif command -v getent &>/dev/null; then
if getent hosts pkgs.tailscale.com &>/dev/null; then
dns_check_failed=false
fi
else else
# No DNS tools available, try curl directly and assume DNS works # ── Debian / Ubuntu ──
dns_check_failed=false export DEBIAN_FRONTEND=noninteractive
fi
if $dns_check_failed; then # Source os-release properly (handles quoted values)
echo "[INFO] DNS resolution for pkgs.tailscale.com failed (blocked or redirected)." . /etc/os-release
echo "[INFO] Temporarily overriding /etc/resolv.conf with Cloudflare DNS (1.1.1.1)"
cp "$ORIG_RESOLV" "$BACKUP_RESOLV" # Fallback if DNS is poisoned or blocked
echo "nameserver 1.1.1.1" >"$ORIG_RESOLV" ORIG_RESOLV="/etc/resolv.conf"
fi BACKUP_RESOLV="/tmp/resolv.conf.backup"
# Check DNS resolution using multiple methods (dig may not be installed)
dns_check_failed=true
if command -v dig >/dev/null 2>&1; then
if dig +short pkgs.tailscale.com 2>/dev/null | grep -qvE "^127\.|^0\.0\.0\.0$|^$"; then
dns_check_failed=false
fi
elif command -v host >/dev/null 2>&1; then
if host pkgs.tailscale.com 2>/dev/null | grep -q "has address"; then
dns_check_failed=false
fi
elif command -v nslookup >/dev/null 2>&1; then
if nslookup pkgs.tailscale.com 2>/dev/null | grep -q "Address:"; then
dns_check_failed=false
fi
elif command -v getent >/dev/null 2>&1; then
if getent hosts pkgs.tailscale.com >/dev/null 2>&1; then
dns_check_failed=false
fi
else
# No DNS tools available, try curl directly and assume DNS works
dns_check_failed=false
fi
if $dns_check_failed; then
echo "[INFO] DNS resolution for pkgs.tailscale.com failed (blocked or redirected)."
echo "[INFO] Temporarily overriding /etc/resolv.conf with Cloudflare DNS (1.1.1.1)"
cp "$ORIG_RESOLV" "$BACKUP_RESOLV"
echo "nameserver 1.1.1.1" >"$ORIG_RESOLV"
fi
if ! command -v curl >/dev/null 2>&1; then
echo "[INFO] curl not found, installing..."
apt-get update -qq
apt-get install -y curl >/dev/null
fi
# Ensure keyrings directory exists
mkdir -p /usr/share/keyrings
curl -fsSL "https://pkgs.tailscale.com/stable/${ID}/${VERSION_CODENAME}.noarmor.gpg" \
| tee /usr/share/keyrings/tailscale-archive-keyring.gpg >/dev/null
echo "deb [signed-by=/usr/share/keyrings/tailscale-archive-keyring.gpg] https://pkgs.tailscale.com/stable/${ID} ${VERSION_CODENAME} main" \
>/etc/apt/sources.list.d/tailscale.list
if ! command -v curl &>/dev/null; then
echo "[INFO] curl not found, installing..."
apt-get update -qq apt-get update -qq
apt update -qq apt-get install -y tailscale >/dev/null
apt install -y curl >/dev/null
fi
# Ensure keyrings directory exists if [ -f /tmp/resolv.conf.backup ]; then
mkdir -p /usr/share/keyrings echo "[INFO] Restoring original /etc/resolv.conf"
mv /tmp/resolv.conf.backup /etc/resolv.conf
curl -fsSL "https://pkgs.tailscale.com/stable/${ID}/${VERSION_CODENAME}.noarmor.gpg" \ fi
| tee /usr/share/keyrings/tailscale-archive-keyring.gpg >/dev/null
echo "deb [signed-by=/usr/share/keyrings/tailscale-archive-keyring.gpg] https://pkgs.tailscale.com/stable/${ID} ${VERSION_CODENAME} main" \
>/etc/apt/sources.list.d/tailscale.list
apt-get update -qq
apt update -qq
apt install -y tailscale >/dev/null
if [[ -f /tmp/resolv.conf.backup ]]; then
echo "[INFO] Restoring original /etc/resolv.conf"
mv /tmp/resolv.conf.backup /etc/resolv.conf
fi fi
' '

View File

@@ -288,8 +288,8 @@ function default_settings() {
echo -e "${DGN}Using Hostname: ${BGN}${HN}${CL}" echo -e "${DGN}Using Hostname: ${BGN}${HN}${CL}"
echo -e "${DGN}Allocated Cores: ${BGN}${CORE_COUNT}${CL}" echo -e "${DGN}Allocated Cores: ${BGN}${CORE_COUNT}${CL}"
echo -e "${DGN}Allocated RAM: ${BGN}${RAM_SIZE}${CL}" echo -e "${DGN}Allocated RAM: ${BGN}${RAM_SIZE}${CL}"
if ! grep -q "^iface ${BRG}" /etc/network/interfaces; then if ! ip link show "${BRG}" &>/dev/null; then
msg_error "Bridge '${BRG}' does not exist in /etc/network/interfaces" msg_error "Bridge '${BRG}' does not exist"
exit exit
else else
echo -e "${DGN}Using LAN Bridge: ${BGN}${BRG}${CL}" echo -e "${DGN}Using LAN Bridge: ${BGN}${BRG}${CL}"
@@ -305,8 +305,8 @@ function default_settings() {
if [ "$NETWORK_MODE" = "dual" ]; then if [ "$NETWORK_MODE" = "dual" ]; then
echo -e "${DGN}Network Mode: ${BGN}Dual Interface (Firewall)${CL}" echo -e "${DGN}Network Mode: ${BGN}Dual Interface (Firewall)${CL}"
echo -e "${DGN}Using WAN MAC Address: ${BGN}${WAN_MAC}${CL}" echo -e "${DGN}Using WAN MAC Address: ${BGN}${WAN_MAC}${CL}"
if ! grep -q "^iface ${WAN_BRG}" /etc/network/interfaces; then if ! ip link show "${WAN_BRG}" &>/dev/null; then
msg_error "Bridge '${WAN_BRG}' does not exist in /etc/network/interfaces" msg_error "Bridge '${WAN_BRG}' does not exist"
exit exit
else else
echo -e "${DGN}Using WAN Bridge: ${BGN}${WAN_BRG}${CL}" echo -e "${DGN}Using WAN Bridge: ${BGN}${WAN_BRG}${CL}"
@@ -424,8 +424,8 @@ function advanced_settings() {
if [ -z $BRG ]; then if [ -z $BRG ]; then
BRG="vmbr0" BRG="vmbr0"
fi fi
if ! grep -q "^iface ${BRG}" /etc/network/interfaces; then if ! ip link show "${BRG}" &>/dev/null; then
msg_error "Bridge '${BRG}' does not exist in /etc/network/interfaces" msg_error "Bridge '${BRG}' does not exist"
exit exit
fi fi
echo -e "${DGN}Using LAN Bridge: ${BGN}$BRG${CL}" echo -e "${DGN}Using LAN Bridge: ${BGN}$BRG${CL}"
@@ -474,8 +474,8 @@ function advanced_settings() {
if [ -z $WAN_BRG ]; then if [ -z $WAN_BRG ]; then
WAN_BRG="vmbr1" WAN_BRG="vmbr1"
fi fi
if ! grep -q "^iface ${WAN_BRG}" /etc/network/interfaces; then if ! ip link show "${WAN_BRG}" &>/dev/null; then
msg_error "WAN Bridge '${WAN_BRG}' does not exist in /etc/network/interfaces" msg_error "WAN Bridge '${WAN_BRG}' does not exist"
exit exit
fi fi
echo -e "${DGN}Using WAN Bridge: ${BGN}$WAN_BRG${CL}" echo -e "${DGN}Using WAN Bridge: ${BGN}$WAN_BRG${CL}"