Compare commits

...

84 Commits

Author SHA1 Message Date
CanbiZ (MickLesk)
44ec223d20 update ifupdown2 source 2026-03-11 14:37:43 +01:00
CanbiZ (MickLesk)
4ab3a24d03 Merge branch 'main' into arm64-build-support 2026-03-11 14:27:14 +01:00
CanbiZ (MickLesk)
231945dfa7 remove arm64 overlay 2026-03-11 14:26:03 +01:00
CanbiZ (MickLesk)
7c051fb648 Improve arm64 support and arch-specific downloads
Add clearer architecture error messages and gate arm64 usage, plus implement architecture-aware behavior across the toolkit. Changes include: update exit-code messages to reference amd64/arm64, refuse arm64 unless explicitly enabled, show architecture in summaries, and use arch-specific package lists when installing in containers. Make downloads for FFmpeg and yq choose the correct amd64/arm64 binaries, tighten template download error handling and formatting, and clean up minor whitespace/comment issues. These changes aim to make arm64 handling explicit and downloads/installations more robust for non-amd64 systems.
2026-03-11 14:23:56 +01:00
Michel Roegl-Brunner
968e96e2c7 Delete removed scripts from pocketbase 2026-03-11 13:37:35 +01:00
community-scripts-pr-app[bot]
02b5c7f7a8 chore: update github-versions.json (#12772)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 12:12:49 +00:00
Sam Heinz
6f8aa6eadc build.func: remove arm64 support for focal, bullseye
Legacy support removed since no cts use them anymore.
2026-03-11 21:17:38 +10:00
community-scripts-pr-app[bot]
f1f77e5283 Update CHANGELOG.md (#12769)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 11:03:53 +00:00
Michel Roegl-Brunner
1e726852df Tracearr: Increase default disk variable from 5 to 10 (#12762)
* Increase default disk variable from 5 to 10

* Increase HDD resource allocation from 5 to 10
2026-03-11 12:03:26 +01:00
Sam Heinz
35b3b93ca6 build.func: change back to VE 2026-03-11 20:59:33 +10:00
community-scripts-pr-app[bot]
ba85fad318 Update CHANGELOG.md (#12768)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 10:43:43 +00:00
Markus Zellner
6ae5eefdf5 Add -y flag to wgd.sh update command (#12767)
Otherwise script waits, which seems like hanging if called without verbose mode
2026-03-11 11:43:18 +01:00
Michel Roegl-Brunner
b6805bb845 HotFix variable assignment syntax in coder-code-server.sh 2026-03-11 09:29:22 +01:00
community-scripts-pr-app[bot]
3117145e6c Update CHANGELOG.md (#12759)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 07:44:05 +00:00
Michel Roegl-Brunner
c144915a74 Coder-Code-Server: Check if config file exists (#12758)
* Check if config file exists

* Apply suggestion from @tremor021

---------

Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>
2026-03-11 08:43:42 +01:00
Michel Roegl-Brunner
ad1e207ee1 Add GitHub Actions workflow for Pages redirect 2026-03-11 08:37:46 +01:00
community-scripts-pr-app[bot]
f9cb07ee4c chore: update github-versions.json (#12756)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 06:17:43 +00:00
Sam Heinz
78979189c1 Update misc/build.func
Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-03-11 10:19:58 +10:00
community-scripts-pr-app[bot]
f5ec5b0e47 Update CHANGELOG.md (#12754)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 00:19:03 +00:00
community-scripts-pr-app[bot]
ba8b85972e chore: update github-versions.json (#12753)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-11 00:18:34 +00:00
Sam Heinz
a6a83b9541 Update misc/build.func
Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-03-11 09:59:42 +10:00
Sam Heinz
e4db6be257 Update misc/build.func
Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-03-11 09:56:34 +10:00
Sam Heinz
b9d401b178 Update misc/build.func
Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-03-11 09:56:18 +10:00
community-scripts-pr-app[bot]
766f678321 chore: update github-versions.json (#12749)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-10 18:17:20 +00:00
CanbiZ (MickLesk)
c69c4afd25 update rybbit link 2026-03-10 16:56:47 +01:00
Michel Roegl-Brunner
516d8d7a0f toggle is_dev to false when a new script gets merged 2026-03-10 16:07:51 +01:00
Michel Roegl-Brunner
a0c93900e9 Remove unnecessary blank line in 2fauth.sh 2026-03-10 15:45:15 +01:00
Michel Roegl-Brunner
a11f282a43 Update workflow 2026-03-10 15:44:28 +01:00
Michel Roegl-Brunner
cac6b4ec59 Check workflow to update date in Pocketbase 2026-03-10 15:39:07 +01:00
Michel Roegl-Brunner
eba96a55d2 New workflow to update last updated field in Database when a script gets changed, also adds last_update_link to link to the latest changes 2026-03-10 15:33:15 +01:00
Michel Roegl-Brunner
37f4585110 New workflow to update last updated field in Database when a script gets changed, also adds last_update_link to link to the latest changes 2026-03-10 15:33:15 +01:00
community-scripts-pr-app[bot]
e6931434b0 Update CHANGELOG.md (#12745)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-10 14:23:26 +00:00
Chris
eec763bed0 [Fix] Immich: Pin libvips to 8.17.3 (#12744) 2026-03-10 15:22:48 +01:00
community-scripts-pr-app[bot]
fa62363628 chore: update github-versions.json (#12743)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-10 12:12:48 +00:00
community-scripts-pr-app[bot]
0bbb5a1c74 chore: update github-versions.json (#12741)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-10 06:15:50 +00:00
community-scripts-pr-app[bot]
064e440d00 Update CHANGELOG.md (#12737)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-10 00:19:06 +00:00
community-scripts-pr-app[bot]
129b85a8cf chore: update github-versions.json (#12736)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-10 00:18:44 +00:00
community-scripts-pr-app[bot]
586154d4e1 Update CHANGELOG.md (#12734)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 22:27:24 +00:00
Bram
b819231a01 feat: add CopycatWarningToast component for user warnings (#12733)
Introduced a new CopycatWarningToast component that displays a warning about copycat sites. The toast appears at the top-center of the screen and can be dismissed, with the dismissal state stored in local storage to prevent reappearing. Integrated the component into the RootLayout for global visibility.
2026-03-09 23:27:01 +01:00
community-scripts-pr-app[bot]
3ec9eba736 chore: update github-versions.json (#12732)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 18:17:57 +00:00
community-scripts-pr-app[bot]
75d4bc2b61 Update CHANGELOG.md (#12731)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 17:59:58 +00:00
Chris
a5ac56be7a [Hotfix] qBittorrent: Disable UPnP port forwarding by default (#12728) 2026-03-09 18:59:35 +01:00
community-scripts-pr-app[bot]
fe46d8c22d Update CHANGELOG.md (#12730)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 17:39:24 +00:00
Chris
93cbd51d5b [Quickfix] Opencloud: ensure correct case for binary (#12729) 2026-03-09 18:38:58 +01:00
CanbiZ (MickLesk)
0b99873194 Add dependency check for zstd before backup
Ensure zstd dependency is installed before backup.
2026-03-09 18:36:28 +01:00
community-scripts-pr-app[bot]
8113c7da22 Update CHANGELOG.md (#12726)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 16:06:44 +00:00
CanbiZ (MickLesk)
85023dab51 Omada: Bump libssl (#12724) 2026-03-09 17:06:07 +01:00
community-scripts-pr-app[bot]
f8ab3dc4b9 Update CHANGELOG.md (#12722)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 14:23:41 +00:00
Chris
fedabe4889 Pin Opencloud to 5.2.0 (#12721) 2026-03-09 15:23:07 +01:00
community-scripts-pr-app[bot]
eba19d8e42 Update CHANGELOG.md (#12718)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 12:52:21 +00:00
CanbiZ (MickLesk)
e180a3bc44 openwebui: Ensure required dependencies (#12717)
* openwebui: Ensure required dependencies

Added zstd and build-essential as dependencies for the script.

* Update dependencies in openwebui-install.sh

Added build-essential and libmariadb-dev to dependencies.
2026-03-09 13:51:59 +01:00
community-scripts-pr-app[bot]
a1a465708f Update CHANGELOG.md (#12716)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 12:24:37 +00:00
CanbiZ (MickLesk)
346d6c6a0a feat: improve zigbee2mqtt backup handler (#12714)
- Name backups by installed version (e.g. Zigbee2MQTT_backup_2.5.1.tar.zst)
- Use zstd compression instead of gzip
- Keep last 5 backups instead of deleting all previous ones
2026-03-09 13:24:09 +01:00
community-scripts-pr-app[bot]
c76813cbcb chore: update github-versions.json (#12715)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 12:12:25 +00:00
CanbiZ (MickLesk)
e7f551dab6 fix(hwaccel): install ROCm runtime only, reduce disk resize to +4GB
The full 'rocm' meta-package includes 15GB+ of dev tools (compilers,
debuggers, dev headers) which are unnecessary in LXC containers.
Install only runtime packages: rocm-opencl-runtime, rocm-hip-runtime,
rocm-smi-lib. Reduce disk resize from +8GB to +4GB accordingly.
2026-03-09 11:14:26 +01:00
CanbiZ (MickLesk)
d8b2a37228 fix(build): auto-resize disk +8GB when AMD GPU detected for ROCm 2026-03-09 10:18:23 +01:00
community-scripts-pr-app[bot]
af3950fafc Update CHANGELOG.md (#12710)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 09:03:30 +00:00
CanbiZ (MickLesk)
b20bf9c658 tools: add Alpine (apk) support to ensure_dependencies and is_package_installed (#12703) 2026-03-09 10:03:04 +01:00
CanbiZ (MickLesk)
8c5e340ad0 fix(hwaccel): use amdgpu/latest/ubuntu instead of versioned URL 2026-03-09 09:57:28 +01:00
community-scripts-pr-app[bot]
f3cd063816 Update CHANGELOG.md (#12709)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 08:48:16 +00:00
CanbiZ (MickLesk)
5adfa3cb45 Frigate: try an OpenVino model build fallback (#12704)
Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-03-09 09:47:50 +01:00
community-scripts-pr-app[bot]
3398fe9361 Update CHANGELOG.md (#12708)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 08:37:47 +00:00
CanbiZ (MickLesk)
2afc25d51f tools.func: extend hwaccel with ROCm (#12707) 2026-03-09 09:37:26 +01:00
CanbiZ (MickLesk)
8c5d5c6679 Downgrade Node.js version from 24 to 22 2026-03-09 09:13:45 +01:00
community-scripts-pr-app[bot]
fca66c5b56 Update CHANGELOG.md (#12706)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 08:05:40 +00:00
CanbiZ (MickLesk)
d38ca1a7fc Reactive Resume: rewrite for v5 using original repo amruthpilla/reactive-resume (#12705)
* fix(reactive-resume): rewrite for v5 using original repo amruthpillai/reactive-resume

Replaces lazy-media fork with original upstream repo (amruthpillai/reactive-resume).
Rewrites install script for v5 architecture:
- TanStack Start / Drizzle ORM single-package build
- Headless Chromium for PDF generation (replaces Browserless)
- Local filesystem storage (removes MinIO dependency)
- Node 24 + pnpm
- Runtime: node .output/server/index.mjs

Fixes #12672, Fixes #11651

* add clean_install
2026-03-09 09:05:14 +01:00
community-scripts-pr-app[bot]
b3bedd720f Update CHANGELOG.md (#12702)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 07:06:16 +00:00
Nícolas Pastorello
b0858920d5 Change cronjob setup to use www-data user (#12695) 2026-03-09 08:05:50 +01:00
community-scripts-pr-app[bot]
e4e365b701 Update CHANGELOG.md (#12701)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 06:35:35 +00:00
Slaviša Arežina
c4315713b5 Fix check_for_gh_release function call (#12694) 2026-03-09 07:35:11 +01:00
community-scripts-pr-app[bot]
047ea2c66d chore: update github-versions.json (#12700)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 06:22:52 +00:00
community-scripts-pr-app[bot]
5e6eb400b5 Update CHANGELOG.md (#12697)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 00:21:57 +00:00
community-scripts-pr-app[bot]
db7880cab5 chore: update github-versions.json (#12696)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-09 00:21:34 +00:00
community-scripts-pr-app[bot]
3909095a2c Update CHANGELOG.md (#12693)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-08 23:05:51 +00:00
Chris
6685b88695 [Fix] Immich: chown install dir before machine-learning update (#12684) 2026-03-09 00:05:29 +01:00
community-scripts-pr-app[bot]
6076a7ecc7 Update CHANGELOG.md (#12692)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-08 23:00:50 +00:00
Chris
cc351a4817 [Fix] Scanopy: Build generate-fixtures (#12686) 2026-03-09 00:00:30 +01:00
community-scripts-pr-app[bot]
9217a0fb79 chore: update github-versions.json (#12688)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-08 18:07:32 +00:00
community-scripts-pr-app[bot]
5abaa2e7e3 Update CHANGELOG.md (#12685)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-08 15:13:19 +00:00
Slaviša Arežina
c3b8285584 Change slug from 'lxc-execute' to 'execute' (#12681) 2026-03-08 16:12:53 +01:00
community-scripts-pr-app[bot]
bf2667827b Update CHANGELOG.md (#12683)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-08 14:09:52 +00:00
Tobias
3b7283a13f fix: rustdeskserver: use correct repo string (#12682) 2026-03-08 15:09:28 +01:00
Sam Heinz
6f62656c96 Replace tools.func call that checks arch 2026-03-07 15:55:21 +10:00
Sam Heinz
087f817bf6 misc scripts: add support for arm64 2026-03-07 15:44:44 +10:00
36 changed files with 1399 additions and 583 deletions

View File

@@ -6,14 +6,14 @@ on:
jobs:
close_issue:
if: github.event.pull_request.merged == true && github.repository == 'community-scripts/ProxmoxVE'
runs-on: ubuntu-latest
runs-on: self-hosted
steps:
- name: Checkout target repo (main)
- name: Checkout target repo (merge commit)
uses: actions/checkout@v4
with:
repository: community-scripts/ProxmoxVE
ref: main
ref: ${{ github.event.pull_request.merge_commit_sha }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: Extract and Process PR Title
@@ -23,6 +23,39 @@ jobs:
echo "Processed Title: $title"
echo "title=$title" >> $GITHUB_ENV
- name: Get slugs from merged PR
id: get_slugs
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
pr_files=$(gh pr view ${{ github.event.pull_request.number }} --repo community-scripts/ProxmoxVE --json files -q '.files[].path' 2>/dev/null || true)
slugs=""
for path in $pr_files; do
[[ -f "$path" ]] || continue
if [[ "$path" == frontend/public/json/*.json ]]; then
s=$(jq -r '.slug // empty' "$path" 2>/dev/null)
[[ -n "$s" ]] && slugs="$slugs $s"
elif [[ "$path" == ct/*.sh ]] || [[ "$path" == install/*.sh ]] || [[ "$path" == tools/*.sh ]] || [[ "$path" == turnkey/*.sh ]] || [[ "$path" == vm/*.sh ]]; then
base=$(basename "$path" .sh)
if [[ "$path" == install/* && "$base" == *-install ]]; then
s="${base%-install}"
else
s="$base"
fi
[[ -n "$s" ]] && slugs="$slugs $s"
fi
done
slugs=$(echo $slugs | xargs -n1 | sort -u | tr '\n' ' ')
if [[ -z "$slugs" && -n "$title" ]]; then
slugs="$title"
fi
if [[ -z "$slugs" ]]; then
echo "count=0" >> "$GITHUB_OUTPUT"
exit 0
fi
echo "$slugs" > pocketbase_slugs.txt
echo "count=$(echo $slugs | wc -w)" >> "$GITHUB_OUTPUT"
- name: Search for Issues with Similar Titles
id: find_issue
env:
@@ -63,3 +96,104 @@ jobs:
run: |
gh issue comment $issue_number --repo community-scripts/ProxmoxVED --body "Merged with #${{ github.event.pull_request.number }} in ProxmoxVE"
gh issue close $issue_number --repo community-scripts/ProxmoxVED
- name: Set is_dev to false in PocketBase
if: steps.get_slugs.outputs.count != '0'
env:
POCKETBASE_URL: ${{ secrets.POCKETBASE_URL }}
POCKETBASE_COLLECTION: ${{ secrets.POCKETBASE_COLLECTION }}
POCKETBASE_ADMIN_EMAIL: ${{ secrets.POCKETBASE_ADMIN_EMAIL }}
POCKETBASE_ADMIN_PASSWORD: ${{ secrets.POCKETBASE_ADMIN_PASSWORD }}
PR_URL: ${{ github.server_url }}/${{ github.repository }}/pull/${{ github.event.pull_request.number }}
run: |
node << 'ENDSCRIPT'
(async function() {
const fs = require('fs');
const https = require('https');
const http = require('http');
const url = require('url');
function request(fullUrl, opts) {
return new Promise(function(resolve, reject) {
const u = url.parse(fullUrl);
const isHttps = u.protocol === 'https:';
const body = opts.body;
const options = {
hostname: u.hostname,
port: u.port || (isHttps ? 443 : 80),
path: u.path,
method: opts.method || 'GET',
headers: opts.headers || {}
};
if (body) options.headers['Content-Length'] = Buffer.byteLength(body);
const lib = isHttps ? https : http;
const req = lib.request(options, function(res) {
let data = '';
res.on('data', function(chunk) { data += chunk; });
res.on('end', function() {
resolve({ ok: res.statusCode >= 200 && res.statusCode < 300, statusCode: res.statusCode, body: data });
});
});
req.on('error', reject);
if (body) req.write(body);
req.end();
});
}
const raw = process.env.POCKETBASE_URL.replace(/\/$/, '');
const apiBase = /\/api$/i.test(raw) ? raw : raw + '/api';
const coll = process.env.POCKETBASE_COLLECTION;
const slugsText = fs.readFileSync('pocketbase_slugs.txt', 'utf8').trim();
const slugs = slugsText ? slugsText.split(/\s+/).filter(Boolean) : [];
if (slugs.length === 0) {
console.log('No slugs to update.');
return;
}
const authUrl = apiBase + '/collections/users/auth-with-password';
const authBody = JSON.stringify({
identity: process.env.POCKETBASE_ADMIN_EMAIL,
password: process.env.POCKETBASE_ADMIN_PASSWORD
});
const authRes = await request(authUrl, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: authBody
});
if (!authRes.ok) {
throw new Error('Auth failed: ' + authRes.body);
}
const token = JSON.parse(authRes.body).token;
const recordsUrl = apiBase + '/collections/' + encodeURIComponent(coll) + '/records';
const prUrl = process.env.PR_URL || '';
for (const slug of slugs) {
const filter = "(slug='" + slug.replace(/'/g, "''") + "')";
const listRes = await request(recordsUrl + '?filter=' + encodeURIComponent(filter) + '&perPage=1', {
headers: { 'Authorization': token }
});
const list = JSON.parse(listRes.body);
const record = list.items && list.items[0];
if (!record) {
console.log('Slug not in DB, skipping: ' + slug);
continue;
}
const patchRes = await request(recordsUrl + '/' + record.id, {
method: 'PATCH',
headers: { 'Authorization': token, 'Content-Type': 'application/json' },
body: JSON.stringify({
name: record.name || record.slug,
last_update_commit: prUrl,
is_dev: false
})
});
if (!patchRes.ok) {
console.warn('PATCH failed for slug ' + slug + ': ' + patchRes.body);
continue;
}
console.log('Set is_dev=false for slug: ' + slug);
}
console.log('Done.');
})().catch(e => { console.error(e); process.exit(1); });
ENDSCRIPT
shell: bash

View File

@@ -0,0 +1,150 @@
name: Delete PocketBase entry on script/JSON removal
on:
push:
branches:
- main
paths:
- "frontend/public/json/**"
- "vm/**"
- "tools/**"
- "turnkey/**"
- "ct/**"
- "install/**"
jobs:
delete-pocketbase-entry:
runs-on: self-hosted
steps:
- name: Checkout Repository
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Get slugs from deleted JSON and script files
id: slugs
run: |
BEFORE="${{ github.event.before }}"
AFTER="${{ github.event.after }}"
slugs=""
# Deleted JSON files: get slug from previous commit
deleted_json=$(git diff --name-only --diff-filter=D "$BEFORE" "$AFTER" -- frontend/public/json/ | grep '\.json$' || true)
for f in $deleted_json; do
[[ -z "$f" ]] && continue
s=$(git show "$BEFORE:$f" 2>/dev/null | jq -r '.slug // empty' 2>/dev/null || true)
[[ -n "$s" ]] && slugs="$slugs $s"
done
# Deleted script files: derive slug from path
deleted_sh=$(git diff --name-only --diff-filter=D "$BEFORE" "$AFTER" -- ct/ install/ tools/ turnkey/ vm/ | grep '\.sh$' || true)
for f in $deleted_sh; do
[[ -z "$f" ]] && continue
base="${f##*/}"
base="${base%.sh}"
if [[ "$f" == install/* && "$base" == *-install ]]; then
s="${base%-install}"
else
s="$base"
fi
[[ -n "$s" ]] && slugs="$slugs $s"
done
slugs=$(echo $slugs | xargs -n1 | sort -u | tr '\n' ' ')
if [[ -z "$slugs" ]]; then
echo "No deleted JSON or script files to remove from PocketBase."
echo "count=0" >> "$GITHUB_OUTPUT"
exit 0
fi
echo "$slugs" > slugs_to_delete.txt
echo "count=$(echo $slugs | wc -w)" >> "$GITHUB_OUTPUT"
echo "Slugs to delete: $slugs"
- name: Delete from PocketBase
if: steps.slugs.outputs.count != '0'
env:
POCKETBASE_URL: ${{ secrets.POCKETBASE_URL }}
POCKETBASE_COLLECTION: ${{ secrets.POCKETBASE_COLLECTION }}
POCKETBASE_ADMIN_EMAIL: ${{ secrets.POCKETBASE_ADMIN_EMAIL }}
POCKETBASE_ADMIN_PASSWORD: ${{ secrets.POCKETBASE_ADMIN_PASSWORD }}
run: |
node << 'ENDSCRIPT'
(async function() {
const fs = require('fs');
const https = require('https');
const http = require('http');
const url = require('url');
function request(fullUrl, opts) {
return new Promise(function(resolve, reject) {
const u = url.parse(fullUrl);
const isHttps = u.protocol === 'https:';
const body = opts.body;
const options = {
hostname: u.hostname,
port: u.port || (isHttps ? 443 : 80),
path: u.path,
method: opts.method || 'GET',
headers: opts.headers || {}
};
if (body) options.headers['Content-Length'] = Buffer.byteLength(body);
const lib = isHttps ? https : http;
const req = lib.request(options, function(res) {
let data = '';
res.on('data', function(chunk) { data += chunk; });
res.on('end', function() {
resolve({ ok: res.statusCode >= 200 && res.statusCode < 300, statusCode: res.statusCode, body: data });
});
});
req.on('error', reject);
if (body) req.write(body);
req.end();
});
}
const raw = process.env.POCKETBASE_URL.replace(/\/$/, '');
const apiBase = /\/api$/i.test(raw) ? raw : raw + '/api';
const coll = process.env.POCKETBASE_COLLECTION;
const slugs = fs.readFileSync('slugs_to_delete.txt', 'utf8').trim().split(/\s+/).filter(Boolean);
const authUrl = apiBase + '/collections/users/auth-with-password';
const authBody = JSON.stringify({
identity: process.env.POCKETBASE_ADMIN_EMAIL,
password: process.env.POCKETBASE_ADMIN_PASSWORD
});
const authRes = await request(authUrl, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: authBody
});
if (!authRes.ok) {
throw new Error('Auth failed. Response: ' + authRes.body);
}
const token = JSON.parse(authRes.body).token;
const recordsUrl = apiBase + '/collections/' + encodeURIComponent(coll) + '/records';
for (const slug of slugs) {
const filter = "(slug='" + slug + "')";
const listRes = await request(recordsUrl + '?filter=' + encodeURIComponent(filter) + '&perPage=1', {
headers: { 'Authorization': token }
});
const list = JSON.parse(listRes.body);
const existingId = list.items && list.items[0] && list.items[0].id;
if (!existingId) {
console.log('No PocketBase record for slug "' + slug + '", skipping.');
continue;
}
const delRes = await request(recordsUrl + '/' + existingId, {
method: 'DELETE',
headers: { 'Authorization': token }
});
if (delRes.ok) {
console.log('Deleted PocketBase record for slug "' + slug + '" (id=' + existingId + ').');
} else {
console.warn('DELETE failed for slug "' + slug + '": ' + delRes.statusCode + ' ' + delRes.body);
}
}
console.log('Done.');
})().catch(e => { console.error(e); process.exit(1); });
ENDSCRIPT
shell: bash

41
.github/workflows/trigger_github_pages_redirect.yml generated vendored Normal file
View File

@@ -0,0 +1,41 @@
name: Pages Redirect
on:
workflow_dispatch:
permissions:
pages: write
id-token: write
contents: read
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Create redirect page
run: |
mkdir site
cat <<EOF > site/index.html
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="refresh" content="0; url=https://community-scripts.org/">
<link rel="canonical" href="https://community-scripts.org/">
<title>Redirecting...</title>
</head>
<body>
Redirecting...
</body>
</html>
EOF
- uses: actions/upload-pages-artifact@v3
with:
path: site
- name: Deploy
uses: actions/deploy-pages@v4

View File

@@ -0,0 +1,167 @@
name: Update script timestamp on .sh changes
on:
push:
branches:
- main
paths:
- "ct/**/*.sh"
- "install/**/*.sh"
- "tools/**/*.sh"
- "turnkey/**/*.sh"
- "vm/**/*.sh"
jobs:
update-script-timestamp:
runs-on: self-hosted
steps:
- name: Checkout Repository
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Get changed .sh files and derive slugs
id: slugs
run: |
changed=$(git diff --name-only "${{ github.event.before }}" "${{ github.event.after }}" -- ct/ install/ tools/ turnkey/ vm/ | grep '\.sh$' || true)
if [[ -z "$changed" ]]; then
echo "No .sh files changed in ct/, install/, tools/, turnkey/, or vm/."
echo "count=0" >> "$GITHUB_OUTPUT"
exit 0
fi
declare -A seen
slugs=""
for f in $changed; do
[[ -f "$f" ]] || continue
base="${f##*/}"
base="${base%.sh}"
if [[ "$f" == install/* && "$base" == *-install ]]; then
slug="${base%-install}"
else
slug="$base"
fi
if [[ -z "${seen[$slug]:-}" ]]; then
seen[$slug]=1
slugs="$slugs $slug"
fi
done
slugs=$(echo $slugs | xargs -n1 | sort -u)
if [[ -z "$slugs" ]]; then
echo "No slugs to update."
echo "count=0" >> "$GITHUB_OUTPUT"
exit 0
fi
echo "$slugs" > changed_slugs.txt
echo "count=$(echo "$slugs" | wc -w)" >> "$GITHUB_OUTPUT"
- name: Parse PR number from merge commit
id: pr
run: |
re='#([0-9]+)'
if [[ "$COMMIT_MSG" =~ $re ]]; then
echo "number=${BASH_REMATCH[1]}" >> "$GITHUB_OUTPUT"
else
echo "number=" >> "$GITHUB_OUTPUT"
fi
env:
COMMIT_MSG: ${{ github.event.head_commit.message }}
- name: Update script timestamps in PocketBase
if: steps.slugs.outputs.count != '0'
env:
POCKETBASE_URL: ${{ secrets.POCKETBASE_URL }}
POCKETBASE_COLLECTION: ${{ secrets.POCKETBASE_COLLECTION }}
POCKETBASE_ADMIN_EMAIL: ${{ secrets.POCKETBASE_ADMIN_EMAIL }}
POCKETBASE_ADMIN_PASSWORD: ${{ secrets.POCKETBASE_ADMIN_PASSWORD }}
COMMIT_URL: ${{ github.server_url }}/${{ github.repository }}/commit/${{ github.sha }}
PR_URL: ${{ steps.pr.outputs.number != '' && format('{0}/{1}/pull/{2}', github.server_url, github.repository, steps.pr.outputs.number) || '' }}
run: |
node << 'ENDSCRIPT'
(async function() {
const fs = require('fs');
const https = require('https');
const http = require('http');
const url = require('url');
function request(fullUrl, opts) {
return new Promise(function(resolve, reject) {
const u = url.parse(fullUrl);
const isHttps = u.protocol === 'https:';
const body = opts.body;
const options = {
hostname: u.hostname,
port: u.port || (isHttps ? 443 : 80),
path: u.path,
method: opts.method || 'GET',
headers: opts.headers || {}
};
if (body) options.headers['Content-Length'] = Buffer.byteLength(body);
const lib = isHttps ? https : http;
const req = lib.request(options, function(res) {
let data = '';
res.on('data', function(chunk) { data += chunk; });
res.on('end', function() {
resolve({ ok: res.statusCode >= 200 && res.statusCode < 300, statusCode: res.statusCode, body: data });
});
});
req.on('error', reject);
if (body) req.write(body);
req.end();
});
}
const raw = process.env.POCKETBASE_URL.replace(/\/$/, '');
const apiBase = /\/api$/i.test(raw) ? raw : raw + '/api';
const coll = process.env.POCKETBASE_COLLECTION;
const slugsText = fs.readFileSync('changed_slugs.txt', 'utf8').trim();
const slugs = slugsText ? slugsText.split(/\s+/).filter(Boolean) : [];
if (slugs.length === 0) {
console.log('No slugs to update.');
return;
}
const authUrl = apiBase + '/collections/users/auth-with-password';
const authBody = JSON.stringify({
identity: process.env.POCKETBASE_ADMIN_EMAIL,
password: process.env.POCKETBASE_ADMIN_PASSWORD
});
const authRes = await request(authUrl, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: authBody
});
if (!authRes.ok) {
throw new Error('Auth failed: ' + authRes.body);
}
const token = JSON.parse(authRes.body).token;
const recordsUrl = apiBase + '/collections/' + encodeURIComponent(coll) + '/records';
for (const slug of slugs) {
const filter = "(slug='" + slug.replace(/'/g, "''") + "')";
const listRes = await request(recordsUrl + '?filter=' + encodeURIComponent(filter) + '&perPage=1', {
headers: { 'Authorization': token }
});
const list = JSON.parse(listRes.body);
const record = list.items && list.items[0];
if (!record) {
console.log('Slug not in DB, skipping: ' + slug);
continue;
}
const patchRes = await request(recordsUrl + '/' + record.id, {
method: 'PATCH',
headers: { 'Authorization': token, 'Content-Type': 'application/json' },
body: JSON.stringify({
name: record.name || record.slug,
last_update_commit: process.env.PR_URL || process.env.COMMIT_URL || ''
})
});
if (!patchRes.ok) {
console.warn('PATCH failed for slug ' + slug + ': ' + patchRes.body);
continue;
}
console.log('Updated timestamp for slug: ' + slug);
}
console.log('Done.');
})().catch(e => { console.error(e); process.exit(1); });
ENDSCRIPT
shell: bash

View File

@@ -420,14 +420,83 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
</details>
## 2026-03-11
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- Tracearr: Increase default disk variable from 5 to 10 [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#12762](https://github.com/community-scripts/ProxmoxVE/pull/12762))
- Fix Wireguard Dashboard update [@odin568](https://github.com/odin568) ([#12767](https://github.com/community-scripts/ProxmoxVE/pull/12767))
### 🧰 Tools
- #### ✨ New Features
- Coder-Code-Server: Check if config file exists [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#12758](https://github.com/community-scripts/ProxmoxVE/pull/12758))
## 2026-03-10
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- [Fix] Immich: Pin libvips to 8.17.3 [@vhsdream](https://github.com/vhsdream) ([#12744](https://github.com/community-scripts/ProxmoxVE/pull/12744))
## 2026-03-09
### 🚀 Updated Scripts
- Pin Opencloud to 5.2.0 [@vhsdream](https://github.com/vhsdream) ([#12721](https://github.com/community-scripts/ProxmoxVE/pull/12721))
- #### 🐞 Bug Fixes
- [Hotfix] qBittorrent: Disable UPnP port forwarding by default [@vhsdream](https://github.com/vhsdream) ([#12728](https://github.com/community-scripts/ProxmoxVE/pull/12728))
- [Quickfix] Opencloud: ensure correct case for binary [@vhsdream](https://github.com/vhsdream) ([#12729](https://github.com/community-scripts/ProxmoxVE/pull/12729))
- Omada: Bump libssl [@MickLesk](https://github.com/MickLesk) ([#12724](https://github.com/community-scripts/ProxmoxVE/pull/12724))
- openwebui: Ensure required dependencies [@MickLesk](https://github.com/MickLesk) ([#12717](https://github.com/community-scripts/ProxmoxVE/pull/12717))
- Frigate: try an OpenVino model build fallback [@MickLesk](https://github.com/MickLesk) ([#12704](https://github.com/community-scripts/ProxmoxVE/pull/12704))
- Change cronjob setup to use www-data user [@opastorello](https://github.com/opastorello) ([#12695](https://github.com/community-scripts/ProxmoxVE/pull/12695))
- RustDesk Server: Fix check_for_gh_release function call [@tremor021](https://github.com/tremor021) ([#12694](https://github.com/community-scripts/ProxmoxVE/pull/12694))
- #### ✨ New Features
- feat: improve zigbee2mqtt backup handler [@MickLesk](https://github.com/MickLesk) ([#12714](https://github.com/community-scripts/ProxmoxVE/pull/12714))
- #### 💥 Breaking Changes
- Reactive Resume: rewrite for v5 using original repo amruthpilla/reactive-resume [@MickLesk](https://github.com/MickLesk) ([#12705](https://github.com/community-scripts/ProxmoxVE/pull/12705))
### 💾 Core
- #### ✨ New Features
- tools: add Alpine (apk) support to ensure_dependencies and is_package_installed [@MickLesk](https://github.com/MickLesk) ([#12703](https://github.com/community-scripts/ProxmoxVE/pull/12703))
- tools.func: extend hwaccel with ROCm [@MickLesk](https://github.com/MickLesk) ([#12707](https://github.com/community-scripts/ProxmoxVE/pull/12707))
### 🌐 Website
- #### ✨ New Features
- feat: add CopycatWarningToast component for user warnings [@BramSuurdje](https://github.com/BramSuurdje) ([#12733](https://github.com/community-scripts/ProxmoxVE/pull/12733))
## 2026-03-08
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- [Fix] Immich: chown install dir before machine-learning update [@vhsdream](https://github.com/vhsdream) ([#12684](https://github.com/community-scripts/ProxmoxVE/pull/12684))
- [Fix] Scanopy: Build generate-fixtures [@vhsdream](https://github.com/vhsdream) ([#12686](https://github.com/community-scripts/ProxmoxVE/pull/12686))
- fix: rustdeskserver: use correct repo string [@CrazyWolf13](https://github.com/CrazyWolf13) ([#12682](https://github.com/community-scripts/ProxmoxVE/pull/12682))
- NZBGet: Fixes for RAR5 handling [@tremor021](https://github.com/tremor021) ([#12675](https://github.com/community-scripts/ProxmoxVE/pull/12675))
### 🌐 Website
- #### 🐞 Bug Fixes
- LXC-Execute: Fix slug [@tremor021](https://github.com/tremor021) ([#12681](https://github.com/community-scripts/ProxmoxVE/pull/12681))
## 2026-03-07
### 🆕 New Scripts

View File

@@ -213,7 +213,8 @@ EOF
msg_ok "Updated Immich server, web, cli and plugins"
cd "$SRC_DIR"/machine-learning
mkdir -p "$ML_DIR" && chown -R immich:immich "$ML_DIR"
mkdir -p "$ML_DIR"
chown -R immich:immich "$INSTALL_DIR"
chown immich:immich ./uv.lock
export VIRTUAL_ENV="${ML_DIR}"/ml-venv
if [[ -f ~/.openvino ]]; then
@@ -379,7 +380,7 @@ function compile_imagemagick() {
function compile_libvips() {
SOURCE=$SOURCE_DIR/libvips
: "${LIBVIPS_REVISION:=$(jq -cr '.revision' "$BASE_DIR"/server/sources/libvips.json)}"
LIBVIPS_REVISION="0c9151a4f416d2f8ae20a755db218f6637050eec"
if [[ "$LIBVIPS_REVISION" != "$(grep 'libvips' ~/.immich_library_revisions | awk '{print $2}')" ]]; then
msg_info "Recompiling libvips"
[[ -d "$SOURCE" ]] && rm -rf "$SOURCE"

View File

@@ -29,7 +29,7 @@ function update_script() {
exit
fi
RELEASE="v5.1.0"
RELEASE="v5.2.0"
if check_for_gh_release "OpenCloud" "opencloud-eu/opencloud" "${RELEASE}"; then
msg_info "Stopping services"
systemctl stop opencloud opencloud-wopi
@@ -41,7 +41,9 @@ function update_script() {
ensure_dependencies "inotify-tools"
msg_ok "Updated packages"
rm -f /usr/bin/{OpenCloud,opencloud}
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "OpenCloud" "opencloud-eu/opencloud" "singlefile" "${RELEASE}" "/usr/bin" "opencloud-*-linux-amd64"
mv /usr/bin/OpenCloud /usr/bin/opencloud
if ! grep -q 'POSIX_WATCH' /etc/opencloud/opencloud.env; then
sed -i '/^## External/i ## Uncomment below to enable PosixFS Collaborative Mode\

View File

@@ -25,6 +25,8 @@ function update_script() {
check_container_storage
check_container_resources
ensure_dependencies zstd build-essential libmariadb-dev
if [[ -d /opt/open-webui ]]; then
msg_warn "Legacy installation detected — migrating to uv based install..."
msg_info "Stopping Service"
@@ -92,7 +94,6 @@ EOF
OLLAMA_VERSION=$(ollama -v | awk '{print $NF}')
RELEASE=$(curl -s https://api.github.com/repos/ollama/ollama/releases/latest | grep "tag_name" | awk '{print substr($2, 3, length($2)-4)}')
if [ "$OLLAMA_VERSION" != "$RELEASE" ]; then
ensure_dependencies zstd
msg_info "Ollama update available: v$OLLAMA_VERSION -> v$RELEASE"
msg_info "Downloading Ollama v$RELEASE \n"
curl -fS#LO https://github.com/ollama/ollama/releases/download/v${RELEASE}/ollama-linux-amd64.tar.zst

View File

@@ -1,9 +1,9 @@
#!/usr/bin/env bash
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
# Copyright (c) 2021-2026 community-scripts ORG
# Author: vhsdream
# Author: vhsdream | MickLesk (CanbiZ)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://rxresume.org | Github: https://github.com/lazy-media/Reactive-Resume
# Source: https://rxresume.org | Github: https://github.com/amruthpillai/reactive-resume
APP="Reactive-Resume"
var_tags="${var_tags:-documents}"
@@ -24,62 +24,29 @@ function update_script() {
check_container_storage
check_container_resources
if [[ ! -f /etc/systemd/system/Reactive-Resume.service ]]; then
if [[ ! -f /etc/systemd/system/reactive-resume.service ]]; then
msg_error "No $APP Installation Found!"
exit
fi
if check_for_gh_release "Reactive-Resume" "lazy-media/Reactive-Resume"; then
if check_for_gh_release "reactive-resume" "amruthpillai/reactive-resume"; then
msg_info "Stopping services"
systemctl stop Reactive-Resume
systemctl stop reactive-resume
msg_ok "Stopped services"
cp /opt/Reactive-Resume/.env /opt/rxresume.env
fetch_and_deploy_gh_release "Reactive-Resume" "lazy-media/Reactive-Resume" "tarball" "latest" "/opt/Reactive-Resume"
cp /opt/reactive-resume/.env /opt/reactive-resume.env.bak
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "reactive-resume" "amruthpillai/reactive-resume" "tarball" "latest" "/opt/reactive-resume"
msg_info "Updating Reactive-Resume"
cd /opt/Reactive-Resume
export PUPPETEER_SKIP_DOWNLOAD="true"
export NEXT_TELEMETRY_DISABLED=1
msg_info "Updating Reactive Resume (Patience)"
cd /opt/reactive-resume
export CI="true"
export NODE_ENV="production"
$STD pnpm install --frozen-lockfile
$STD pnpm run build
$STD pnpm run prisma:generate
mv /opt/rxresume.env /opt/Reactive-Resume/.env
msg_ok "Updated Reactive-Resume"
msg_info "Updating Minio"
systemctl stop minio
cd /tmp
curl -fsSL https://dl.min.io/server/minio/release/linux-amd64/minio.deb -o minio.deb
$STD dpkg -i minio.deb
rm -f /tmp/minio.deb
msg_ok "Updated Minio"
msg_info "Updating Browserless (Patience)"
systemctl stop browserless
cp /opt/browserless/.env /opt/browserless.env
rm -rf /opt/browserless
brwsr_tmp=$(mktemp)
TAG=$(curl -fsSL https://api.github.com/repos/browserless/browserless/tags?per_page=1 | grep "name" | awk '{print substr($2, 3, length($2)-4) }')
curl -fsSL https://github.com/browserless/browserless/archive/refs/tags/v"$TAG".zip -o "$brwsr_tmp"
$STD unzip "$brwsr_tmp"
mv browserless-"$TAG"/ /opt/browserless
cd /opt/browserless
$STD npm install typescript
$STD npm install esbuild
$STD npm install
rm -rf src/routes/{chrome,edge,firefox,webkit}
$STD node_modules/playwright-core/cli.js install --with-deps chromium
$STD npm run build
$STD npm run build:function
$STD npm prune production
mv /opt/browserless.env /opt/browserless/.env
rm -f "$brwsr_tmp"
msg_ok "Updated Browserless"
mv /opt/reactive-resume.env.bak /opt/reactive-resume/.env
msg_ok "Updated Reactive Resume"
msg_info "Restarting services"
systemctl start minio Reactive-Resume browserless
systemctl start chromium-printer reactive-resume
msg_ok "Restarted services"
msg_ok "Updated successfully!"
fi

View File

@@ -29,7 +29,7 @@ function update_script() {
exit
fi
if check_for_gh_release "rustdesk-api"; then
if check_for_gh_release "rustdesk-hbbs" "lejianwen/rustdesk-server"; then
msg_info "Stopping Service"
systemctl stop rustdesk-hbbr
systemctl stop rustdesk-hbbs

View File

@@ -53,6 +53,13 @@ function update_script() {
fi
sed -i 's|_TARGET=.*$|_URL=http://127.0.0.1:60072|' /opt/scanopy/.env
msg_info "Building Scanopy Server (patience)"
cd /opt/scanopy/backend
$STD cargo build --release --bin server --bin generate-fixtures
$STD ./target/release/generate-fixtures --output-dir /opt/scanopy/ui/src/lib/data
mv ./target/release/server /usr/bin/scanopy-server
msg_ok "Built Scanopy Server"
msg_info "Creating frontend UI"
export PUBLIC_SERVER_HOSTNAME=default
export PUBLIC_SERVER_PORT=""
@@ -61,12 +68,6 @@ function update_script() {
$STD npm run build
msg_ok "Created frontend UI"
msg_info "Building Scanopy Server (patience)"
cd /opt/scanopy/backend
$STD cargo build --release --bin server
mv ./target/release/server /usr/bin/scanopy-server
msg_ok "Built Scanopy Server"
if [[ -f /etc/systemd/system/scanopy-daemon.service ]]; then
fetch_and_deploy_gh_release "Scanopy Daemon" "scanopy/scanopy" "singlefile" "latest" "/usr/local/bin" "scanopy-daemon-linux-amd64"
mv "/usr/local/bin/Scanopy Daemon" /usr/local/bin/scanopy-daemon

View File

@@ -9,7 +9,7 @@ APP="Tracearr"
var_tags="${var_tags:-media}"
var_cpu="${var_cpu:-2}"
var_ram="${var_ram:-2048}"
var_disk="${var_disk:-5}"
var_disk="${var_disk:-10}"
var_os="${var_os:-debian}"
var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}"

View File

@@ -37,7 +37,7 @@ function update_script() {
if [[ -d /etc/wgdashboard ]]; then
sleep 2
cd /etc/wgdashboard/src
$STD ./wgd.sh update
$STD ./wgd.sh update -y
$STD ./wgd.sh start
fi
msg_ok "Updated LXC"

View File

@@ -35,13 +35,16 @@ function update_script() {
msg_ok "Stopped Service"
msg_info "Creating Backup"
rm -rf /opt/${APP}_backup*.tar.gz
mkdir -p /opt/z2m_backup
$STD tar -czf /opt/z2m_backup/${APP}_backup_$(date +%Y%m%d%H%M%S).tar.gz -C /opt zigbee2mqtt
mv /opt/zigbee2mqtt/data /opt/z2m_backup
msg_ok "Backup Created"
ensure_dependencies zstd
mkdir -p /opt/{backups,z2m_backup}
BACKUP_VERSION="$(<"$HOME/.zigbee2mqtt")"
BACKUP_FILE="/opt/backups/${APP}_backup_${BACKUP_VERSION}.tar.zst"
$STD tar -cf - -C /opt zigbee2mqtt | zstd -q -o "$BACKUP_FILE"
ls -t /opt/backups/${APP}_backup_*.tar.zst 2>/dev/null | tail -n +6 | xargs -r rm -f
mv /opt/zigbee2mqtt/data /opt/z2m_backup/data
msg_ok "Backup Created (${BACKUP_VERSION})"
fetch_and_deploy_gh_release "Zigbee2MQTT" "Koenkk/zigbee2mqtt" "tarball" "latest" "/opt/zigbee2mqtt"
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Zigbee2MQTT" "Koenkk/zigbee2mqtt" "tarball" "latest" "/opt/zigbee2mqtt"
msg_info "Updating Zigbee2MQTT"
rm -rf /opt/zigbee2mqtt/data

View File

@@ -1,6 +1,6 @@
{
"name": "PVE LXC Execute Command",
"slug": "lxc-execute",
"slug": "execute",
"categories": [
1
],

View File

@@ -1,19 +1,19 @@
{
"generated": "2026-03-08T12:08:10Z",
"generated": "2026-03-11T12:12:40Z",
"versions": [
{
"slug": "2fauth",
"repo": "Bubka/2FAuth",
"version": "v6.0.0",
"version": "v6.1.0",
"pinned": false,
"date": "2026-01-14T16:00:58Z"
"date": "2026-03-11T07:48:27Z"
},
{
"slug": "adguard",
"repo": "AdguardTeam/AdGuardHome",
"version": "v0.107.72",
"version": "v0.107.73",
"pinned": false,
"date": "2026-02-19T15:37:49Z"
"date": "2026-03-10T17:23:23Z"
},
{
"slug": "adguardhome-sync",
@@ -67,9 +67,9 @@
{
"slug": "autobrr",
"repo": "autobrr/autobrr",
"version": "v1.73.0",
"version": "v1.74.0",
"pinned": false,
"date": "2026-02-13T16:37:28Z"
"date": "2026-03-08T21:45:41Z"
},
{
"slug": "autocaliweb",
@@ -88,9 +88,9 @@
{
"slug": "backrest",
"repo": "garethgeorge/backrest",
"version": "v1.12.0",
"version": "v1.12.1",
"pinned": false,
"date": "2026-02-22T06:49:49Z"
"date": "2026-03-11T06:16:22Z"
},
{
"slug": "baikal",
@@ -116,9 +116,9 @@
{
"slug": "bentopdf",
"repo": "alam00000/bentopdf",
"version": "v2.4.1",
"version": "v2.5.0",
"pinned": false,
"date": "2026-03-07T09:14:39Z"
"date": "2026-03-10T08:40:54Z"
},
{
"slug": "beszel",
@@ -151,9 +151,9 @@
{
"slug": "booklore",
"repo": "booklore-app/BookLore",
"version": "v2.0.6",
"version": "v2.1.0",
"pinned": false,
"date": "2026-03-06T19:16:29Z"
"date": "2026-03-08T20:27:24Z"
},
{
"slug": "bookstack",
@@ -200,9 +200,9 @@
{
"slug": "cleanuparr",
"repo": "Cleanuparr/Cleanuparr",
"version": "v2.7.7",
"version": "v2.7.9",
"pinned": false,
"date": "2026-03-02T13:08:32Z"
"date": "2026-03-10T18:51:23Z"
},
{
"slug": "cloudreve",
@@ -228,9 +228,9 @@
{
"slug": "configarr",
"repo": "raydak-labs/configarr",
"version": "v1.23.0",
"version": "v1.24.0",
"pinned": false,
"date": "2026-02-23T12:28:13Z"
"date": "2026-03-09T15:16:08Z"
},
{
"slug": "convertx",
@@ -270,23 +270,23 @@
{
"slug": "databasus",
"repo": "databasus/databasus",
"version": "v3.17.0",
"version": "v3.19.1",
"pinned": false,
"date": "2026-03-06T07:07:22Z"
"date": "2026-03-11T10:25:28Z"
},
{
"slug": "dawarich",
"repo": "Freika/dawarich",
"version": "1.3.1",
"version": "1.3.2",
"pinned": false,
"date": "2026-02-27T19:47:40Z"
"date": "2026-03-08T20:37:50Z"
},
{
"slug": "discopanel",
"repo": "nickheyer/discopanel",
"version": "v2.0.1",
"version": "v2.0.3",
"pinned": false,
"date": "2026-03-07T02:43:33Z"
"date": "2026-03-11T07:29:10Z"
},
{
"slug": "dispatcharr",
@@ -312,16 +312,16 @@
{
"slug": "domain-monitor",
"repo": "Hosteroid/domain-monitor",
"version": "v1.1.4",
"version": "v1.1.5",
"pinned": false,
"date": "2026-03-02T09:25:01Z"
"date": "2026-03-08T19:17:09Z"
},
{
"slug": "donetick",
"repo": "donetick/donetick",
"version": "v0.1.74",
"version": "v0.1.75-beta.3",
"pinned": false,
"date": "2026-02-14T23:21:45Z"
"date": ""
},
{
"slug": "drawio",
@@ -347,9 +347,9 @@
{
"slug": "elementsynapse",
"repo": "etkecc/synapse-admin",
"version": "v0.11.1-etke53",
"version": "v0.11.4-etke54",
"pinned": false,
"date": "2026-02-03T20:38:15Z"
"date": "2026-03-08T12:37:07Z"
},
{
"slug": "emby",
@@ -389,9 +389,9 @@
{
"slug": "fladder",
"repo": "DonutWare/Fladder",
"version": "v0.10.1",
"version": "v0.10.2",
"pinned": false,
"date": "2026-02-21T12:45:53Z"
"date": "2026-03-08T15:28:11Z"
},
{
"slug": "flaresolverr",
@@ -438,9 +438,9 @@
{
"slug": "ghostfolio",
"repo": "ghostfolio/ghostfolio",
"version": "2.248.0",
"version": "2.249.0",
"pinned": false,
"date": "2026-03-07T17:24:24Z"
"date": "2026-03-10T19:26:50Z"
},
{
"slug": "gitea",
@@ -473,9 +473,9 @@
{
"slug": "gokapi",
"repo": "Forceu/Gokapi",
"version": "v2.2.3",
"version": "v2.2.4",
"pinned": false,
"date": "2026-03-04T21:29:16Z"
"date": "2026-03-10T15:44:19Z"
},
{
"slug": "gotify",
@@ -557,9 +557,9 @@
{
"slug": "homebox",
"repo": "sysadminsmedia/homebox",
"version": "v0.24.1",
"version": "v0.24.2",
"pinned": false,
"date": "2026-03-07T15:41:21Z"
"date": "2026-03-09T19:54:02Z"
},
{
"slug": "homepage",
@@ -613,16 +613,16 @@
{
"slug": "invoiceninja",
"repo": "invoiceninja/invoiceninja",
"version": "v5.12.70",
"version": "v5.13.1",
"pinned": false,
"date": "2026-03-06T01:53:38Z"
"date": "2026-03-10T23:45:05Z"
},
{
"slug": "jackett",
"repo": "Jackett/Jackett",
"version": "v0.24.1316",
"version": "v0.24.1341",
"pinned": false,
"date": "2026-03-08T05:59:08Z"
"date": "2026-03-11T05:55:00Z"
},
{
"slug": "jellystat",
@@ -676,9 +676,9 @@
{
"slug": "kima-hub",
"repo": "Chevron7Locked/kima-hub",
"version": "v1.6.2",
"version": "v1.6.3",
"pinned": false,
"date": "2026-03-05T05:38:02Z"
"date": "2026-03-10T22:26:12Z"
},
{
"slug": "kimai",
@@ -823,9 +823,9 @@
{
"slug": "mail-archiver",
"repo": "s1t5/mail-archiver",
"version": "2602.4",
"version": "2603.1",
"pinned": false,
"date": "2026-02-26T08:43:01Z"
"date": "2026-03-10T11:51:08Z"
},
{
"slug": "managemydamnlife",
@@ -837,9 +837,9 @@
{
"slug": "manyfold",
"repo": "manyfold3d/manyfold",
"version": "v0.133.1",
"version": "v0.134.0",
"pinned": false,
"date": "2026-02-26T15:50:34Z"
"date": "2026-03-09T13:20:45Z"
},
{
"slug": "mealie",
@@ -879,9 +879,9 @@
{
"slug": "metube",
"repo": "alexta69/metube",
"version": "2026.03.07",
"version": "2026.03.08",
"pinned": false,
"date": "2026-03-07T14:14:57Z"
"date": "2026-03-08T20:28:19Z"
},
{
"slug": "miniflux",
@@ -998,9 +998,9 @@
{
"slug": "opencloud",
"repo": "opencloud-eu/opencloud",
"version": "v5.1.0",
"version": "v5.2.0",
"pinned": true,
"date": "2026-02-16T15:04:28Z"
"date": "2026-03-09T13:32:31Z"
},
{
"slug": "opengist",
@@ -1236,9 +1236,9 @@
{
"slug": "pulse",
"repo": "rcourtman/Pulse",
"version": "v5.1.21",
"version": "v5.1.23",
"pinned": false,
"date": "2026-03-06T12:13:08Z"
"date": "2026-03-09T22:22:12Z"
},
{
"slug": "pve-scripts-local",
@@ -1305,10 +1305,10 @@
},
{
"slug": "reactive-resume",
"repo": "lazy-media/Reactive-Resume",
"version": "v1.2.7",
"repo": "amruthpillai/reactive-resume",
"version": "v5.0.11",
"pinned": false,
"date": "2026-01-20T11:59:40Z"
"date": "2026-03-04T20:39:11Z"
},
{
"slug": "recyclarr",
@@ -1327,9 +1327,9 @@
{
"slug": "revealjs",
"repo": "hakimel/reveal.js",
"version": "5.2.1",
"version": "6.0.0",
"pinned": false,
"date": "2025-03-28T13:00:23Z"
"date": "2026-03-11T11:54:59Z"
},
{
"slug": "romm",
@@ -1362,9 +1362,9 @@
{
"slug": "scanopy",
"repo": "scanopy/scanopy",
"version": "v0.14.16",
"version": "v0.14.17",
"pinned": false,
"date": "2026-03-08T06:39:25Z"
"date": "2026-03-09T05:04:49Z"
},
{
"slug": "scraparr",
@@ -1376,9 +1376,9 @@
{
"slug": "seaweedfs",
"repo": "seaweedfs/seaweedfs",
"version": "4.15",
"version": "4.17",
"pinned": false,
"date": "2026-03-05T06:30:30Z"
"date": "2026-03-11T09:30:38Z"
},
{
"slug": "seelf",
@@ -1397,9 +1397,9 @@
{
"slug": "semaphore",
"repo": "semaphoreui/semaphore",
"version": "v2.17.16",
"version": "v2.17.21",
"pinned": false,
"date": "2026-03-05T12:39:05Z"
"date": "2026-03-09T09:33:06Z"
},
{
"slug": "shelfmark",
@@ -1516,9 +1516,9 @@
{
"slug": "tasmoadmin",
"repo": "TasmoAdmin/TasmoAdmin",
"version": "v4.3.4",
"version": "v5.0.0",
"pinned": false,
"date": "2026-01-25T22:16:41Z"
"date": "2026-03-09T20:51:03Z"
},
{
"slug": "tautulli",
@@ -1537,9 +1537,9 @@
{
"slug": "termix",
"repo": "Termix-SSH/Termix",
"version": "release-1.11.1-tag",
"version": "release-1.11.2-tag",
"pinned": false,
"date": "2026-02-13T04:49:16Z"
"date": "2026-03-08T23:27:30Z"
},
{
"slug": "the-lounge",
@@ -1551,9 +1551,9 @@
{
"slug": "thingsboard",
"repo": "thingsboard/thingsboard",
"version": "v4.3.0.1",
"version": "v4.3.1",
"pinned": false,
"date": "2026-02-03T12:39:14Z"
"date": "2026-03-10T09:25:25Z"
},
{
"slug": "threadfin",
@@ -1572,9 +1572,9 @@
{
"slug": "tinyauth",
"repo": "steveiliop56/tinyauth",
"version": "v5.0.1",
"version": "v5.0.2",
"pinned": false,
"date": "2026-03-04T21:05:05Z"
"date": "2026-03-08T15:46:59Z"
},
{
"slug": "traccar",
@@ -1586,9 +1586,9 @@
{
"slug": "tracearr",
"repo": "connorgallopo/Tracearr",
"version": "v1.4.21",
"version": "v1.4.22",
"pinned": false,
"date": "2026-03-03T18:43:20Z"
"date": "2026-03-09T17:39:52Z"
},
{
"slug": "tracktor",
@@ -1628,9 +1628,9 @@
{
"slug": "tunarr",
"repo": "chrisbenincasa/tunarr",
"version": "v1.1.18",
"version": "v1.1.19",
"pinned": false,
"date": "2026-02-26T22:09:44Z"
"date": "2026-03-11T02:21:06Z"
},
{
"slug": "uhf",
@@ -1670,9 +1670,9 @@
{
"slug": "uptimekuma",
"repo": "louislam/uptime-kuma",
"version": "2.2.0",
"version": "2.2.1",
"pinned": false,
"date": "2026-03-05T02:08:14Z"
"date": "2026-03-10T02:25:33Z"
},
{
"slug": "vaultwarden",
@@ -1712,9 +1712,9 @@
{
"slug": "wanderer",
"repo": "meilisearch/meilisearch",
"version": "v1.37.0",
"version": "v1.38.2",
"pinned": false,
"date": "2026-03-02T09:16:36Z"
"date": "2026-03-11T11:36:01Z"
},
{
"slug": "warracker",
@@ -1726,9 +1726,9 @@
{
"slug": "watcharr",
"repo": "sbondCo/Watcharr",
"version": "v3.0.0",
"version": "v3.0.1",
"pinned": false,
"date": "2026-03-04T09:29:14Z"
"date": "2026-03-09T11:33:44Z"
},
{
"slug": "watchyourlan",
@@ -1803,9 +1803,9 @@
{
"slug": "yubal",
"repo": "guillevc/yubal",
"version": "v0.6.3",
"version": "v0.7.0",
"pinned": false,
"date": "2026-03-07T03:24:05Z"
"date": "2026-03-08T13:37:49Z"
},
{
"slug": "zerobyte",
@@ -1831,16 +1831,16 @@
{
"slug": "zitadel",
"repo": "zitadel/zitadel",
"version": "v4.12.1",
"version": "v4.12.2",
"pinned": false,
"date": "2026-03-04T12:40:17Z"
"date": "2026-03-11T07:50:10Z"
},
{
"slug": "zoraxy",
"repo": "tobychui/zoraxy",
"version": "v3.3.2-rc2",
"version": "v3.3.2-rc3",
"pinned": false,
"date": "2026-02-27T03:31:25Z"
"date": "2026-03-09T13:56:45Z"
},
{
"slug": "zwave-js-ui",

View File

@@ -12,7 +12,7 @@
"documentation": "https://docs.rxresume.org/",
"website": "https://rxresume.org",
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/reactive-resume.webp",
"config_path": "/opt/Reactive-Resume/.env",
"config_path": "/opt/reactive-resume/.env",
"description": "A one-of-a-kind resume builder that keeps your privacy in mind. Completely secure, customizable, portable, open-source and free forever.",
"install_methods": [
{

View File

@@ -21,7 +21,7 @@
"resources": {
"cpu": 2,
"ram": 2048,
"hdd": 5,
"hdd": 10,
"os": "Debian",
"version": "13"
}

View File

@@ -5,6 +5,7 @@ import { Inter } from "next/font/google";
import Script from "next/script";
import React from "react";
import { CopycatWarningToast } from "@/components/copycat-warning-toast";
import { ThemeProvider } from "@/components/theme-provider";
import { analytics, basePath } from "@/config/site-config";
import QueryProvider from "@/components/query-provider";
@@ -116,6 +117,7 @@ export default function RootLayout({
<div className="w-full max-w-[1440px] ">
{children}
<Toaster richColors />
<CopycatWarningToast />
</div>
</div>
<Footer />

View File

@@ -0,0 +1,24 @@
"use client";
import { useEffect } from "react";
import { toast } from "sonner";
const STORAGE_KEY = "copycat-warning-dismissed";
export function CopycatWarningToast() {
useEffect(() => {
if (typeof window === "undefined")
return;
if (localStorage.getItem(STORAGE_KEY) === "true")
return;
toast.warning("Beware of copycat sites. Always verify the URL is correct before trusting or running scripts.", {
position: "top-center",
duration: Number.POSITIVE_INFINITY,
closeButton: true,
onDismiss: () => localStorage.setItem(STORAGE_KEY, "true"),
});
}, []);
return null;
}

View File

@@ -45,8 +45,8 @@ export const navbarLinks = [
export const mostPopularScripts = ["post-pve-install", "docker", "homeassistant"];
export const analytics = {
url: "analytics.bramsuurd.nl",
token: "f9eee289f931",
url: "analytics.community-scripts.org",
token: "e9f14e1e7232",
};
export const AlertColors = {

View File

@@ -208,7 +208,7 @@ msg_info "Building OpenVino Model"
cd /models
wget -q http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz
$STD tar -zxf ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz --no-same-owner
if $STD python3 /opt/frigate/docker/main/build_ov_model.py; then
if python3 /opt/frigate/docker/main/build_ov_model.py &>/dev/null; then
cp /models/ssdlite_mobilenet_v2.xml /openvino-model/
cp /models/ssdlite_mobilenet_v2.bin /openvino-model/
wget -q https://github.com/openvinotoolkit/open_model_zoo/raw/master/data/dataset_classes/coco_91cl_bkgr.txt -O /openvino-model/coco_91cl_bkgr.txt

View File

@@ -137,7 +137,7 @@ rm -rf /opt/glpi-${RELEASE}.tgz
msg_ok "Setup Service"
msg_info "Setup Cronjob"
echo "* * * * * php /opt/glpi/front/cron.php" | crontab -
echo "* * * * * php /opt/glpi/front/cron.php" | crontab -u www-data -
msg_ok "Setup Cronjob"
msg_info "Update PHP Params"

View File

@@ -260,7 +260,7 @@ msg_ok "(4/5) Compiled imagemagick"
msg_info "(5/5) Compiling libvips"
SOURCE=$SOURCE_DIR/libvips
: "${LIBVIPS_REVISION:=$(jq -cr '.revision' $BASE_DIR/server/sources/libvips.json)}"
LIBVIPS_REVISION="0c9151a4f416d2f8ae20a755db218f6637050eec"
$STD git clone https://github.com/libvips/libvips.git "$SOURCE"
cd "$SOURCE"
$STD git reset --hard "$LIBVIPS_REVISION"

View File

@@ -28,7 +28,7 @@ fi
if ! dpkg -l | grep -q 'libssl1.1'; then
msg_info "Installing libssl (if needed)"
curl -fsSL "https://security.debian.org/debian-security/pool/updates/main/o/openssl/libssl1.1_1.1.1w-0+deb11u4_amd64.deb" -o "/tmp/libssl.deb"
curl -fsSL "https://security.debian.org/debian-security/pool/updates/main/o/openssl/libssl1.1_1.1.1w-0+deb11u5_amd64.deb" -o "/tmp/libssl.deb"
$STD dpkg -i /tmp/libssl.deb
rm -f /tmp/libssl.deb
msg_ok "Installed libssl1.1"

View File

@@ -64,7 +64,8 @@ $STD sudo -u cool coolconfig set-admin-password --user=admin --password="$COOLPA
echo "$COOLPASS" >~/.coolpass
msg_ok "Installed Collabora Online"
fetch_and_deploy_gh_release "opencloud" "opencloud-eu/opencloud" "singlefile" "v5.1.0" "/usr/bin" "opencloud-*-linux-amd64"
fetch_and_deploy_gh_release "OpenCloud" "opencloud-eu/opencloud" "singlefile" "v5.2.0" "/usr/bin" "opencloud-*-linux-amd64"
mv /usr/bin/OpenCloud /usr/bin/opencloud
msg_info "Configuring OpenCloud"
DATA_DIR="/var/lib/opencloud"

View File

@@ -16,7 +16,9 @@ update_os
msg_info "Installing Dependencies"
$STD apt install -y \
ffmpeg \
zstd
zstd \
build-essential \
libmariadb-dev
msg_ok "Installed Dependencies"
setup_hwaccel

View File

@@ -27,6 +27,9 @@ WebUI\Password_PBKDF2="@ByteArray(amjeuVrF3xRbgzqWQmes5A==:XK3/Ra9jUmqUc4RwzCtrh
WebUI\Port=8090
WebUI\UseUPnP=false
WebUI\Username=admin
[Network]
PortForwardingEnabled=false
EOF
msg_ok "Setup qBittorrent-nox"

View File

@@ -1,9 +1,9 @@
#!/usr/bin/env bash
# Copyright (c) 2021-2026 community-scripts ORG
# Author: vhsdream
# Author: vhsdream | Rewrite: MickLesk (CanbiZ)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://rxresume.org | Github: https://github.com/lazy-media/Reactive-Resume
# Source: https://rxresume.org | Github: https://github.com/amruthpillai/reactive-resume
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color
@@ -13,149 +13,106 @@ setting_up_container
network_check
update_os
PG_VERSION="16" setup_postgresql
PG_DB_NAME="reactive_resume" PG_DB_USER="reactive_resume" setup_postgresql_db
NODE_VERSION="22" NODE_MODULE="pnpm@latest" setup_nodejs
msg_info "Installing Dependencies"
cd /tmp
curl -fsSL https://dl.min.io/server/minio/release/linux-amd64/minio.deb -o minio.deb
$STD dpkg -i minio.deb
$STD apt install -y chromium
msg_ok "Installed Dependencies"
PG_VERSION="16" setup_postgresql
PG_DB_NAME="rxresume" PG_DB_USER="rxresume" PG_DB_GRANT_SUPERUSER="true" setup_postgresql_db
NODE_VERSION="24" NODE_MODULE="pnpm@latest" setup_nodejs
fetch_and_deploy_gh_release "Reactive-Resume" "lazy-media/Reactive-Resume" "tarball"
fetch_and_deploy_gh_release "reactive-resume" "amruthpillai/reactive-resume" "tarball"
msg_info "Setting up Reactive-Resume"
MINIO_PASS=$(openssl rand -base64 48)
ACCESS_TOKEN=$(openssl rand -base64 48)
REFRESH_TOKEN=$(openssl rand -base64 48)
CHROME_TOKEN=$(openssl rand -hex 32)
TAG=$(curl -fsSL https://api.github.com/repos/browserless/browserless/tags?per_page=1 | grep "name" | awk '{print substr($2, 3, length($2)-4) }')
cd /opt/Reactive-Resume
export CI="true"
export PUPPETEER_SKIP_DOWNLOAD="true"
msg_info "Building Reactive Resume (Patience)"
cd /opt/reactive-resume
export NODE_ENV="production"
export NEXT_TELEMETRY_DISABLED=1
export CI="true"
$STD pnpm install --frozen-lockfile
$STD pnpm run build
$STD pnpm run prisma:generate
msg_ok "Setup Reactive-Resume"
mkdir -p /opt/reactive-resume/data
msg_ok "Built Reactive Resume"
msg_info "Installing Browserless (Patience)"
cd /tmp
curl -fsSL https://github.com/browserless/browserless/archive/refs/tags/v"$TAG".zip -o v"$TAG".zip
$STD unzip v"$TAG".zip
mv browserless-"$TAG" /opt/browserless
cd /opt/browserless
$STD npm install
rm -rf src/routes/{chrome,edge,firefox,webkit}
$STD node_modules/playwright-core/cli.js install --with-deps chromium
$STD npm install typescript --save-dev
$STD npm install esbuild --save-dev
$STD npm run build
$STD npm run build:function
$STD npm prune production
msg_ok "Installed Browserless"
msg_info "Configuring Reactive Resume"
AUTH_SECRET=$(openssl rand -hex 32)
msg_info "Configuring applications"
mkdir -p /opt/minio
cat <<EOF >/opt/minio/.env
MINIO_ROOT_USER="storageadmin"
MINIO_ROOT_PASSWORD="${MINIO_PASS}"
MINIO_VOLUMES=/opt/minio
MINIO_OPTS="--address :9000 --console-address 127.0.0.1:9001"
EOF
cat <<EOF >/opt/Reactive-Resume/.env
cat <<EOF >/opt/reactive-resume/.env
# Reactive Resume v5 Configuration
NODE_ENV=production
PORT=3000
# for use behind a reverse proxy, use your FQDN for PUBLIC_URL and STORAGE_URL
# To avoid issues when behind a reverse proxy with downloading PDFs, ensure that the
# storage path is accessible via a subdomain (i.e storage.yourapp.xyz) or you set your
# reverse proxy to properly rewrite the subpath (/rxresume) to point to the service
# running on port 9000 (minio).
PUBLIC_URL=http://${LOCAL_IP}:3000
STORAGE_URL=http://${LOCAL_IP}:9000/rxresume
DATABASE_URL=postgresql://${PG_DB_USER}:${PG_DB_PASS}@localhost:5432/${PG_DB_NAME}?schema=public
ACCESS_TOKEN_SECRET=${ACCESS_TOKEN}
REFRESH_TOKEN_SECRET=${REFRESH_TOKEN}
CHROME_PORT=8080
CHROME_TOKEN=${CHROME_TOKEN}
CHROME_URL=ws://localhost:8080
CHROME_IGNORE_HTTPS_ERRORS=true
MAIL_FROM=noreply@locahost
# SMTP_URL=smtp://username:password@smtp.server.mail:587 #
STORAGE_ENDPOINT=localhost
STORAGE_PORT=9000
STORAGE_REGION=us-east-1
STORAGE_BUCKET=rxresume
STORAGE_ACCESS_KEY=storageadmin
STORAGE_SECRET_KEY=${MINIO_PASS}
STORAGE_USE_SSL=false
STORAGE_SKIP_BUCKET_CHECK=false
# GitHub (OAuth, Optional)
# Public URL (change to your FQDN when using a reverse proxy)
APP_URL=http://${LOCAL_IP}:3000
# Database
DATABASE_URL=postgresql://${PG_DB_USER}:${PG_DB_PASS}@localhost:5432/${PG_DB_NAME}
# Authentication Secret (do not change after initial setup)
AUTH_SECRET=${AUTH_SECRET}
# Printer (headless Chromium for PDF generation)
PRINTER_ENDPOINT=http://localhost:9222
# Storage: uses local filesystem (/opt/reactive-resume/data) when S3 is not configured
# S3_ACCESS_KEY_ID=
# S3_SECRET_ACCESS_KEY=
# S3_REGION=us-east-1
# S3_ENDPOINT=
# S3_BUCKET=
# S3_FORCE_PATH_STYLE=false
# Email (optional, logs to console if not configured)
# SMTP_HOST=
# SMTP_PORT=465
# SMTP_USER=
# SMTP_PASS=
# SMTP_FROM=Reactive Resume <noreply@localhost>
# OAuth (optional)
# GITHUB_CLIENT_ID=
# GITHUB_CLIENT_SECRET=
# GITHUB_CALLBACK_URL=http://localhost:5173/api/auth/github/callback
# Google (OAuth, Optional)
# GOOGLE_CLIENT_ID=
# GOOGLE_CLIENT_SECRET=
# GOOGLE_CALLBACK_URL=http://localhost:5173/api/auth/google/callback
EOF
cat <<EOF >/opt/browserless/.env
DEBUG=browserless*,-**:verbose
HOST=localhost
PORT=8080
TOKEN=${CHROME_TOKEN}
# Feature Flags
# FLAG_DISABLE_SIGNUPS=false
# FLAG_DISABLE_EMAIL_AUTH=false
EOF
rm -f /tmp/v"$TAG".zip
rm -f /tmp/minio.deb
msg_ok "Configured applications"
msg_ok "Configured Reactive Resume"
msg_info "Creating Services"
mkdir -p /etc/systemd/system/minio.service.d/
cat <<EOF >/etc/systemd/system/minio.service.d/override.conf
[Service]
User=root
Group=root
WorkingDirectory=/usr/local/bin
EnvironmentFile=/opt/minio/.env
EOF
cat <<EOF >/etc/systemd/system/Reactive-Resume.service
cat <<EOF >/etc/systemd/system/chromium-printer.service
[Unit]
Description=Reactive-Resume Service
After=network.target postgresql.service minio.service
Wants=postgresql.service minio.service
Description=Headless Chromium for Reactive Resume PDF generation
After=network.target
[Service]
WorkingDirectory=/opt/Reactive-Resume
EnvironmentFile=/opt/Reactive-Resume/.env
ExecStart=/usr/bin/pnpm run start
Type=simple
ExecStart=/usr/bin/chromium --headless --disable-gpu --no-sandbox --disable-dev-shm-usage --remote-debugging-address=127.0.0.1 --remote-debugging-port=9222
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
cat <<EOF >/etc/systemd/system/browserless.service
cat <<EOF >/etc/systemd/system/reactive-resume.service
[Unit]
Description=Browserless service
After=network.target Reactive-Resume.service
Description=Reactive Resume
After=network.target postgresql.service chromium-printer.service
Wants=postgresql.service chromium-printer.service
[Service]
WorkingDirectory=/opt/browserless
EnvironmentFile=/opt/browserless/.env
ExecStart=/usr/bin/npm run start
Restart=unless-stopped
WorkingDirectory=/opt/reactive-resume
EnvironmentFile=/opt/reactive-resume/.env
ExecStart=/usr/bin/node .output/server/index.mjs
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
systemctl daemon-reload
systemctl enable -q --now minio.service Reactive-Resume.service browserless.service
systemctl enable -q --now chromium-printer.service reactive-resume.service
msg_ok "Created Services"
motd_ssh

View File

@@ -27,6 +27,13 @@ fetch_and_deploy_gh_release "Scanopy" "scanopy/scanopy" "tarball" "latest" "/opt
TOOLCHAIN="$(grep "channel" /opt/scanopy/backend/rust-toolchain.toml | awk -F\" '{print $2}')"
RUST_TOOLCHAIN=$TOOLCHAIN setup_rust
msg_info "Building Scanopy Server (patience)"
cd /opt/scanopy/backend
$STD cargo build --release --bin server --bin generate-fixtures
$STD ./target/release/generate-fixtures --output-dir /opt/scanopy/ui/src/lib/data
mv ./target/release/server /usr/bin/scanopy-server
msg_ok "Built Scanopy Server"
msg_info "Creating frontend UI"
export PUBLIC_SERVER_HOSTNAME=default
export PUBLIC_SERVER_PORT=""
@@ -35,12 +42,6 @@ $STD npm ci --no-fund --no-audit
$STD npm run build
msg_ok "Created frontend UI"
msg_info "Building Scanopy Server (patience)"
cd /opt/scanopy/backend
$STD cargo build --release --bin server
mv ./target/release/server /usr/bin/scanopy-server
msg_ok "Built Scanopy Server"
msg_info "Configuring server for first-run"
cat <<EOF >/opt/scanopy/.env
### - SERVER

View File

@@ -196,7 +196,7 @@ explain_exit_code() {
103) echo "Validation: Shell is not Bash" ;;
104) echo "Validation: Not running as root (or invoked via sudo)" ;;
105) echo "Validation: Proxmox VE version not supported" ;;
106) echo "Validation: Architecture not supported (ARM / PiMox)" ;;
106) echo "Validation: Unsupported architecture (requires amd64 or arm64)" ;;
107) echo "Validation: Kernel key parameters unreadable" ;;
108) echo "Validation: Kernel key limits exceeded" ;;
109) echo "Proxmox: No available container ID after max attempts" ;;
@@ -348,10 +348,10 @@ explain_exit_code() {
json_escape() {
# Escape a string for safe JSON embedding using awk (handles any input size).
# Pipeline: strip ANSI → remove control chars → escape \ " TAB → join lines with \n
printf '%s' "$1" \
| sed 's/\x1b\[[0-9;]*[a-zA-Z]//g' \
| tr -d '\000-\010\013\014\016-\037\177\r' \
| awk '
printf '%s' "$1" |
sed 's/\x1b\[[0-9;]*[a-zA-Z]//g' |
tr -d '\000-\010\013\014\016-\037\177\r' |
awk '
BEGIN { ORS = "" }
{
gsub(/\\/, "\\\\") # backslash → \\

View File

@@ -1826,9 +1826,9 @@ advanced_settings() {
while [ $STEP -le $MAX_STEP ]; do
case $STEP in
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 1: Container Type
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
1)
local default_on="ON"
local default_off="OFF"
@@ -1851,9 +1851,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 2: Root Password
# ════════════════════════════════════════════════════════════════════════<EFBFBD><EFBFBD><EFBFBD>══
# ------------------------------------------------------------------------------
2)
if PW1=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "ROOT PASSWORD" \
@@ -1905,9 +1905,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 3: Container ID
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
3)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "CONTAINER ID" \
@@ -1939,9 +1939,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 4: Hostname
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
4)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "HOSTNAME" \
@@ -1962,9 +1962,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 5: Disk Size
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
5)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "DISK SIZE" \
@@ -1983,9 +1983,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 6: CPU Cores
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
6)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "CPU CORES" \
@@ -2004,9 +2004,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 7: RAM Size
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
7)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "RAM SIZE" \
@@ -2025,9 +2025,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 8: Network Bridge
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
8)
if [[ ${#BRIDGE_MENU_OPTIONS[@]} -eq 0 ]]; then
# Validate default bridge exists
@@ -2063,9 +2063,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 9: IPv4 Configuration
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
9)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "IPv4 CONFIGURATION" \
@@ -2160,9 +2160,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 10: IPv6 Configuration
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
10)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "IPv6 CONFIGURATION" \
@@ -2235,9 +2235,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 11: MTU Size
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
11)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "MTU SIZE" \
@@ -2255,9 +2255,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 12: DNS Search Domain
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
12)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "DNS SEARCH DOMAIN" \
@@ -2271,9 +2271,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 13: DNS Server
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
13)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "DNS SERVER" \
@@ -2287,9 +2287,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 14: MAC Address
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
14)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "MAC ADDRESS" \
@@ -2307,9 +2307,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 15: VLAN Tag
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
15)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "VLAN TAG" \
@@ -2327,9 +2327,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 16: Tags
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
16)
if result=$(whiptail --backtitle "Proxmox VE Helper Scripts [Step $STEP/$MAX_STEP]" \
--title "CONTAINER TAGS" \
@@ -2349,18 +2349,18 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 17: SSH Settings
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
17)
configure_ssh_settings "Step $STEP/$MAX_STEP"
# configure_ssh_settings handles its own flow, always advance
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 18: FUSE Support
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
18)
local fuse_default_flag="--defaultno"
[[ "$_enable_fuse" == "yes" || "$_enable_fuse" == "1" ]] && fuse_default_flag=""
@@ -2382,9 +2382,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 19: TUN/TAP Support
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
19)
local tun_default_flag="--defaultno"
[[ "$_enable_tun" == "yes" || "$_enable_tun" == "1" ]] && tun_default_flag=""
@@ -2406,9 +2406,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 20: Nesting Support
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
20)
local nesting_default_flag=""
[[ "$_enable_nesting" == "0" || "$_enable_nesting" == "no" ]] && nesting_default_flag="--defaultno"
@@ -2436,9 +2436,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 21: GPU Passthrough
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
21)
local gpu_default_flag="--defaultno"
[[ "$_enable_gpu" == "yes" ]] && gpu_default_flag=""
@@ -2460,9 +2460,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 22: Keyctl Support (Docker/systemd)
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
22)
local keyctl_default_flag="--defaultno"
[[ "$_enable_keyctl" == "1" ]] && keyctl_default_flag=""
@@ -2484,9 +2484,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 23: APT Cacher Proxy
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
23)
local apt_cacher_default_flag="--defaultno"
[[ "$_apt_cacher" == "yes" ]] && apt_cacher_default_flag=""
@@ -2516,9 +2516,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 24: Container Timezone
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
24)
local tz_hint="$_ct_timezone"
[[ -z "$tz_hint" ]] && tz_hint="(empty - will use host timezone)"
@@ -2541,9 +2541,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 25: Container Protection
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
25)
local protect_default_flag="--defaultno"
[[ "$_protect_ct" == "yes" || "$_protect_ct" == "1" ]] && protect_default_flag=""
@@ -2565,9 +2565,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 26: Device Node Creation (mknod)
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
26)
local mknod_default_flag="--defaultno"
[[ "$_enable_mknod" == "1" ]] && mknod_default_flag=""
@@ -2589,9 +2589,9 @@ advanced_settings() {
((STEP++))
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 27: Mount Filesystems
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
27)
local mount_hint=""
[[ -n "$_mount_fs" ]] && mount_hint="$_mount_fs" || mount_hint="(none)"
@@ -2608,9 +2608,9 @@ advanced_settings() {
fi
;;
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# STEP 28: Verbose Mode & Confirmation
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
28)
local verbose_default_flag="--defaultno"
[[ "$_verbose" == "yes" ]] && verbose_default_flag=""
@@ -2676,9 +2676,9 @@ Advanced:
esac
done
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
# Apply all collected values to global variables
# ═══════════════════════════════════════════════════════════════════════════
# ------------------------------------------------------------------------------
CT_TYPE="$_ct_type"
PW="$_pw"
CT_ID="$_ct_id"
@@ -2895,6 +2895,9 @@ echo_default() {
echo -e "${DISKSIZE}${BOLD}${DGN}Disk Size: ${BGN}${DISK_SIZE} GB${CL}"
echo -e "${CPUCORE}${BOLD}${DGN}CPU Cores: ${BGN}${CORE_COUNT}${CL}"
echo -e "${RAMSIZE}${BOLD}${DGN}RAM Size: ${BGN}${RAM_SIZE} MiB${CL}"
if [[ "$(dpkg --print-architecture)" == "arm64" ]]; then
echo -e "${INFO}${BOLD}${DGN}Architecture: ${BGN}arm64${CL}"
fi
if [[ -n "${var_gpu:-}" && "${var_gpu}" == "yes" ]]; then
echo -e "${GPU}${BOLD}${DGN}GPU Passthrough: ${BGN}Enabled${CL}"
fi
@@ -3922,6 +3925,17 @@ EOF
configure_gpu_passthrough
configure_additional_devices
# Increase disk size for AMD ROCm runtime (~4GB extra needed)
if [[ "${GPU_TYPE:-}" == "AMD" ]]; then
local rocm_extra=4
local new_disk_size=$((PCT_DISK_SIZE + rocm_extra))
if pct resize "$CTID" rootfs "${new_disk_size}G" >/dev/null 2>&1; then
msg_ok "Disk resized ${PCT_DISK_SIZE}GB → ${new_disk_size}GB for ROCm"
else
msg_warn "Failed to resize disk for ROCm — installation may fail if space is insufficient"
fi
fi
# ============================================================================
# START CONTAINER AND INSTALL USERLAND
# ============================================================================
@@ -4065,7 +4079,11 @@ EOF'
msg_warn "Skipping timezone setup zone '$tz' not found in container"
fi
pct exec "$CTID" -- bash -c "apt-get update 2>&1 && apt-get install -y sudo curl mc gnupg2 jq 2>&1" >>"$BUILD_LOG" 2>&1 || {
local _base_pkgs="sudo curl mc gnupg2 jq"
if [[ "${ARCH:-amd64}" == "arm64" ]]; then
_base_pkgs+=" openssh-server wget gcc"
fi
pct exec "$CTID" -- bash -c "apt-get update 2>&1 && apt-get install -y ${_base_pkgs} 2>&1" >>"$BUILD_LOG" 2>&1 || {
msg_error "apt-get base packages installation failed"
install_exit_code=1
}
@@ -4096,7 +4114,9 @@ EOF'
# that sends "configuring" status AFTER the host already reported "failed"
export CONTAINER_INSTALLING=true
lxc-attach -n "$CTID" -- bash -c "$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/install/${var_install}.sh)"
local _install_script
_install_script="$(curl -fsSL "https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/install/${var_install}.sh")"
lxc-attach -n "$CTID" -- bash -c "$_install_script"
local lxc_exit=$?
unset CONTAINER_INSTALLING
@@ -4474,7 +4494,9 @@ EOF'
# Re-run install script in existing container (don't destroy/recreate)
set +Eeuo pipefail
trap - ERR
lxc-attach -n "$CTID" -- bash -c "$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/install/${var_install}.sh)"
local _install_script
_install_script="$(curl -fsSL "https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/install/${var_install}.sh")"
lxc-attach -n "$CTID" -- bash -c "$_install_script"
local apt_retry_exit=$?
set -Eeuo pipefail
trap 'error_handler' ERR
@@ -4993,6 +5015,72 @@ create_lxc_container() {
esac
}
ARCH="$(dpkg --print-architecture)"
# Maps OS type + version to the release variant name used by ARM64 template sources.
arm64_template_variant() {
case "$1" in
debian)
case "$2" in
12 | 12.*) echo "bookworm" ;; 13 | 13.*) echo "trixie" ;;
*) echo "trixie" ;;
esac
;;
alpine) echo "3.22" ;;
ubuntu)
case "$2" in
24.04* | noble) echo "noble" ;; 24.10* | oracular) echo "oracular" ;;
*) echo "jammy" ;;
esac
;;
*) return 1 ;;
esac
}
# Downloads an ARM64 LXC rootfs template to $1.
# Debian: fetches latest release from community-scripts/debian-arm64-lxc on GitHub.
# Others: fetches from jenkins.linuxcontainers.org.
download_arm64_template() {
local dest="$1" url
mkdir -p "$(dirname "$dest")" || {
msg_error "Cannot create template dir."
exit 207
}
if [[ "$PCT_OSTYPE" == "debian" ]]; then
url=$(curl -fsSL "https://api.github.com/repos/community-scripts/debian-arm64-lxc/releases/latest" |
grep -Eo "https://[^\"]*debian-${CUSTOM_TEMPLATE_VARIANT}-arm64-rootfs\.tar\.xz" | head -n1)
[[ -n "$url" ]] || {
msg_error "Could not find Debian ${CUSTOM_TEMPLATE_VARIANT} ARM64 template URL."
exit 207
}
else
url="https://jenkins.linuxcontainers.org/job/image-${PCT_OSTYPE}/architecture=arm64,release=${CUSTOM_TEMPLATE_VARIANT},variant=default/lastStableBuild/artifact/rootfs.tar.xz"
fi
msg_info "Downloading ${PCT_OSTYPE^} ${CUSTOM_TEMPLATE_VARIANT} ARM64 template"
if ! curl -fsSL -o "$dest" "$url"; then
msg_error "Failed to download ARM64 template from: $url"
exit 208
fi
msg_ok "Downloaded ARM64 LXC template"
}
# Architecture-aware template download wrapper.
# Optional $1 overrides destination path (for local-storage fallback).
download_template() {
local dest="${1:-$TEMPLATE_PATH}"
if [[ "$ARCH" == "arm64" ]]; then
download_arm64_template "$dest"
else
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1 || {
msg_error "Failed to download template '$TEMPLATE' to storage '$TEMPLATE_STORAGE'"
exit 222
}
fi
}
# ------------------------------------------------------------------------------
# Required input variables
# ------------------------------------------------------------------------------
@@ -5129,153 +5217,116 @@ create_lxc_container() {
# ------------------------------------------------------------------------------
# Template discovery & validation
# ------------------------------------------------------------------------------
TEMPLATE_SEARCH="${PCT_OSTYPE}-${PCT_OSVERSION:-}"
case "$PCT_OSTYPE" in
debian | ubuntu) TEMPLATE_PATTERN="-standard_" ;;
alpine | fedora | rocky | centos) TEMPLATE_PATTERN="-default_" ;;
*) TEMPLATE_PATTERN="" ;;
esac
CUSTOM_TEMPLATE_VARIANT=""
msg_info "Searching for template '$TEMPLATE_SEARCH'"
if [[ "$ARCH" == "arm64" ]]; then
# ARM64: use custom template download from linuxcontainers.org / GitHub
msg_info "Preparing ARM64 template"
# Initialize variables
ONLINE_TEMPLATE=""
ONLINE_TEMPLATES=()
CUSTOM_TEMPLATE_VARIANT=$(arm64_template_variant "$PCT_OSTYPE" "${PCT_OSVERSION:-}") || {
msg_error "No ARM64 template mapping for ${PCT_OSTYPE} ${PCT_OSVERSION:-latest}"
exit 207
}
# Step 1: Check local templates first (instant)
mapfile -t LOCAL_TEMPLATES < <(
pveam list "$TEMPLATE_STORAGE" 2>/dev/null |
awk -v search="${TEMPLATE_SEARCH}" -v pattern="${TEMPLATE_PATTERN}" '$1 ~ search && $1 ~ pattern {print $1}' |
sed 's|.*/||' | sort -t - -k 2 -V
)
TEMPLATE="${PCT_OSTYPE}-${CUSTOM_TEMPLATE_VARIANT}-rootfs.tar.xz"
TEMPLATE_SOURCE="custom-arm64"
# Step 2: If local template found, use it immediately (skip pveam update)
if [[ ${#LOCAL_TEMPLATES[@]} -gt 0 ]]; then
TEMPLATE="${LOCAL_TEMPLATES[-1]}"
TEMPLATE_SOURCE="local"
msg_ok "Template search completed"
else
# Step 3: No local template - need to check online (this may be slow)
msg_info "No local template found, checking online catalog..."
# Update catalog with timeout to prevent long hangs
if command -v timeout &>/dev/null; then
if ! timeout 30 pveam update >/dev/null 2>&1; then
msg_warn "Template catalog update timed out (possible network/DNS issue). Run 'pveam update' manually to diagnose."
fi
else
pveam update >/dev/null 2>&1 || msg_warn "Could not update template catalog (pveam update failed)"
# Resolve template path: pvesm → storage.cfg fallback → default
TEMPLATE_PATH="$(pvesm path "${TEMPLATE_STORAGE}:vztmpl/${TEMPLATE}" 2>/dev/null || true)"
if [[ -z "$TEMPLATE_PATH" ]]; then
local _tpl_base
_tpl_base=$(awk -v s="$TEMPLATE_STORAGE" '$1==s {f=1} f && /path/ {print $2; exit}' /etc/pve/storage.cfg)
TEMPLATE_PATH="${_tpl_base:-/var/lib/vz}/template/cache/$TEMPLATE"
fi
# Download if missing, too small, or corrupt (single pass)
if [[ ! -f "$TEMPLATE_PATH" ]]; then
download_arm64_template "$TEMPLATE_PATH"
elif [[ "$(stat -c%s "$TEMPLATE_PATH")" -lt 1000000 ]] || ! tar -tf "$TEMPLATE_PATH" &>/dev/null; then
msg_warn "Local template invalid re-downloading."
rm -f "$TEMPLATE_PATH"
download_arm64_template "$TEMPLATE_PATH"
else
msg_ok "Template ${BL}$TEMPLATE${CL} found locally."
fi
else
TEMPLATE_SEARCH="${PCT_OSTYPE}-${PCT_OSVERSION:-}"
case "$PCT_OSTYPE" in
debian | ubuntu) TEMPLATE_PATTERN="-standard_" ;;
alpine | fedora | rocky | centos) TEMPLATE_PATTERN="-default_" ;;
*) TEMPLATE_PATTERN="" ;;
esac
msg_info "Searching for template '$TEMPLATE_SEARCH'"
# Initialize variables
ONLINE_TEMPLATE=""
ONLINE_TEMPLATES=()
mapfile -t ONLINE_TEMPLATES < <(pveam available -section system 2>/dev/null | grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' | awk '{print $2}' | grep -E "^${TEMPLATE_SEARCH}.*${TEMPLATE_PATTERN}" | sort -t - -k 2 -V 2>/dev/null || true)
[[ ${#ONLINE_TEMPLATES[@]} -gt 0 ]] && ONLINE_TEMPLATE="${ONLINE_TEMPLATES[-1]}"
TEMPLATE="$ONLINE_TEMPLATE"
TEMPLATE_SOURCE="online"
msg_ok "Template search completed"
fi
# If still no template, try to find alternatives
if [[ -z "$TEMPLATE" ]]; then
msg_warn "No template found for ${PCT_OSTYPE} ${PCT_OSVERSION}, searching for alternatives..."
# Get all available versions for this OS type
AVAILABLE_VERSIONS=()
mapfile -t AVAILABLE_VERSIONS < <(
pveam available -section system 2>/dev/null |
grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' |
awk -F'\t' '{print $1}' |
grep "^${PCT_OSTYPE}-" |
sed -E "s/.*${PCT_OSTYPE}-([0-9]+(\.[0-9]+)?).*/\1/" |
sort -u -V 2>/dev/null
# Step 1: Check local templates first (instant)
mapfile -t LOCAL_TEMPLATES < <(
pveam list "$TEMPLATE_STORAGE" 2>/dev/null |
awk -v search="${TEMPLATE_SEARCH}" -v pattern="${TEMPLATE_PATTERN}" '$1 ~ search && $1 ~ pattern {print $1}' |
sed 's|.*/||' | sort -t - -k 2 -V
)
if [[ ${#AVAILABLE_VERSIONS[@]} -gt 0 ]]; then
echo ""
echo "${BL}Available ${PCT_OSTYPE} versions:${CL}"
for i in "${!AVAILABLE_VERSIONS[@]}"; do
echo " [$((i + 1))] ${AVAILABLE_VERSIONS[$i]}"
done
echo ""
read -p "Select version [1-${#AVAILABLE_VERSIONS[@]}] or press Enter to cancel: " choice </dev/tty
# Step 2: If local template found, use it immediately (skip pveam update)
if [[ ${#LOCAL_TEMPLATES[@]} -gt 0 ]]; then
TEMPLATE="${LOCAL_TEMPLATES[-1]}"
TEMPLATE_SOURCE="local"
msg_ok "Template search completed"
else
# Step 3: No local template - need to check online (this may be slow)
msg_info "No local template found, checking online catalog..."
if [[ "$choice" =~ ^[0-9]+$ ]] && [[ "$choice" -ge 1 ]] && [[ "$choice" -le ${#AVAILABLE_VERSIONS[@]} ]]; then
PCT_OSVERSION="${AVAILABLE_VERSIONS[$((choice - 1))]}"
TEMPLATE_SEARCH="${PCT_OSTYPE}-${PCT_OSVERSION}"
ONLINE_TEMPLATES=()
mapfile -t ONLINE_TEMPLATES < <(
pveam available -section system 2>/dev/null |
grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' |
awk '{print $2}' |
grep -E "^${TEMPLATE_SEARCH}-.*${TEMPLATE_PATTERN}" |
sort -t - -k 2 -V 2>/dev/null || true
)
if [[ ${#ONLINE_TEMPLATES[@]} -gt 0 ]]; then
TEMPLATE="${ONLINE_TEMPLATES[-1]}"
TEMPLATE_SOURCE="online"
else
msg_error "No templates available for ${PCT_OSTYPE} ${PCT_OSVERSION}"
exit 225
# Update catalog with timeout to prevent long hangs
if command -v timeout &>/dev/null; then
if ! timeout 30 pveam update >/dev/null 2>&1; then
msg_warn "Template catalog update timed out (possible network/DNS issue). Run 'pveam update' manually to diagnose."
fi
else
msg_custom "🚫" "${YW}" "Installation cancelled"
exit 0
pveam update >/dev/null 2>&1 || msg_warn "Could not update template catalog (pveam update failed)"
fi
else
msg_error "No ${PCT_OSTYPE} templates available at all"
exit 225
ONLINE_TEMPLATES=()
mapfile -t ONLINE_TEMPLATES < <(pveam available -section system 2>/dev/null | grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' | awk '{print $2}' | grep -E "^${TEMPLATE_SEARCH}.*${TEMPLATE_PATTERN}" | sort -t - -k 2 -V 2>/dev/null || true)
[[ ${#ONLINE_TEMPLATES[@]} -gt 0 ]] && ONLINE_TEMPLATE="${ONLINE_TEMPLATES[-1]}"
TEMPLATE="$ONLINE_TEMPLATE"
TEMPLATE_SOURCE="online"
msg_ok "Template search completed"
fi
fi
TEMPLATE_PATH="$(pvesm path $TEMPLATE_STORAGE:vztmpl/$TEMPLATE 2>/dev/null || true)"
if [[ -z "$TEMPLATE_PATH" ]]; then
TEMPLATE_BASE=$(awk -v s="$TEMPLATE_STORAGE" '$1==s {f=1} f && /path/ {print $2; exit}' /etc/pve/storage.cfg)
[[ -n "$TEMPLATE_BASE" ]] && TEMPLATE_PATH="$TEMPLATE_BASE/template/cache/$TEMPLATE"
fi
# If we still don't have a path but have a valid template name, construct it
if [[ -z "$TEMPLATE_PATH" && -n "$TEMPLATE" ]]; then
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
fi
[[ -n "$TEMPLATE_PATH" ]] || {
# If still no template, try to find alternatives
if [[ -z "$TEMPLATE" ]]; then
msg_error "Template ${PCT_OSTYPE} ${PCT_OSVERSION} not available"
msg_warn "No template found for ${PCT_OSTYPE} ${PCT_OSVERSION}, searching for alternatives..."
# Get available versions
# Get all available versions for this OS type
AVAILABLE_VERSIONS=()
mapfile -t AVAILABLE_VERSIONS < <(
pveam available -section system 2>/dev/null |
grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' |
awk -F'\t' '{print $1}' |
grep "^${PCT_OSTYPE}-" |
sed -E 's/.*'"${PCT_OSTYPE}"'-([0-9]+\.[0-9]+).*/\1/' |
grep -E '^[0-9]+\.[0-9]+$' |
sort -u -V 2>/dev/null || sort -u
sed -E "s/.*${PCT_OSTYPE}-([0-9]+(\.[0-9]+)?).*/\1/" |
sort -u -V 2>/dev/null
)
if [[ ${#AVAILABLE_VERSIONS[@]} -gt 0 ]]; then
echo -e "\n${BL}Available versions:${CL}"
echo ""
echo "${BL}Available ${PCT_OSTYPE} versions:${CL}"
for i in "${!AVAILABLE_VERSIONS[@]}"; do
echo " [$((i + 1))] ${AVAILABLE_VERSIONS[$i]}"
done
echo ""
read -p "Select version [1-${#AVAILABLE_VERSIONS[@]}] or Enter to exit: " choice </dev/tty
read -p "Select version [1-${#AVAILABLE_VERSIONS[@]}] or press Enter to cancel: " choice </dev/tty
if [[ "$choice" =~ ^[0-9]+$ ]] && [[ "$choice" -ge 1 ]] && [[ "$choice" -le ${#AVAILABLE_VERSIONS[@]} ]]; then
export var_version="${AVAILABLE_VERSIONS[$((choice - 1))]}"
export PCT_OSVERSION="$var_version"
msg_ok "Switched to ${PCT_OSTYPE} ${var_version}"
PCT_OSVERSION="${AVAILABLE_VERSIONS[$((choice - 1))]}"
TEMPLATE_SEARCH="${PCT_OSTYPE}-${PCT_OSVERSION}"
# Retry template search with new version
TEMPLATE_SEARCH="${PCT_OSTYPE}-${PCT_OSVERSION:-}"
mapfile -t LOCAL_TEMPLATES < <(
pveam list "$TEMPLATE_STORAGE" 2>/dev/null |
awk -v search="${TEMPLATE_SEARCH}-" -v pattern="${TEMPLATE_PATTERN}" '$1 ~ search && $1 ~ pattern {print $1}' |
sed 's|.*/||' | sort -t - -k 2 -V
)
ONLINE_TEMPLATES=()
mapfile -t ONLINE_TEMPLATES < <(
pveam available -section system 2>/dev/null |
grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' |
@@ -5283,109 +5334,181 @@ create_lxc_container() {
grep -E "^${TEMPLATE_SEARCH}-.*${TEMPLATE_PATTERN}" |
sort -t - -k 2 -V 2>/dev/null || true
)
ONLINE_TEMPLATE=""
[[ ${#ONLINE_TEMPLATES[@]} -gt 0 ]] && ONLINE_TEMPLATE="${ONLINE_TEMPLATES[-1]}"
if [[ ${#LOCAL_TEMPLATES[@]} -gt 0 ]]; then
TEMPLATE="${LOCAL_TEMPLATES[-1]}"
TEMPLATE_SOURCE="local"
else
TEMPLATE="$ONLINE_TEMPLATE"
if [[ ${#ONLINE_TEMPLATES[@]} -gt 0 ]]; then
TEMPLATE="${ONLINE_TEMPLATES[-1]}"
TEMPLATE_SOURCE="online"
else
msg_error "No templates available for ${PCT_OSTYPE} ${PCT_OSVERSION}"
exit 225
fi
TEMPLATE_PATH="$(pvesm path $TEMPLATE_STORAGE:vztmpl/$TEMPLATE 2>/dev/null || true)"
if [[ -z "$TEMPLATE_PATH" ]]; then
TEMPLATE_BASE=$(awk -v s="$TEMPLATE_STORAGE" '$1==s {f=1} f && /path/ {print $2; exit}' /etc/pve/storage.cfg)
[[ -n "$TEMPLATE_BASE" ]] && TEMPLATE_PATH="$TEMPLATE_BASE/template/cache/$TEMPLATE"
fi
# If we still don't have a path but have a valid template name, construct it
if [[ -z "$TEMPLATE_PATH" && -n "$TEMPLATE" ]]; then
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
fi
[[ -n "$TEMPLATE_PATH" ]] || {
msg_error "Template still not found after version change"
exit 220
}
else
msg_custom "🚫" "${YW}" "Installation cancelled"
exit 0
fi
else
msg_error "No ${PCT_OSTYPE} templates available"
exit 220
exit 225
fi
fi
}
# Validate that we found a template
if [[ -z "$TEMPLATE" ]]; then
msg_error "No template found for ${PCT_OSTYPE} ${PCT_OSVERSION}"
msg_custom "" "${YW}" "Please check:"
msg_custom " •" "${YW}" "Is pveam catalog available? (run: pveam available -section system)"
msg_custom " •" "${YW}" "Does the template exist for your OS version?"
exit 225
fi
msg_ok "Template ${BL}$TEMPLATE${CL} [$TEMPLATE_SOURCE]"
msg_debug "Resolved TEMPLATE_PATH=$TEMPLATE_PATH"
NEED_DOWNLOAD=0
if [[ ! -f "$TEMPLATE_PATH" ]]; then
msg_info "Template not present locally will download."
NEED_DOWNLOAD=1
elif [[ ! -r "$TEMPLATE_PATH" ]]; then
msg_error "Template file exists but is not readable check permissions."
exit 221
elif [[ "$(stat -c%s "$TEMPLATE_PATH")" -lt 1000000 ]]; then
if [[ -n "$ONLINE_TEMPLATE" ]]; then
msg_warn "Template file too small (<1MB) re-downloading."
NEED_DOWNLOAD=1
else
msg_warn "Template looks too small, but no online version exists. Keeping local file."
TEMPLATE_PATH="$(pvesm path $TEMPLATE_STORAGE:vztmpl/$TEMPLATE 2>/dev/null || true)"
if [[ -z "$TEMPLATE_PATH" ]]; then
TEMPLATE_BASE=$(awk -v s="$TEMPLATE_STORAGE" '$1==s {f=1} f && /path/ {print $2; exit}' /etc/pve/storage.cfg)
[[ -n "$TEMPLATE_BASE" ]] && TEMPLATE_PATH="$TEMPLATE_BASE/template/cache/$TEMPLATE"
fi
elif ! tar -tf "$TEMPLATE_PATH" &>/dev/null; then
if [[ -n "$ONLINE_TEMPLATE" ]]; then
msg_warn "Template appears corrupted re-downloading."
NEED_DOWNLOAD=1
else
msg_warn "Template appears corrupted, but no online version exists. Keeping local file."
fi
else
$STD msg_ok "Template $TEMPLATE is present and valid."
fi
if [[ "$TEMPLATE_SOURCE" == "local" && -n "$ONLINE_TEMPLATE" && "$TEMPLATE" != "$ONLINE_TEMPLATE" ]]; then
msg_warn "Local template is outdated: $TEMPLATE (latest available: $ONLINE_TEMPLATE)"
if whiptail --yesno "A newer template is available:\n$ONLINE_TEMPLATE\n\nDo you want to download and use it instead?" 12 70; then
TEMPLATE="$ONLINE_TEMPLATE"
NEED_DOWNLOAD=1
else
msg_custom "" "${BL}" "Continuing with local template $TEMPLATE"
# If we still don't have a path but have a valid template name, construct it
if [[ -z "$TEMPLATE_PATH" && -n "$TEMPLATE" ]]; then
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
fi
fi
if [[ "$NEED_DOWNLOAD" -eq 1 ]]; then
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
for attempt in {1..3}; do
msg_info "Attempt $attempt: Downloading template $TEMPLATE to $TEMPLATE_STORAGE"
if pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1; then
msg_ok "Template download successful."
break
[[ -n "$TEMPLATE_PATH" ]] || {
if [[ -z "$TEMPLATE" ]]; then
msg_error "Template ${PCT_OSTYPE} ${PCT_OSVERSION} not available"
# Get available versions
mapfile -t AVAILABLE_VERSIONS < <(
pveam available -section system 2>/dev/null |
grep "^${PCT_OSTYPE}-" |
sed -E 's/.*'"${PCT_OSTYPE}"'-([0-9]+\.[0-9]+).*/\1/' |
grep -E '^[0-9]+\.[0-9]+$' |
sort -u -V 2>/dev/null || sort -u
)
if [[ ${#AVAILABLE_VERSIONS[@]} -gt 0 ]]; then
echo -e "\n${BL}Available versions:${CL}"
for i in "${!AVAILABLE_VERSIONS[@]}"; do
echo " [$((i + 1))] ${AVAILABLE_VERSIONS[$i]}"
done
echo ""
read -p "Select version [1-${#AVAILABLE_VERSIONS[@]}] or press Enter to exit: " choice </dev/tty
if [[ "$choice" =~ ^[0-9]+$ ]] && [[ "$choice" -ge 1 ]] && [[ "$choice" -le ${#AVAILABLE_VERSIONS[@]} ]]; then
export var_version="${AVAILABLE_VERSIONS[$((choice - 1))]}"
export PCT_OSVERSION="$var_version"
msg_ok "Switched to ${PCT_OSTYPE} ${var_version}"
# Retry template search with new version
TEMPLATE_SEARCH="${PCT_OSTYPE}-${PCT_OSVERSION:-}"
mapfile -t LOCAL_TEMPLATES < <(
pveam list "$TEMPLATE_STORAGE" 2>/dev/null |
awk -v search="${TEMPLATE_SEARCH}-" -v pattern="${TEMPLATE_PATTERN}" '$1 ~ search && $1 ~ pattern {print $1}' |
sed 's|.*/||' | sort -t - -k 2 -V
)
mapfile -t ONLINE_TEMPLATES < <(
pveam available -section system 2>/dev/null |
grep -E '\.(tar\.zst|tar\.xz|tar\.gz)$' |
awk '{print $2}' |
grep -E "^${TEMPLATE_SEARCH}-.*${TEMPLATE_PATTERN}" |
sort -t - -k 2 -V 2>/dev/null || true
)
ONLINE_TEMPLATE=""
[[ ${#ONLINE_TEMPLATES[@]} -gt 0 ]] && ONLINE_TEMPLATE="${ONLINE_TEMPLATES[-1]}"
if [[ ${#LOCAL_TEMPLATES[@]} -gt 0 ]]; then
TEMPLATE="${LOCAL_TEMPLATES[-1]}"
TEMPLATE_SOURCE="local"
else
TEMPLATE="$ONLINE_TEMPLATE"
TEMPLATE_SOURCE="online"
fi
TEMPLATE_PATH="$(pvesm path $TEMPLATE_STORAGE:vztmpl/$TEMPLATE 2>/dev/null || true)"
if [[ -z "$TEMPLATE_PATH" ]]; then
TEMPLATE_BASE=$(awk -v s="$TEMPLATE_STORAGE" '$1==s {f=1} f && /path/ {print $2; exit}' /etc/pve/storage.cfg)
[[ -n "$TEMPLATE_BASE" ]] && TEMPLATE_PATH="$TEMPLATE_BASE/template/cache/$TEMPLATE"
fi
# If we still don't have a path but have a valid template name, construct it
if [[ -z "$TEMPLATE_PATH" && -n "$TEMPLATE" ]]; then
TEMPLATE_PATH="/var/lib/vz/template/cache/$TEMPLATE"
fi
[[ -n "$TEMPLATE_PATH" ]] || {
msg_error "Template still not found after version change"
exit 220
}
else
msg_custom "🚫" "${YW}" "Installation cancelled"
exit 0
fi
else
msg_error "No ${PCT_OSTYPE} templates available"
exit 220
fi
fi
if [[ $attempt -eq 3 ]]; then
msg_error "Failed after 3 attempts. Please check network access, permissions, or manually run:\n pveam download $TEMPLATE_STORAGE $TEMPLATE"
exit 222
fi
sleep $((attempt * 5))
done
fi
}
if ! pveam list "$TEMPLATE_STORAGE" 2>/dev/null | grep -q "$TEMPLATE"; then
msg_error "Template $TEMPLATE not available in storage $TEMPLATE_STORAGE after download."
exit 223
# Validate that we found a template
if [[ -z "$TEMPLATE" ]]; then
msg_error "No template found for ${PCT_OSTYPE} ${PCT_OSVERSION}"
msg_custom "" "${YW}" "Please check:"
msg_custom " •" "${YW}" "Is pveam catalog available? (run: pveam available -section system)"
msg_custom " •" "${YW}" "Does the template exist for your OS version?"
exit 225
fi
msg_ok "Template ${BL}$TEMPLATE${CL} [$TEMPLATE_SOURCE]"
msg_debug "Resolved TEMPLATE_PATH=$TEMPLATE_PATH"
NEED_DOWNLOAD=0
if [[ ! -f "$TEMPLATE_PATH" ]]; then
msg_info "Template not present locally, will download it."
NEED_DOWNLOAD=1
elif [[ ! -r "$TEMPLATE_PATH" ]]; then
msg_error "Template file exists but is not readable, check permissions."
exit 221
elif [[ "$(stat -c%s "$TEMPLATE_PATH")" -lt 1000000 ]]; then
if [[ -n "$ONLINE_TEMPLATE" ]]; then
msg_warn "Template file too small (<1MB), re-downloading."
NEED_DOWNLOAD=1
else
msg_warn "Template looks too small, but no online version exists. Keeping local file."
fi
elif ! tar -tf "$TEMPLATE_PATH" &>/dev/null; then
if [[ -n "$ONLINE_TEMPLATE" ]]; then
msg_warn "Template appears corrupted, re-downloading."
NEED_DOWNLOAD=1
else
msg_warn "Template appears corrupted, but no online version exists. Keeping local file."
fi
else
$STD msg_ok "Template $TEMPLATE is present and valid."
fi
if [[ "$TEMPLATE_SOURCE" == "local" && -n "$ONLINE_TEMPLATE" && "$TEMPLATE" != "$ONLINE_TEMPLATE" ]]; then
msg_warn "Local template is outdated: $TEMPLATE (latest available: $ONLINE_TEMPLATE)"
if whiptail --yesno "A newer template is available:\n$ONLINE_TEMPLATE\n\nDo you want to download and use it instead?" 12 70; then
TEMPLATE="$ONLINE_TEMPLATE"
NEED_DOWNLOAD=1
else
msg_custom "" "${BL}" "Continuing with local template $TEMPLATE"
fi
fi
if [[ "$NEED_DOWNLOAD" -eq 1 ]]; then
[[ -f "$TEMPLATE_PATH" ]] && rm -f "$TEMPLATE_PATH"
for attempt in {1..3}; do
msg_info "Attempt $attempt: Downloading template $TEMPLATE to $TEMPLATE_STORAGE"
if pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1; then
msg_ok "Template download successful."
break
fi
if [[ $attempt -eq 3 ]]; then
msg_error "Failed after 3 attempts. Please check network access, permissions, or manually run:\n pveam download $TEMPLATE_STORAGE $TEMPLATE"
exit 222
fi
sleep $((attempt * 5))
done
fi
if ! pveam list "$TEMPLATE_STORAGE" 2>/dev/null | grep -q "$TEMPLATE"; then
msg_error "Template $TEMPLATE not available in storage $TEMPLATE_STORAGE after download."
exit 223
fi
fi
# ------------------------------------------------------------------------------
@@ -5464,19 +5587,13 @@ create_lxc_container() {
if [[ ! -s "$TEMPLATE_PATH" || "$(stat -c%s "$TEMPLATE_PATH" 2>/dev/null || echo 0)" -lt 1000000 ]]; then
msg_info "Template file missing or too small downloading"
rm -f "$TEMPLATE_PATH"
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1 || {
msg_error "Failed to download template '$TEMPLATE' to storage '$TEMPLATE_STORAGE'"
exit 222
}
download_template
msg_ok "Template downloaded"
elif ! tar -tf "$TEMPLATE_PATH" &>/dev/null; then
if [[ -n "$ONLINE_TEMPLATE" ]]; then
if [[ "$ARCH" == "arm64" || -n "$ONLINE_TEMPLATE" ]]; then
msg_info "Template appears corrupted re-downloading"
rm -f "$TEMPLATE_PATH"
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1 || {
msg_error "Failed to re-download template '$TEMPLATE'"
exit 222
}
download_template
msg_ok "Template re-downloaded"
else
msg_warn "Template appears corrupted, but no online version exists. Skipping re-download."
@@ -5497,7 +5614,7 @@ create_lxc_container() {
if grep -qiE 'unable to open|corrupt|invalid' "$LOGFILE"; then
msg_info "Template may be corrupted re-downloading"
rm -f "$TEMPLATE_PATH"
pveam download "$TEMPLATE_STORAGE" "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1
download_template
msg_ok "Template re-downloaded"
fi
@@ -5510,7 +5627,11 @@ create_lxc_container() {
if [[ ! -f "$LOCAL_TEMPLATE_PATH" ]]; then
msg_ok "Trying local storage fallback"
msg_info "Downloading template to local"
pveam download local "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1
if [[ "$ARCH" == "arm64" ]]; then
download_arm64_template "$LOCAL_TEMPLATE_PATH"
else
pveam download local "$TEMPLATE" >>"${BUILD_LOG:-/dev/null}" 2>&1
fi
msg_ok "Template downloaded to local"
else
msg_ok "Trying local storage fallback"

View File

@@ -344,9 +344,15 @@ pve_check() {
# - Provides link to ARM64-compatible scripts
# ------------------------------------------------------------------------------
arch_check() {
if [ "$(dpkg --print-architecture)" != "amd64" ]; then
msg_error "This script will not work with PiMox (ARM architecture detected)."
msg_warn "Visit https://github.com/asylumexp/Proxmox for ARM64 support."
local arch
arch="$(dpkg --print-architecture)"
if [[ "$arch" != "amd64" && "$arch" != "arm64" ]]; then
msg_error "This script requires amd64 or arm64 (detected: $arch)."
sleep 2
exit 106
fi
if [[ "$arch" == "arm64" && "${var_arm64:-}" != "yes" ]]; then
msg_error "This script does not yet support arm64."
sleep 2
exit 106
fi

View File

@@ -99,7 +99,7 @@ if ! declare -f explain_exit_code &>/dev/null; then
103) echo "Validation: Shell is not Bash" ;;
104) echo "Validation: Not running as root (or invoked via sudo)" ;;
105) echo "Validation: Proxmox VE version not supported" ;;
106) echo "Validation: Architecture not supported (ARM / PiMox)" ;;
106) echo "Validation: Unsupported architecture (requires amd64 or arm64)" ;;
107) echo "Validation: Kernel key parameters unreadable" ;;
108) echo "Validation: Kernel key limits exceeded" ;;
109) echo "Proxmox: No available container ID after max attempts" ;;

View File

@@ -969,13 +969,43 @@ verify_repo_available() {
}
# ------------------------------------------------------------------------------
# Ensure dependencies are installed (with apt update caching)
# Ensure dependencies are installed (with apt/apk update caching)
# Supports both Debian (apt/dpkg) and Alpine (apk) systems
# ------------------------------------------------------------------------------
ensure_dependencies() {
local deps=("$@")
local missing=()
# Fast batch check using dpkg-query (much faster than individual checks)
# Detect Alpine Linux
if [[ -f /etc/alpine-release ]]; then
for dep in "${deps[@]}"; do
if command -v "$dep" &>/dev/null; then
continue
fi
if apk info -e "$dep" &>/dev/null; then
continue
fi
missing+=("$dep")
done
if [[ ${#missing[@]} -gt 0 ]]; then
$STD apk add --no-cache "${missing[@]}" || {
local failed=()
for pkg in "${missing[@]}"; do
if ! $STD apk add --no-cache "$pkg" 2>/dev/null; then
failed+=("$pkg")
fi
done
if [[ ${#failed[@]} -gt 0 ]]; then
msg_error "Failed to install dependencies: ${failed[*]}"
return 1
fi
}
fi
return 0
fi
# Debian/Ubuntu: Fast batch check using dpkg-query
local installed_pkgs
installed_pkgs=$(dpkg-query -W -f='${Package}\n' 2>/dev/null | sort -u)
@@ -1072,11 +1102,15 @@ create_temp_dir() {
}
# ------------------------------------------------------------------------------
# Check if package is installed (faster than dpkg -l | grep)
# Check if package is installed (supports both Debian and Alpine)
# ------------------------------------------------------------------------------
is_package_installed() {
local package="$1"
dpkg-query -W -f='${Status}' "$package" 2>/dev/null | grep -q "^install ok installed$"
if [[ -f /etc/alpine-release ]]; then
apk info -e "$package" &>/dev/null
else
dpkg-query -W -f='${Status}' "$package" 2>/dev/null | grep -q "^install ok installed$"
fi
}
# ------------------------------------------------------------------------------
@@ -2788,10 +2822,13 @@ function fetch_and_deploy_codeberg_release() {
# Fall back to architecture heuristic
if [[ -z "$url_match" ]]; then
for u in $assets; do
if [[ "$u" =~ ($arch|amd64|x86_64|aarch64|arm64).*\.deb$ ]]; then
url_match="$u"
break
if [[ "${arch,,}" =~ ^(amd64|x86_64)$ ]]; then
[[ "$u" =~ (amd64|x86_64).*\.deb$ ]] || continue
elif [[ "${arch,,}" =~ ^(arm64|aarch64)$ ]]; then
[[ "$u" =~ (arm64|aarch64).*\.deb$ ]] || continue
fi
url_match="$u"
break
done
fi
@@ -3088,7 +3125,11 @@ _gh_scan_older_releases() {
done)
fi
if [[ "$has_match" != "true" ]]; then
has_match=$(echo "$releases_list" | jq -r ".[$i].assets[].browser_download_url" | grep -qE "($arch|amd64|x86_64|aarch64|arm64).*\.deb$" && echo true)
if [[ "${arch,,}" =~ ^(amd64|x86_64)$ ]]; then
has_match=$(echo "$releases_list" | jq -r ".[$i].assets[].browser_download_url" | grep -qE '(amd64|x86_64).*\.deb$' && echo true)
elif [[ "${arch,,}" =~ ^(arm64|aarch64)$ ]]; then
has_match=$(echo "$releases_list" | jq -r ".[$i].assets[].browser_download_url" | grep -qE '(arm64|aarch64).*\.deb$' && echo true)
fi
fi
if [[ "$has_match" != "true" ]]; then
has_match=$(echo "$releases_list" | jq -r ".[$i].assets[].browser_download_url" | grep -qE '\.deb$' && echo true)
@@ -3294,10 +3335,13 @@ function fetch_and_deploy_gh_release() {
# If no match via explicit pattern, fall back to architecture heuristic
if [[ -z "$url_match" ]]; then
for u in $assets; do
if [[ "$u" =~ ($arch|amd64|x86_64|aarch64|arm64).*\.deb$ ]]; then
url_match="$u"
break
if [[ "${arch,,}" =~ ^(amd64|x86_64)$ ]]; then
[[ "$u" =~ (amd64|x86_64).*\.deb$ ]] || continue
elif [[ "${arch,,}" =~ ^(arm64|aarch64)$ ]]; then
[[ "$u" =~ (arm64|aarch64).*\.deb$ ]] || continue
fi
url_match="$u"
break
done
fi
@@ -3328,10 +3372,13 @@ function fetch_and_deploy_gh_release() {
fi
if [[ -z "$url_match" ]]; then
for u in $assets; do
if [[ "$u" =~ ($arch|amd64|x86_64|aarch64|arm64).*\.deb$ ]]; then
url_match="$u"
break
if [[ "${arch,,}" =~ ^(amd64|x86_64)$ ]]; then
[[ "$u" =~ (amd64|x86_64).*\.deb$ ]] || continue
elif [[ "${arch,,}" =~ ^(arm64|aarch64)$ ]]; then
[[ "$u" =~ (arm64|aarch64).*\.deb$ ]] || continue
fi
url_match="$u"
break
done
fi
if [[ -z "$url_match" ]]; then
@@ -3699,7 +3746,12 @@ function setup_ffmpeg() {
# Binary fallback mode
if [[ "$TYPE" == "binary" ]]; then
if ! CURL_TIMEOUT=300 curl_with_retry "https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz" "$TMP_DIR/ffmpeg.tar.xz"; then
local ffmpeg_arch
case "$(dpkg --print-architecture 2>/dev/null || echo amd64)" in
arm64) ffmpeg_arch="arm64" ;;
*) ffmpeg_arch="amd64" ;;
esac
if ! CURL_TIMEOUT=300 curl_with_retry "https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-${ffmpeg_arch}-static.tar.xz" "$TMP_DIR/ffmpeg.tar.xz"; then
msg_error "Failed to download FFmpeg binary"
rm -rf "$TMP_DIR"
return 1
@@ -3781,7 +3833,12 @@ function setup_ffmpeg() {
# If no source download (either VERSION empty or download failed), use binary
if [[ -z "$VERSION" ]]; then
msg_info "Setup FFmpeg from pre-built binary"
if ! CURL_TIMEOUT=300 curl_with_retry "https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz" "$TMP_DIR/ffmpeg.tar.xz"; then
local ffmpeg_arch
case "$(dpkg --print-architecture 2>/dev/null || echo amd64)" in
arm64) ffmpeg_arch="arm64" ;;
*) ffmpeg_arch="amd64" ;;
esac
if ! CURL_TIMEOUT=300 curl_with_retry "https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-${ffmpeg_arch}-static.tar.xz" "$TMP_DIR/ffmpeg.tar.xz"; then
msg_error "Failed to download FFmpeg pre-built binary"
rm -rf "$TMP_DIR"
return 1
@@ -4512,9 +4569,8 @@ _setup_amd_gpu() {
fi
# Ubuntu includes AMD firmware in linux-firmware by default
# ROCm for compute (optional - large download)
# Uncomment if needed:
# $STD apt -y install rocm-opencl-runtime 2>/dev/null || true
# ROCm compute stack (OpenCL + HIP)
_setup_rocm "$os_id" "$os_codename"
msg_ok "AMD GPU configured"
}
@@ -4539,9 +4595,103 @@ _setup_amd_apu() {
$STD apt -y install firmware-amd-graphics 2>/dev/null || true
fi
# ROCm compute stack (OpenCL + HIP) - also works for many APUs
_setup_rocm "$os_id" "$os_codename"
msg_ok "AMD APU configured"
}
# ══════════════════════════════════════════════════════════════════════════════
# AMD ROCm Compute Setup
# Adds ROCm repository and installs the ROCm compute stack for AMD GPUs/APUs.
# Provides: OpenCL, HIP, rocm-smi, rocminfo
# Supported: Debian 12/13, Ubuntu 22.04/24.04 (amd64 only)
# ══════════════════════════════════════════════════════════════════════════════
_setup_rocm() {
local os_id="$1" os_codename="$2"
# Only amd64 is supported
if [[ "$(dpkg --print-architecture 2>/dev/null)" != "amd64" ]]; then
msg_warn "ROCm is only available for amd64 — skipping"
return 0
fi
local ROCM_VERSION="7.2"
local ROCM_REPO_CODENAME
# Map OS codename to ROCm repository codename (Ubuntu-based repos)
case "${os_id}-${os_codename}" in
debian-bookworm) ROCM_REPO_CODENAME="jammy" ;;
debian-trixie | debian-sid) ROCM_REPO_CODENAME="noble" ;;
ubuntu-jammy) ROCM_REPO_CODENAME="jammy" ;;
ubuntu-noble) ROCM_REPO_CODENAME="noble" ;;
*)
msg_warn "ROCm not supported on ${os_id} ${os_codename} — skipping"
return 0
;;
esac
msg_info "Installing ROCm ${ROCM_VERSION} compute stack"
# ROCm main repository (userspace compute libs)
setup_deb822_repo \
"rocm" \
"https://repo.radeon.com/rocm/rocm.gpg.key" \
"https://repo.radeon.com/rocm/apt/${ROCM_VERSION}" \
"${ROCM_REPO_CODENAME}" \
"main" \
"amd64" || {
msg_warn "Failed to add ROCm repository — skipping ROCm"
return 0
}
# AMDGPU driver repository (append to same keyring)
{
echo ""
echo "Types: deb"
echo "URIs: https://repo.radeon.com/amdgpu/latest/ubuntu"
echo "Suites: ${ROCM_REPO_CODENAME}"
echo "Components: main"
echo "Architectures: amd64"
echo "Signed-By: /etc/apt/keyrings/rocm.gpg"
} >>/etc/apt/sources.list.d/rocm.sources
# Pin ROCm packages to prefer radeon repo
cat <<EOF >/etc/apt/preferences.d/rocm-pin-600
Package: *
Pin: release o=repo.radeon.com
Pin-Priority: 600
EOF
$STD apt update
# Install only runtime packages — full 'rocm' meta-package includes 15GB+ dev tools
$STD apt install -y rocm-opencl-runtime rocm-hip-runtime rocm-smi-lib 2>/dev/null || {
msg_warn "ROCm runtime install failed — trying minimal set"
$STD apt install -y rocm-opencl-runtime rocm-smi-lib 2>/dev/null || msg_warn "ROCm minimal install also failed"
}
# Group membership for GPU access
usermod -aG render,video root 2>/dev/null || true
# Environment (PATH + LD_LIBRARY_PATH)
if [[ -d /opt/rocm ]]; then
cat <<'ENVEOF' >/etc/profile.d/rocm.sh
export PATH="\$PATH:/opt/rocm/bin"
export LD_LIBRARY_PATH="\${LD_LIBRARY_PATH:+\$LD_LIBRARY_PATH:}/opt/rocm/lib"
ENVEOF
chmod +x /etc/profile.d/rocm.sh
# Also make available for current session / systemd services
echo "/opt/rocm/lib" >/etc/ld.so.conf.d/rocm.conf
ldconfig 2>/dev/null || true
fi
if [[ -x /opt/rocm/bin/rocminfo ]]; then
msg_ok "ROCm ${ROCM_VERSION} installed"
else
msg_warn "ROCm installed but rocminfo not found — GPU may not be available in container"
fi
}
# ══════════════════════════════════════════════════════════════════════════════
# NVIDIA GPU Setup
# ══════════════════════════════════════════════════════════════════════════════
@@ -7754,7 +7904,12 @@ function setup_yq() {
msg_info "Setup yq $LATEST_VERSION"
fi
if ! curl_with_retry "https://github.com/${GITHUB_REPO}/releases/download/v${LATEST_VERSION}/yq_linux_amd64" "$TMP_DIR/yq"; then
local yq_arch
case "$(dpkg --print-architecture 2>/dev/null || echo amd64)" in
arm64) yq_arch="arm64" ;;
*) yq_arch="amd64" ;;
esac
if ! curl_with_retry "https://github.com/${GITHUB_REPO}/releases/download/v${LATEST_VERSION}/yq_linux_${yq_arch}" "$TMP_DIR/yq"; then
msg_error "Failed to download yq"
rm -rf "$TMP_DIR"
return 1

View File

@@ -89,17 +89,25 @@ VERSION=$(curl -fsSL https://api.github.com/repos/coder/code-server/releases/lat
awk '{print substr($2, 3, length($2)-4) }')
msg_info "Installing Code-Server v${VERSION}"
if [ -f ~/.config/code-server/config.yaml ]; then
existing_config=true
fi
curl -fOL https://github.com/coder/code-server/releases/download/v"$VERSION"/code-server_"${VERSION}"_amd64.deb &>/dev/null
dpkg -i code-server_"${VERSION}"_amd64.deb &>/dev/null
rm -rf code-server_"${VERSION}"_amd64.deb
mkdir -p ~/.config/code-server/
systemctl enable -q --now code-server@"$USER"
if [ $existing_config = false ]; then
cat <<EOF >~/.config/code-server/config.yaml
bind-addr: 0.0.0.0:8680
auth: none
password:
cert: false
EOF
fi
systemctl restart code-server@"$USER"
msg_ok "Installed Code-Server v${VERSION} on $hostname"