Compare commits

...

31 Commits

Author SHA1 Message Date
github-actions[bot]
5cd1786bf3 chore: update github-versions.json 2026-02-14 06:14:47 +00:00
community-scripts-pr-app[bot]
1f735cb31f Update CHANGELOG.md (#11893)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-14 00:22:13 +00:00
community-scripts-pr-app[bot]
adcbe8dae2 chore: update github-versions.json (#11892)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-14 00:21:47 +00:00
community-scripts-pr-app[bot]
bba520dbbf Update CHANGELOG.md (#11890)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 18:14:34 +00:00
community-scripts-pr-app[bot]
b43963d352 chore: update github-versions.json (#11889)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 18:14:07 +00:00
CanbiZ (MickLesk)
0957a23366 JSON-escape CPU and GPU model strings
Apply json_escape to GPU_MODEL and CPU_MODEL before assigning to gpu_model and cpu_model to ensure values are safe for inclusion in API JSON payloads. Updated in post_to_api, post_to_api_vm, and post_update_to_api; variable declarations were adjusted to call json_escape on the existing environment values (fallbacks unchanged). This prevents raw model strings from breaking the API payload.
2026-02-13 14:18:36 +01:00
community-scripts-pr-app[bot]
9f3588dd8d Update CHANGELOG.md (#11886)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 12:50:14 +00:00
CanbiZ (MickLesk)
f23414a1a8 Retry reporting with fallback payloads (#11885)
Enhance post_update_to_api to support a "force" mode and robust retry logic: add a 3rd-arg bypass to duplicate suppression, capture a short error summary, and perform up to three POST attempts (full payload, shortened error payload, minimal payload) with HTTP code checks and small backoffs. Mark POST_UPDATE_DONE on success (or after three attempts) to avoid infinite retries. Also invoke post_update_to_api with the "force" flag from cleanup paths in build.func and error_handler.func so a final status update is attempted after cleanup.
2026-02-13 13:49:49 +01:00
community-scripts-pr-app[bot]
2a8bb76dcf chore: update github-versions.json (#11884)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 12:11:43 +00:00
CanbiZ (MickLesk)
bf85ef2a8b Merge branch 'main' of https://github.com/community-scripts/ProxmoxVE 2026-02-13 12:29:11 +01:00
CanbiZ (MickLesk)
cc89cdbab1 Copy install log to host before API report
Copy the container install log to a host path before reporting a failure to the telemetry API so get_error_text() can read it. Introduce host_install_log and point INSTALL_LOG to the host copy when pulled via pct, move post_update_to_api after the log copy, and update the displayed installation-log path.
2026-02-13 12:29:03 +01:00
community-scripts-pr-app[bot]
d6f3f03f8a Update CHANGELOG.md (#11883)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 11:04:20 +00:00
CanbiZ (MickLesk)
55e35d7f11 qf 2026-02-13 12:03:47 +01:00
community-scripts-pr-app[bot]
3b9f8d4a93 Update CHANGELOG.md (#11882)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:27:20 +00:00
community-scripts-pr-app[bot]
6c5377adec Update CHANGELOG.md (#11881)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:27:12 +00:00
Léon Zimmermann
eeb349346b Planka: add migrate step to update function (#11877)
Added database migration commands after restoring data.
2026-02-13 11:26:56 +01:00
community-scripts-pr-app[bot]
d271c16799 Update CHANGELOG.md (#11880)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:26:48 +00:00
CanbiZ (MickLesk)
4774c54861 Openwebui: pin numba constraint (#11874)
Update scripts to use Python 3.12 for uv tool setup and Open-WebUI installs/upgrades. Add a numba constraint (--constraint <(echo "numba>=0.60")) to uv tool install/upgrade commands to ensure compatibility. Changes applied to ct/openwebui.sh and install/openwebui-install.sh for both fresh installs and update paths.
2026-02-13 11:26:19 +01:00
community-scripts-pr-app[bot]
4bf63bae35 Update CHANGELOG.md (#11879)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:17:12 +00:00
CanbiZ (MickLesk)
f2b7c9638d error-handler: Implement json_escape and enhance error handling (#11875)
Added json_escape function for safe JSON embedding and updated error handling to include user abort messages.
2026-02-13 11:16:40 +01:00
community-scripts-pr-app[bot]
551f89e46f Update CHANGELOG.md (#11878)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:02:31 +00:00
Tom Frenzel
4f571a1eb6 fix(donetick): add config entry for v0.1.73 (#11872) 2026-02-13 11:02:02 +01:00
CanbiZ (MickLesk)
3156e8e363 downgrade openwebui 2026-02-13 10:12:04 +01:00
CanbiZ (MickLesk)
60ebdc97a5 fix unifi gpg 2026-02-13 09:24:13 +01:00
community-scripts-pr-app[bot]
20ec369338 Update CHANGELOG.md (#11871)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 08:18:50 +00:00
Chris
4907a906c3 Refactor: Radicale (#11850)
* Refactor: Radicale

* Create explicit config at `/etc/radicale/config`

* grammar

---------

Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-02-13 09:18:29 +01:00
community-scripts-pr-app[bot]
27e3a4301e Update CHANGELOG.md (#11870)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 08:16:37 +00:00
CanbiZ (MickLesk)
43fb75f2b4 Switch sqlite-specific db scripts to generic (#11868)
Replace npm script calls to db:sqlite:generate and db:sqlite:push with db:generate and db:push in ct/pangolin.sh and install/pangolin-install.sh. This makes the build/install steps use the generic DB task names for consistency across update and install workflows.
2026-02-13 09:16:13 +01:00
community-scripts-pr-app[bot]
899d0e4baa Update CHANGELOG.md (#11869)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 08:11:20 +00:00
CanbiZ (MickLesk)
85584b105d SQLServer-2025: add PVE9/Kernel 6.x incompatibility warning (#11829)
* docs(sqlserver2025): add PVE9/Kernel 6.x incompatibility warning

* Update warning note for SQL Server SQLPAL compatibility

* Update frontend/public/json/sqlserver2025.json

Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>

---------

Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-02-13 09:10:50 +01:00
CanbiZ (MickLesk)
3fe6f50414 hotfix unifi wrong url 2026-02-13 08:50:41 +01:00
16 changed files with 355 additions and 99 deletions

View File

@@ -401,14 +401,42 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
</details> </details>
## 2026-02-14
## 2026-02-13 ## 2026-02-13
### 🚀 Updated Scripts ### 🚀 Updated Scripts
- #### 🐞 Bug Fixes - #### 🐞 Bug Fixes
- OpenWebUI: pin numba constraint [@MickLesk](https://github.com/MickLesk) ([#11874](https://github.com/community-scripts/ProxmoxVE/pull/11874))
- Planka: add migrate step to update function [@ZimmermannLeon](https://github.com/ZimmermannLeon) ([#11877](https://github.com/community-scripts/ProxmoxVE/pull/11877))
- Pangolin: switch sqlite-specific back to generic [@MickLesk](https://github.com/MickLesk) ([#11868](https://github.com/community-scripts/ProxmoxVE/pull/11868))
- [Hotfix] Jotty: Copy contents of config backup into /opt/jotty/config [@vhsdream](https://github.com/vhsdream) ([#11864](https://github.com/community-scripts/ProxmoxVE/pull/11864)) - [Hotfix] Jotty: Copy contents of config backup into /opt/jotty/config [@vhsdream](https://github.com/vhsdream) ([#11864](https://github.com/community-scripts/ProxmoxVE/pull/11864))
- #### 🔧 Refactor
- Refactor: Radicale [@vhsdream](https://github.com/vhsdream) ([#11850](https://github.com/community-scripts/ProxmoxVE/pull/11850))
- chore(donetick): add config entry for v0.1.73 [@tomfrenzel](https://github.com/tomfrenzel) ([#11872](https://github.com/community-scripts/ProxmoxVE/pull/11872))
### 💾 Core
- #### 🔧 Refactor
- core: retry reporting with fallback payloads [@MickLesk](https://github.com/MickLesk) ([#11885](https://github.com/community-scripts/ProxmoxVE/pull/11885))
### 📡 API
- #### ✨ New Features
- error-handler: Implement json_escape and enhance error handling [@MickLesk](https://github.com/MickLesk) ([#11875](https://github.com/community-scripts/ProxmoxVE/pull/11875))
### 🌐 Website
- #### 📝 Script Information
- SQLServer-2025: add PVE9/Kernel 6.x incompatibility warning [@MickLesk](https://github.com/MickLesk) ([#11829](https://github.com/community-scripts/ProxmoxVE/pull/11829))
## 2026-02-12 ## 2026-02-12
### 🚀 Updated Scripts ### 🚀 Updated Scripts

View File

@@ -42,7 +42,8 @@ function update_script() {
msg_info "Restoring Configurations" msg_info "Restoring Configurations"
mv /opt/selfhosted.yaml /opt/donetick/config mv /opt/selfhosted.yaml /opt/donetick/config
sed -i '/capacitor:\/\/localhost/d' /opt/donetick/config/selfhosted.yaml grep -q 'http://localhost"$' /opt/donetick/config/selfhosted.yaml || sed -i '/https:\/\/localhost"$/a\ - "http://localhost"' /opt/donetick/config/selfhosted.yaml
grep -q 'capacitor://localhost' /opt/donetick/config/selfhosted.yaml || sed -i '/http:\/\/localhost"$/a\ - "capacitor://localhost"' /opt/donetick/config/selfhosted.yaml
mv /opt/donetick.db /opt/donetick mv /opt/donetick.db /opt/donetick
msg_ok "Restored Configurations" msg_ok "Restored Configurations"

View File

@@ -44,7 +44,7 @@ function update_script() {
msg_info "Installing uv-based Open-WebUI" msg_info "Installing uv-based Open-WebUI"
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
$STD uv tool install --python 3.12 open-webui[all] $STD uv tool install --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
msg_ok "Installed uv-based Open-WebUI" msg_ok "Installed uv-based Open-WebUI"
msg_info "Restoring data" msg_info "Restoring data"
@@ -126,7 +126,7 @@ EOF
msg_info "Updating Open WebUI via uv" msg_info "Updating Open WebUI via uv"
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
$STD uv tool upgrade --python 3.12 open-webui[all] $STD uv tool install --force --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
systemctl restart open-webui systemctl restart open-webui
msg_ok "Updated Open WebUI" msg_ok "Updated Open WebUI"
msg_ok "Updated successfully!" msg_ok "Updated successfully!"

View File

@@ -51,7 +51,7 @@ function update_script() {
$STD npm run db:generate $STD npm run db:generate
$STD npm run build $STD npm run build
$STD npm run build:cli $STD npm run build:cli
$STD npm run db:sqlite:push $STD npm run db:push
cp -R .next/standalone ./ cp -R .next/standalone ./
chmod +x ./dist/cli.mjs chmod +x ./dist/cli.mjs
cp server/db/names.json ./dist/names.json cp server/db/names.json ./dist/names.json

View File

@@ -61,6 +61,12 @@ function update_script() {
rm -rf "$BK" rm -rf "$BK"
msg_ok "Restored data" msg_ok "Restored data"
msg_ok "Migrate Database"
cd /opt/planka
$STD npm run db:upgrade
$STD npm run db:migrate
msg_ok "Migrated Database"
msg_info "Starting Service" msg_info "Starting Service"
systemctl start planka systemctl start planka
msg_ok "Started Service" msg_ok "Started Service"

View File

@@ -28,16 +28,55 @@ function update_script() {
exit exit
fi fi
msg_info "Updating ${APP}" if check_for_gh_release "Radicale" "Kozea/Radicale"; then
$STD python3 -m venv /opt/radicale msg_info "Stopping service"
source /opt/radicale/bin/activate systemctl stop radicale
$STD python3 -m pip install --upgrade https://github.com/Kozea/Radicale/archive/master.tar.gz msg_ok "Stopped service"
msg_ok "Updated ${APP}"
msg_info "Starting Service" msg_info "Backing up users file"
systemctl enable -q --now radicale cp /opt/radicale/users /opt/radicale_users_backup
msg_ok "Started Service" msg_ok "Backed up users file"
msg_ok "Updated successfully!"
PYTHON_VERSION="3.13" setup_uv
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Radicale" "Kozea/Radicale" "tarball" "latest" "/opt/radicale"
msg_info "Restoring users file"
rm -f /opt/radicale/users
mv /opt/radicale_users_backup /opt/radicale/users
msg_ok "Restored users file"
if grep -q 'start.sh' /etc/systemd/system/radicale.service; then
sed -i -e '/^Description/i[Unit]' \
-e '\|^ExecStart|iWorkingDirectory=/opt/radicale' \
-e 's|^ExecStart=.*|ExecStart=/usr/local/bin/uv run -m radicale --config /etc/radicale/config|' /etc/systemd/system/radicale.service
systemctl daemon-reload
fi
if [[ ! -f /etc/radicale/config ]]; then
msg_info "Migrating to config file (/etc/radicale/config)"
mkdir -p /etc/radicale
cat <<EOF >/etc/radicale/config
[server]
hosts = 0.0.0.0:5232
[auth]
type = htpasswd
htpasswd_filename = /opt/radicale/users
htpasswd_encryption = sha512
[storage]
type = multifilesystem
filesystem_folder = /var/lib/radicale/collections
[web]
type = internal
EOF
msg_ok "Migrated to config (/etc/radicale/config)"
fi
msg_info "Starting service"
systemctl start radicale
msg_ok "Started service"
msg_ok "Updated Successfully!"
fi
exit exit
} }

View File

@@ -1,5 +1,5 @@
{ {
"generated": "2026-02-13T06:23:00Z", "generated": "2026-02-14T06:14:47Z",
"versions": [ "versions": [
{ {
"slug": "2fauth", "slug": "2fauth",
@@ -67,9 +67,9 @@
{ {
"slug": "autobrr", "slug": "autobrr",
"repo": "autobrr/autobrr", "repo": "autobrr/autobrr",
"version": "v1.72.1", "version": "v1.73.0",
"pinned": false, "pinned": false,
"date": "2026-01-30T12:57:58Z" "date": "2026-02-13T16:37:28Z"
}, },
{ {
"slug": "autocaliweb", "slug": "autocaliweb",
@@ -193,16 +193,16 @@
{ {
"slug": "cleanuparr", "slug": "cleanuparr",
"repo": "Cleanuparr/Cleanuparr", "repo": "Cleanuparr/Cleanuparr",
"version": "v2.6.0", "version": "v2.6.1",
"pinned": false, "pinned": false,
"date": "2026-02-13T00:14:21Z" "date": "2026-02-13T10:00:19Z"
}, },
{ {
"slug": "cloudreve", "slug": "cloudreve",
"repo": "cloudreve/cloudreve", "repo": "cloudreve/cloudreve",
"version": "4.13.0", "version": "4.14.0",
"pinned": false, "pinned": false,
"date": "2026-02-05T12:53:24Z" "date": "2026-02-14T06:05:06Z"
}, },
{ {
"slug": "comfyui", "slug": "comfyui",
@@ -445,9 +445,9 @@
{ {
"slug": "gotify", "slug": "gotify",
"repo": "gotify/server", "repo": "gotify/server",
"version": "v2.8.0", "version": "v2.9.0",
"pinned": false, "pinned": false,
"date": "2026-01-02T11:56:16Z" "date": "2026-02-13T15:22:31Z"
}, },
{ {
"slug": "grist", "slug": "grist",
@@ -508,9 +508,9 @@
{ {
"slug": "homarr", "slug": "homarr",
"repo": "homarr-labs/homarr", "repo": "homarr-labs/homarr",
"version": "v1.53.0", "version": "v1.53.1",
"pinned": false, "pinned": false,
"date": "2026-02-06T19:42:58Z" "date": "2026-02-13T19:47:11Z"
}, },
{ {
"slug": "homebox", "slug": "homebox",
@@ -578,9 +578,9 @@
{ {
"slug": "jackett", "slug": "jackett",
"repo": "Jackett/Jackett", "repo": "Jackett/Jackett",
"version": "v0.24.1103", "version": "v0.24.1109",
"pinned": false, "pinned": false,
"date": "2026-02-13T05:53:23Z" "date": "2026-02-14T05:54:26Z"
}, },
{ {
"slug": "jellystat", "slug": "jellystat",
@@ -823,16 +823,16 @@
{ {
"slug": "metube", "slug": "metube",
"repo": "alexta69/metube", "repo": "alexta69/metube",
"version": "2026.02.12", "version": "2026.02.13",
"pinned": false, "pinned": false,
"date": "2026-02-12T21:05:49Z" "date": "2026-02-13T15:18:17Z"
}, },
{ {
"slug": "miniflux", "slug": "miniflux",
"repo": "miniflux/v2", "repo": "miniflux/v2",
"version": "2.2.16", "version": "2.2.17",
"pinned": false, "pinned": false,
"date": "2026-01-07T03:26:27Z" "date": "2026-02-13T20:30:17Z"
}, },
{ {
"slug": "monica", "slug": "monica",
@@ -1000,7 +1000,7 @@
"repo": "fosrl/pangolin", "repo": "fosrl/pangolin",
"version": "1.15.4", "version": "1.15.4",
"pinned": false, "pinned": false,
"date": "2026-02-13T00:54:02Z" "date": "2026-02-13T23:01:29Z"
}, },
{ {
"slug": "paperless-ai", "slug": "paperless-ai",
@@ -1026,9 +1026,9 @@
{ {
"slug": "patchmon", "slug": "patchmon",
"repo": "PatchMon/PatchMon", "repo": "PatchMon/PatchMon",
"version": "v1.3.7", "version": "v1.4.0",
"pinned": false, "pinned": false,
"date": "2025-12-25T11:08:14Z" "date": "2026-02-13T10:39:03Z"
}, },
{ {
"slug": "paymenter", "slug": "paymenter",
@@ -1096,9 +1096,9 @@
{ {
"slug": "pocketbase", "slug": "pocketbase",
"repo": "pocketbase/pocketbase", "repo": "pocketbase/pocketbase",
"version": "v0.36.2", "version": "v0.36.3",
"pinned": false, "pinned": false,
"date": "2026-02-01T08:12:42Z" "date": "2026-02-13T18:38:58Z"
}, },
{ {
"slug": "pocketid", "slug": "pocketid",
@@ -1219,6 +1219,13 @@
"pinned": false, "pinned": false,
"date": "2025-11-16T22:39:01Z" "date": "2025-11-16T22:39:01Z"
}, },
{
"slug": "radicale",
"repo": "Kozea/Radicale",
"version": "v3.6.0",
"pinned": false,
"date": "2026-01-10T06:56:46Z"
},
{ {
"slug": "rclone", "slug": "rclone",
"repo": "rclone/rclone", "repo": "rclone/rclone",
@@ -1306,9 +1313,9 @@
{ {
"slug": "semaphore", "slug": "semaphore",
"repo": "semaphoreui/semaphore", "repo": "semaphoreui/semaphore",
"version": "v2.16.51", "version": "v2.17.0",
"pinned": false, "pinned": false,
"date": "2026-01-12T16:26:38Z" "date": "2026-02-13T21:08:30Z"
}, },
{ {
"slug": "shelfmark", "slug": "shelfmark",
@@ -1404,9 +1411,9 @@
{ {
"slug": "tandoor", "slug": "tandoor",
"repo": "TandoorRecipes/recipes", "repo": "TandoorRecipes/recipes",
"version": "2.5.0", "version": "2.5.1",
"pinned": false, "pinned": false,
"date": "2026-02-08T13:23:02Z" "date": "2026-02-13T15:57:27Z"
}, },
{ {
"slug": "tasmoadmin", "slug": "tasmoadmin",
@@ -1460,9 +1467,9 @@
{ {
"slug": "tianji", "slug": "tianji",
"repo": "msgbyte/tianji", "repo": "msgbyte/tianji",
"version": "v1.31.12", "version": "v1.31.13",
"pinned": false, "pinned": false,
"date": "2026-02-12T19:06:14Z" "date": "2026-02-13T16:30:09Z"
}, },
{ {
"slug": "traccar", "slug": "traccar",
@@ -1509,9 +1516,9 @@
{ {
"slug": "tududi", "slug": "tududi",
"repo": "chrisvel/tududi", "repo": "chrisvel/tududi",
"version": "v0.88.4", "version": "v0.88.5",
"pinned": false, "pinned": false,
"date": "2026-01-20T15:11:58Z" "date": "2026-02-13T13:54:14Z"
}, },
{ {
"slug": "tunarr", "slug": "tunarr",
@@ -1558,9 +1565,9 @@
{ {
"slug": "uptimekuma", "slug": "uptimekuma",
"repo": "louislam/uptime-kuma", "repo": "louislam/uptime-kuma",
"version": "2.1.0", "version": "2.1.1",
"pinned": false, "pinned": false,
"date": "2026-02-07T02:31:49Z" "date": "2026-02-13T16:07:33Z"
}, },
{ {
"slug": "vaultwarden", "slug": "vaultwarden",

View File

@@ -12,7 +12,7 @@
"documentation": "https://radicale.org/master.html#documentation-1", "documentation": "https://radicale.org/master.html#documentation-1",
"website": "https://radicale.org/", "website": "https://radicale.org/",
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/radicale.webp", "logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/radicale.webp",
"config_path": "/etc/radicale/config or ~/.config/radicale/config", "config_path": "/etc/radicale/config",
"description": "Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV (contacts)", "description": "Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV (contacts)",
"install_methods": [ "install_methods": [
{ {

View File

@@ -32,6 +32,10 @@
"password": null "password": null
}, },
"notes": [ "notes": [
{
"text": "SQL Server (2025) SQLPAL is incompatible with Proxmox VE 9 (Kernel 6.12+) in LXC containers. Use a VM instead or the SQL-Server 2022 LXC.",
"type": "warning"
},
{ {
"text": "If you choose not to run the installation setup, execute: `/opt/mssql/bin/mssql-conf setup` in LXC shell.", "text": "If you choose not to run the installation setup, execute: `/opt/mssql/bin/mssql-conf setup` in LXC shell.",
"type": "info" "type": "info"

View File

@@ -24,7 +24,7 @@ setup_hwaccel
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
msg_info "Installing Open WebUI" msg_info "Installing Open WebUI"
$STD uv tool install --python 3.12 open-webui[all] $STD uv tool install --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
msg_ok "Installed Open WebUI" msg_ok "Installed Open WebUI"
read -r -p "${TAB3}Would you like to add Ollama? <y/N> " prompt read -r -p "${TAB3}Would you like to add Ollama? <y/N> " prompt

View File

@@ -36,7 +36,7 @@ $STD npm ci
$STD npm run set:sqlite $STD npm run set:sqlite
$STD npm run set:oss $STD npm run set:oss
rm -rf server/private rm -rf server/private
$STD npm run db:sqlite:generate $STD npm run db:generate
$STD npm run build $STD npm run build
$STD npm run build:cli $STD npm run build:cli
cp -R .next/standalone ./ cp -R .next/standalone ./
@@ -178,7 +178,7 @@ http:
servers: servers:
- url: "http://$LOCAL_IP:3000" - url: "http://$LOCAL_IP:3000"
EOF EOF
$STD npm run db:sqlite:push $STD npm run db:push
. /etc/os-release . /etc/os-release
if [ "$VERSION_CODENAME" = "trixie" ]; then if [ "$VERSION_CODENAME" = "trixie" ]; then

View File

@@ -14,42 +14,51 @@ network_check
update_os update_os
msg_info "Installing Dependencies" msg_info "Installing Dependencies"
$STD apt install -y \ $STD apt install -y apache2-utils
apache2-utils \
python3-pip \
python3-venv
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
PYTHON_VERSION="3.13" setup_uv
fetch_and_deploy_gh_release "Radicale" "Kozea/Radicale" "tarball" "latest" "/opt/radicale"
msg_info "Setting up Radicale" msg_info "Setting up Radicale"
python3 -m venv /opt/radicale cd /opt/radicale
source /opt/radicale/bin/activate
$STD python3 -m pip install --upgrade https://github.com/Kozea/Radicale/archive/master.tar.gz
RNDPASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13) RNDPASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13)
$STD htpasswd -c -b -5 /opt/radicale/users admin $RNDPASS $STD htpasswd -c -b -5 /opt/radicale/users admin "$RNDPASS"
{ {
echo "Radicale Credentials" echo "Radicale Credentials"
echo "Admin User: admin" echo "Admin User: admin"
echo "Admin Password: $RNDPASS" echo "Admin Password: $RNDPASS"
} >>~/radicale.creds } >>~/radicale.creds
msg_ok "Done setting up Radicale"
msg_info "Setup Service" mkdir -p /etc/radicale
cat <<EOF >/etc/radicale/config
[server]
hosts = 0.0.0.0:5232
cat <<EOF >/opt/radicale/start.sh [auth]
#!/usr/bin/env bash type = htpasswd
source /opt/radicale/bin/activate htpasswd_filename = /opt/radicale/users
python3 -m radicale --storage-filesystem-folder=/var/lib/radicale/collections --hosts 0.0.0.0:5232 --auth-type htpasswd --auth-htpasswd-filename /opt/radicale/users --auth-htpasswd-encryption sha512 htpasswd_encryption = sha512
[storage]
type = multifilesystem
filesystem_folder = /var/lib/radicale/collections
[web]
type = internal
EOF EOF
msg_ok "Set up Radicale"
chmod +x /opt/radicale/start.sh msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/radicale.service cat <<EOF >/etc/systemd/system/radicale.service
[Unit]
Description=A simple CalDAV (calendar) and CardDAV (contact) server Description=A simple CalDAV (calendar) and CardDAV (contact) server
After=network.target After=network.target
Requires=network.target Requires=network.target
[Service] [Service]
ExecStart=/opt/radicale/start.sh WorkingDirectory=/opt/radicale
ExecStart=/usr/local/bin/uv run -m radicale --config /etc/radicale/config
Restart=on-failure Restart=on-failure
# User=radicale # User=radicale
# Deny other users access to the calendar data # Deny other users access to the calendar data

View File

@@ -15,16 +15,18 @@ update_os
msg_info "Installing Dependencies" msg_info "Installing Dependencies"
$STD apt install -y apt-transport-https $STD apt install -y apt-transport-https
curl -fsSL "https://dl.ui.com/unifi/unifi-repo.gpg" -o "/usr/share/keyrings/unifi-repo.gpg"
cat <<EOF | sudo tee /etc/apt/sources.list.d/100-ubnt-unifi.sources >/dev/null
Types: deb
URIs: https://www.ui.com/downloads/unifi/debian
Suites: stable
Components: ubiquiti
Architectures: amd64
Signed-By: /usr/share/keyrings/unifi-repo.gpg
EOF
$STD apt update
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
setup_deb822_repo \
"unifi" \
"https://dl.ui.com/unifi/unifi-repo.gpg" \
"https://www.ui.com/downloads/unifi/debian" \
"stable" \
"ubiquiti" \
"amd64"
JAVA_VERSION="21" setup_java JAVA_VERSION="21" setup_java
if lscpu | grep -q 'avx'; then if lscpu | grep -q 'avx'; then

View File

@@ -153,7 +153,7 @@ explain_exit_code() {
126) echo "Command invoked cannot execute (permission problem?)" ;; 126) echo "Command invoked cannot execute (permission problem?)" ;;
127) echo "Command not found" ;; 127) echo "Command not found" ;;
128) echo "Invalid argument to exit" ;; 128) echo "Invalid argument to exit" ;;
130) echo "Terminated by Ctrl+C (SIGINT)" ;; 130) echo "Aborted by user (SIGINT)" ;;
134) echo "Process aborted (SIGABRT - possibly Node.js heap overflow)" ;; 134) echo "Process aborted (SIGABRT - possibly Node.js heap overflow)" ;;
137) echo "Killed (SIGKILL / Out of memory?)" ;; 137) echo "Killed (SIGKILL / Out of memory?)" ;;
139) echo "Segmentation fault (core dumped)" ;; 139) echo "Segmentation fault (core dumped)" ;;
@@ -233,6 +233,43 @@ explain_exit_code() {
esac esac
} }
# ------------------------------------------------------------------------------
# json_escape()
#
# - Escapes a string for safe JSON embedding
# - Handles backslashes, quotes, newlines, tabs, and carriage returns
# ------------------------------------------------------------------------------
json_escape() {
local s="$1"
s=${s//\\/\\\\}
s=${s//"/\\"}
s=${s//$'\n'/\\n}
s=${s//$'\r'/}
s=${s//$'\t'/\\t}
echo "$s"
}
# ------------------------------------------------------------------------------
# get_error_text()
#
# - Returns last 20 lines of the active log (INSTALL_LOG or BUILD_LOG)
# - Falls back to empty string if no log is available
# ------------------------------------------------------------------------------
get_error_text() {
local logfile=""
if declare -f get_active_logfile >/dev/null 2>&1; then
logfile=$(get_active_logfile)
elif [[ -n "${INSTALL_LOG:-}" ]]; then
logfile="$INSTALL_LOG"
elif [[ -n "${BUILD_LOG:-}" ]]; then
logfile="$BUILD_LOG"
fi
if [[ -n "$logfile" && -s "$logfile" ]]; then
tail -n 20 "$logfile" 2>/dev/null | sed 's/\r$//'
fi
}
# ============================================================================== # ==============================================================================
# SECTION 2: TELEMETRY FUNCTIONS # SECTION 2: TELEMETRY FUNCTIONS
# ============================================================================== # ==============================================================================
@@ -385,7 +422,8 @@ post_to_api() {
detect_gpu detect_gpu
fi fi
local gpu_vendor="${GPU_VENDOR:-unknown}" local gpu_vendor="${GPU_VENDOR:-unknown}"
local gpu_model="${GPU_MODEL:-}" local gpu_model
gpu_model=$(json_escape "${GPU_MODEL:-}")
local gpu_passthrough="${GPU_PASSTHROUGH:-unknown}" local gpu_passthrough="${GPU_PASSTHROUGH:-unknown}"
# Detect CPU if not already set # Detect CPU if not already set
@@ -393,7 +431,8 @@ post_to_api() {
detect_cpu detect_cpu
fi fi
local cpu_vendor="${CPU_VENDOR:-unknown}" local cpu_vendor="${CPU_VENDOR:-unknown}"
local cpu_model="${CPU_MODEL:-}" local cpu_model
cpu_model=$(json_escape "${CPU_MODEL:-}")
# Detect RAM if not already set # Detect RAM if not already set
if [[ -z "${RAM_SPEED:-}" ]]; then if [[ -z "${RAM_SPEED:-}" ]]; then
@@ -484,7 +523,8 @@ post_to_api_vm() {
detect_gpu detect_gpu
fi fi
local gpu_vendor="${GPU_VENDOR:-unknown}" local gpu_vendor="${GPU_VENDOR:-unknown}"
local gpu_model="${GPU_MODEL:-}" local gpu_model
gpu_model=$(json_escape "${GPU_MODEL:-}")
local gpu_passthrough="${GPU_PASSTHROUGH:-unknown}" local gpu_passthrough="${GPU_PASSTHROUGH:-unknown}"
# Detect CPU if not already set # Detect CPU if not already set
@@ -492,7 +532,8 @@ post_to_api_vm() {
detect_cpu detect_cpu
fi fi
local cpu_vendor="${CPU_VENDOR:-unknown}" local cpu_vendor="${CPU_VENDOR:-unknown}"
local cpu_model="${CPU_MODEL:-}" local cpu_model
cpu_model=$(json_escape "${CPU_MODEL:-}")
# Detect RAM if not already set # Detect RAM if not already set
if [[ -z "${RAM_SPEED:-}" ]]; then if [[ -z "${RAM_SPEED:-}" ]]; then
@@ -555,9 +596,12 @@ post_update_to_api() {
# Silent fail - telemetry should never break scripts # Silent fail - telemetry should never break scripts
command -v curl &>/dev/null || return 0 command -v curl &>/dev/null || return 0
# Prevent duplicate submissions # Support "force" mode (3rd arg) to bypass duplicate check for retries after cleanup
local force="${3:-}"
POST_UPDATE_DONE=${POST_UPDATE_DONE:-false} POST_UPDATE_DONE=${POST_UPDATE_DONE:-false}
[[ "$POST_UPDATE_DONE" == "true" ]] && return 0 if [[ "$POST_UPDATE_DONE" == "true" && "$force" != "force" ]]; then
return 0
fi
[[ "${DIAGNOSTICS:-no}" == "no" ]] && return 0 [[ "${DIAGNOSTICS:-no}" == "no" ]] && return 0
[[ -z "${RANDOM_UUID:-}" ]] && return 0 [[ -z "${RANDOM_UUID:-}" ]] && return 0
@@ -568,12 +612,14 @@ post_update_to_api() {
# Get GPU info (if detected) # Get GPU info (if detected)
local gpu_vendor="${GPU_VENDOR:-unknown}" local gpu_vendor="${GPU_VENDOR:-unknown}"
local gpu_model="${GPU_MODEL:-}" local gpu_model
gpu_model=$(json_escape "${GPU_MODEL:-}")
local gpu_passthrough="${GPU_PASSTHROUGH:-unknown}" local gpu_passthrough="${GPU_PASSTHROUGH:-unknown}"
# Get CPU info (if detected) # Get CPU info (if detected)
local cpu_vendor="${CPU_VENDOR:-unknown}" local cpu_vendor="${CPU_VENDOR:-unknown}"
local cpu_model="${CPU_MODEL:-}" local cpu_model
cpu_model=$(json_escape "${CPU_MODEL:-}")
# Get RAM info (if detected) # Get RAM info (if detected)
local ram_speed="${RAM_SPEED:-}" local ram_speed="${RAM_SPEED:-}"
@@ -595,13 +641,21 @@ post_update_to_api() {
esac esac
# For failed/unknown status, resolve exit code and error description # For failed/unknown status, resolve exit code and error description
local short_error=""
if [[ "$pb_status" == "failed" ]] || [[ "$pb_status" == "unknown" ]]; then if [[ "$pb_status" == "failed" ]] || [[ "$pb_status" == "unknown" ]]; then
if [[ "$raw_exit_code" =~ ^[0-9]+$ ]]; then if [[ "$raw_exit_code" =~ ^[0-9]+$ ]]; then
exit_code="$raw_exit_code" exit_code="$raw_exit_code"
else else
exit_code=1 exit_code=1
fi fi
error=$(explain_exit_code "$exit_code") local error_text=""
error_text=$(get_error_text)
if [[ -n "$error_text" ]]; then
error=$(json_escape "$error_text")
else
error=$(json_escape "$(explain_exit_code "$exit_code")")
fi
short_error=$(json_escape "$(explain_exit_code "$exit_code")")
error_category=$(categorize_error "$exit_code") error_category=$(categorize_error "$exit_code")
[[ -z "$error" ]] && error="Unknown error" [[ -z "$error" ]] && error="Unknown error"
fi fi
@@ -618,8 +672,9 @@ post_update_to_api() {
pve_version=$(pveversion 2>/dev/null | awk -F'[/ ]' '{print $2}') || true pve_version=$(pveversion 2>/dev/null | awk -F'[/ ]' '{print $2}') || true
fi fi
# Full payload including all fields - allows record creation if initial call failed local http_code=""
# The Go service will find the record by random_id and PATCH, or create if not found
# ── Attempt 1: Full payload with complete error text ──
local JSON_PAYLOAD local JSON_PAYLOAD
JSON_PAYLOAD=$( JSON_PAYLOAD=$(
cat <<EOF cat <<EOF
@@ -651,11 +706,80 @@ post_update_to_api() {
EOF EOF
) )
# Fire-and-forget: never block, never fail http_code=$(curl -sS -w "%{http_code}" -m "${TELEMETRY_TIMEOUT}" -X POST "${TELEMETRY_URL}" \
-H "Content-Type: application/json" \
-d "$JSON_PAYLOAD" -o /dev/null 2>/dev/null) || http_code="000"
if [[ "$http_code" =~ ^2[0-9]{2}$ ]]; then
POST_UPDATE_DONE=true
return 0
fi
# ── Attempt 2: Short error text (no full log) ──
sleep 1
local RETRY_PAYLOAD
RETRY_PAYLOAD=$(
cat <<EOF
{
"random_id": "${RANDOM_UUID}",
"type": "${TELEMETRY_TYPE:-lxc}",
"nsapp": "${NSAPP:-unknown}",
"status": "${pb_status}",
"ct_type": ${CT_TYPE:-1},
"disk_size": ${DISK_SIZE:-0},
"core_count": ${CORE_COUNT:-0},
"ram_size": ${RAM_SIZE:-0},
"os_type": "${var_os:-}",
"os_version": "${var_version:-}",
"pve_version": "${pve_version}",
"method": "${METHOD:-default}",
"exit_code": ${exit_code},
"error": "${short_error}",
"error_category": "${error_category}",
"install_duration": ${duration},
"cpu_vendor": "${cpu_vendor}",
"cpu_model": "${cpu_model}",
"gpu_vendor": "${gpu_vendor}",
"gpu_model": "${gpu_model}",
"gpu_passthrough": "${gpu_passthrough}",
"ram_speed": "${ram_speed}",
"repo_source": "${REPO_SOURCE}"
}
EOF
)
http_code=$(curl -sS -w "%{http_code}" -m "${TELEMETRY_TIMEOUT}" -X POST "${TELEMETRY_URL}" \
-H "Content-Type: application/json" \
-d "$RETRY_PAYLOAD" -o /dev/null 2>/dev/null) || http_code="000"
if [[ "$http_code" =~ ^2[0-9]{2}$ ]]; then
POST_UPDATE_DONE=true
return 0
fi
# ── Attempt 3: Minimal payload (bare minimum to set status) ──
sleep 2
local MINIMAL_PAYLOAD
MINIMAL_PAYLOAD=$(
cat <<EOF
{
"random_id": "${RANDOM_UUID}",
"type": "${TELEMETRY_TYPE:-lxc}",
"nsapp": "${NSAPP:-unknown}",
"status": "${pb_status}",
"exit_code": ${exit_code},
"error": "${short_error}",
"error_category": "${error_category}",
"install_duration": ${duration}
}
EOF
)
curl -sS -w "%{http_code}" -m "${TELEMETRY_TIMEOUT}" -X POST "${TELEMETRY_URL}" \ curl -sS -w "%{http_code}" -m "${TELEMETRY_TIMEOUT}" -X POST "${TELEMETRY_URL}" \
-H "Content-Type: application/json" \ -H "Content-Type: application/json" \
-d "$JSON_PAYLOAD" -o /dev/null 2>&1 || true -d "$MINIMAL_PAYLOAD" -o /dev/null 2>/dev/null || true
# Tried 3 times - mark as done regardless to prevent infinite loops
POST_UPDATE_DONE=true POST_UPDATE_DONE=true
} }
@@ -691,6 +815,9 @@ categorize_error() {
# Configuration errors # Configuration errors
203 | 204 | 205 | 206 | 207 | 208) echo "config" ;; 203 | 204 | 205 | 206 | 207 | 208) echo "config" ;;
# Aborted by user
130) echo "aborted" ;;
# Resource errors (OOM, etc) # Resource errors (OOM, etc)
137 | 134) echo "resource" ;; 137 | 134) echo "resource" ;;
@@ -755,7 +882,13 @@ post_tool_to_api() {
if [[ "$status" == "failed" ]]; then if [[ "$status" == "failed" ]]; then
[[ ! "$exit_code" =~ ^[0-9]+$ ]] && exit_code=1 [[ ! "$exit_code" =~ ^[0-9]+$ ]] && exit_code=1
error=$(explain_exit_code "$exit_code") local error_text=""
error_text=$(get_error_text)
if [[ -n "$error_text" ]]; then
error=$(json_escape "$error_text")
else
error=$(json_escape "$(explain_exit_code "$exit_code")")
fi
error_category=$(categorize_error "$exit_code") error_category=$(categorize_error "$exit_code")
fi fi
@@ -816,7 +949,13 @@ post_addon_to_api() {
if [[ "$status" == "failed" ]]; then if [[ "$status" == "failed" ]]; then
[[ ! "$exit_code" =~ ^[0-9]+$ ]] && exit_code=1 [[ ! "$exit_code" =~ ^[0-9]+$ ]] && exit_code=1
error=$(explain_exit_code "$exit_code") local error_text=""
error_text=$(get_error_text)
if [[ -n "$error_text" ]]; then
error=$(json_escape "$error_text")
else
error=$(json_escape "$(explain_exit_code "$exit_code")")
fi
error_category=$(categorize_error "$exit_code") error_category=$(categorize_error "$exit_code")
fi fi
@@ -909,7 +1048,13 @@ post_update_to_api_extended() {
else else
exit_code=1 exit_code=1
fi fi
error=$(explain_exit_code "$exit_code") local error_text=""
error_text=$(get_error_text)
if [[ -n "$error_text" ]]; then
error=$(json_escape "$error_text")
else
error=$(json_escape "$(explain_exit_code "$exit_code")")
fi
error_category=$(categorize_error "$exit_code") error_category=$(categorize_error "$exit_code")
[[ -z "$error" ]] && error="Unknown error" [[ -z "$error" ]] && error="Unknown error"
fi fi

View File

@@ -4046,12 +4046,10 @@ EOF'
if [[ $install_exit_code -ne 0 ]]; then if [[ $install_exit_code -ne 0 ]]; then
msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})" msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})"
# Report failure to telemetry API # Copy install log from container BEFORE API call so get_error_text() can read it
post_update_to_api "failed" "$install_exit_code"
# Copy both logs from container before potential deletion
local build_log_copied=false local build_log_copied=false
local install_log_copied=false local install_log_copied=false
local host_install_log="/tmp/install-lxc-${CTID}-${SESSION_ID}.log"
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
# Copy BUILD_LOG (creation log) if it exists # Copy BUILD_LOG (creation log) if it exists
@@ -4059,15 +4057,22 @@ EOF'
cp "${BUILD_LOG}" "/tmp/create-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null && build_log_copied=true cp "${BUILD_LOG}" "/tmp/create-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null && build_log_copied=true
fi fi
# Copy INSTALL_LOG from container # Copy INSTALL_LOG from container to host
if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "/tmp/install-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null; then if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "$host_install_log" 2>/dev/null; then
install_log_copied=true install_log_copied=true
# Point INSTALL_LOG to host copy so get_error_text() finds it
INSTALL_LOG="$host_install_log"
fi fi
fi
# Show available logs # Report failure to telemetry API (now with log available on host)
post_update_to_api "failed" "$install_exit_code"
# Show available logs
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
echo "" echo ""
[[ "$build_log_copied" == true ]] && echo -e "${GN}${CL} Container creation log: ${BL}/tmp/create-lxc-${CTID}-${SESSION_ID}.log${CL}" [[ "$build_log_copied" == true ]] && echo -e "${GN}${CL} Container creation log: ${BL}/tmp/create-lxc-${CTID}-${SESSION_ID}.log${CL}"
[[ "$install_log_copied" == true ]] && echo -e "${GN}${CL} Installation log: ${BL}/tmp/install-lxc-${CTID}-${SESSION_ID}.log${CL}" [[ "$install_log_copied" == true ]] && echo -e "${GN}${CL} Installation log: ${BL}${host_install_log}${CL}"
fi fi
# Dev mode: Keep container or open breakpoint shell # Dev mode: Keep container or open breakpoint shell
@@ -4125,6 +4130,10 @@ EOF'
echo -e "${BFR}${CM}${GN}Container ${CTID} removed${CL}" echo -e "${BFR}${CM}${GN}Container ${CTID} removed${CL}"
fi fi
# Force one final status update attempt after cleanup
# This ensures status is updated even if the first attempt failed (e.g., HTTP 400)
post_update_to_api "failed" "$install_exit_code" "force"
exit $install_exit_code exit $install_exit_code
fi fi
} }

View File

@@ -222,6 +222,12 @@ error_handler() {
pct destroy "$CTID" &>/dev/null || true pct destroy "$CTID" &>/dev/null || true
echo -e "${GN}${CL} Container ${CTID} removed" echo -e "${GN}${CL} Container ${CTID} removed"
fi fi
# Force one final status update attempt after cleanup
# This ensures status is updated even if the first attempt failed (e.g., HTTP 400)
if declare -f post_update_to_api &>/dev/null; then
post_update_to_api "failed" "$exit_code" "force"
fi
fi fi
fi fi
fi fi