Compare commits

..

124 Commits

Author SHA1 Message Date
github-actions[bot]
acab1a0424 chore: update github-versions.json 2026-02-13 12:11:36 +00:00
CanbiZ (MickLesk)
bf85ef2a8b Merge branch 'main' of https://github.com/community-scripts/ProxmoxVE 2026-02-13 12:29:11 +01:00
CanbiZ (MickLesk)
cc89cdbab1 Copy install log to host before API report
Copy the container install log to a host path before reporting a failure to the telemetry API so get_error_text() can read it. Introduce host_install_log and point INSTALL_LOG to the host copy when pulled via pct, move post_update_to_api after the log copy, and update the displayed installation-log path.
2026-02-13 12:29:03 +01:00
community-scripts-pr-app[bot]
d6f3f03f8a Update CHANGELOG.md (#11883)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 11:04:20 +00:00
CanbiZ (MickLesk)
55e35d7f11 qf 2026-02-13 12:03:47 +01:00
community-scripts-pr-app[bot]
3b9f8d4a93 Update CHANGELOG.md (#11882)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:27:20 +00:00
community-scripts-pr-app[bot]
6c5377adec Update CHANGELOG.md (#11881)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:27:12 +00:00
Léon Zimmermann
eeb349346b Planka: add migrate step to update function (#11877)
Added database migration commands after restoring data.
2026-02-13 11:26:56 +01:00
community-scripts-pr-app[bot]
d271c16799 Update CHANGELOG.md (#11880)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:26:48 +00:00
CanbiZ (MickLesk)
4774c54861 Openwebui: pin numba constraint (#11874)
Update scripts to use Python 3.12 for uv tool setup and Open-WebUI installs/upgrades. Add a numba constraint (--constraint <(echo "numba>=0.60")) to uv tool install/upgrade commands to ensure compatibility. Changes applied to ct/openwebui.sh and install/openwebui-install.sh for both fresh installs and update paths.
2026-02-13 11:26:19 +01:00
community-scripts-pr-app[bot]
4bf63bae35 Update CHANGELOG.md (#11879)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:17:12 +00:00
CanbiZ (MickLesk)
f2b7c9638d error-handler: Implement json_escape and enhance error handling (#11875)
Added json_escape function for safe JSON embedding and updated error handling to include user abort messages.
2026-02-13 11:16:40 +01:00
community-scripts-pr-app[bot]
551f89e46f Update CHANGELOG.md (#11878)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 10:02:31 +00:00
Tom Frenzel
4f571a1eb6 fix(donetick): add config entry for v0.1.73 (#11872) 2026-02-13 11:02:02 +01:00
CanbiZ (MickLesk)
3156e8e363 downgrade openwebui 2026-02-13 10:12:04 +01:00
CanbiZ (MickLesk)
60ebdc97a5 fix unifi gpg 2026-02-13 09:24:13 +01:00
community-scripts-pr-app[bot]
20ec369338 Update CHANGELOG.md (#11871)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 08:18:50 +00:00
Chris
4907a906c3 Refactor: Radicale (#11850)
* Refactor: Radicale

* Create explicit config at `/etc/radicale/config`

* grammar

---------

Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-02-13 09:18:29 +01:00
community-scripts-pr-app[bot]
27e3a4301e Update CHANGELOG.md (#11870)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 08:16:37 +00:00
CanbiZ (MickLesk)
43fb75f2b4 Switch sqlite-specific db scripts to generic (#11868)
Replace npm script calls to db:sqlite:generate and db:sqlite:push with db:generate and db:push in ct/pangolin.sh and install/pangolin-install.sh. This makes the build/install steps use the generic DB task names for consistency across update and install workflows.
2026-02-13 09:16:13 +01:00
community-scripts-pr-app[bot]
899d0e4baa Update CHANGELOG.md (#11869)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 08:11:20 +00:00
CanbiZ (MickLesk)
85584b105d SQLServer-2025: add PVE9/Kernel 6.x incompatibility warning (#11829)
* docs(sqlserver2025): add PVE9/Kernel 6.x incompatibility warning

* Update warning note for SQL Server SQLPAL compatibility

* Update frontend/public/json/sqlserver2025.json

Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>

---------

Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-02-13 09:10:50 +01:00
CanbiZ (MickLesk)
3fe6f50414 hotfix unifi wrong url 2026-02-13 08:50:41 +01:00
community-scripts-pr-app[bot]
724a066aed chore: update github-versions.json (#11867)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 06:23:08 +00:00
community-scripts-pr-app[bot]
cd6e8ecbbe Update CHANGELOG.md (#11866)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 06:17:38 +00:00
Chris
8083c0c0e1 [Hotfix] Jotty: Copy contents of config backup into /opt/jotty/config (#11864) 2026-02-13 07:17:08 +01:00
community-scripts-pr-app[bot]
29836f35ed Update CHANGELOG.md (#11863)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 00:24:04 +00:00
community-scripts-pr-app[bot]
17d3d4297c chore: update github-versions.json (#11862)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-13 00:23:36 +00:00
MickLesk
2b921736e6 fix(tools.func): fix GPG key format detection in setup_deb822_repo
The previous logic using 'file | grep PGP' was inverted — both
ASCII-armored and binary GPG keys matched the pattern, causing
ASCII-armored keys to be copied directly instead of being
dearmored. This resulted in APT failing with NO_PUBKEY errors
on Debian 12 (bookworm).

Use 'grep BEGIN PGP' to reliably detect ASCII-armored keys and
dearmor them, otherwise copy binary keys directly.
2026-02-12 22:30:34 +01:00
MickLesk
ddabe81dd8 fix(tools.func): set GPG keyring files to 644 permissions
gpg --dearmor creates files with restrictive permissions (600),
which prevents Debian 13's sqv signature verifier from reading
the keyring files. This causes apt update to fail with
'Permission denied' errors for all repositories using custom
GPG keys (adoptium, pgdg, pdm, etc.).

Set chmod 644 after creating .gpg files in both setup_deb822_repo()
and the MongoDB GPG key import in manage_tool_repository().
2026-02-12 22:24:24 +01:00
community-scripts-pr-app[bot]
19c5671d3f Update CHANGELOG.md (#11854)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 19:06:26 +00:00
CanbiZ (MickLesk)
2326520d17 Archlinux-VM: fix LVM/LVM-thin storage and improve error reporting | VM's add correct exit_code for analytics (#11842)
* fix(archlinux-vm): fix LVM/LVM-thin storage and improve error reporting

- Add catch-all (*) case for storage types (LVM, LVM-thin, zfspool)
  Previously only nfs/dir/cifs and btrfs were handled, leaving
  DISK_EXT, DISK_REF, and DISK_IMPORT unset on LVM/LVM-thin storage
- Fix error_handler to send numeric exit_code to API instead of
  bash command text (which caused 'Unknown error' in telemetry)
- Replace fragile pvesm alloc for EFI disk with Proxmox-managed
  :0,efitype=4m (consistent with docker-vm.sh)
- Modernize disk import: auto-detect qm disk import vs qm importdisk,
  parse output to get correct disk reference instead of guessing names
- Use --format flag (double dash) consistent with modern Proxmox API
- Remove unused FORMAT variable (EFI type now always set correctly)
- Remove fragile eval-based disk name construction

* fix(vm): fix LVM/LVM-thin storage and error reporting for all VM scripts

- Add catch-all (*) case to storage type detection in all VM scripts
  that were missing it (debian-vm, debian-13-vm, ubuntu2204/2404/2504,
  nextcloud-vm, owncloud-vm, opnsense-vm, pimox-haos-vm)
- Add catch-all to mikrotik-routeros (had zfspool but not lvm/lvmthin)
- Fix error_handler in ALL 14 VM scripts to send numeric exit_code
  to post_update_to_api instead of bash command text, which caused
  'Unknown error' in telemetry because the API expects a number
2026-02-12 20:06:02 +01:00
community-scripts-pr-app[bot]
7964d39e32 Update CHANGELOG.md (#11853)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 19:05:55 +00:00
CanbiZ (MickLesk)
f7cf7c8adc fix(telemetry): prevent stuck 'installing' status in error_handler.func (#11845)
The catch_errors() function in CT scripts overrides the API telemetry
traps set by build.func. This caused on_exit, on_interrupt, and
on_terminate to never call post_update_to_api, leaving telemetry
records permanently stuck on 'installing'.

Changes:
- on_exit: Report orphaned 'installing' records on ANY exit where
  post_to_api was called but post_update_to_api was not
- on_interrupt: Call post_update_to_api('failed', '130') before exit
- on_terminate: Call post_update_to_api('failed', '143') before exit

All calls are guarded by POST_UPDATE_DONE flag to prevent duplicates.
2026-02-12 20:05:27 +01:00
community-scripts-pr-app[bot]
744191cb84 Update CHANGELOG.md (#11852)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 19:05:20 +00:00
CanbiZ (MickLesk)
291ed4c5ad fix(emqx): increase disk to 6GB and add optional MQ disable prompt (#11844)
EMQX 6.1+ preallocates significant disk space for the MQ feature,
causing high CPU/disk usage on small containers (emqx/emqx#16649).

- Increase default disk from 4GB to 6GB
- Add read -rp prompt during install to optionally disable MQ feature
  via mq.enable=false in emqx.conf (reduces disk/CPU overhead)
- Setting is in install script (not CT script) per reviewer feedback

Co-authored-by: sim-san <sim-san@users.noreply.github.com>
2026-02-12 20:04:47 +01:00
community-scripts-pr-app[bot]
f9612c5aba Update CHANGELOG.md (#11851)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 19:04:18 +00:00
CanbiZ (MickLesk)
403a839ac0 fix(tools): auto-detect binary vs armored GPG keys in setup_deb822_repo (#11841)
The UniFi GPG key at dl.ui.com/unifi/unifi-repo.gpg is already in binary
format. setup_deb822_repo unconditionally ran gpg --dearmor which expects
ASCII-armored input, corrupting binary keys and causing apt to fail with
'Unable to locate package unifi'.

setup_deb822_repo now downloads the key to a temp file first and uses
the file command to detect whether it is already a binary PGP/GPG key.
Binary keys are copied directly; armored keys are dearmored as before.

This also reverts unifi-install.sh back to using setup_deb822_repo for
consistency with all other install scripts.
2026-02-12 20:03:33 +01:00
community-scripts-pr-app[bot]
41c89413ef chore: update github-versions.json (#11848)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 18:21:07 +00:00
community-scripts-pr-app[bot]
fa11528a7b Update CHANGELOG.md (#11847)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 18:12:53 +00:00
CanbiZ (MickLesk)
2a03c86384 Tailscale: fix DNS check and keyrings directory issues (#11837)
* fix(tailscale-addon): fix DNS check and keyrings directory issues

- Source /etc/os-release instead of grep to handle quoted values properly
- Use VERSION_CODENAME variable instead of VER for correct URL
- Add fallback DNS resolution methods (host, nslookup, getent) when dig is missing
- Create /usr/share/keyrings directory if it doesn't exist
- Skip DNS check gracefully when no DNS tools are available

Fixes installation failures with 'dig: command not found' and
'No such file or directory' for keyrings path

* Update tools/addon/add-tailscale-lxc.sh

Co-authored-by: Chris <punk.sand7393@fastmail.com>

* Update tools/addon/add-tailscale-lxc.sh

Co-authored-by: Chris <punk.sand7393@fastmail.com>

---------

Co-authored-by: Chris <punk.sand7393@fastmail.com>
2026-02-12 19:12:23 +01:00
community-scripts-pr-app[bot]
57b4e10b93 Update CHANGELOG.md (#11846)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 17:40:58 +00:00
Stefan Tomas
4b22c7cc2d Increased the Grafana container default disk size. (#11840)
Co-authored-by: Stefan Tomas <stefan.tomas@proton.me>
2026-02-12 18:40:31 +01:00
community-scripts-pr-app[bot]
79fd0d1dda Update CHANGELOG.md (#11839)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 13:51:22 +00:00
community-scripts-pr-app[bot]
280778d53b Update CHANGELOG.md (#11838)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 13:51:13 +00:00
CanbiZ (MickLesk)
1c3a3107f1 Deluge: add python3-setuptools as dep (#11833)
* fix(deluge): add python3-setuptools for pkg_resources on Python 3.13

* Ensure dependencies before updating Deluge
2026-02-12 14:50:56 +01:00
CanbiZ (MickLesk)
ee2c3a20ee fix(dispatcharr): migrate from requirements.txt to uv sync (pyproject.toml) (#11831) 2026-02-12 14:50:42 +01:00
CanbiZ (MickLesk)
5ee4f4e34b fix(api): prevent duplicate post_to_api submissions (#11836)
Add POST_TO_API_DONE idempotency guard to post_to_api() to prevent
the same telemetry record from being submitted twice with the same
RANDOM_UUID. This mirrors the existing POST_UPDATE_DONE pattern in
post_update_to_api().

post_to_api() is called twice in build.func:
- After storage validation (inside CONTAINER_STORAGE check)
- After create_lxc_container() completes

When both execute, the second call fails with a random_id uniqueness
violation on PocketBase, generating server-side errors.
2026-02-12 13:45:32 +01:00
community-scripts-pr-app[bot]
4b0e893bf1 Update CHANGELOG.md (#11834)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 12:26:51 +00:00
CanbiZ (MickLesk)
3676157a7c Debian13-VM: Optimize First Boot & add noCloud/Cloud Selection (#11810)
* fix(debian-13-vm): disable systemd-firstboot and add autologin via virt-customize

Newer Debian 13 (Trixie) nocloud images (Jan 29+) have systemd-firstboot
enabled, which blocks the boot process by prompting for timezone and root
password on the serial console. The noVNC console stalls after auditd checks.

Fix by using virt-customize (like docker-vm.sh) to:
- Disable systemd-firstboot.service and mask it
- Pre-seed timezone to Etc/UTC
- Configure autologin on ttyS0 and tty1 for nocloud images
- Set hostname and clear machine-id for unique IDs
- Install libguestfs-tools if not present

Fixes #11807

* feat(debian-13-vm): add cloud-init selection dialog for default and advanced settings

Add select_cloud_init() function consistent with docker-vm.sh that prompts
the user to choose between Cloud-Init (genericcloud image) and nocloud
(with auto-login) in both default and advanced settings.

Previously, default settings hardcoded CLOUD_INIT=no without asking.
2026-02-12 13:26:20 +01:00
CanbiZ (MickLesk)
e437e50882 Merge branch 'main' of https://github.com/community-scripts/ProxmoxVE 2026-02-12 13:25:49 +01:00
CanbiZ (MickLesk)
6e9a94b46d quickfix: missing fields for db 2026-02-12 13:25:42 +01:00
community-scripts-pr-app[bot]
137ae6775e chore: update github-versions.json (#11832)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 12:15:25 +00:00
community-scripts-pr-app[bot]
dd8c998d43 Update CHANGELOG.md (#11827)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 11:48:41 +00:00
Slaviša Arežina
b215bac01d Update database generation command in install script (#11825) 2026-02-12 12:48:16 +01:00
community-scripts-pr-app[bot]
c3cd9df12f Update CHANGELOG.md (#11826)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 10:55:38 +00:00
CanbiZ (MickLesk)
406d53ea2f core: remove old Go API and extend misc/api.func with new backend (#11822)
* Remove Go API and extend misc/api.func

Delete the Go-based API (api/main.go, api/go.mod, api/go.sum, api/.env.example) and significantly enhance misc/api.func. The shell telemetry file now includes telemetry configuration, repo source detection, GPU/CPU/RAM detection, expanded explain_exit_code mappings, and refactored post_to_api/post_to_api_vm to send non-blocking telemetry to telemetry.community-scripts.org while respecting DIAGNOSTICS/DEV_MODE and adding richer metadata (cpu/gpu/ram/repo_source). Also updates header/author info and improves privacy/robustness and error handling.

* Start install timer and refine error reporting

Call start_install_timer during build startup and overhaul exit/error reporting.

Changes:
- Invoke start_install_timer early in misc/build.func to track install duration.
- Update api_exit_script comments to reference PocketBase/api.func and adjust ERR/SIGINT/SIGTERM traps to post numeric exit codes (use $? / 130 / 143) instead of command strings.
- Replace the previous explain_exit_code implementation with a conditional fallback: only define explain_exit_code if not already provided (api.func is the canonical source). Expanded and reorganized exit code mappings (curl, timeout, systemd, Node/Python/Postgres/MySQL/MongoDB, Proxmox, etc.).
- In error_handler: stop echoing the container log path (host shows combined log), and post a "failed" update to the API with the exit code before offering container cleanup.

Rationale: these changes make telemetry more consistent and robust (numeric codes), provide a safe fallback for exit descriptions when api.func isn't loaded, and ensure failures are reported to the API prior to any automatic cleanup.

* Report install start/failure to telemetry API

Add telemetry hooks in misc/build.func: call post_to_api at installation start to capture early or immediately-failing installs, and call post_update_to_api with status "failed" and the install exit code when a container installation fails. This improves visibility into install failures for monitoring/telemetry.
2026-02-12 11:55:13 +01:00
community-scripts-pr-app[bot]
c8b278f26f chore: update github-versions.json (#11821)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 06:25:48 +00:00
community-scripts-pr-app[bot]
a6f0d7233e Update CHANGELOG.md (#11818)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 00:21:24 +00:00
community-scripts-pr-app[bot]
079a436286 chore: update github-versions.json (#11817)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 00:21:01 +00:00
community-scripts-pr-app[bot]
1c2ed6ff10 Update CHANGELOG.md (#11813)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 21:30:42 +00:00
Chris
c1d7f23a17 [Feature] OpenCloud: support PosixFS Collaborative Mode (#11806)
* [Feature] OpenCloud: support PosixFS Collaborative Mode

* Use ensure_dependencies in opencloud.sh
2026-02-11 22:30:09 +01:00
community-scripts-pr-app[bot]
fdbe48badb Update CHANGELOG.md (#11812)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 21:05:59 +00:00
CanbiZ (MickLesk)
d09dd0b664 dispatcharr: include port 9191 in success-message (#11808) 2026-02-11 22:05:32 +01:00
community-scripts-pr-app[bot]
cba6717469 Update CHANGELOG.md (#11809)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 19:48:18 +00:00
Tom Frenzel
0a12acf6bd fix: make donetick 0.1.71 compatible (#11804) 2026-02-11 20:47:52 +01:00
community-scripts-pr-app[bot]
4e4defa236 chore: update github-versions.json (#11805)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 18:22:35 +00:00
community-scripts-pr-app[bot]
c15f69712f Update CHANGELOG.md (#11802)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 15:39:16 +00:00
ls-root
b53a731c42 fix(core): respect EDITOR variable for config editing (#11693)
- Replace hardcoded nano with ${EDITOR:-nano} for file editing
- Default to nano if no EDITOR enviornmet variable is set
2026-02-11 16:38:46 +01:00
community-scripts-pr-app[bot]
ddfe9166a1 Update .app files (#11793)
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2026-02-11 13:33:31 +01:00
community-scripts-pr-app[bot]
1b1c84ad4f Update CHANGELOG.md (#11797)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 12:33:15 +00:00
CanbiZ (MickLesk)
db69c7b0f8 fix(kasm): Support new version URL format without hash suffix (#11787)
Kasm changed their release URL format starting with v1.18.1:
- Old format: kasm_release_1.18.0.09f70a.tar.gz (with hash)
- New format: kasm_release_1.18.1.tar.gz (without hash)

The script now tries both detection methods:
1. First tries to find URL with hash suffix (old format)
2. Falls back to detecting version from service_images URLs and
   constructing the new URL format

This fixes the update detection for Kasm v1.18.1 and future versions.

Fixes #11785
2026-02-11 13:32:48 +01:00
community-scripts-pr-app[bot]
53b3b4bf9f chore: update github-versions.json (#11796)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 12:17:32 +00:00
community-scripts-pr-app[bot]
8fadcc0130 Update CHANGELOG.md (#11794)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 11:27:00 +00:00
community-scripts-pr-app[bot]
5aff8dc2f1 Update CHANGELOG.md (#11792)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 11:26:39 +00:00
community-scripts-pr-app[bot]
e7ed841361 Update date in json (#11791)
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2026-02-11 11:26:32 +00:00
push-app-to-main[bot]
7e49c222e5 Draw.io (#11788)
Co-authored-by: push-app-to-main[bot] <203845782+push-app-to-main[bot]@users.noreply.github.com>
2026-02-11 12:26:14 +01:00
community-scripts-pr-app[bot]
9f31012598 Update CHANGELOG.md (#11786)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 10:52:47 +00:00
Slaviša Arežina
811062f958 remove Torch (#11783) 2026-02-11 11:52:20 +01:00
community-scripts-pr-app[bot]
893b0bfb4a Update CHANGELOG.md (#11784)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 09:34:25 +00:00
Romain PINSOLLE
f34f994560 Snowshare: fix update script (#11726)
* Snowshare: fix update error

* Implement upload backup and restore in update script

Added backup and restore functionality for uploads during the Snowshare update process.

---------

Co-authored-by: CanbiZ (MickLesk) <47820557+MickLesk@users.noreply.github.com>
2026-02-11 10:33:55 +01:00
community-scripts-pr-app[bot]
216b389635 chore: update github-versions.json (#11781)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 06:26:01 +00:00
community-scripts-pr-app[bot]
d062baf8c9 Update CHANGELOG.md (#11780)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 06:25:27 +00:00
Tiago Noronha
e09e244c3d Fix formatting in kutt.json notes section (#11774) 2026-02-11 07:25:02 +01:00
community-scripts-pr-app[bot]
2645f4cf4d Update CHANGELOG.md (#11779)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 00:26:43 +00:00
community-scripts-pr-app[bot]
a0b55b6934 chore: update github-versions.json (#11778)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-11 00:26:14 +00:00
community-scripts-pr-app[bot]
b263dc25fe Update CHANGELOG.md (#11777)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 22:05:48 +00:00
Chris
ac308c931e Refactor: Slskd & Soularr (#11674) 2026-02-10 23:04:56 +01:00
Chris
a16dfb6d82 Immich: Pin version to 2.5.6 (#11775) 2026-02-10 22:43:15 +01:00
community-scripts-pr-app[bot]
63e9bc3729 Update CHANGELOG.md (#11773)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 19:55:15 +00:00
Slaviša Arežina
3735f9251b Fix setuptools (#11772) 2026-02-10 20:54:30 +01:00
community-scripts-pr-app[bot]
fc2559c702 Update CHANGELOG.md (#11770)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 18:31:37 +00:00
carlosmaroot
5f2d463408 feat: enhance backup notes with guest name and improve whitespace handling and content line matching in container parsing. (#11752) 2026-02-10 19:30:54 +01:00
community-scripts-pr-app[bot]
69e0dc6968 chore: update github-versions.json (#11769)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 18:25:11 +00:00
community-scripts-pr-app[bot]
fccb8a923a Update CHANGELOG.md (#11766)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 14:27:13 +00:00
CanbiZ (MickLesk)
53dbb9d705 fix(elementsynapse): prevent systemd invoke failure during apt install in LXC (#11758) 2026-02-10 15:26:22 +01:00
community-scripts-pr-app[bot]
236c5296b8 Update CHANGELOG.md (#11764)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 14:19:46 +00:00
CanbiZ (MickLesk)
76c7e3a67f fix(workflow): include addon scripts in github-versions extraction (#11757) 2026-02-10 15:19:09 +01:00
community-scripts-pr-app[bot]
4dbb139c60 Update CHANGELOG.md (#11759)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 12:33:09 +00:00
BirdMakingStuff
c581704fdd Snowshare: fix typo in config file path on website (#11754)
* fix: Fixed typo in snowshare systemd service

* Revert "fix: Fixed typo in snowshare systemd service"

This reverts commit cce1449caa.

* update config_path of snowshare.json instead

* fix formatting issue
2026-02-10 13:32:42 +01:00
community-scripts-pr-app[bot]
e0641d1573 chore: update github-versions.json (#11756)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 12:19:10 +00:00
community-scripts-pr-app[bot]
058e860f6d Update .app files (#11751)
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2026-02-10 08:27:06 +01:00
community-scripts-pr-app[bot]
2b87aa6d52 Update CHANGELOG.md (#11750)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 07:24:02 +00:00
push-app-to-main[bot]
9640caf7bc paperless-exporter (#11737) 2026-02-10 08:23:33 +01:00
community-scripts-pr-app[bot]
fc044a73ca chore: update github-versions.json (#11749)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 06:27:33 +00:00
community-scripts-pr-app[bot]
c97ccbe9bb Update CHANGELOG.md (#11747)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 00:26:38 +00:00
community-scripts-pr-app[bot]
5f69d8c315 chore: update github-versions.json (#11746)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-10 00:26:07 +00:00
community-scripts-pr-app[bot]
e8cc7ce8ff chore: update github-versions.json (#11741)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 18:20:11 +00:00
CanbiZ (MickLesk)
1357a6f26e fix(msg_menu): redirect menu display to /dev/tty to prevent capture in command substitution 2026-02-09 17:12:56 +01:00
community-scripts-pr-app[bot]
d2da5af858 Update CHANGELOG.md (#11736)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 14:54:50 +00:00
CanbiZ (MickLesk)
14755d5efe fix: add --clear to uv venv calls for uv 0.10 compatibility (#11723)
uv 0.10 requires --clear flag to overwrite existing virtual environments.
Without it, update scripts fail when the venv already exists.

Affected: 13 ct/ update scripts, 25 install/ scripts, glances addon
2026-02-09 15:54:22 +01:00
community-scripts-pr-app[bot]
927c3a7c48 Update CHANGELOG.md (#11735)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 14:53:53 +00:00
CanbiZ (MickLesk)
9ed365e9eb fix(koillection): add missing setup_composer in update script (#11734) 2026-02-09 15:53:21 +01:00
community-scripts-pr-app[bot]
d5cdfc7405 Update CHANGELOG.md (#11732)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 13:16:31 +00:00
Slaviša Arežina
3d2bc05092 Refactor: FileFlows (#11108)
* Refactor

* Apply suggestion from @tremor021

* Apply suggestion from @tremor021
2026-02-09 14:16:04 +01:00
CanbiZ (MickLesk)
a330afde03 fix(umlautadaptarr): use release appsettings.json instead of hardcoded copy (#11725)
The install script overwrote the correct appsettings.json shipped in the
release archive with a hardcoded copy that was missing newer required
fields (ApiKey, ProxyPort, EnableChangedTitleCache) and had structural
differences (Lidarr/Readarr as arrays instead of objects), causing the
service to fail on startup.

- Remove hardcoded appsettings.json from install script (release archive
  already ships the correct version)
- Backup and restore appsettings.json during updates to preserve user
  configuration

Closes #11665
2026-02-09 14:15:29 +01:00
community-scripts-pr-app[bot]
f19bc7722b Update CHANGELOG.md (#11731)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 13:12:46 +00:00
CanbiZ (MickLesk)
03571fb26c Refactor: wger (#11722)
* Refactor wger installation script for new services

Updated installation script for wger, changing dependencies and configuration for PostgreSQL, Gunicorn, and Celery. Adjusted paths and service configurations for better compatibility.

* Fix license URL in wger-install.sh

* Fix resource defaults and enhance update process

Updated default resource values and improved backup and restore process for wger installation.

* add json

* Update ct/wger.sh

Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>

* Update wger.sh

* Update install/wger-install.sh

Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>

---------

Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>
2026-02-09 14:12:22 +01:00
community-scripts-pr-app[bot]
c58ca1a70a Update CHANGELOG.md (#11730)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 13:12:15 +00:00
CanbiZ (MickLesk)
157e69b365 fix(addons): ensure curl is installed before use in all addon scripts (#11718) 2026-02-09 14:11:51 +01:00
community-scripts-pr-app[bot]
ee3e53a1a2 Update CHANGELOG.md (#11729)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 13:11:35 +00:00
CanbiZ (MickLesk)
c2cb89ddc5 fix(netbird): add systemd ordering to start after Docker (#11716)
When Docker is installed in the same LXC, Docker sets the FORWARD chain
policy to DROP on startup. If Netbird starts before Docker finishes
initializing its iptables rules, Docker overrides the Netbird routing
rules, causing traffic routing to fail despite the tunnel being up.

Add a systemd drop-in override that ensures netbird.service starts after
docker.service (only if Docker is installed). This prevents the race
condition and ensures correct iptables ordering after reboot.

Closes #11354
2026-02-09 14:11:07 +01:00
community-scripts-pr-app[bot]
d21df736fd chore: update github-versions.json (#11727)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-09 12:17:04 +00:00
CanbiZ (MickLesk)
59e27bfc8a hotfix nginxui remove jwt 2026-02-09 11:39:46 +01:00
129 changed files with 2326 additions and 1429 deletions

View File

@@ -89,9 +89,15 @@ jobs:
slug=$(jq -r '.slug // empty' "$json_file" 2>/dev/null) slug=$(jq -r '.slug // empty' "$json_file" 2>/dev/null)
[[ -z "$slug" ]] && continue [[ -z "$slug" ]] && continue
# Find corresponding install script # Find corresponding script (install script or addon script)
install_script="install/${slug}-install.sh" install_script=""
[[ ! -f "$install_script" ]] && continue if [[ -f "install/${slug}-install.sh" ]]; then
install_script="install/${slug}-install.sh"
elif [[ -f "tools/addon/${slug}.sh" ]]; then
install_script="tools/addon/${slug}.sh"
else
continue
fi
# Look for fetch_and_deploy_gh_release calls # Look for fetch_and_deploy_gh_release calls
# Pattern: fetch_and_deploy_gh_release "app" "owner/repo" ["mode"] ["version"] # Pattern: fetch_and_deploy_gh_release "app" "owner/repo" ["mode"] ["version"]

View File

@@ -401,14 +401,143 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
</details> </details>
## 2026-02-13
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- OpenWebUI: pin numba constraint [@MickLesk](https://github.com/MickLesk) ([#11874](https://github.com/community-scripts/ProxmoxVE/pull/11874))
- Planka: add migrate step to update function [@ZimmermannLeon](https://github.com/ZimmermannLeon) ([#11877](https://github.com/community-scripts/ProxmoxVE/pull/11877))
- Pangolin: switch sqlite-specific back to generic [@MickLesk](https://github.com/MickLesk) ([#11868](https://github.com/community-scripts/ProxmoxVE/pull/11868))
- [Hotfix] Jotty: Copy contents of config backup into /opt/jotty/config [@vhsdream](https://github.com/vhsdream) ([#11864](https://github.com/community-scripts/ProxmoxVE/pull/11864))
- #### 🔧 Refactor
- chore(donetick): add config entry for v0.1.73 [@tomfrenzel](https://github.com/tomfrenzel) ([#11872](https://github.com/community-scripts/ProxmoxVE/pull/11872))
- Refactor: Radicale [@vhsdream](https://github.com/vhsdream) ([#11850](https://github.com/community-scripts/ProxmoxVE/pull/11850))
### 📡 API
- #### ✨ New Features
- error-handler: Implement json_escape and enhance error handling [@MickLesk](https://github.com/MickLesk) ([#11875](https://github.com/community-scripts/ProxmoxVE/pull/11875))
### 🌐 Website
- #### 📝 Script Information
- SQLServer-2025: add PVE9/Kernel 6.x incompatibility warning [@MickLesk](https://github.com/MickLesk) ([#11829](https://github.com/community-scripts/ProxmoxVE/pull/11829))
## 2026-02-12
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- EMQX: increase disk to 6GB and add optional MQ disable prompt [@MickLesk](https://github.com/MickLesk) ([#11844](https://github.com/community-scripts/ProxmoxVE/pull/11844))
- Increased the Grafana container default disk size. [@shtefko](https://github.com/shtefko) ([#11840](https://github.com/community-scripts/ProxmoxVE/pull/11840))
- Pangolin: Update database generation command in install script [@tremor021](https://github.com/tremor021) ([#11825](https://github.com/community-scripts/ProxmoxVE/pull/11825))
- Deluge: add python3-setuptools as dep [@MickLesk](https://github.com/MickLesk) ([#11833](https://github.com/community-scripts/ProxmoxVE/pull/11833))
- Dispatcharr: migrate to uv sync [@MickLesk](https://github.com/MickLesk) ([#11831](https://github.com/community-scripts/ProxmoxVE/pull/11831))
- #### ✨ New Features
- Archlinux-VM: fix LVM/LVM-thin storage and improve error reporting | VM's add correct exit_code for analytics [@MickLesk](https://github.com/MickLesk) ([#11842](https://github.com/community-scripts/ProxmoxVE/pull/11842))
- Debian13-VM: Optimize First Boot & add noCloud/Cloud Selection [@MickLesk](https://github.com/MickLesk) ([#11810](https://github.com/community-scripts/ProxmoxVE/pull/11810))
### 💾 Core
- #### ✨ New Features
- tools.func: auto-detect binary vs armored GPG keys in setup_deb822_repo [@MickLesk](https://github.com/MickLesk) ([#11841](https://github.com/community-scripts/ProxmoxVE/pull/11841))
- core: remove old Go API and extend misc/api.func with new backend [@MickLesk](https://github.com/MickLesk) ([#11822](https://github.com/community-scripts/ProxmoxVE/pull/11822))
- #### 🔧 Refactor
- error_handler: prevent stuck 'installing' status [@MickLesk](https://github.com/MickLesk) ([#11845](https://github.com/community-scripts/ProxmoxVE/pull/11845))
### 🧰 Tools
- #### 🐞 Bug Fixes
- Tailscale: fix DNS check and keyrings directory issues [@MickLesk](https://github.com/MickLesk) ([#11837](https://github.com/community-scripts/ProxmoxVE/pull/11837))
## 2026-02-11
### 🆕 New Scripts
- Draw.io ([#11788](https://github.com/community-scripts/ProxmoxVE/pull/11788))
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- dispatcharr: include port 9191 in success-message [@MickLesk](https://github.com/MickLesk) ([#11808](https://github.com/community-scripts/ProxmoxVE/pull/11808))
- fix: make donetick 0.1.71 compatible [@tomfrenzel](https://github.com/tomfrenzel) ([#11804](https://github.com/community-scripts/ProxmoxVE/pull/11804))
- Kasm: Support new version URL format without hash suffix [@MickLesk](https://github.com/MickLesk) ([#11787](https://github.com/community-scripts/ProxmoxVE/pull/11787))
- LibreTranslate: Remove Torch [@tremor021](https://github.com/tremor021) ([#11783](https://github.com/community-scripts/ProxmoxVE/pull/11783))
- Snowshare: fix update script [@TuroYT](https://github.com/TuroYT) ([#11726](https://github.com/community-scripts/ProxmoxVE/pull/11726))
- #### ✨ New Features
- [Feature] OpenCloud: support PosixFS Collaborative Mode [@vhsdream](https://github.com/vhsdream) ([#11806](https://github.com/community-scripts/ProxmoxVE/pull/11806))
### 💾 Core
- #### 🔧 Refactor
- core: respect EDITOR variable for config editing [@ls-root](https://github.com/ls-root) ([#11693](https://github.com/community-scripts/ProxmoxVE/pull/11693))
### 📚 Documentation
- Fix formatting in kutt.json notes section [@tiagodenoronha](https://github.com/tiagodenoronha) ([#11774](https://github.com/community-scripts/ProxmoxVE/pull/11774))
## 2026-02-10
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- Immich: Pin version to 2.5.6 [@vhsdream](https://github.com/vhsdream) ([#11775](https://github.com/community-scripts/ProxmoxVE/pull/11775))
- Libretranslate: Fix setuptools [@tremor021](https://github.com/tremor021) ([#11772](https://github.com/community-scripts/ProxmoxVE/pull/11772))
- Element Synapse: prevent systemd invoke failure during apt install [@MickLesk](https://github.com/MickLesk) ([#11758](https://github.com/community-scripts/ProxmoxVE/pull/11758))
- #### ✨ New Features
- Refactor: Slskd & Soularr [@vhsdream](https://github.com/vhsdream) ([#11674](https://github.com/community-scripts/ProxmoxVE/pull/11674))
### 🗑️ Deleted Scripts
- move paperless-exporter from LXC to addon ([#11737](https://github.com/community-scripts/ProxmoxVE/pull/11737))
### 🧰 Tools
- #### 🐞 Bug Fixes
- feat: improve storage parsing & add guestname [@carlosmaroot](https://github.com/carlosmaroot) ([#11752](https://github.com/community-scripts/ProxmoxVE/pull/11752))
### 📂 Github
- Github-Version Workflow: include addon scripts in extraction [@MickLesk](https://github.com/MickLesk) ([#11757](https://github.com/community-scripts/ProxmoxVE/pull/11757))
### 🌐 Website
- #### 📝 Script Information
- Snowshare: fix typo in config file path on website [@BirdMakingStuff](https://github.com/BirdMakingStuff) ([#11754](https://github.com/community-scripts/ProxmoxVE/pull/11754))
## 2026-02-09 ## 2026-02-09
### 🚀 Updated Scripts ### 🚀 Updated Scripts
- #### 🐞 Bug Fixes - #### 🐞 Bug Fixes
- tracearr: prepare for next stable release [@durzo](https://github.com/durzo) ([#11673](https://github.com/community-scripts/ProxmoxVE/pull/11673)) - several scripts: add --clear to uv venv calls for uv 0.10 compatibility [@MickLesk](https://github.com/MickLesk) ([#11723](https://github.com/community-scripts/ProxmoxVE/pull/11723))
- Koillection: ensure setup_composer is in update script [@MickLesk](https://github.com/MickLesk) ([#11734](https://github.com/community-scripts/ProxmoxVE/pull/11734))
- PeaNUT: symlink server.js after update [@vhsdream](https://github.com/vhsdream) ([#11696](https://github.com/community-scripts/ProxmoxVE/pull/11696)) - PeaNUT: symlink server.js after update [@vhsdream](https://github.com/vhsdream) ([#11696](https://github.com/community-scripts/ProxmoxVE/pull/11696))
- Umlautadaptarr: use release appsettings.json instead of hardcoded copy [@MickLesk](https://github.com/MickLesk) ([#11725](https://github.com/community-scripts/ProxmoxVE/pull/11725))
- tracearr: prepare for next stable release [@durzo](https://github.com/durzo) ([#11673](https://github.com/community-scripts/ProxmoxVE/pull/11673))
- #### ✨ New Features - #### ✨ New Features
@@ -416,6 +545,8 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
- #### 🔧 Refactor - #### 🔧 Refactor
- Refactor: FileFlows [@tremor021](https://github.com/tremor021) ([#11108](https://github.com/community-scripts/ProxmoxVE/pull/11108))
- Refactor: wger [@MickLesk](https://github.com/MickLesk) ([#11722](https://github.com/community-scripts/ProxmoxVE/pull/11722))
- Nginx-UI: better User Handling | ACME [@MickLesk](https://github.com/MickLesk) ([#11715](https://github.com/community-scripts/ProxmoxVE/pull/11715)) - Nginx-UI: better User Handling | ACME [@MickLesk](https://github.com/MickLesk) ([#11715](https://github.com/community-scripts/ProxmoxVE/pull/11715))
- NginxProxymanager: use better-sqlite3 [@MickLesk](https://github.com/MickLesk) ([#11708](https://github.com/community-scripts/ProxmoxVE/pull/11708)) - NginxProxymanager: use better-sqlite3 [@MickLesk](https://github.com/MickLesk) ([#11708](https://github.com/community-scripts/ProxmoxVE/pull/11708))
@@ -425,6 +556,13 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
- hwaccel: add libmfx-gen1.2 to Intel Arc setup for QSV support [@MickLesk](https://github.com/MickLesk) ([#11707](https://github.com/community-scripts/ProxmoxVE/pull/11707)) - hwaccel: add libmfx-gen1.2 to Intel Arc setup for QSV support [@MickLesk](https://github.com/MickLesk) ([#11707](https://github.com/community-scripts/ProxmoxVE/pull/11707))
### 🧰 Tools
- #### 🐞 Bug Fixes
- addons: ensure curl is installed before use [@MickLesk](https://github.com/MickLesk) ([#11718](https://github.com/community-scripts/ProxmoxVE/pull/11718))
- Netbird (addon): add systemd ordering to start after Docker [@MickLesk](https://github.com/MickLesk) ([#11716](https://github.com/community-scripts/ProxmoxVE/pull/11716))
### ❔ Uncategorized ### ❔ Uncategorized
- Bichon: Update website [@tremor021](https://github.com/tremor021) ([#11711](https://github.com/community-scripts/ProxmoxVE/pull/11711)) - Bichon: Update website [@tremor021](https://github.com/tremor021) ([#11711](https://github.com/community-scripts/ProxmoxVE/pull/11711))

View File

@@ -1,5 +0,0 @@
MONGO_USER=
MONGO_PASSWORD=
MONGO_IP=
MONGO_PORT=
MONGO_DATABASE=

View File

@@ -1,23 +0,0 @@
module proxmox-api
go 1.24.0
require (
github.com/gorilla/mux v1.8.1
github.com/joho/godotenv v1.5.1
github.com/rs/cors v1.11.1
go.mongodb.org/mongo-driver v1.17.2
)
require (
github.com/golang/snappy v0.0.4 // indirect
github.com/klauspost/compress v1.16.7 // indirect
github.com/montanaflynn/stats v0.7.1 // indirect
github.com/xdg-go/pbkdf2 v1.0.0 // indirect
github.com/xdg-go/scram v1.1.2 // indirect
github.com/xdg-go/stringprep v1.0.4 // indirect
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78 // indirect
golang.org/x/crypto v0.45.0 // indirect
golang.org/x/sync v0.18.0 // indirect
golang.org/x/text v0.31.0 // indirect
)

View File

@@ -1,56 +0,0 @@
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/golang/snappy v0.0.4 h1:yAGX7huGHXlcLOEtBnF4w7FQwA26wojNCwOYAEhLjQM=
github.com/golang/snappy v0.0.4/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
github.com/joho/godotenv v1.5.1 h1:7eLL/+HRGLY0ldzfGMeQkb7vMd0as4CfYvUVzLqw0N0=
github.com/joho/godotenv v1.5.1/go.mod h1:f4LDr5Voq0i2e/R5DDNOoa2zzDfwtkZa6DnEwAbqwq4=
github.com/klauspost/compress v1.16.7 h1:2mk3MPGNzKyxErAw8YaohYh69+pa4sIQSC0fPGCFR9I=
github.com/klauspost/compress v1.16.7/go.mod h1:ntbaceVETuRiXiv4DpjP66DpAtAGkEQskQzEyD//IeE=
github.com/montanaflynn/stats v0.7.1 h1:etflOAAHORrCC44V+aR6Ftzort912ZU+YLiSTuV8eaE=
github.com/montanaflynn/stats v0.7.1/go.mod h1:etXPPgVO6n31NxCd9KQUMvCM+ve0ruNzt6R8Bnaayow=
github.com/rs/cors v1.11.1 h1:eU3gRzXLRK57F5rKMGMZURNdIG4EoAmX8k94r9wXWHA=
github.com/rs/cors v1.11.1/go.mod h1:XyqrcTp5zjWr1wsJ8PIRZssZ8b/WMcMf71DJnit4EMU=
github.com/xdg-go/pbkdf2 v1.0.0 h1:Su7DPu48wXMwC3bs7MCNG+z4FhcyEuz5dlvchbq0B0c=
github.com/xdg-go/pbkdf2 v1.0.0/go.mod h1:jrpuAogTd400dnrH08LKmI/xc1MbPOebTwRqcT5RDeI=
github.com/xdg-go/scram v1.1.2 h1:FHX5I5B4i4hKRVRBCFRxq1iQRej7WO3hhBuJf+UUySY=
github.com/xdg-go/scram v1.1.2/go.mod h1:RT/sEzTbU5y00aCK8UOx6R7YryM0iF1N2MOmC3kKLN4=
github.com/xdg-go/stringprep v1.0.4 h1:XLI/Ng3O1Atzq0oBs3TWm+5ZVgkq2aqdlvP9JtoZ6c8=
github.com/xdg-go/stringprep v1.0.4/go.mod h1:mPGuuIYwz7CmR2bT9j4GbQqutWS1zV24gijq1dTyGkM=
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78 h1:ilQV1hzziu+LLM3zUTJ0trRztfwgjqKnBWNtSRkbmwM=
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78/go.mod h1:aL8wCCfTfSfmXjznFBSZNN13rSJjlIOI1fUNAtF7rmI=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
go.mongodb.org/mongo-driver v1.17.2 h1:gvZyk8352qSfzyZ2UMWcpDpMSGEr1eqE4T793SqyhzM=
go.mongodb.org/mongo-driver v1.17.2/go.mod h1:Hy04i7O2kC4RS06ZrhPRqj/u4DTYkFDAAccj+rVKqgQ=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.3.8/go.mod h1:E6s5w1FMmriuDzIBO73fBruAKo1PCIq6d2Q6DHfQ8WQ=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=

View File

@@ -1,450 +0,0 @@
// Copyright (c) 2021-2026 community-scripts ORG
// Author: Michel Roegl-Brunner (michelroegl-brunner)
// License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"net/http"
"os"
"strconv"
"time"
"github.com/gorilla/mux"
"github.com/joho/godotenv"
"github.com/rs/cors"
"go.mongodb.org/mongo-driver/bson"
"go.mongodb.org/mongo-driver/bson/primitive"
"go.mongodb.org/mongo-driver/mongo"
"go.mongodb.org/mongo-driver/mongo/options"
)
var client *mongo.Client
var collection *mongo.Collection
func loadEnv() {
if err := godotenv.Load(); err != nil {
log.Fatal("Error loading .env file")
}
}
// DataModel represents a single document in MongoDB
type DataModel struct {
ID primitive.ObjectID `json:"id" bson:"_id,omitempty"`
CT_TYPE uint `json:"ct_type" bson:"ct_type"`
DISK_SIZE float32 `json:"disk_size" bson:"disk_size"`
CORE_COUNT uint `json:"core_count" bson:"core_count"`
RAM_SIZE uint `json:"ram_size" bson:"ram_size"`
OS_TYPE string `json:"os_type" bson:"os_type"`
OS_VERSION string `json:"os_version" bson:"os_version"`
DISABLEIP6 string `json:"disableip6" bson:"disableip6"`
NSAPP string `json:"nsapp" bson:"nsapp"`
METHOD string `json:"method" bson:"method"`
CreatedAt time.Time `json:"created_at" bson:"created_at"`
PVEVERSION string `json:"pve_version" bson:"pve_version"`
STATUS string `json:"status" bson:"status"`
RANDOM_ID string `json:"random_id" bson:"random_id"`
TYPE string `json:"type" bson:"type"`
ERROR string `json:"error" bson:"error"`
}
type StatusModel struct {
RANDOM_ID string `json:"random_id" bson:"random_id"`
ERROR string `json:"error" bson:"error"`
STATUS string `json:"status" bson:"status"`
}
type CountResponse struct {
TotalEntries int64 `json:"total_entries"`
StatusCount map[string]int64 `json:"status_count"`
NSAPPCount map[string]int64 `json:"nsapp_count"`
}
// ConnectDatabase initializes the MongoDB connection
func ConnectDatabase() {
loadEnv()
mongoURI := fmt.Sprintf("mongodb://%s:%s@%s:%s",
os.Getenv("MONGO_USER"),
os.Getenv("MONGO_PASSWORD"),
os.Getenv("MONGO_IP"),
os.Getenv("MONGO_PORT"))
database := os.Getenv("MONGO_DATABASE")
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
var err error
client, err = mongo.Connect(ctx, options.Client().ApplyURI(mongoURI))
if err != nil {
log.Fatal("Failed to connect to MongoDB!", err)
}
collection = client.Database(database).Collection("data_models")
fmt.Println("Connected to MongoDB on 10.10.10.18")
}
// UploadJSON handles API requests and stores data as a document in MongoDB
func UploadJSON(w http.ResponseWriter, r *http.Request) {
var input DataModel
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
input.CreatedAt = time.Now()
_, err := collection.InsertOne(context.Background(), input)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
log.Println("Received data:", input)
w.WriteHeader(http.StatusCreated)
json.NewEncoder(w).Encode(map[string]string{"message": "Data saved successfully"})
}
// UpdateStatus updates the status of a record based on RANDOM_ID
func UpdateStatus(w http.ResponseWriter, r *http.Request) {
var input StatusModel
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
filter := bson.M{"random_id": input.RANDOM_ID}
update := bson.M{"$set": bson.M{"status": input.STATUS, "error": input.ERROR}}
_, err := collection.UpdateOne(context.Background(), filter, update)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
log.Println("Updated data:", input)
w.WriteHeader(http.StatusOK)
json.NewEncoder(w).Encode(map[string]string{"message": "Record updated successfully"})
}
// GetDataJSON fetches all data from MongoDB
func GetDataJSON(w http.ResponseWriter, r *http.Request) {
var records []DataModel
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := collection.Find(ctx, bson.M{})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
records = append(records, record)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(records)
}
func GetPaginatedData(w http.ResponseWriter, r *http.Request) {
page, _ := strconv.Atoi(r.URL.Query().Get("page"))
limit, _ := strconv.Atoi(r.URL.Query().Get("limit"))
if page < 1 {
page = 1
}
if limit < 1 {
limit = 10
}
skip := (page - 1) * limit
var records []DataModel
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
options := options.Find().SetSkip(int64(skip)).SetLimit(int64(limit))
cursor, err := collection.Find(ctx, bson.M{}, options)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
records = append(records, record)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(records)
}
func GetSummary(w http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
totalCount, err := collection.CountDocuments(ctx, bson.M{})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
statusCount := make(map[string]int64)
nsappCount := make(map[string]int64)
pipeline := []bson.M{
{"$group": bson.M{"_id": "$status", "count": bson.M{"$sum": 1}}},
}
cursor, err := collection.Aggregate(ctx, pipeline)
if err == nil {
for cursor.Next(ctx) {
var result struct {
ID string `bson:"_id"`
Count int64 `bson:"count"`
}
if err := cursor.Decode(&result); err == nil {
statusCount[result.ID] = result.Count
}
}
}
pipeline = []bson.M{
{"$group": bson.M{"_id": "$nsapp", "count": bson.M{"$sum": 1}}},
}
cursor, err = collection.Aggregate(ctx, pipeline)
if err == nil {
for cursor.Next(ctx) {
var result struct {
ID string `bson:"_id"`
Count int64 `bson:"count"`
}
if err := cursor.Decode(&result); err == nil {
nsappCount[result.ID] = result.Count
}
}
}
response := CountResponse{
TotalEntries: totalCount,
StatusCount: statusCount,
NSAPPCount: nsappCount,
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(response)
}
func GetByNsapp(w http.ResponseWriter, r *http.Request) {
nsapp := r.URL.Query().Get("nsapp")
var records []DataModel
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := collection.Find(ctx, bson.M{"nsapp": nsapp})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
records = append(records, record)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(records)
}
func GetByDateRange(w http.ResponseWriter, r *http.Request) {
startDate := r.URL.Query().Get("start_date")
endDate := r.URL.Query().Get("end_date")
if startDate == "" || endDate == "" {
http.Error(w, "Both start_date and end_date are required", http.StatusBadRequest)
return
}
start, err := time.Parse("2006-01-02T15:04:05.999999+00:00", startDate+"T00:00:00+00:00")
if err != nil {
http.Error(w, "Invalid start_date format", http.StatusBadRequest)
return
}
end, err := time.Parse("2006-01-02T15:04:05.999999+00:00", endDate+"T23:59:59+00:00")
if err != nil {
http.Error(w, "Invalid end_date format", http.StatusBadRequest)
return
}
var records []DataModel
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := collection.Find(ctx, bson.M{
"created_at": bson.M{
"$gte": start,
"$lte": end,
},
})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
records = append(records, record)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(records)
}
func GetByStatus(w http.ResponseWriter, r *http.Request) {
status := r.URL.Query().Get("status")
var records []DataModel
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := collection.Find(ctx, bson.M{"status": status})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
records = append(records, record)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(records)
}
func GetByOS(w http.ResponseWriter, r *http.Request) {
osType := r.URL.Query().Get("os_type")
osVersion := r.URL.Query().Get("os_version")
var records []DataModel
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := collection.Find(ctx, bson.M{"os_type": osType, "os_version": osVersion})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
records = append(records, record)
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(records)
}
func GetErrors(w http.ResponseWriter, r *http.Request) {
errorCount := make(map[string]int)
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := collection.Find(ctx, bson.M{"error": bson.M{"$ne": ""}})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
var record DataModel
if err := cursor.Decode(&record); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
if record.ERROR != "" {
errorCount[record.ERROR]++
}
}
type ErrorCountResponse struct {
Error string `json:"error"`
Count int `json:"count"`
}
var errorCounts []ErrorCountResponse
for err, count := range errorCount {
errorCounts = append(errorCounts, ErrorCountResponse{
Error: err,
Count: count,
})
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(struct {
ErrorCounts []ErrorCountResponse `json:"error_counts"`
}{
ErrorCounts: errorCounts,
})
}
func main() {
ConnectDatabase()
router := mux.NewRouter()
router.HandleFunc("/upload", UploadJSON).Methods("POST")
router.HandleFunc("/upload/updatestatus", UpdateStatus).Methods("POST")
router.HandleFunc("/data/json", GetDataJSON).Methods("GET")
router.HandleFunc("/data/paginated", GetPaginatedData).Methods("GET")
router.HandleFunc("/data/summary", GetSummary).Methods("GET")
router.HandleFunc("/data/nsapp", GetByNsapp).Methods("GET")
router.HandleFunc("/data/date", GetByDateRange).Methods("GET")
router.HandleFunc("/data/status", GetByStatus).Methods("GET")
router.HandleFunc("/data/os", GetByOS).Methods("GET")
router.HandleFunc("/data/errors", GetErrors).Methods("GET")
c := cors.New(cors.Options{
AllowedOrigins: []string{"*"},
AllowedMethods: []string{"GET", "POST"},
AllowedHeaders: []string{"Content-Type", "Authorization"},
AllowCredentials: true,
})
handler := c.Handler(router)
fmt.Println("Server running on port 8080")
log.Fatal(http.ListenAndServe(":8080", handler))
}

View File

@@ -51,7 +51,7 @@ function update_script() {
cp -r /opt/adventurelog-backup/backend/server/media /opt/adventurelog/backend/server/media cp -r /opt/adventurelog-backup/backend/server/media /opt/adventurelog/backend/server/media
cd /opt/adventurelog/backend/server cd /opt/adventurelog/backend/server
if [[ ! -x .venv/bin/python ]]; then if [[ ! -x .venv/bin/python ]]; then
$STD uv venv .venv $STD uv venv --clear .venv
$STD .venv/bin/python -m ensurepip --upgrade $STD .venv/bin/python -m ensurepip --upgrade
fi fi
$STD .venv/bin/python -m pip install --upgrade pip $STD .venv/bin/python -m pip install --upgrade pip

View File

@@ -9,7 +9,7 @@ APP="Alpine-Grafana"
var_tags="${var_tags:-alpine;monitoring}" var_tags="${var_tags:-alpine;monitoring}"
var_cpu="${var_cpu:-1}" var_cpu="${var_cpu:-1}"
var_ram="${var_ram:-256}" var_ram="${var_ram:-256}"
var_disk="${var_disk:-1}" var_disk="${var_disk:-2}"
var_os="${var_os:-alpine}" var_os="${var_os:-alpine}"
var_version="${var_version:-3.23}" var_version="${var_version:-3.23}"
var_unprivileged="${var_unprivileged:-1}" var_unprivileged="${var_unprivileged:-1}"

View File

@@ -44,7 +44,7 @@ function update_script() {
msg_info "Updating Autocaliweb" msg_info "Updating Autocaliweb"
cd "$INSTALL_DIR" cd "$INSTALL_DIR"
if [[ ! -d "$VIRTUAL_ENV" ]]; then if [[ ! -d "$VIRTUAL_ENV" ]]; then
$STD uv venv "$VIRTUAL_ENV" $STD uv venv --clear "$VIRTUAL_ENV"
fi fi
$STD uv sync --all-extras --active $STD uv sync --all-extras --active
cd "$INSTALL_DIR"/koreader/plugins cd "$INSTALL_DIR"/koreader/plugins

View File

@@ -40,7 +40,7 @@ function update_script() {
chmod 775 /opt/bazarr /var/lib/bazarr/ chmod 775 /opt/bazarr /var/lib/bazarr/
# Always ensure venv exists # Always ensure venv exists
if [[ ! -d /opt/bazarr/venv/ ]]; then if [[ ! -d /opt/bazarr/venv/ ]]; then
$STD uv venv /opt/bazarr/venv --python 3.12 $STD uv venv --clear /opt/bazarr/venv --python 3.12
fi fi
# Always check and fix service file if needed # Always check and fix service file if needed

View File

@@ -28,6 +28,7 @@ function update_script() {
exit exit
fi fi
msg_info "Updating Deluge" msg_info "Updating Deluge"
ensure_dependencies python3-setuptools
$STD apt update $STD apt update
$STD pip3 install deluge[all] --upgrade $STD pip3 install deluge[all] --upgrade
msg_ok "Updated Deluge" msg_ok "Updated Deluge"

View File

@@ -103,8 +103,8 @@ function update_script() {
cd /opt/dispatcharr cd /opt/dispatcharr
rm -rf .venv rm -rf .venv
$STD uv venv $STD uv venv --clear
$STD uv pip install -r requirements.txt --index-strategy unsafe-best-match $STD uv sync
$STD uv pip install gunicorn gevent celery redis daphne $STD uv pip install gunicorn gevent celery redis daphne
msg_ok "Updated Dispatcharr Backend" msg_ok "Updated Dispatcharr Backend"
@@ -144,4 +144,4 @@ description
msg_ok "Completed successfully!\n" msg_ok "Completed successfully!\n"
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}" echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
echo -e "${INFO}${YW} Access it using the following URL:${CL}" echo -e "${INFO}${YW} Access it using the following URL:${CL}"
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}${CL}" echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:9191${CL}"

View File

@@ -35,13 +35,15 @@ function update_script() {
msg_ok "Stopped Service" msg_ok "Stopped Service"
msg_info "Backing Up Configurations" msg_info "Backing Up Configurations"
mv /opt/donetick/config/selfhosted.yml /opt/donetick/donetick.db /opt mv /opt/donetick/config/selfhosted.yaml /opt/donetick/donetick.db /opt
msg_ok "Backed Up Configurations" msg_ok "Backed Up Configurations"
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "donetick" "donetick/donetick" "prebuild" "latest" "/opt/donetick" "donetick_Linux_x86_64.tar.gz" CLEAN_INSTALL=1 fetch_and_deploy_gh_release "donetick" "donetick/donetick" "prebuild" "latest" "/opt/donetick" "donetick_Linux_x86_64.tar.gz"
msg_info "Restoring Configurations" msg_info "Restoring Configurations"
mv /opt/selfhosted.yml /opt/donetick/config mv /opt/selfhosted.yaml /opt/donetick/config
grep -q 'http://localhost"$' /opt/donetick/config/selfhosted.yaml || sed -i '/https:\/\/localhost"$/a\ - "http://localhost"' /opt/donetick/config/selfhosted.yaml
grep -q 'capacitor://localhost' /opt/donetick/config/selfhosted.yaml || sed -i '/http:\/\/localhost"$/a\ - "capacitor://localhost"' /opt/donetick/config/selfhosted.yaml
mv /opt/donetick.db /opt/donetick mv /opt/donetick.db /opt/donetick
msg_ok "Restored Configurations" msg_ok "Restored Configurations"

58
ct/drawio.sh Normal file
View File

@@ -0,0 +1,58 @@
#!/usr/bin/env bash
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
# Copyright (c) 2021-2026 community-scripts ORG
# Author: Slaviša Arežina (tremor021)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://www.drawio.com/
APP="DrawIO"
var_tags="${var_tags:-diagrams}"
var_cpu="${var_cpu:-1}"
var_ram="${var_ram:-2048}"
var_disk="${var_disk:-4}"
var_os="${var_os:-debian}"
var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}"
header_info "$APP"
variables
color
catch_errors
function update_script() {
header_info
check_container_storage
check_container_resources
if [[ ! -f /var/lib/tomcat11/webapps/draw.war ]]; then
msg_error "No ${APP} Installation Found!"
exit
fi
if check_for_gh_release "drawio" "jgraph/drawio"; then
msg_info "Stopping service"
systemctl stop tomcat11
msg_ok "Service stopped"
msg_info "Updating Debian LXC"
$STD apt update
$STD apt upgrade -y
msg_ok "Updated Debian LXC"
USE_ORIGINAL_FILENAME=true fetch_and_deploy_gh_release "drawio" "jgraph/drawio" "singlefile" "latest" "/var/lib/tomcat11/webapps" "draw.war"
msg_info "Starting service"
systemctl start tomcat11
msg_ok "Service started"
msg_ok "Updated successfully!"
fi
exit
}
start
build_container
description
msg_ok "Completed Successfully!\n"
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:8080/draw${CL}"

View File

@@ -9,7 +9,7 @@ APP="EMQX"
var_tags="${var_tags:-mqtt}" var_tags="${var_tags:-mqtt}"
var_cpu="${var_cpu:-2}" var_cpu="${var_cpu:-2}"
var_ram="${var_ram:-1024}" var_ram="${var_ram:-1024}"
var_disk="${var_disk:-4}" var_disk="${var_disk:-6}"
var_os="${var_os:-debian}" var_os="${var_os:-debian}"
var_version="${var_version:-13}" var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}" var_unprivileged="${var_unprivileged:-1}"

View File

@@ -61,7 +61,7 @@ function update_script() {
msg_info "Updating Backend" msg_info "Updating Backend"
cd /opt/endurain/backend cd /opt/endurain/backend
$STD poetry export -f requirements.txt --output requirements.txt --without-hashes $STD poetry export -f requirements.txt --output requirements.txt --without-hashes
$STD uv venv $STD uv venv --clear
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
msg_ok "Backend Updated" msg_ok "Backend Updated"

View File

@@ -42,7 +42,7 @@ function update_script() {
rm -rf "$VENV_PATH" rm -rf "$VENV_PATH"
mkdir -p /opt/esphome mkdir -p /opt/esphome
cd /opt/esphome cd /opt/esphome
$STD uv venv "$VENV_PATH" $STD uv venv --clear "$VENV_PATH"
$STD "$VENV_PATH/bin/python" -m ensurepip --upgrade $STD "$VENV_PATH/bin/python" -m ensurepip --upgrade
$STD "$VENV_PATH/bin/python" -m pip install --upgrade pip $STD "$VENV_PATH/bin/python" -m pip install --upgrade pip
$STD "$VENV_PATH/bin/python" -m pip install esphome tornado esptool $STD "$VENV_PATH/bin/python" -m pip install esphome tornado esptool

View File

@@ -37,17 +37,12 @@ function update_script() {
msg_info "Stopped Service" msg_info "Stopped Service"
msg_info "Creating Backup" msg_info "Creating Backup"
ls /opt/*.tar.gz &>/dev/null && rm -f /opt/*.tar.gz
backup_filename="/opt/${APP}_backup_$(date +%F).tar.gz" backup_filename="/opt/${APP}_backup_$(date +%F).tar.gz"
tar -czf "$backup_filename" -C /opt/fileflows Data tar -czf "$backup_filename" -C /opt/fileflows Data
msg_ok "Backup Created" msg_ok "Backup Created"
msg_info "Updating $APP to latest version" fetch_and_deploy_from_url "https://fileflows.com/downloads/zip" "/opt/fileflows"
temp_file=$(mktemp)
curl -fsSL https://fileflows.com/downloads/zip -o "$temp_file"
$STD unzip -o -d /opt/fileflows "$temp_file"
rm -rf "$temp_file"
rm -rf "$backup_filename"
msg_ok "Updated $APP to latest version"
msg_info "Starting Service" msg_info "Starting Service"
systemctl start fileflows systemctl start fileflows

View File

@@ -9,7 +9,7 @@ APP="Grafana"
var_tags="${var_tags:-monitoring;visualization}" var_tags="${var_tags:-monitoring;visualization}"
var_cpu="${var_cpu:-1}" var_cpu="${var_cpu:-1}"
var_ram="${var_ram:-512}" var_ram="${var_ram:-512}"
var_disk="${var_disk:-2}" var_disk="${var_disk:-4}"
var_os="${var_os:-debian}" var_os="${var_os:-debian}"
var_version="${var_version:-13}" var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}" var_unprivileged="${var_unprivileged:-1}"

6
ct/headers/drawio Normal file
View File

@@ -0,0 +1,6 @@
____ ________
/ __ \_________ __ __/ _/ __ \
/ / / / ___/ __ `/ | /| / // // / / /
/ /_/ / / / /_/ /| |/ |/ // // /_/ /
/_____/_/ \__,_/ |__/|__/___/\____/

View File

@@ -1,6 +0,0 @@
____ __ __ ____ __ _ _________ __ ______ __
/ __ \_________ ____ ___ ___ / /_/ /_ ___ __ _______ / __ \____ _____ ___ _____/ /__ __________ / | / / ____/ |/ / / ____/ ______ ____ _____/ /____ _____
/ /_/ / ___/ __ \/ __ `__ \/ _ \/ __/ __ \/ _ \/ / / / ___/_____/ /_/ / __ `/ __ \/ _ \/ ___/ / _ \/ ___/ ___/_____/ |/ / / __ | /_____/ __/ | |/_/ __ \/ __ \/ ___/ __/ _ \/ ___/
/ ____/ / / /_/ / / / / / / __/ /_/ / / / __/ /_/ (__ )_____/ ____/ /_/ / /_/ / __/ / / / __(__ |__ )_____/ /| / /_/ // /_____/ /____> </ /_/ / /_/ / / / /_/ __/ /
/_/ /_/ \____/_/ /_/ /_/\___/\__/_/ /_/\___/\__,_/____/ /_/ \__,_/ .___/\___/_/ /_/\___/____/____/ /_/ |_/\____//_/|_| /_____/_/|_/ .___/\____/_/ \__/\___/_/
/_/ /_/

View File

@@ -105,7 +105,7 @@ EOF
msg_ok "Image-processing libraries up to date" msg_ok "Image-processing libraries up to date"
fi fi
RELEASE="2.5.5" RELEASE="2.5.6"
if check_for_gh_release "Immich" "immich-app/immich" "${RELEASE}"; then if check_for_gh_release "Immich" "immich-app/immich" "${RELEASE}"; then
if [[ $(cat ~/.immich) > "2.5.1" ]]; then if [[ $(cat ~/.immich) > "2.5.1" ]]; then
msg_info "Enabling Maintenance Mode" msg_info "Enabling Maintenance Mode"

View File

@@ -46,7 +46,7 @@ function update_script() {
msg_info "Restoring configuration & data" msg_info "Restoring configuration & data"
mv /opt/app.env /opt/jotty/.env mv /opt/app.env /opt/jotty/.env
[[ -d /opt/data ]] && mv /opt/data /opt/jotty/data [[ -d /opt/data ]] && mv /opt/data /opt/jotty/data
[[ -d /opt/jotty/config ]] && mv /opt/config/* /opt/jotty/config [[ -d /opt/jotty/config ]] && cp -a /opt/config/* /opt/jotty/config && rm -rf /opt/config
msg_ok "Restored configuration & data" msg_ok "Restored configuration & data"
msg_info "Starting Service" msg_info "Starting Service"

View File

@@ -34,7 +34,7 @@ function update_script() {
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
mkdir -p "$INSTALL_DIR" mkdir -p "$INSTALL_DIR"
cd "$INSTALL_DIR" cd "$INSTALL_DIR"
$STD uv venv .venv $STD uv venv --clear .venv
$STD "$VENV_PYTHON" -m ensurepip --upgrade $STD "$VENV_PYTHON" -m ensurepip --upgrade
$STD "$VENV_PYTHON" -m pip install --upgrade pip $STD "$VENV_PYTHON" -m pip install --upgrade pip
$STD "$VENV_PYTHON" -m pip install jupyter $STD "$VENV_PYTHON" -m pip install jupyter

View File

@@ -34,10 +34,19 @@ function update_script() {
CURRENT_VERSION=$(readlink -f /opt/kasm/current | awk -F'/' '{print $4}') CURRENT_VERSION=$(readlink -f /opt/kasm/current | awk -F'/' '{print $4}')
KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1) KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1)
if [[ -z "$KASM_URL" ]]; then if [[ -z "$KASM_URL" ]]; then
SERVICE_IMAGE_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_service_images_amd64_[0-9]+\.[0-9]+\.[0-9]+\.tar\.gz' | head -n 1)
if [[ -n "$SERVICE_IMAGE_URL" ]]; then
KASM_VERSION=$(echo "$SERVICE_IMAGE_URL" | sed -E 's/.*kasm_release_service_images_amd64_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
KASM_URL="https://kasm-static-content.s3.amazonaws.com/kasm_release_${KASM_VERSION}.tar.gz"
fi
else
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
fi
if [[ -z "$KASM_URL" ]] || [[ -z "$KASM_VERSION" ]]; then
msg_error "Unable to detect latest Kasm release URL." msg_error "Unable to detect latest Kasm release URL."
exit 1 exit 1
fi fi
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
msg_info "Checked for new version" msg_info "Checked for new version"
msg_info "Removing outdated docker-compose plugin" msg_info "Removing outdated docker-compose plugin"

View File

@@ -33,6 +33,7 @@ function update_script() {
msg_ok "Stopped Service" msg_ok "Stopped Service"
PHP_VERSION="8.5" PHP_APACHE="YES" setup_php PHP_VERSION="8.5" PHP_APACHE="YES" setup_php
setup_composer
msg_info "Creating a backup" msg_info "Creating a backup"
mv /opt/koillection/ /opt/koillection-backup mv /opt/koillection/ /opt/koillection-backup

View File

@@ -30,7 +30,7 @@ function update_script() {
fi fi
RELEASE="v5.0.2" RELEASE="v5.0.2"
if check_for_gh_release "opencloud" "opencloud-eu/opencloud" "${RELEASE}"; then if check_for_gh_release "OpenCloud" "opencloud-eu/opencloud" "${RELEASE}"; then
msg_info "Stopping services" msg_info "Stopping services"
systemctl stop opencloud opencloud-wopi systemctl stop opencloud opencloud-wopi
msg_ok "Stopped services" msg_ok "Stopped services"
@@ -38,9 +38,21 @@ function update_script() {
msg_info "Updating packages" msg_info "Updating packages"
$STD apt-get update $STD apt-get update
$STD apt-get dist-upgrade -y $STD apt-get dist-upgrade -y
ensure_dependencies "inotify-tools"
msg_ok "Updated packages" msg_ok "Updated packages"
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "opencloud" "opencloud-eu/opencloud" "singlefile" "${RELEASE}" "/usr/bin" "opencloud-*-linux-amd64" CLEAN_INSTALL=1 fetch_and_deploy_gh_release "OpenCloud" "opencloud-eu/opencloud" "singlefile" "${RELEASE}" "/usr/bin" "opencloud-*-linux-amd64"
if ! grep -q 'POSIX_WATCH' /etc/opencloud/opencloud.env; then
sed -i '/^## External/i ## Uncomment below to enable PosixFS Collaborative Mode\
## Increase inotify watch/instance limits on your PVE host:\
### sysctl -w fs.inotify.max_user_watches=1048576\
### sysctl -w fs.inotify.max_user_instances=1024\
# STORAGE_USERS_POSIX_ENABLE_COLLABORATION=true\
# STORAGE_USERS_POSIX_WATCH_TYPE=inotifywait\
# STORAGE_USERS_POSIX_WATCH_FS=true\
# STORAGE_USERS_POSIX_WATCH_PATH=<path-to-storage-or-bind-mount>' /etc/opencloud/opencloud.env
fi
msg_info "Starting services" msg_info "Starting services"
systemctl start opencloud opencloud-wopi systemctl start opencloud opencloud-wopi

View File

@@ -44,7 +44,7 @@ function update_script() {
msg_info "Installing uv-based Open-WebUI" msg_info "Installing uv-based Open-WebUI"
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
$STD uv tool install --python 3.12 open-webui[all] $STD uv tool install --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
msg_ok "Installed uv-based Open-WebUI" msg_ok "Installed uv-based Open-WebUI"
msg_info "Restoring data" msg_info "Restoring data"
@@ -126,7 +126,7 @@ EOF
msg_info "Updating Open WebUI via uv" msg_info "Updating Open WebUI via uv"
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
$STD uv tool upgrade --python 3.12 open-webui[all] $STD uv tool install --force --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
systemctl restart open-webui systemctl restart open-webui
msg_ok "Updated Open WebUI" msg_ok "Updated Open WebUI"
msg_ok "Updated successfully!" msg_ok "Updated successfully!"

View File

@@ -51,7 +51,7 @@ function update_script() {
$STD npm run db:generate $STD npm run db:generate
$STD npm run build $STD npm run build
$STD npm run build:cli $STD npm run build:cli
$STD npm run db:sqlite:push $STD npm run db:push
cp -R .next/standalone ./ cp -R .next/standalone ./
chmod +x ./dist/cli.mjs chmod +x ./dist/cli.mjs
cp server/db/names.json ./dist/names.json cp server/db/names.json ./dist/names.json

View File

@@ -61,6 +61,12 @@ function update_script() {
rm -rf "$BK" rm -rf "$BK"
msg_ok "Restored data" msg_ok "Restored data"
msg_ok "Migrate Database"
cd /opt/planka
$STD npm run db:upgrade
$STD npm run db:migrate
msg_ok "Migrated Database"
msg_info "Starting Service" msg_info "Starting Service"
systemctl start planka systemctl start planka
msg_ok "Started Service" msg_ok "Started Service"

View File

@@ -1,52 +0,0 @@
#!/usr/bin/env bash
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
# Copyright (c) 2021-2026 community-scripts ORG
# Author: Andy Grunwald (andygrunwald)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/hansmi/prometheus-paperless-exporter
APP="Prometheus-Paperless-NGX-Exporter"
var_tags="${var_tags:-monitoring;alerting}"
var_cpu="${var_cpu:-1}"
var_ram="${var_ram:-256}"
var_disk="${var_disk:-2}"
var_os="${var_os:-debian}"
var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}"
header_info "$APP"
variables
color
catch_errors
function update_script() {
header_info
check_container_storage
check_container_resources
if [[ ! -f /etc/systemd/system/prometheus-paperless-ngx-exporter.service ]]; then
msg_error "No ${APP} Installation Found!"
exit
fi
if check_for_gh_release "prom-paperless-exp" "hansmi/prometheus-paperless-exporter"; then
msg_info "Stopping Service"
systemctl stop prometheus-paperless-ngx-exporter
msg_ok "Stopped Service"
fetch_and_deploy_gh_release "prom-paperless-exp" "hansmi/prometheus-paperless-exporter" "binary"
msg_info "Starting Service"
systemctl start prometheus-paperless-ngx-exporter
msg_ok "Started Service"
msg_ok "Updated successfully!"
fi
exit
}
start
build_container
description
msg_ok "Completed successfully!\n"
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:8081/metrics${CL}"

View File

@@ -41,7 +41,7 @@ function update_script() {
rm -rf "$PVE_VENV_PATH" rm -rf "$PVE_VENV_PATH"
mkdir -p /opt/prometheus-pve-exporter mkdir -p /opt/prometheus-pve-exporter
cd /opt/prometheus-pve-exporter cd /opt/prometheus-pve-exporter
$STD uv venv "$PVE_VENV_PATH" $STD uv venv --clear "$PVE_VENV_PATH"
$STD "$PVE_VENV_PATH/bin/python" -m ensurepip --upgrade $STD "$PVE_VENV_PATH/bin/python" -m ensurepip --upgrade
$STD "$PVE_VENV_PATH/bin/python" -m pip install --upgrade pip $STD "$PVE_VENV_PATH/bin/python" -m pip install --upgrade pip
$STD "$PVE_VENV_PATH/bin/python" -m pip install prometheus-pve-exporter $STD "$PVE_VENV_PATH/bin/python" -m pip install prometheus-pve-exporter

View File

@@ -28,16 +28,55 @@ function update_script() {
exit exit
fi fi
msg_info "Updating ${APP}" if check_for_gh_release "Radicale" "Kozea/Radicale"; then
$STD python3 -m venv /opt/radicale msg_info "Stopping service"
source /opt/radicale/bin/activate systemctl stop radicale
$STD python3 -m pip install --upgrade https://github.com/Kozea/Radicale/archive/master.tar.gz msg_ok "Stopped service"
msg_ok "Updated ${APP}"
msg_info "Starting Service" msg_info "Backing up users file"
systemctl enable -q --now radicale cp /opt/radicale/users /opt/radicale_users_backup
msg_ok "Started Service" msg_ok "Backed up users file"
msg_ok "Updated successfully!"
PYTHON_VERSION="3.13" setup_uv
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Radicale" "Kozea/Radicale" "tarball" "latest" "/opt/radicale"
msg_info "Restoring users file"
rm -f /opt/radicale/users
mv /opt/radicale_users_backup /opt/radicale/users
msg_ok "Restored users file"
if grep -q 'start.sh' /etc/systemd/system/radicale.service; then
sed -i -e '/^Description/i[Unit]' \
-e '\|^ExecStart|iWorkingDirectory=/opt/radicale' \
-e 's|^ExecStart=.*|ExecStart=/usr/local/bin/uv run -m radicale --config /etc/radicale/config|' /etc/systemd/system/radicale.service
systemctl daemon-reload
fi
if [[ ! -f /etc/radicale/config ]]; then
msg_info "Migrating to config file (/etc/radicale/config)"
mkdir -p /etc/radicale
cat <<EOF >/etc/radicale/config
[server]
hosts = 0.0.0.0:5232
[auth]
type = htpasswd
htpasswd_filename = /opt/radicale/users
htpasswd_encryption = sha512
[storage]
type = multifilesystem
filesystem_folder = /var/lib/radicale/collections
[web]
type = internal
EOF
msg_ok "Migrated to config (/etc/radicale/config)"
fi
msg_info "Starting service"
systemctl start radicale
msg_ok "Started service"
msg_ok "Updated Successfully!"
fi
exit exit
} }

View File

@@ -41,7 +41,7 @@ function update_script() {
# Always ensure venv exists # Always ensure venv exists
if [[ ! -d /opt/sabnzbd/venv ]]; then if [[ ! -d /opt/sabnzbd/venv ]]; then
msg_info "Migrating SABnzbd to uv virtual environment" msg_info "Migrating SABnzbd to uv virtual environment"
$STD uv venv /opt/sabnzbd/venv $STD uv venv --clear /opt/sabnzbd/venv
msg_ok "Created uv venv at /opt/sabnzbd/venv" msg_ok "Created uv venv at /opt/sabnzbd/venv"
fi fi

View File

@@ -38,7 +38,7 @@ function update_script() {
msg_info "Updating Scraparr" msg_info "Updating Scraparr"
cd /opt/scraparr cd /opt/scraparr
$STD uv venv /opt/scraparr/.venv $STD uv venv --clear /opt/scraparr/.venv
$STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade $STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade
$STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip $STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip
$STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt $STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt

View File

@@ -3,7 +3,7 @@ source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxV
# Copyright (c) 2021-2026 community-scripts ORG # Copyright (c) 2021-2026 community-scripts ORG
# Author: vhsdream # Author: vhsdream
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/slskd/slskd, https://soularr.net # Source: https://github.com/slskd/slskd, https://github.com/mrusse/soularr
APP="slskd" APP="slskd"
var_tags="${var_tags:-arr;p2p}" var_tags="${var_tags:-arr;p2p}"
@@ -24,50 +24,65 @@ function update_script() {
check_container_storage check_container_storage
check_container_resources check_container_resources
if [[ ! -d /opt/slskd ]] || [[ ! -d /opt/soularr ]]; then if [[ ! -d /opt/slskd ]]; then
msg_error "No ${APP} Installation Found!" msg_error "No Slskd Installation Found!"
exit exit
fi fi
RELEASE=$(curl -s https://api.github.com/repos/slskd/slskd/releases/latest | grep "tag_name" | awk '{print substr($2, 2, length($2)-3) }') if check_for_gh_release "Slskd" "slskd/slskd"; then
if [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]] || [[ ! -f /opt/${APP}_version.txt ]]; then msg_info "Stopping Service(s)"
msg_info "Stopping Service" systemctl stop slskd
systemctl stop slskd soularr.timer soularr.service [[ -f /etc/systemd/system/soularr.service ]] && systemctl stop soularr.timer soularr.service
msg_info "Stopped Service" msg_ok "Stopped Service(s)"
msg_info "Updating $APP to v${RELEASE}" msg_info "Backing up config"
tmp_file=$(mktemp) cp /opt/slskd/config/slskd.yml /opt/slskd.yml.bak
curl -fsSL "https://github.com/slskd/slskd/releases/download/${RELEASE}/slskd-${RELEASE}-linux-x64.zip" -o $tmp_file msg_ok "Backed up config"
$STD unzip -oj $tmp_file slskd -d /opt/${APP}
echo "${RELEASE}" >/opt/${APP}_version.txt
msg_ok "Updated $APP to v${RELEASE}"
msg_info "Starting Service" CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Slskd" "slskd/slskd" "prebuild" "latest" "/opt/slskd" "slskd-*-linux-x64.zip"
msg_info "Restoring config"
mv /opt/slskd.yml.bak /opt/slskd/config/slskd.yml
msg_ok "Restored config"
msg_info "Starting Service(s)"
systemctl start slskd systemctl start slskd
msg_ok "Started Service" [[ -f /etc/systemd/system/soularr.service ]] && systemctl start soularr.timer
rm -rf $tmp_file msg_ok "Started Service(s)"
else msg_ok "Updated Slskd successfully!"
msg_ok "No ${APP} update required. ${APP} is already at v${RELEASE}"
fi fi
msg_info "Updating Soularr" [[ -d /opt/soularr ]] && if check_for_gh_release "Soularr" "mrusse/soularr"; then
cp /opt/soularr/config.ini /opt/config.ini.bak if systemctl is-active soularr.timer >/dev/null; then
cp /opt/soularr/run.sh /opt/run.sh.bak msg_info "Stopping Timer and Service"
cd /tmp systemctl stop soularr.timer soularr.service
rm -rf /opt/soularr msg_ok "Stopped Timer and Service"
curl -fsSL -o main.zip https://github.com/mrusse/soularr/archive/refs/heads/main.zip fi
$STD unzip main.zip
mv soularr-main /opt/soularr
cd /opt/soularr
$STD pip install -r requirements.txt
mv /opt/config.ini.bak /opt/soularr/config.ini
mv /opt/run.sh.bak /opt/soularr/run.sh
rm -rf /tmp/main.zip
msg_ok "Updated soularr"
msg_info "Starting soularr timer" msg_info "Backing up Soularr config"
systemctl start soularr.timer cp /opt/soularr/config.ini /opt/soularr_config.ini.bak
msg_ok "Started soularr timer" cp /opt/soularr/run.sh /opt/soularr_run.sh.bak
exit msg_ok "Backed up Soularr config"
PYTHON_VERSION="3.11" setup_uv
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Soularr" "mrusse/soularr" "tarball" "latest" "/opt/soularr"
msg_info "Updating Soularr"
cd /opt/soularr
$STD uv venv -c venv
$STD source venv/bin/activate
$STD uv pip install -r requirements.txt
deactivate
msg_ok "Updated Soularr"
msg_info "Restoring Soularr config"
mv /opt/soularr_config.ini.bak /opt/soularr/config.ini
mv /opt/soularr_run.sh.bak /opt/soularr/run.sh
msg_ok "Restored Soularr config"
msg_info "Starting Soularr Timer"
systemctl restart soularr.timer
msg_ok "Started Soularr Timer"
msg_ok "Updated Soularr successfully!"
fi
} }
start start

View File

@@ -33,7 +33,15 @@ function update_script() {
systemctl stop snowshare systemctl stop snowshare
msg_ok "Stopped Service" msg_ok "Stopped Service"
fetch_and_deploy_gh_release "snowshare" "TuroYT/snowshare" "tarball" msg_info "Backing up uploads"
[ -d /opt/snowshare/uploads ] && cp -a /opt/snowshare/uploads /opt/.snowshare_uploads_backup
msg_ok "Uploads backed up"
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "snowshare" "TuroYT/snowshare" "tarball"
msg_info "Restoring uploads"
[ -d /opt/.snowshare_uploads_backup ] && rm -rf /opt/snowshare/uploads && cp -a /opt/.snowshare_uploads_backup /opt/snowshare/uploads
msg_ok "Uploads restored"
msg_info "Updating Snowshare" msg_info "Updating Snowshare"
cd /opt/snowshare cd /opt/snowshare

View File

@@ -39,7 +39,7 @@ function update_script() {
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "streamlink-webui" "CrazyWolf13/streamlink-webui" "tarball" CLEAN_INSTALL=1 fetch_and_deploy_gh_release "streamlink-webui" "CrazyWolf13/streamlink-webui" "tarball"
msg_info "Updating streamlink-webui" msg_info "Updating streamlink-webui"
$STD uv venv /opt/streamlink-webui/backend/src/.venv $STD uv venv --clear /opt/streamlink-webui/backend/src/.venv
source /opt/streamlink-webui/backend/src/.venv/bin/activate source /opt/streamlink-webui/backend/src/.venv/bin/activate
$STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/streamlink-webui/backend/src/.venv $STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/streamlink-webui/backend/src/.venv
cd /opt/streamlink-webui/frontend/src cd /opt/streamlink-webui/frontend/src

View File

@@ -50,7 +50,7 @@ function update_script() {
cp -r /opt/tandoor.bak/{config,api,mediafiles,staticfiles} /opt/tandoor/ cp -r /opt/tandoor.bak/{config,api,mediafiles,staticfiles} /opt/tandoor/
mv /opt/tandoor.bak/.env /opt/tandoor/.env mv /opt/tandoor.bak/.env /opt/tandoor/.env
cd /opt/tandoor cd /opt/tandoor
$STD uv venv .venv --python=python3 $STD uv venv --clear .venv --python=python3
$STD uv pip install -r requirements.txt --python .venv/bin/python $STD uv pip install -r requirements.txt --python .venv/bin/python
cd /opt/tandoor/vue3 cd /opt/tandoor/vue3
$STD yarn install $STD yarn install

View File

@@ -33,7 +33,9 @@ function update_script() {
systemctl stop umlautadaptarr systemctl stop umlautadaptarr
msg_ok "Stopped Service" msg_ok "Stopped Service"
cp /opt/UmlautAdaptarr/appsettings.json /opt/UmlautAdaptarr/appsettings.json.bak
fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip" fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip"
cp /opt/UmlautAdaptarr/appsettings.json.bak /opt/UmlautAdaptarr/appsettings.json
msg_info "Starting Service" msg_info "Starting Service"
systemctl start umlautadaptarr systemctl start umlautadaptarr

View File

@@ -39,7 +39,7 @@ function update_script() {
msg_info "Updating Warracker" msg_info "Updating Warracker"
cd /opt/warracker/backend cd /opt/warracker/backend
$STD uv venv .venv $STD uv venv --clear .venv
$STD source .venv/bin/activate $STD source .venv/bin/activate
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
msg_ok "Updated Warracker" msg_ok "Updated Warracker"

View File

@@ -0,0 +1,35 @@
{
"name": "Draw.IO",
"slug": "drawio",
"categories": [
12
],
"date_created": "2026-02-11",
"type": "ct",
"updateable": true,
"privileged": false,
"interface_port": 8080,
"documentation": "https://www.drawio.com/doc/",
"website": "https://www.drawio.com/",
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/draw-io.webp",
"config_path": "",
"description": "draw.io is a configurable diagramming and whiteboarding application, jointly owned and developed by draw.io Ltd (previously named JGraph) and draw.io AG.",
"install_methods": [
{
"type": "default",
"script": "ct/drawio.sh",
"resources": {
"cpu": 1,
"ram": 2048,
"hdd": 4,
"os": "Debian",
"version": "13"
}
}
],
"default_credentials": {
"username": null,
"password": null
},
"notes": []
}

View File

@@ -21,7 +21,7 @@
"resources": { "resources": {
"cpu": 2, "cpu": 2,
"ram": 1024, "ram": 1024,
"hdd": 4, "hdd": 6,
"os": "debian", "os": "debian",
"version": "13" "version": "13"
} }

View File

@@ -1,5 +1,5 @@
{ {
"generated": "2026-02-09T06:27:10Z", "generated": "2026-02-13T12:11:36Z",
"versions": [ "versions": [
{ {
"slug": "2fauth", "slug": "2fauth",
@@ -15,6 +15,13 @@
"pinned": false, "pinned": false,
"date": "2025-12-08T14:34:55Z" "date": "2025-12-08T14:34:55Z"
}, },
{
"slug": "adguardhome-sync",
"repo": "bakito/adguardhome-sync",
"version": "v0.8.2",
"pinned": false,
"date": "2025-10-24T17:13:47Z"
},
{ {
"slug": "adventurelog", "slug": "adventurelog",
"repo": "seanmorley15/adventurelog", "repo": "seanmorley15/adventurelog",
@@ -109,9 +116,9 @@
{ {
"slug": "bentopdf", "slug": "bentopdf",
"repo": "alam00000/bentopdf", "repo": "alam00000/bentopdf",
"version": "v2.1.0", "version": "v2.2.0",
"pinned": false, "pinned": false,
"date": "2026-02-02T14:30:55Z" "date": "2026-02-09T07:07:40Z"
}, },
{ {
"slug": "beszel", "slug": "beszel",
@@ -186,9 +193,9 @@
{ {
"slug": "cleanuparr", "slug": "cleanuparr",
"repo": "Cleanuparr/Cleanuparr", "repo": "Cleanuparr/Cleanuparr",
"version": "v2.5.1", "version": "v2.6.1",
"pinned": false, "pinned": false,
"date": "2026-01-11T00:46:17Z" "date": "2026-02-13T10:00:19Z"
}, },
{ {
"slug": "cloudreve", "slug": "cloudreve",
@@ -200,16 +207,16 @@
{ {
"slug": "comfyui", "slug": "comfyui",
"repo": "comfyanonymous/ComfyUI", "repo": "comfyanonymous/ComfyUI",
"version": "v0.12.3", "version": "v0.13.0",
"pinned": false, "pinned": false,
"date": "2026-02-05T07:04:07Z" "date": "2026-02-10T20:27:38Z"
}, },
{ {
"slug": "commafeed", "slug": "commafeed",
"repo": "Athou/commafeed", "repo": "Athou/commafeed",
"version": "6.1.1", "version": "6.2.0",
"pinned": false, "pinned": false,
"date": "2026-01-26T15:14:16Z" "date": "2026-02-09T19:44:58Z"
}, },
{ {
"slug": "configarr", "slug": "configarr",
@@ -235,16 +242,16 @@
{ {
"slug": "cronicle", "slug": "cronicle",
"repo": "jhuckaby/Cronicle", "repo": "jhuckaby/Cronicle",
"version": "v0.9.105", "version": "v0.9.106",
"pinned": false, "pinned": false,
"date": "2026-02-05T18:16:11Z" "date": "2026-02-11T17:11:46Z"
}, },
{ {
"slug": "cryptpad", "slug": "cryptpad",
"repo": "cryptpad/cryptpad", "repo": "cryptpad/cryptpad",
"version": "2025.9.0", "version": "2026.2.0",
"pinned": false, "pinned": false,
"date": "2025-10-22T10:06:29Z" "date": "2026-02-11T15:39:05Z"
}, },
{ {
"slug": "dawarich", "slug": "dawarich",
@@ -256,44 +263,51 @@
{ {
"slug": "discopanel", "slug": "discopanel",
"repo": "nickheyer/discopanel", "repo": "nickheyer/discopanel",
"version": "v1.0.35", "version": "v1.0.36",
"pinned": false, "pinned": false,
"date": "2026-02-02T05:20:12Z" "date": "2026-02-09T21:15:44Z"
}, },
{ {
"slug": "dispatcharr", "slug": "dispatcharr",
"repo": "Dispatcharr/Dispatcharr", "repo": "Dispatcharr/Dispatcharr",
"version": "v0.18.1", "version": "v0.19.0",
"pinned": false, "pinned": false,
"date": "2026-01-27T17:09:11Z" "date": "2026-02-10T21:18:10Z"
}, },
{ {
"slug": "docmost", "slug": "docmost",
"repo": "docmost/docmost", "repo": "docmost/docmost",
"version": "v0.25.2", "version": "v0.25.3",
"pinned": false, "pinned": false,
"date": "2026-02-06T19:50:55Z" "date": "2026-02-10T02:58:23Z"
}, },
{ {
"slug": "domain-locker", "slug": "domain-locker",
"repo": "Lissy93/domain-locker", "repo": "Lissy93/domain-locker",
"version": "v0.1.2", "version": "v0.1.3",
"pinned": false, "pinned": false,
"date": "2025-11-14T22:08:23Z" "date": "2026-02-11T10:03:32Z"
}, },
{ {
"slug": "domain-monitor", "slug": "domain-monitor",
"repo": "Hosteroid/domain-monitor", "repo": "Hosteroid/domain-monitor",
"version": "v1.1.1", "version": "v1.1.3",
"pinned": false, "pinned": false,
"date": "2025-11-18T11:32:30Z" "date": "2026-02-11T15:48:18Z"
}, },
{ {
"slug": "donetick", "slug": "donetick",
"repo": "donetick/donetick", "repo": "donetick/donetick",
"version": "v0.1.64", "version": "v0.1.73",
"pinned": false, "pinned": false,
"date": "2025-10-03T05:18:24Z" "date": "2026-02-12T23:42:30Z"
},
{
"slug": "drawio",
"repo": "jgraph/drawio",
"version": "v29.3.6",
"pinned": false,
"date": "2026-01-28T18:25:02Z"
}, },
{ {
"slug": "duplicati", "slug": "duplicati",
@@ -319,9 +333,9 @@
{ {
"slug": "endurain", "slug": "endurain",
"repo": "endurain-project/endurain", "repo": "endurain-project/endurain",
"version": "v0.17.3", "version": "v0.17.4",
"pinned": false, "pinned": false,
"date": "2026-01-23T22:02:05Z" "date": "2026-02-11T04:54:22Z"
}, },
{ {
"slug": "ersatztv", "slug": "ersatztv",
@@ -389,9 +403,9 @@
{ {
"slug": "ghostfolio", "slug": "ghostfolio",
"repo": "ghostfolio/ghostfolio", "repo": "ghostfolio/ghostfolio",
"version": "2.237.0", "version": "2.238.0",
"pinned": false, "pinned": false,
"date": "2026-02-08T13:59:53Z" "date": "2026-02-12T18:28:55Z"
}, },
{ {
"slug": "gitea", "slug": "gitea",
@@ -529,9 +543,16 @@
{ {
"slug": "huntarr", "slug": "huntarr",
"repo": "plexguide/Huntarr.io", "repo": "plexguide/Huntarr.io",
"version": "9.2.3", "version": "9.2.4.1",
"pinned": false, "pinned": false,
"date": "2026-02-07T04:44:20Z" "date": "2026-02-12T22:17:47Z"
},
{
"slug": "immich-public-proxy",
"repo": "alangrainger/immich-public-proxy",
"version": "v1.15.1",
"pinned": false,
"date": "2026-01-26T08:04:27Z"
}, },
{ {
"slug": "inspircd", "slug": "inspircd",
@@ -550,16 +571,23 @@
{ {
"slug": "invoiceninja", "slug": "invoiceninja",
"repo": "invoiceninja/invoiceninja", "repo": "invoiceninja/invoiceninja",
"version": "v5.12.55", "version": "v5.12.59",
"pinned": false, "pinned": false,
"date": "2026-02-05T01:06:15Z" "date": "2026-02-13T02:26:13Z"
}, },
{ {
"slug": "jackett", "slug": "jackett",
"repo": "Jackett/Jackett", "repo": "Jackett/Jackett",
"version": "v0.24.1074", "version": "v0.24.1103",
"pinned": false, "pinned": false,
"date": "2026-02-09T06:01:19Z" "date": "2026-02-13T05:53:23Z"
},
{
"slug": "jellystat",
"repo": "CyferShepard/Jellystat",
"version": "V1.1.8",
"pinned": false,
"date": "2026-02-08T08:15:00Z"
}, },
{ {
"slug": "joplin-server", "slug": "joplin-server",
@@ -571,9 +599,9 @@
{ {
"slug": "jotty", "slug": "jotty",
"repo": "fccview/jotty", "repo": "fccview/jotty",
"version": "1.19.1", "version": "1.20.0",
"pinned": false, "pinned": false,
"date": "2026-01-26T21:30:39Z" "date": "2026-02-12T09:23:30Z"
}, },
{ {
"slug": "kapowarr", "slug": "kapowarr",
@@ -599,9 +627,9 @@
{ {
"slug": "keycloak", "slug": "keycloak",
"repo": "keycloak/keycloak", "repo": "keycloak/keycloak",
"version": "26.5.2", "version": "26.5.3",
"pinned": false, "pinned": false,
"date": "2026-01-23T14:26:58Z" "date": "2026-02-10T07:30:08Z"
}, },
{ {
"slug": "kimai", "slug": "kimai",
@@ -634,9 +662,9 @@
{ {
"slug": "kometa", "slug": "kometa",
"repo": "Kometa-Team/Kometa", "repo": "Kometa-Team/Kometa",
"version": "v2.2.2", "version": "v2.3.0",
"pinned": false, "pinned": false,
"date": "2025-10-06T21:31:07Z" "date": "2026-02-09T21:26:56Z"
}, },
{ {
"slug": "komga", "slug": "komga",
@@ -683,9 +711,9 @@
{ {
"slug": "libretranslate", "slug": "libretranslate",
"repo": "LibreTranslate/LibreTranslate", "repo": "LibreTranslate/LibreTranslate",
"version": "v1.8.4", "version": "v1.9.0",
"pinned": false, "pinned": false,
"date": "2026-02-02T17:45:16Z" "date": "2026-02-10T19:05:48Z"
}, },
{ {
"slug": "lidarr", "slug": "lidarr",
@@ -718,9 +746,9 @@
{ {
"slug": "lubelogger", "slug": "lubelogger",
"repo": "hargata/lubelog", "repo": "hargata/lubelog",
"version": "v1.5.8", "version": "v1.6.0",
"pinned": false, "pinned": false,
"date": "2026-01-26T18:18:03Z" "date": "2026-02-10T20:16:32Z"
}, },
{ {
"slug": "mafl", "slug": "mafl",
@@ -739,9 +767,9 @@
{ {
"slug": "mail-archiver", "slug": "mail-archiver",
"repo": "s1t5/mail-archiver", "repo": "s1t5/mail-archiver",
"version": "2601.3", "version": "2602.1",
"pinned": false, "pinned": false,
"date": "2026-01-25T12:52:24Z" "date": "2026-02-11T06:23:11Z"
}, },
{ {
"slug": "managemydamnlife", "slug": "managemydamnlife",
@@ -753,9 +781,9 @@
{ {
"slug": "manyfold", "slug": "manyfold",
"repo": "manyfold3d/manyfold", "repo": "manyfold3d/manyfold",
"version": "v0.132.0", "version": "v0.132.1",
"pinned": false, "pinned": false,
"date": "2026-01-29T13:53:21Z" "date": "2026-02-09T22:02:28Z"
}, },
{ {
"slug": "mealie", "slug": "mealie",
@@ -767,9 +795,9 @@
{ {
"slug": "mediamanager", "slug": "mediamanager",
"repo": "maxdorninger/MediaManager", "repo": "maxdorninger/MediaManager",
"version": "v1.12.2", "version": "v1.12.3",
"pinned": false, "pinned": false,
"date": "2026-02-08T19:18:29Z" "date": "2026-02-11T16:45:40Z"
}, },
{ {
"slug": "mediamtx", "slug": "mediamtx",
@@ -795,9 +823,9 @@
{ {
"slug": "metube", "slug": "metube",
"repo": "alexta69/metube", "repo": "alexta69/metube",
"version": "2026.02.08", "version": "2026.02.12",
"pinned": false, "pinned": false,
"date": "2026-02-08T17:01:37Z" "date": "2026-02-12T21:05:49Z"
}, },
{ {
"slug": "miniflux", "slug": "miniflux",
@@ -816,9 +844,9 @@
{ {
"slug": "myip", "slug": "myip",
"repo": "jason5ng32/MyIP", "repo": "jason5ng32/MyIP",
"version": "v5.2.0", "version": "v5.2.1",
"pinned": false, "pinned": false,
"date": "2026-01-05T05:56:57Z" "date": "2026-02-10T07:38:47Z"
}, },
{ {
"slug": "mylar3", "slug": "mylar3",
@@ -837,9 +865,9 @@
{ {
"slug": "navidrome", "slug": "navidrome",
"repo": "navidrome/navidrome", "repo": "navidrome/navidrome",
"version": "v0.60.2", "version": "v0.60.3",
"pinned": false, "pinned": false,
"date": "2026-02-07T19:42:33Z" "date": "2026-02-10T23:55:04Z"
}, },
{ {
"slug": "netbox", "slug": "netbox",
@@ -848,6 +876,13 @@
"pinned": false, "pinned": false,
"date": "2026-02-03T13:54:26Z" "date": "2026-02-03T13:54:26Z"
}, },
{
"slug": "nextcloud-exporter",
"repo": "xperimental/nextcloud-exporter",
"version": "v0.9.0",
"pinned": false,
"date": "2025-10-12T20:03:10Z"
},
{ {
"slug": "nginx-ui", "slug": "nginx-ui",
"repo": "0xJacky/nginx-ui", "repo": "0xJacky/nginx-ui",
@@ -963,9 +998,9 @@
{ {
"slug": "pangolin", "slug": "pangolin",
"repo": "fosrl/pangolin", "repo": "fosrl/pangolin",
"version": "1.15.2", "version": "1.15.4",
"pinned": false, "pinned": false,
"date": "2026-02-05T19:23:58Z" "date": "2026-02-13T00:54:02Z"
}, },
{ {
"slug": "paperless-ai", "slug": "paperless-ai",
@@ -991,9 +1026,9 @@
{ {
"slug": "patchmon", "slug": "patchmon",
"repo": "PatchMon/PatchMon", "repo": "PatchMon/PatchMon",
"version": "v1.3.7", "version": "v1.4.0",
"pinned": false, "pinned": false,
"date": "2025-12-25T11:08:14Z" "date": "2026-02-13T10:39:03Z"
}, },
{ {
"slug": "paymenter", "slug": "paymenter",
@@ -1012,16 +1047,16 @@
{ {
"slug": "pelican-panel", "slug": "pelican-panel",
"repo": "pelican-dev/panel", "repo": "pelican-dev/panel",
"version": "v1.0.0-beta31", "version": "v1.0.0-beta32",
"pinned": false, "pinned": false,
"date": "2026-01-18T22:43:24Z" "date": "2026-02-09T22:15:44Z"
}, },
{ {
"slug": "pelican-wings", "slug": "pelican-wings",
"repo": "pelican-dev/wings", "repo": "pelican-dev/wings",
"version": "v1.0.0-beta22", "version": "v1.0.0-beta23",
"pinned": false, "pinned": false,
"date": "2026-01-18T22:38:36Z" "date": "2026-02-09T22:10:26Z"
}, },
{ {
"slug": "pf2etools", "slug": "pf2etools",
@@ -1037,12 +1072,19 @@
"pinned": false, "pinned": false,
"date": "2025-12-01T05:07:31Z" "date": "2025-12-01T05:07:31Z"
}, },
{
"slug": "pihole-exporter",
"repo": "eko/pihole-exporter",
"version": "v1.2.0",
"pinned": false,
"date": "2025-07-29T19:15:37Z"
},
{ {
"slug": "planka", "slug": "planka",
"repo": "plankanban/planka", "repo": "plankanban/planka",
"version": "v2.0.0-rc.4", "version": "v2.0.0",
"pinned": false, "pinned": false,
"date": "2025-09-04T12:41:17Z" "date": "2026-02-11T13:50:10Z"
}, },
{ {
"slug": "plant-it", "slug": "plant-it",
@@ -1089,9 +1131,9 @@
{ {
"slug": "prometheus-alertmanager", "slug": "prometheus-alertmanager",
"repo": "prometheus/alertmanager", "repo": "prometheus/alertmanager",
"version": "v0.31.0", "version": "v0.31.1",
"pinned": false, "pinned": false,
"date": "2026-02-02T13:34:15Z" "date": "2026-02-11T21:28:26Z"
}, },
{ {
"slug": "prometheus-blackbox-exporter", "slug": "prometheus-blackbox-exporter",
@@ -1131,9 +1173,9 @@
{ {
"slug": "pulse", "slug": "pulse",
"repo": "rcourtman/Pulse", "repo": "rcourtman/Pulse",
"version": "v5.1.5", "version": "v5.1.9",
"pinned": false, "pinned": false,
"date": "2026-02-08T12:19:53Z" "date": "2026-02-11T15:34:40Z"
}, },
{ {
"slug": "pve-scripts-local", "slug": "pve-scripts-local",
@@ -1149,6 +1191,13 @@
"pinned": false, "pinned": false,
"date": "2025-11-19T23:54:34Z" "date": "2025-11-19T23:54:34Z"
}, },
{
"slug": "qbittorrent-exporter",
"repo": "martabal/qbittorrent-exporter",
"version": "v1.13.2",
"pinned": false,
"date": "2025-12-13T22:59:03Z"
},
{ {
"slug": "qdrant", "slug": "qdrant",
"repo": "qdrant/qdrant", "repo": "qdrant/qdrant",
@@ -1170,6 +1219,13 @@
"pinned": false, "pinned": false,
"date": "2025-11-16T22:39:01Z" "date": "2025-11-16T22:39:01Z"
}, },
{
"slug": "radicale",
"repo": "Kozea/Radicale",
"version": "v3.6.0",
"pinned": false,
"date": "2026-01-10T06:56:46Z"
},
{ {
"slug": "rclone", "slug": "rclone",
"repo": "rclone/rclone", "repo": "rclone/rclone",
@@ -1180,9 +1236,9 @@
{ {
"slug": "rdtclient", "slug": "rdtclient",
"repo": "rogerfar/rdt-client", "repo": "rogerfar/rdt-client",
"version": "v2.0.119", "version": "v2.0.120",
"pinned": false, "pinned": false,
"date": "2025-10-13T23:15:11Z" "date": "2026-02-12T02:53:51Z"
}, },
{ {
"slug": "reactive-resume", "slug": "reactive-resume",
@@ -1236,16 +1292,16 @@
{ {
"slug": "scanopy", "slug": "scanopy",
"repo": "scanopy/scanopy", "repo": "scanopy/scanopy",
"version": "v0.14.3", "version": "v0.14.4",
"pinned": false, "pinned": false,
"date": "2026-02-04T01:41:01Z" "date": "2026-02-10T03:57:28Z"
}, },
{ {
"slug": "scraparr", "slug": "scraparr",
"repo": "thecfu/scraparr", "repo": "thecfu/scraparr",
"version": "v2.2.5", "version": "v3.0.3",
"pinned": false, "pinned": false,
"date": "2025-10-07T12:34:31Z" "date": "2026-02-12T14:20:56Z"
}, },
{ {
"slug": "seelf", "slug": "seelf",
@@ -1282,6 +1338,13 @@
"pinned": false, "pinned": false,
"date": "2026-01-16T12:08:28Z" "date": "2026-01-16T12:08:28Z"
}, },
{
"slug": "slskd",
"repo": "slskd/slskd",
"version": "0.24.3",
"pinned": false,
"date": "2026-01-15T14:40:15Z"
},
{ {
"slug": "snipeit", "slug": "snipeit",
"repo": "grokability/snipe-it", "repo": "grokability/snipe-it",
@@ -1292,9 +1355,9 @@
{ {
"slug": "snowshare", "slug": "snowshare",
"repo": "TuroYT/snowshare", "repo": "TuroYT/snowshare",
"version": "v1.2.12", "version": "v1.3.5",
"pinned": false, "pinned": false,
"date": "2026-01-30T13:35:56Z" "date": "2026-02-11T10:24:51Z"
}, },
{ {
"slug": "sonarr", "slug": "sonarr",
@@ -1327,9 +1390,9 @@
{ {
"slug": "stirling-pdf", "slug": "stirling-pdf",
"repo": "Stirling-Tools/Stirling-PDF", "repo": "Stirling-Tools/Stirling-PDF",
"version": "v2.4.5", "version": "v2.4.6",
"pinned": false, "pinned": false,
"date": "2026-02-06T23:12:20Z" "date": "2026-02-12T00:01:19Z"
}, },
{ {
"slug": "streamlink-webui", "slug": "streamlink-webui",
@@ -1376,9 +1439,9 @@
{ {
"slug": "termix", "slug": "termix",
"repo": "Termix-SSH/Termix", "repo": "Termix-SSH/Termix",
"version": "release-1.11.0-tag", "version": "release-1.11.1-tag",
"pinned": false, "pinned": false,
"date": "2026-01-25T02:09:52Z" "date": "2026-02-13T04:49:16Z"
}, },
{ {
"slug": "the-lounge", "slug": "the-lounge",
@@ -1404,9 +1467,9 @@
{ {
"slug": "tianji", "slug": "tianji",
"repo": "msgbyte/tianji", "repo": "msgbyte/tianji",
"version": "v1.31.10", "version": "v1.31.12",
"pinned": false, "pinned": false,
"date": "2026-02-04T17:21:04Z" "date": "2026-02-12T19:06:14Z"
}, },
{ {
"slug": "traccar", "slug": "traccar",
@@ -1418,9 +1481,9 @@
{ {
"slug": "tracearr", "slug": "tracearr",
"repo": "connorgallopo/Tracearr", "repo": "connorgallopo/Tracearr",
"version": "v1.4.12", "version": "v1.4.17",
"pinned": false, "pinned": false,
"date": "2026-01-28T23:29:37Z" "date": "2026-02-11T01:33:21Z"
}, },
{ {
"slug": "tracktor", "slug": "tracktor",
@@ -1432,9 +1495,9 @@
{ {
"slug": "traefik", "slug": "traefik",
"repo": "traefik/traefik", "repo": "traefik/traefik",
"version": "v3.6.7", "version": "v3.6.8",
"pinned": false, "pinned": false,
"date": "2026-01-14T14:11:45Z" "date": "2026-02-11T16:44:37Z"
}, },
{ {
"slug": "trilium", "slug": "trilium",
@@ -1446,9 +1509,9 @@
{ {
"slug": "trip", "slug": "trip",
"repo": "itskovacs/TRIP", "repo": "itskovacs/TRIP",
"version": "1.39.0", "version": "1.40.0",
"pinned": false, "pinned": false,
"date": "2026-02-07T16:59:51Z" "date": "2026-02-10T20:12:53Z"
}, },
{ {
"slug": "tududi", "slug": "tududi",
@@ -1495,9 +1558,9 @@
{ {
"slug": "upsnap", "slug": "upsnap",
"repo": "seriousm4x/UpSnap", "repo": "seriousm4x/UpSnap",
"version": "5.2.7", "version": "5.2.8",
"pinned": false, "pinned": false,
"date": "2026-01-07T23:48:00Z" "date": "2026-02-13T00:02:37Z"
}, },
{ {
"slug": "uptimekuma", "slug": "uptimekuma",
@@ -1509,9 +1572,9 @@
{ {
"slug": "vaultwarden", "slug": "vaultwarden",
"repo": "dani-garcia/vaultwarden", "repo": "dani-garcia/vaultwarden",
"version": "1.35.2", "version": "1.35.3",
"pinned": false, "pinned": false,
"date": "2026-01-09T18:37:04Z" "date": "2026-02-10T20:37:03Z"
}, },
{ {
"slug": "victoriametrics", "slug": "victoriametrics",
@@ -1523,9 +1586,9 @@
{ {
"slug": "vikunja", "slug": "vikunja",
"repo": "go-vikunja/vikunja", "repo": "go-vikunja/vikunja",
"version": "v1.0.0", "version": "v1.1.0",
"pinned": false, "pinned": false,
"date": "2026-01-28T11:12:59Z" "date": "2026-02-09T10:34:29Z"
}, },
{ {
"slug": "wallabag", "slug": "wallabag",
@@ -1537,9 +1600,9 @@
{ {
"slug": "wallos", "slug": "wallos",
"repo": "ellite/Wallos", "repo": "ellite/Wallos",
"version": "v4.6.0", "version": "v4.6.1",
"pinned": false, "pinned": false,
"date": "2025-12-20T15:57:51Z" "date": "2026-02-10T21:06:46Z"
}, },
{ {
"slug": "wanderer", "slug": "wanderer",
@@ -1572,9 +1635,9 @@
{ {
"slug": "wavelog", "slug": "wavelog",
"repo": "wavelog/wavelog", "repo": "wavelog/wavelog",
"version": "2.2.2", "version": "2.3",
"pinned": false, "pinned": false,
"date": "2025-12-31T16:53:34Z" "date": "2026-02-11T15:46:40Z"
}, },
{ {
"slug": "wealthfolio", "slug": "wealthfolio",
@@ -1590,19 +1653,26 @@
"pinned": false, "pinned": false,
"date": "2025-11-11T14:30:28Z" "date": "2025-11-11T14:30:28Z"
}, },
{
"slug": "wger",
"repo": "wger-project/wger",
"version": "2.4",
"pinned": false,
"date": "2026-01-18T12:12:02Z"
},
{ {
"slug": "wikijs", "slug": "wikijs",
"repo": "requarks/wiki", "repo": "requarks/wiki",
"version": "v2.5.311", "version": "v2.5.312",
"pinned": false, "pinned": false,
"date": "2026-01-08T09:50:00Z" "date": "2026-02-12T02:45:22Z"
}, },
{ {
"slug": "wishlist", "slug": "wishlist",
"repo": "cmintey/wishlist", "repo": "cmintey/wishlist",
"version": "v0.59.0", "version": "v0.60.0",
"pinned": false, "pinned": false,
"date": "2026-01-19T16:42:14Z" "date": "2026-02-10T04:05:26Z"
}, },
{ {
"slug": "wizarr", "slug": "wizarr",
@@ -1628,9 +1698,9 @@
{ {
"slug": "yubal", "slug": "yubal",
"repo": "guillevc/yubal", "repo": "guillevc/yubal",
"version": "v0.4.2", "version": "v0.5.0",
"pinned": false, "pinned": false,
"date": "2026-02-08T21:35:13Z" "date": "2026-02-09T22:11:32Z"
}, },
{ {
"slug": "zigbee2mqtt", "slug": "zigbee2mqtt",
@@ -1642,9 +1712,9 @@
{ {
"slug": "zipline", "slug": "zipline",
"repo": "diced/zipline", "repo": "diced/zipline",
"version": "v4.4.1", "version": "v4.4.2",
"pinned": false, "pinned": false,
"date": "2026-01-20T01:29:01Z" "date": "2026-02-11T04:58:54Z"
}, },
{ {
"slug": "zitadel", "slug": "zitadel",

View File

@@ -21,7 +21,7 @@
"resources": { "resources": {
"cpu": 1, "cpu": 1,
"ram": 512, "ram": 512,
"hdd": 2, "hdd": 4,
"os": "debian", "os": "debian",
"version": "13" "version": "13"
} }
@@ -32,7 +32,7 @@
"resources": { "resources": {
"cpu": 1, "cpu": 1,
"ram": 256, "ram": 256,
"hdd": 1, "hdd": 2,
"os": "alpine", "os": "alpine",
"version": "3.23" "version": "3.23"
} }

View File

@@ -33,7 +33,7 @@
}, },
"notes": [ "notes": [
{ {
"text": "Kutt needs so be served with an SSL certificate for its login to work. During install, you will be prompted to choose if you want to have Caddy installed for SSL termination or if you want to use your own reverse proxy (in that case point your reverse porxy to port 3000).", "text": "Kutt needs so be served with an SSL certificate for its login to work. During install, you will be prompted to choose if you want to have Caddy installed for SSL termination or if you want to use your own reverse proxy (in that case point your reverse proxy to port 3000).",
"type": "info" "type": "info"
} }
] ]

View File

@@ -1,44 +1,35 @@
{ {
"name": "Prometheus Paperless NGX Exporter", "name": "Prometheus Paperless NGX Exporter",
"slug": "prometheus-paperless-ngx-exporter", "slug": "prometheus-paperless-ngx-exporter",
"categories": [ "categories": [
9 9
], ],
"date_created": "2025-02-07", "date_created": "2025-02-07",
"type": "ct", "type": "addon",
"updateable": true, "updateable": true,
"privileged": false, "privileged": false,
"interface_port": 8081, "interface_port": 8081,
"documentation": null, "documentation": "https://github.com/hansmi/prometheus-paperless-exporter",
"website": "https://github.com/hansmi/prometheus-paperless-exporter", "website": "https://github.com/hansmi/prometheus-paperless-exporter",
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/paperless-ngx.webp", "logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/paperless-ngx.webp",
"config_path": "", "config_path": "/etc/prometheus-paperless-ngx-exporter/config.env",
"description": "Prometheus metrics exporter for Paperless-NGX, a document management system transforming physical documents into a searchable online archive. The exporter relies on Paperless' REST API.", "description": "Prometheus metrics exporter for Paperless-NGX, a document management system transforming physical documents into a searchable online archive. The exporter relies on Paperless' REST API.",
"install_methods": [ "install_methods": [
{ {
"type": "default", "type": "default",
"script": "ct/prometheus-paperless-ngx-exporter.sh", "script": "tools/addon/prometheus-paperless-ngx-exporter.sh",
"resources": { "resources": {
"cpu": 1, "cpu": null,
"ram": 256, "ram": null,
"hdd": 2, "hdd": null,
"os": "debian", "os": null,
"version": "13" "version": null
} }
} }
], ],
"default_credentials": { "default_credentials": {
"username": null, "username": null,
"password": null "password": null
}, },
"notes": [ "notes": []
{
"text": "Please adjust the Paperless URL in the systemd unit file: /etc/systemd/system/prometheus-paperless-ngx-exporter.service",
"type": "info"
},
{
"text": "Please adjust the Paperless authentication token in the configuration file: /etc/prometheus-paperless-ngx-exporter/paperless_auth_token_file",
"type": "info"
}
]
} }

View File

@@ -12,7 +12,7 @@
"documentation": "https://radicale.org/master.html#documentation-1", "documentation": "https://radicale.org/master.html#documentation-1",
"website": "https://radicale.org/", "website": "https://radicale.org/",
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/radicale.webp", "logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/radicale.webp",
"config_path": "/etc/radicale/config or ~/.config/radicale/config", "config_path": "/etc/radicale/config",
"description": "Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV (contacts)", "description": "Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV (contacts)",
"install_methods": [ "install_methods": [
{ {

View File

@@ -1,5 +1,5 @@
{ {
"name": "slskd", "name": "Slskd",
"slug": "slskd", "slug": "slskd",
"categories": [ "categories": [
11 11
@@ -35,10 +35,6 @@
{ {
"text": "See /opt/slskd/config/slskd.yml to add your Soulseek credentials", "text": "See /opt/slskd/config/slskd.yml to add your Soulseek credentials",
"type": "info" "type": "info"
},
{
"text": "This LXC includes Soularr; it needs to be configured (/opt/soularr/config.ini) before it will work",
"type": "info"
} }
] ]
} }

View File

@@ -10,7 +10,7 @@
"privileged": false, "privileged": false,
"interface_port": 3000, "interface_port": 3000,
"documentation": "https://github.com/TuroYT/snowshare", "documentation": "https://github.com/TuroYT/snowshare",
"config_path": "/opt/snowshare/.env", "config_path": "/opt/snowshare.env",
"website": "https://github.com/TuroYT/snowshare", "website": "https://github.com/TuroYT/snowshare",
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/png/snowshare.png", "logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/png/snowshare.png",
"description": "A modern, secure file and link sharing platform built with Next.js, Prisma, and NextAuth. Share URLs, code snippets, and files with customizable expiration, privacy, and QR codes.", "description": "A modern, secure file and link sharing platform built with Next.js, Prisma, and NextAuth. Share URLs, code snippets, and files with customizable expiration, privacy, and QR codes.",

View File

@@ -32,6 +32,10 @@
"password": null "password": null
}, },
"notes": [ "notes": [
{
"text": "SQL Server (2025) SQLPAL is incompatible with Proxmox VE 9 (Kernel 6.12+) in LXC containers. Use a VM instead or the SQL-Server 2022 LXC.",
"type": "warning"
},
{ {
"text": "If you choose not to run the installation setup, execute: `/opt/mssql/bin/mssql-conf setup` in LXC shell.", "text": "If you choose not to run the installation setup, execute: `/opt/mssql/bin/mssql-conf setup` in LXC shell.",
"type": "info" "type": "info"

View File

@@ -58,7 +58,7 @@ DISABLE_REGISTRATION=False
EOF EOF
cd /opt/adventurelog/backend/server cd /opt/adventurelog/backend/server
mkdir -p /opt/adventurelog/backend/server/media mkdir -p /opt/adventurelog/backend/server/media
$STD uv venv /opt/adventurelog/backend/server/.venv $STD uv venv --clear /opt/adventurelog/backend/server/.venv
$STD /opt/adventurelog/backend/server/.venv/bin/python -m ensurepip --upgrade $STD /opt/adventurelog/backend/server/.venv/bin/python -m ensurepip --upgrade
$STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install --upgrade pip $STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install --upgrade pip
$STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install -r requirements.txt $STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install -r requirements.txt

View File

@@ -77,7 +77,7 @@ echo "${KEPUB_VERSION#v}" >"$INSTALL_DIR"/KEPUBIFY_RELEASE
sed 's/^/v/' ~/.autocaliweb >"$INSTALL_DIR"/ACW_RELEASE sed 's/^/v/' ~/.autocaliweb >"$INSTALL_DIR"/ACW_RELEASE
cd "$INSTALL_DIR" cd "$INSTALL_DIR"
$STD uv venv "$VIRTUAL_ENV" $STD uv venv --clear "$VIRTUAL_ENV"
$STD uv sync --all-extras --active $STD uv sync --all-extras --active
cat <<EOF >./dirs.json cat <<EOF >./dirs.json
{ {

View File

@@ -29,7 +29,7 @@ fetch_and_deploy_gh_release "babybuddy" "babybuddy/babybuddy" "tarball"
msg_info "Installing Babybuddy" msg_info "Installing Babybuddy"
mkdir -p /opt/data mkdir -p /opt/data
cd /opt/babybuddy cd /opt/babybuddy
$STD uv venv .venv $STD uv venv --clear .venv
$STD source .venv/bin/activate $STD source .venv/bin/activate
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
cp babybuddy/settings/production.example.py babybuddy/settings/production.py cp babybuddy/settings/production.example.py babybuddy/settings/production.py

View File

@@ -20,7 +20,7 @@ msg_info "Installing Bazarr"
mkdir -p /var/lib/bazarr/ mkdir -p /var/lib/bazarr/
chmod 775 /opt/bazarr /var/lib/bazarr/ chmod 775 /opt/bazarr /var/lib/bazarr/
sed -i.bak 's/--only-binary=Pillow//g' /opt/bazarr/requirements.txt sed -i.bak 's/--only-binary=Pillow//g' /opt/bazarr/requirements.txt
$STD uv venv /opt/bazarr/venv --python 3.12 $STD uv venv --clear /opt/bazarr/venv --python 3.12
$STD uv pip install -r /opt/bazarr/requirements.txt --python /opt/bazarr/venv/bin/python3 $STD uv pip install -r /opt/bazarr/requirements.txt --python /opt/bazarr/venv/bin/python3
msg_ok "Installed Bazarr" msg_ok "Installed Bazarr"

View File

@@ -36,7 +36,7 @@ PYTHON_VERSION="3.12" setup_uv
fetch_and_deploy_gh_release "ComfyUI" "comfyanonymous/ComfyUI" "tarball" "latest" "/opt/ComfyUI" fetch_and_deploy_gh_release "ComfyUI" "comfyanonymous/ComfyUI" "tarball" "latest" "/opt/ComfyUI"
msg_info "Python dependencies" msg_info "Python dependencies"
$STD uv venv "/opt/ComfyUI/venv" $STD uv venv --clear "/opt/ComfyUI/venv"
if [[ "${comfyui_gpu_type,,}" == "nvidia" ]]; then if [[ "${comfyui_gpu_type,,}" == "nvidia" ]]; then
pytorch_url="https://download.pytorch.org/whl/cu130" pytorch_url="https://download.pytorch.org/whl/cu130"

View File

@@ -16,7 +16,8 @@ update_os
msg_info "Installing Dependencies" msg_info "Installing Dependencies"
$STD apt install -y \ $STD apt install -y \
python3-pip \ python3-pip \
python3-libtorrent python3-libtorrent \
python3-setuptools
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
msg_info "Installing Deluge" msg_info "Installing Deluge"

View File

@@ -36,8 +36,8 @@ fetch_and_deploy_gh_release "dispatcharr" "Dispatcharr/Dispatcharr" "tarball"
msg_info "Installing Python Dependencies with uv" msg_info "Installing Python Dependencies with uv"
cd /opt/dispatcharr cd /opt/dispatcharr
$STD uv venv $STD uv venv --clear
$STD uv pip install -r requirements.txt --index-strategy unsafe-best-match $STD uv sync
$STD uv pip install gunicorn gevent celery redis daphne $STD uv pip install gunicorn gevent celery redis daphne
msg_ok "Installed Python Dependencies" msg_ok "Installed Python Dependencies"

25
install/drawio-install.sh Normal file
View File

@@ -0,0 +1,25 @@
#!/usr/bin/env bash
# Copyright (c) 2021-2026 community-scripts ORG
# Author: Slaviša Arežina (tremor021)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://www.drawio.com/
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
setup_hwaccel
msg_info "Installing Dependencies"
$STD apt install -y tomcat11
msg_ok "Installed Dependencies"
USE_ORIGINAL_FILENAME=true fetch_and_deploy_gh_release "drawio" "jgraph/drawio" "singlefile" "latest" "/var/lib/tomcat11/webapps" "draw.war"
motd_ssh
customize
cleanup_lxc

View File

@@ -31,8 +31,10 @@ setup_deb822_repo "matrix-org" \
"main" "main"
echo "matrix-synapse-py3 matrix-synapse/server-name string $servername" | debconf-set-selections echo "matrix-synapse-py3 matrix-synapse/server-name string $servername" | debconf-set-selections
echo "matrix-synapse-py3 matrix-synapse/report-stats boolean false" | debconf-set-selections echo "matrix-synapse-py3 matrix-synapse/report-stats boolean false" | debconf-set-selections
echo "exit 101" >/usr/sbin/policy-rc.d
chmod +x /usr/sbin/policy-rc.d
$STD apt install matrix-synapse-py3 -y $STD apt install matrix-synapse-py3 -y
systemctl stop matrix-synapse rm -f /usr/sbin/policy-rc.d
sed -i 's/127.0.0.1/0.0.0.0/g' /etc/matrix-synapse/homeserver.yaml sed -i 's/127.0.0.1/0.0.0.0/g' /etc/matrix-synapse/homeserver.yaml
sed -i 's/'\''::1'\'', //g' /etc/matrix-synapse/homeserver.yaml sed -i 's/'\''::1'\'', //g' /etc/matrix-synapse/homeserver.yaml
SECRET=$(openssl rand -hex 32) SECRET=$(openssl rand -hex 32)

View File

@@ -38,6 +38,18 @@ rm -f "$DEB_FILE"
echo "$LATEST_VERSION" >~/.emqx echo "$LATEST_VERSION" >~/.emqx
msg_ok "Installed EMQX" msg_ok "Installed EMQX"
read -r -p "${TAB3}Would you like to disable the EMQX MQ feature? (reduces disk/CPU usage) <y/N> " prompt
if [[ ${prompt,,} =~ ^(y|yes)$ ]]; then
msg_info "Disabling EMQX MQ feature"
mkdir -p /etc/emqx
if ! grep -q "^mq.enable" /etc/emqx/emqx.conf 2>/dev/null; then
echo "mq.enable = false" >>/etc/emqx/emqx.conf
else
sed -i 's/^mq.enable.*/mq.enable = false/' /etc/emqx/emqx.conf
fi
msg_ok "Disabled EMQX MQ feature"
fi
msg_info "Starting EMQX service" msg_info "Starting EMQX service"
$STD systemctl enable -q --now emqx $STD systemctl enable -q --now emqx
msg_ok "Enabled EMQX service" msg_ok "Enabled EMQX service"

View File

@@ -86,7 +86,7 @@ $STD uv tool update-shell
export PATH="/root/.local/bin:$PATH" export PATH="/root/.local/bin:$PATH"
$STD poetry self add poetry-plugin-export $STD poetry self add poetry-plugin-export
$STD poetry export -f requirements.txt --output requirements.txt --without-hashes $STD poetry export -f requirements.txt --output requirements.txt --without-hashes
$STD uv venv $STD uv venv --clear
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
msg_ok "Setup Backend" msg_ok "Setup Backend"

View File

@@ -23,7 +23,7 @@ msg_info "Setting up Virtual Environment"
mkdir -p /opt/esphome mkdir -p /opt/esphome
mkdir -p /root/config mkdir -p /root/config
cd /opt/esphome cd /opt/esphome
$STD uv venv /opt/esphome/.venv $STD uv venv --clear /opt/esphome/.venv
$STD /opt/esphome/.venv/bin/python -m ensurepip --upgrade $STD /opt/esphome/.venv/bin/python -m ensurepip --upgrade
$STD /opt/esphome/.venv/bin/python -m pip install --upgrade pip $STD /opt/esphome/.venv/bin/python -m pip install --upgrade pip
$STD /opt/esphome/.venv/bin/python -m pip install esphome tornado esptool $STD /opt/esphome/.venv/bin/python -m pip install esphome tornado esptool

View File

@@ -15,31 +15,30 @@ network_check
update_os update_os
msg_info "Installing Dependencies" msg_info "Installing Dependencies"
$STD apt-get install -y \ $STD apt install -y \
ffmpeg \ ffmpeg \
jq \
imagemagick imagemagick
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
setup_hwaccel setup_hwaccel
msg_info "Installing ASP.NET Core Runtime" msg_info "Installing ASP.NET Core Runtime"
curl -fsSL https://packages.microsoft.com/config/debian/13/packages-microsoft-prod.deb -o packages-microsoft-prod.deb setup_deb822_repo \
$STD dpkg -i packages-microsoft-prod.deb "microsoft" \
rm -rf packages-microsoft-prod.deb "https://packages.microsoft.com/keys/microsoft-2025.asc" \
$STD apt-get update "https://packages.microsoft.com/debian/13/prod/" \
$STD apt-get install -y aspnetcore-runtime-8.0 "trixie"
$STD apt install -y aspnetcore-runtime-8.0
msg_ok "Installed ASP.NET Core Runtime" msg_ok "Installed ASP.NET Core Runtime"
fetch_and_deploy_from_url "https://fileflows.com/downloads/zip" "/opt/fileflows"
msg_info "Setup FileFlows" msg_info "Setup FileFlows"
$STD ln -svf /usr/bin/ffmpeg /usr/local/bin/ffmpeg $STD ln -svf /usr/bin/ffmpeg /usr/local/bin/ffmpeg
$STD ln -svf /usr/bin/ffprobe /usr/local/bin/ffprobe $STD ln -svf /usr/bin/ffprobe /usr/local/bin/ffprobe
temp_file=$(mktemp) cd /opt/fileflows/Server
curl -fsSL https://fileflows.com/downloads/zip -o "$temp_file" dotnet FileFlows.Server.dll --systemd install --root true
$STD unzip -d /opt/fileflows "$temp_file"
$STD bash -c "cd /opt/fileflows/Server && dotnet FileFlows.Server.dll --systemd install --root true"
systemctl enable -q --now fileflows systemctl enable -q --now fileflows
rm -f "$temp_file"
msg_ok "Setup FileFlows" msg_ok "Setup FileFlows"
motd_ssh motd_ssh

View File

@@ -17,7 +17,7 @@ PYTHON_VERSION="3.12" setup_uv
fetch_and_deploy_gh_release "huntarr" "plexguide/Huntarr.io" "tarball" fetch_and_deploy_gh_release "huntarr" "plexguide/Huntarr.io" "tarball"
msg_info "Configure Huntarr" msg_info "Configure Huntarr"
$STD uv venv /opt/huntarr/.venv $STD uv venv --clear /opt/huntarr/.venv
$STD uv pip install --python /opt/huntarr/.venv/bin/python -r /opt/huntarr/requirements.txt $STD uv pip install --python /opt/huntarr/.venv/bin/python -r /opt/huntarr/requirements.txt
msg_ok "Configured Huntrarr" msg_ok "Configured Huntrarr"

View File

@@ -289,7 +289,7 @@ ML_DIR="${APP_DIR}/machine-learning"
GEO_DIR="${INSTALL_DIR}/geodata" GEO_DIR="${INSTALL_DIR}/geodata"
mkdir -p {"${APP_DIR}","${UPLOAD_DIR}","${GEO_DIR}","${INSTALL_DIR}"/cache} mkdir -p {"${APP_DIR}","${UPLOAD_DIR}","${GEO_DIR}","${INSTALL_DIR}"/cache}
fetch_and_deploy_gh_release "Immich" "immich-app/immich" "tarball" "v2.5.5" "$SRC_DIR" fetch_and_deploy_gh_release "Immich" "immich-app/immich" "tarball" "v2.5.6" "$SRC_DIR"
PNPM_VERSION="$(jq -r '.packageManager | split("@")[1]' ${SRC_DIR}/package.json)" PNPM_VERSION="$(jq -r '.packageManager | split("@")[1]' ${SRC_DIR}/package.json)"
NODE_VERSION="24" NODE_MODULE="pnpm@${PNPM_VERSION}" setup_nodejs NODE_VERSION="24" NODE_MODULE="pnpm@${PNPM_VERSION}" setup_nodejs

View File

@@ -18,7 +18,7 @@ PYTHON_VERSION="3.12" setup_uv
msg_info "Installing Jupyter" msg_info "Installing Jupyter"
mkdir -p /opt/jupyter mkdir -p /opt/jupyter
cd /opt/jupyter cd /opt/jupyter
$STD uv venv /opt/jupyter/.venv $STD uv venv --clear /opt/jupyter/.venv
$STD /opt/jupyter/.venv/bin/python -m ensurepip --upgrade $STD /opt/jupyter/.venv/bin/python -m ensurepip --upgrade
$STD /opt/jupyter/.venv/bin/python -m pip install --upgrade pip $STD /opt/jupyter/.venv/bin/python -m pip install --upgrade pip
$STD /opt/jupyter/.venv/bin/python -m pip install jupyter $STD /opt/jupyter/.venv/bin/python -m pip install jupyter

View File

@@ -22,7 +22,7 @@ fetch_and_deploy_gh_release "kapowarr" "Casvt/Kapowarr" "tarball"
msg_info "Setup Kapowarr" msg_info "Setup Kapowarr"
cd /opt/kapowarr cd /opt/kapowarr
$STD uv venv .venv $STD uv venv --clear .venv
$STD source .venv/bin/activate $STD source .venv/bin/activate
$STD uv pip install --upgrade pip $STD uv pip install --upgrade pip
$STD uv pip install --no-cache-dir -r requirements.txt $STD uv pip install --no-cache-dir -r requirements.txt

View File

@@ -20,10 +20,19 @@ msg_ok "Installed Docker"
msg_info "Detecting latest Kasm Workspaces release" msg_info "Detecting latest Kasm Workspaces release"
KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1) KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1)
if [[ -z "$KASM_URL" ]]; then if [[ -z "$KASM_URL" ]]; then
SERVICE_IMAGE_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_service_images_amd64_[0-9]+\.[0-9]+\.[0-9]+\.tar\.gz' | head -n 1)
if [[ -n "$SERVICE_IMAGE_URL" ]]; then
KASM_VERSION=$(echo "$SERVICE_IMAGE_URL" | sed -E 's/.*kasm_release_service_images_amd64_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
KASM_URL="https://kasm-static-content.s3.amazonaws.com/kasm_release_${KASM_VERSION}.tar.gz"
fi
else
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
fi
if [[ -z "$KASM_URL" ]] || [[ -z "$KASM_VERSION" ]]; then
msg_error "Unable to detect latest Kasm release URL." msg_error "Unable to detect latest Kasm release URL."
exit 1 exit 1
fi fi
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
msg_ok "Detected Kasm Workspaces version $KASM_VERSION" msg_ok "Detected Kasm Workspaces version $KASM_VERSION"
msg_warn "WARNING: This script will run an external installer from a third-party source (https://www.kasmweb.com/)." msg_warn "WARNING: This script will run an external installer from a third-party source (https://www.kasmweb.com/)."

View File

@@ -50,7 +50,7 @@ $STD useradd librenms -d /opt/librenms -M -r -s "$(which bash)"
mkdir -p /opt/librenms/{rrd,logs,bootstrap/cache,storage,html} mkdir -p /opt/librenms/{rrd,logs,bootstrap/cache,storage,html}
cd /opt/librenms cd /opt/librenms
APP_KEY=$(openssl rand -base64 40 | tr -dc 'a-zA-Z0-9') APP_KEY=$(openssl rand -base64 40 | tr -dc 'a-zA-Z0-9')
$STD uv venv .venv $STD uv venv --clear .venv
$STD source .venv/bin/activate $STD source .venv/bin/activate
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
cat <<EOF >/opt/librenms/.env cat <<EOF >/opt/librenms/.env

View File

@@ -37,18 +37,13 @@ PYTHON_VERSION="3.12" setup_uv
fetch_and_deploy_gh_release "libretranslate" "LibreTranslate/LibreTranslate" "tarball" fetch_and_deploy_gh_release "libretranslate" "LibreTranslate/LibreTranslate" "tarball"
msg_info "Setup LibreTranslate (Patience)" msg_info "Setup LibreTranslate (Patience)"
TORCH_VERSION=$(grep -Eo '"torch ==[0-9]+\.[0-9]+\.[0-9]+' /opt/libretranslate/pyproject.toml |
tail -n1 | sed 's/.*==//')
if [[ -z "$TORCH_VERSION" ]]; then
TORCH_VERSION="2.5.0"
fi
cd /opt/libretranslate cd /opt/libretranslate
$STD uv venv .venv --python 3.12 $STD uv venv --clear .venv --python 3.12
$STD source .venv/bin/activate $STD source .venv/bin/activate
$STD uv pip install --upgrade pip setuptools $STD uv pip install --upgrade pip
$STD uv pip install "setuptools<81"
$STD uv pip install Babel==2.12.1 $STD uv pip install Babel==2.12.1
$STD .venv/bin/python scripts/compile_locales.py $STD .venv/bin/python scripts/compile_locales.py
$STD uv pip install "torch==${TORCH_VERSION}" --extra-index-url https://download.pytorch.org/whl/cpu
$STD uv pip install "numpy<2" $STD uv pip install "numpy<2"
$STD uv pip install . $STD uv pip install .
$STD uv pip install libretranslate $STD uv pip install libretranslate

View File

@@ -42,7 +42,7 @@ msg_ok "Set up PostgreSQL"
msg_info "Setting up Virtual Environment" msg_info "Setting up Virtual Environment"
mkdir -p /opt/litellm mkdir -p /opt/litellm
cd /opt/litellm cd /opt/litellm
$STD uv venv /opt/litellm/.venv $STD uv venv --clear /opt/litellm/.venv
$STD /opt/litellm/.venv/bin/python -m ensurepip --upgrade $STD /opt/litellm/.venv/bin/python -m ensurepip --upgrade
$STD /opt/litellm/.venv/bin/python -m pip install --upgrade pip $STD /opt/litellm/.venv/bin/python -m pip install --upgrade pip
$STD /opt/litellm/.venv/bin/python -m pip install litellm[proxy] prisma $STD /opt/litellm/.venv/bin/python -m pip install litellm[proxy] prisma

View File

@@ -29,7 +29,7 @@ fetch_and_deploy_gh_release "mylar3" "mylar3/mylar3" "tarball"
msg_info "Installing ${APPLICATION}" msg_info "Installing ${APPLICATION}"
mkdir -p /opt/mylar3-data mkdir -p /opt/mylar3-data
$STD uv venv /opt/mylar3/.venv $STD uv venv --clear /opt/mylar3/.venv
$STD /opt/mylar3/.venv/bin/python -m ensurepip --upgrade $STD /opt/mylar3/.venv/bin/python -m ensurepip --upgrade
$STD /opt/mylar3/.venv/bin/python -m pip install --upgrade pip $STD /opt/mylar3/.venv/bin/python -m pip install --upgrade pip
$STD /opt/mylar3/.venv/bin/python -m pip install --no-cache-dir -r /opt/mylar3/requirements.txt $STD /opt/mylar3/.venv/bin/python -m pip install --no-cache-dir -r /opt/mylar3/requirements.txt

View File

@@ -37,7 +37,6 @@ PageSize = 10
Host = 0.0.0.0 Host = 0.0.0.0
Port = 9000 Port = 9000
RunMode = release RunMode = release
JwtSecret = $(openssl rand -hex 32)
[cert] [cert]
HTTPChallengePort = 9180 HTTPChallengePort = 9180

View File

@@ -38,6 +38,10 @@ for server in "${servers[@]}"; do
fi fi
done done
msg_info "Installing dependencies"
$STD apt install -y inotify-tools
msg_ok "Installed dependencies"
msg_info "Installing Collabora Online" msg_info "Installing Collabora Online"
curl -fsSL https://collaboraoffice.com/downloads/gpg/collaboraonline-release-keyring.gpg -o /etc/apt/keyrings/collaboraonline-release-keyring.gpg curl -fsSL https://collaboraoffice.com/downloads/gpg/collaboraonline-release-keyring.gpg -o /etc/apt/keyrings/collaboraonline-release-keyring.gpg
cat <<EOF >/etc/apt/sources.list.d/colloboraonline.sources cat <<EOF >/etc/apt/sources.list.d/colloboraonline.sources
@@ -148,8 +152,15 @@ COLLABORATION_JWT_SECRET=
# FRONTEND_FULL_TEXT_SEARCH_ENABLED=true # FRONTEND_FULL_TEXT_SEARCH_ENABLED=true
# SEARCH_EXTRACTOR_TIKA_TIKA_URL=<your-tika-url> # SEARCH_EXTRACTOR_TIKA_TIKA_URL=<your-tika-url>
## External storage test - Only NFS v4.2+ is supported ## Uncomment below to enable PosixFS Collaborative Mode
## User files ## Increase inotify watch/instance limits on your PVE host:
### sysctl -w fs.inotify.max_user_watches=1048576
### sysctl -w fs.inotify.max_user_instances=1024
# STORAGE_USERS_POSIX_ENABLE_COLLABORATION=true
# STORAGE_USERS_POSIX_WATCH_TYPE=inotifywait
# STORAGE_USERS_POSIX_WATCH_FS=true
# STORAGE_USERS_POSIX_WATCH_PATH=<path-to-storage-or-bind-mount>
## User files location - experimental - use at your own risk! - ZFS, NFS v4.2+ supported - CIFS/SMB not supported
# STORAGE_USERS_POSIX_ROOT=<path-to-your-bind_mount> # STORAGE_USERS_POSIX_ROOT=<path-to-your-bind_mount>
EOF EOF

View File

@@ -24,7 +24,7 @@ setup_hwaccel
PYTHON_VERSION="3.12" setup_uv PYTHON_VERSION="3.12" setup_uv
msg_info "Installing Open WebUI" msg_info "Installing Open WebUI"
$STD uv tool install --python 3.12 open-webui[all] $STD uv tool install --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
msg_ok "Installed Open WebUI" msg_ok "Installed Open WebUI"
read -r -p "${TAB3}Would you like to add Ollama? <y/N> " prompt read -r -p "${TAB3}Would you like to add Ollama? <y/N> " prompt

View File

@@ -178,7 +178,7 @@ http:
servers: servers:
- url: "http://$LOCAL_IP:3000" - url: "http://$LOCAL_IP:3000"
EOF EOF
$STD npm run db:sqlite:push $STD npm run db:push
. /etc/os-release . /etc/os-release
if [ "$VERSION_CODENAME" = "trixie" ]; then if [ "$VERSION_CODENAME" = "trixie" ]; then

View File

@@ -1,47 +0,0 @@
#!/usr/bin/env bash
# Copyright (c) 2021-2026 community-scripts ORG
# Author: Andy Grunwald (andygrunwald)
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/hansmi/prometheus-paperless-exporter
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
fetch_and_deploy_gh_release "prom-paperless-exp" "hansmi/prometheus-paperless-exporter" "binary"
msg_info "Configuring Prometheus Paperless NGX Exporter"
mkdir -p /etc/prometheus-paperless-ngx-exporter
echo "SECRET_AUTH_TOKEN" >/etc/prometheus-paperless-ngx-exporter/paperless_auth_token_file
msg_ok "Configured Prometheus Paperless NGX Exporter"
msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/prometheus-paperless-ngx-exporter.service
[Unit]
Description=Prometheus Paperless NGX Exporter
Wants=network-online.target
After=network-online.target
[Service]
User=root
Restart=always
Type=simple
ExecStart=/usr/bin/prometheus-paperless-exporter \
--paperless_url=http://paperless.example.org \
--paperless_auth_token_file=/etc/prometheus-paperless-ngx-exporter/paperless_auth_token_file
ExecReload=/bin/kill -HUP \$MAINPID
[Install]
WantedBy=multi-user.target
EOF
systemctl enable -q --now prometheus-paperless-ngx-exporter
msg_ok "Created Service"
motd_ssh
customize
cleanup_lxc

View File

@@ -19,7 +19,7 @@ msg_info "Installing Prometheus Proxmox VE Exporter"
mkdir -p /opt/prometheus-pve-exporter mkdir -p /opt/prometheus-pve-exporter
cd /opt/prometheus-pve-exporter cd /opt/prometheus-pve-exporter
$STD uv venv /opt/prometheus-pve-exporter/.venv $STD uv venv --clear /opt/prometheus-pve-exporter/.venv
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m ensurepip --upgrade $STD /opt/prometheus-pve-exporter/.venv/bin/python -m ensurepip --upgrade
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install --upgrade pip $STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install --upgrade pip
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install prometheus-pve-exporter $STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install prometheus-pve-exporter

View File

@@ -14,42 +14,51 @@ network_check
update_os update_os
msg_info "Installing Dependencies" msg_info "Installing Dependencies"
$STD apt install -y \ $STD apt install -y apache2-utils
apache2-utils \
python3-pip \
python3-venv
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
PYTHON_VERSION="3.13" setup_uv
fetch_and_deploy_gh_release "Radicale" "Kozea/Radicale" "tarball" "latest" "/opt/radicale"
msg_info "Setting up Radicale" msg_info "Setting up Radicale"
python3 -m venv /opt/radicale cd /opt/radicale
source /opt/radicale/bin/activate
$STD python3 -m pip install --upgrade https://github.com/Kozea/Radicale/archive/master.tar.gz
RNDPASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13) RNDPASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13)
$STD htpasswd -c -b -5 /opt/radicale/users admin $RNDPASS $STD htpasswd -c -b -5 /opt/radicale/users admin "$RNDPASS"
{ {
echo "Radicale Credentials" echo "Radicale Credentials"
echo "Admin User: admin" echo "Admin User: admin"
echo "Admin Password: $RNDPASS" echo "Admin Password: $RNDPASS"
} >>~/radicale.creds } >>~/radicale.creds
msg_ok "Done setting up Radicale"
msg_info "Setup Service" mkdir -p /etc/radicale
cat <<EOF >/etc/radicale/config
[server]
hosts = 0.0.0.0:5232
cat <<EOF >/opt/radicale/start.sh [auth]
#!/usr/bin/env bash type = htpasswd
source /opt/radicale/bin/activate htpasswd_filename = /opt/radicale/users
python3 -m radicale --storage-filesystem-folder=/var/lib/radicale/collections --hosts 0.0.0.0:5232 --auth-type htpasswd --auth-htpasswd-filename /opt/radicale/users --auth-htpasswd-encryption sha512 htpasswd_encryption = sha512
[storage]
type = multifilesystem
filesystem_folder = /var/lib/radicale/collections
[web]
type = internal
EOF EOF
msg_ok "Set up Radicale"
chmod +x /opt/radicale/start.sh msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/radicale.service cat <<EOF >/etc/systemd/system/radicale.service
[Unit]
Description=A simple CalDAV (calendar) and CardDAV (contact) server Description=A simple CalDAV (calendar) and CardDAV (contact) server
After=network.target After=network.target
Requires=network.target Requires=network.target
[Service] [Service]
ExecStart=/opt/radicale/start.sh WorkingDirectory=/opt/radicale
ExecStart=/usr/local/bin/uv run -m radicale --config /etc/radicale/config
Restart=on-failure Restart=on-failure
# User=radicale # User=radicale
# Deny other users access to the calendar data # Deny other users access to the calendar data

View File

@@ -36,7 +36,7 @@ msg_ok "Setup Unrar"
fetch_and_deploy_gh_release "sabnzbd-org" "sabnzbd/sabnzbd" "prebuild" "latest" "/opt/sabnzbd" "SABnzbd-*-src.tar.gz" fetch_and_deploy_gh_release "sabnzbd-org" "sabnzbd/sabnzbd" "prebuild" "latest" "/opt/sabnzbd" "SABnzbd-*-src.tar.gz"
msg_info "Installing SABnzbd" msg_info "Installing SABnzbd"
$STD uv venv /opt/sabnzbd/venv $STD uv venv --clear /opt/sabnzbd/venv
$STD uv pip install -r /opt/sabnzbd/requirements.txt --python=/opt/sabnzbd/venv/bin/python $STD uv pip install -r /opt/sabnzbd/requirements.txt --python=/opt/sabnzbd/venv/bin/python
msg_ok "Installed SABnzbd" msg_ok "Installed SABnzbd"

View File

@@ -18,7 +18,7 @@ fetch_and_deploy_gh_release "scrappar" "thecfu/scraparr" "tarball" "latest" "/op
msg_info "Installing Scraparr" msg_info "Installing Scraparr"
cd /opt/scraparr cd /opt/scraparr
$STD uv venv /opt/scraparr/.venv $STD uv venv --clear /opt/scraparr/.venv
$STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade $STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade
$STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip $STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip
$STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt $STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt

View File

@@ -131,7 +131,7 @@ msg_ok "Built Shelfmark frontend"
msg_info "Configuring Shelfmark" msg_info "Configuring Shelfmark"
cd /opt/shelfmark cd /opt/shelfmark
$STD uv venv ./venv $STD uv venv --clear ./venv
$STD source ./venv/bin/activate $STD source ./venv/bin/activate
$STD uv pip install -r ./requirements-base.txt $STD uv pip install -r ./requirements-base.txt
[[ "$DEPLOYMENT_TYPE" == "1" ]] && $STD uv pip install -r ./requirements-shelfmark.txt [[ "$DEPLOYMENT_TYPE" == "1" ]] && $STD uv pip install -r ./requirements-shelfmark.txt

View File

@@ -3,7 +3,7 @@
# Copyright (c) 2021-2026 community-scripts ORG # Copyright (c) 2021-2026 community-scripts ORG
# Author: vhsdream # Author: vhsdream
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/slskd/slskd/, https://soularr.net # Source: https://github.com/slskd/slskd/, https://github.com/mrusse/soularr
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH" source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color color
@@ -13,71 +13,71 @@ setting_up_container
network_check network_check
update_os update_os
msg_info "Installing Dependencies" fetch_and_deploy_gh_release "Slskd" "slskd/slskd" "prebuild" "latest" "/opt/slskd" "slskd-*-linux-x64.zip"
$STD apt install -y \
python3-pip
msg_ok "Installed Dependencies"
msg_info "Setup ${APPLICATION}" msg_info "Configuring Slskd"
tmp_file=$(mktemp)
RELEASE=$(curl -s https://api.github.com/repos/slskd/slskd/releases/latest | grep "tag_name" | awk '{print substr($2, 2, length($2)-3) }')
curl -fsSL "https://github.com/slskd/slskd/releases/download/${RELEASE}/slskd-${RELEASE}-linux-x64.zip" -o $tmp_file
$STD unzip $tmp_file -d /opt/${APPLICATION}
echo "${RELEASE}" >/opt/${APPLICATION}_version.txt
JWT_KEY=$(openssl rand -base64 44) JWT_KEY=$(openssl rand -base64 44)
SLSKD_API_KEY=$(openssl rand -base64 44) SLSKD_API_KEY=$(openssl rand -base64 44)
cp /opt/${APPLICATION}/config/slskd.example.yml /opt/${APPLICATION}/config/slskd.yml cp /opt/slskd/config/slskd.example.yml /opt/slskd/config/slskd.yml
sed -i \ sed -i \
-e "\|web:|,\|cidr|s|^#||" \ -e '/web:/,/cidr/s/^# //' \
-e "\|https:|,\|5031|s|false|true|" \ -e '/https:/,/port: 5031/s/false/true/' \
-e '/port: 5030/,/socket/s/,.*$//' \
-e '/content_path:/,/authentication/s/false/true/' \
-e "\|api_keys|,\|cidr|s|<some.*$|$SLSKD_API_KEY|; \ -e "\|api_keys|,\|cidr|s|<some.*$|$SLSKD_API_KEY|; \
s|role: readonly|role: readwrite|; \ s|role: readonly|role: readwrite|; \
s|0.0.0.0/0,::/0|& # Replace this with your subnet|" \ s|0.0.0.0/0,::/0|& # Replace this with your subnet|" \
-e "\|soulseek|,\|write_queue|s|^#||" \
-e "\|jwt:|,\|ttl|s|key: ~|key: $JWT_KEY|" \ -e "\|jwt:|,\|ttl|s|key: ~|key: $JWT_KEY|" \
-e "s|^ picture|# picture|" \ -e '/soulseek/,/write_queue/s/^# //' \
/opt/${APPLICATION}/config/slskd.yml -e 's/^.*picture/#&/' /opt/slskd/config/slskd.yml
msg_ok "Setup ${APPLICATION}" msg_ok "Configured Slskd"
msg_info "Installing Soularr" read -rp "${TAB3}Do you want to install Soularr? y/N " soularr
rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED if [[ ${soularr,,} =~ ^(y|yes)$ ]]; then
cd /tmp PYTHON_VERSION="3.11" setup_uv
curl -fsSL -o main.zip https://github.com/mrusse/soularr/archive/refs/heads/main.zip fetch_and_deploy_gh_release "Soularr" "mrusse/soularr" "tarball" "latest" "/opt/soularr"
$STD unzip main.zip cd /opt/soularr
mv soularr-main /opt/soularr $STD uv venv venv
cd /opt/soularr $STD source venv/bin/activate
$STD pip install -r requirements.txt $STD uv pip install -r requirements.txt
sed -i \ sed -i \
-e "\|[Slskd]|,\|host_url|s|yourslskdapikeygoeshere|$SLSKD_API_KEY|" \ -e "\|[Slskd]|,\|host_url|s|yourslskdapikeygoeshere|$SLSKD_API_KEY|" \
-e "/host_url/s/slskd/localhost/" \ -e "/host_url/s/slskd/localhost/" \
/opt/soularr/config.ini /opt/soularr/config.ini
sed -i \ cat <<EOF >/opt/soularr/run.sh
-e "/#This\|#Default\|INTERVAL/{N;d;}" \ #!/usr/bin/env bash
-e "/while\|#Pass/d" \
-e "\|python|s|app|opt/soularr|; s|python|python3|" \
-e "/dt/,+2d" \
/opt/soularr/run.sh
sed -i -E "/(soularr.py)/s/.{5}$//; /if/,/fi/s/.{4}//" /opt/soularr/run.sh
chmod +x /opt/soularr/run.sh
msg_ok "Installed Soularr"
msg_info "Creating Services" if ps aux | grep "[s]oularr.py" >/dev/null; then
cat <<EOF >/etc/systemd/system/${APPLICATION}.service echo "Soularr is already running. Exiting..."
exit 1
else
source /opt/soularr/venv/bin/activate
uv run python3 -u /opt/soularr/soularr.py --config-dir /opt/soularr
fi
EOF
chmod +x /opt/soularr/run.sh
deactivate
msg_ok "Installed Soularr"
fi
msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/slskd.service
[Unit] [Unit]
Description=${APPLICATION} Service Description=Slskd Service
After=network.target After=network.target
Wants=network.target Wants=network.target
[Service] [Service]
WorkingDirectory=/opt/${APPLICATION} WorkingDirectory=/opt/slskd
ExecStart=/opt/${APPLICATION}/slskd --config /opt/${APPLICATION}/config/slskd.yml ExecStart=/opt/slskd/slskd --config /opt/slskd/config/slskd.yml
Restart=always Restart=always
[Install] [Install]
WantedBy=multi-user.target WantedBy=multi-user.target
EOF EOF
cat <<EOF >/etc/systemd/system/soularr.timer if [[ -d /opt/soularr ]]; then
cat <<EOF >/etc/systemd/system/soularr.timer
[Unit] [Unit]
Description=Soularr service timer Description=Soularr service timer
RefuseManualStart=no RefuseManualStart=no
@@ -85,15 +85,15 @@ RefuseManualStop=no
[Timer] [Timer]
Persistent=true Persistent=true
# run every 5 minutes # run every 10 minutes
OnCalendar=*-*-* *:0/5:00 OnCalendar=*-*-* *:0/10:00
Unit=soularr.service Unit=soularr.service
[Install] [Install]
WantedBy=timers.target WantedBy=timers.target
EOF EOF
cat <<EOF >/etc/systemd/system/soularr.service cat <<EOF >/etc/systemd/system/soularr.service
[Unit] [Unit]
Description=Soularr service Description=Soularr service
After=network.target slskd.service After=network.target slskd.service
@@ -106,10 +106,9 @@ ExecStart=/bin/bash -c /opt/soularr/run.sh
[Install] [Install]
WantedBy=multi-user.target WantedBy=multi-user.target
EOF EOF
systemctl enable -q --now ${APPLICATION} msg_warn "Add your Lidarr API key to Soularr in '/opt/soularr/config.ini', then run 'systemctl enable --now soularr.timer'"
systemctl enable -q soularr.timer fi
rm -rf $tmp_file systemctl enable -q --now slskd
rm -rf /tmp/main.zip
msg_ok "Created Services" msg_ok "Created Services"
motd_ssh motd_ssh

View File

@@ -61,7 +61,7 @@ msg_ok "Installed LibreOffice Components"
msg_info "Installing Python Dependencies" msg_info "Installing Python Dependencies"
mkdir -p /tmp/stirling-pdf mkdir -p /tmp/stirling-pdf
$STD uv venv /opt/.venv $STD uv venv --clear /opt/.venv
export PATH="/opt/.venv/bin:$PATH" export PATH="/opt/.venv/bin:$PATH"
source /opt/.venv/bin/activate source /opt/.venv/bin/activate
$STD uv pip install --upgrade pip $STD uv pip install --upgrade pip

View File

@@ -22,7 +22,7 @@ fetch_and_deploy_gh_release "streamlink-webui" "CrazyWolf13/streamlink-webui" "t
msg_info "Setup ${APPLICATION}" msg_info "Setup ${APPLICATION}"
mkdir -p "/opt/${APPLICATION}-download" mkdir -p "/opt/${APPLICATION}-download"
$STD uv venv /opt/"${APPLICATION}"/backend/src/.venv $STD uv venv --clear /opt/"${APPLICATION}"/backend/src/.venv
source /opt/"${APPLICATION}"/backend/src/.venv/bin/activate source /opt/"${APPLICATION}"/backend/src/.venv/bin/activate
$STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/"${APPLICATION}"/backend/src/.venv $STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/"${APPLICATION}"/backend/src/.venv
cd /opt/"${APPLICATION}"/frontend/src cd /opt/"${APPLICATION}"/frontend/src

View File

@@ -40,7 +40,7 @@ SECRET_KEY=$(openssl rand -base64 45 | sed 's/\//\\\//g')
msg_info "Setup Tandoor" msg_info "Setup Tandoor"
mkdir -p /opt/tandoor/{config,api,mediafiles,staticfiles} mkdir -p /opt/tandoor/{config,api,mediafiles,staticfiles}
cd /opt/tandoor cd /opt/tandoor
$STD uv venv .venv --python=python3 $STD uv venv --clear .venv --python=python3
$STD uv pip install -r requirements.txt --python .venv/bin/python $STD uv pip install -r requirements.txt --python .venv/bin/python
cd /opt/tandoor/vue3 cd /opt/tandoor/vue3
$STD yarn install $STD yarn install

View File

@@ -25,7 +25,7 @@ cd /opt/Tautulli
TAUTULLI_VERSION=$(get_latest_github_release "Tautulli/Tautulli" "false") TAUTULLI_VERSION=$(get_latest_github_release "Tautulli/Tautulli" "false")
echo "${TAUTULLI_VERSION}" >/opt/Tautulli/version.txt echo "${TAUTULLI_VERSION}" >/opt/Tautulli/version.txt
echo "master" >/opt/Tautulli/branch.txt echo "master" >/opt/Tautulli/branch.txt
$STD uv venv $STD uv venv --clear
$STD source /opt/Tautulli/.venv/bin/activate $STD source /opt/Tautulli/.venv/bin/activate
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
$STD uv pip install pyopenssl $STD uv pip install pyopenssl

View File

@@ -30,7 +30,7 @@ msg_ok "Built Frontend"
msg_info "Setting up Backend" msg_info "Setting up Backend"
cd /opt/trip/backend cd /opt/trip/backend
$STD uv venv /opt/trip/.venv $STD uv venv --clear /opt/trip/.venv
$STD uv pip install --python /opt/trip/.venv/bin/python -r trip/requirements.txt $STD uv pip install --python /opt/trip/.venv/bin/python -r trip/requirements.txt
msg_ok "Set up Backend" msg_ok "Set up Backend"

View File

@@ -27,68 +27,6 @@ msg_ok "Installed Dependencies"
fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip" fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip"
msg_info "Setting up UmlautAdaptarr"
cat <<EOF >/opt/UmlautAdaptarr/appsettings.json
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
},
"Console": {
"TimestampFormat": "yyyy-MM-dd HH:mm:ss::"
}
},
"AllowedHosts": "*",
"Kestrel": {
"Endpoints": {
"Http": {
"Url": "http://[::]:5005"
}
}
},
"Settings": {
"UserAgent": "UmlautAdaptarr/1.0",
"UmlautAdaptarrApiHost": "https://umlautadaptarr.pcjones.de/api/v1",
"IndexerRequestsCacheDurationInMinutes": 12
},
"Sonarr": [
{
"Enabled": false,
"Name": "Sonarr",
"Host": "http://192.168.1.100:8989",
"ApiKey": "dein_sonarr_api_key"
}
],
"Radarr": [
{
"Enabled": false,
"Name": "Radarr",
"Host": "http://192.168.1.101:7878",
"ApiKey": "dein_radarr_api_key"
}
],
"Lidarr": [
{
"Enabled": false,
"Host": "http://192.168.1.102:8686",
"ApiKey": "dein_lidarr_api_key"
},
],
"Readarr": [
{
"Enabled": false,
"Host": "http://192.168.1.103:8787",
"ApiKey": "dein_readarr_api_key"
},
],
"IpLeakTest": {
"Enabled": false
}
}
EOF
msg_ok "Setup UmlautAdaptarr"
msg_info "Creating Service" msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/umlautadaptarr.service cat <<EOF >/etc/systemd/system/umlautadaptarr.service
[Unit] [Unit]

View File

@@ -15,16 +15,18 @@ update_os
msg_info "Installing Dependencies" msg_info "Installing Dependencies"
$STD apt install -y apt-transport-https $STD apt install -y apt-transport-https
curl -fsSL "https://dl.ui.com/unifi/unifi-repo.gpg" -o "/usr/share/keyrings/unifi-repo.gpg"
cat <<EOF | sudo tee /etc/apt/sources.list.d/100-ubnt-unifi.sources >/dev/null
Types: deb
URIs: https://www.ui.com/downloads/unifi/debian
Suites: stable
Components: ubiquiti
Architectures: amd64
Signed-By: /usr/share/keyrings/unifi-repo.gpg
EOF
$STD apt update
msg_ok "Installed Dependencies" msg_ok "Installed Dependencies"
setup_deb822_repo \
"unifi" \
"https://dl.ui.com/unifi/unifi-repo.gpg" \
"https://www.ui.com/downloads/unifi/debian" \
"stable" \
"ubiquiti" \
"amd64"
JAVA_VERSION="21" setup_java JAVA_VERSION="21" setup_java
if lscpu | grep -q 'avx'; then if lscpu | grep -q 'avx'; then

View File

@@ -49,7 +49,7 @@ fetch_and_deploy_gh_release "warracker" "sassanix/Warracker" "tarball" "latest"
msg_info "Installing Warracker" msg_info "Installing Warracker"
cd /opt/warracker/backend cd /opt/warracker/backend
$STD uv venv .venv $STD uv venv --clear .venv
$STD source .venv/bin/activate $STD source .venv/bin/activate
$STD uv pip install -r requirements.txt $STD uv pip install -r requirements.txt
mv /opt/warracker/env.example /opt/.env mv /opt/warracker/env.example /opt/.env

File diff suppressed because it is too large Load Diff

View File

@@ -3078,10 +3078,10 @@ settings_menu() {
case "$choice" in case "$choice" in
1) diagnostics_menu ;; 1) diagnostics_menu ;;
2) nano /usr/local/community-scripts/default.vars ;; 2) ${EDITOR:-nano} /usr/local/community-scripts/default.vars ;;
3) 3)
if [ -f "$(get_app_defaults_path)" ]; then if [ -f "$(get_app_defaults_path)" ]; then
nano "$(get_app_defaults_path)" ${EDITOR:-nano} "$(get_app_defaults_path)"
else else
# Back was selected (no app.vars available) # Back was selected (no app.vars available)
return return
@@ -3351,19 +3351,21 @@ msg_menu() {
return 0 return 0
fi fi
# Display menu # Display menu to /dev/tty so it doesn't get captured by command substitution
echo "" {
msg_custom "📋" "${BL}" "${title}" echo ""
echo "" msg_custom "📋" "${BL}" "${title}"
for i in "${!tags[@]}"; do echo ""
local marker=" " for i in "${!tags[@]}"; do
[[ $i -eq 0 ]] && marker="* " local marker=" "
printf "${TAB3}${marker}%s) %s\n" "${tags[$i]}" "${descs[$i]}" [[ $i -eq 0 ]] && marker="* "
done printf "${TAB3}${marker}%s) %s\n" "${tags[$i]}" "${descs[$i]}"
echo "" done
echo ""
} >/dev/tty
local selection="" local selection=""
read -r -t 10 -p "${TAB3}Select [default=${default_tag}, timeout 10s]: " selection || true read -r -t 10 -p "${TAB3}Select [default=${default_tag}, timeout 10s]: " selection </dev/tty >/dev/tty || true
# Validate selection # Validate selection
if [[ -n "$selection" ]]; then if [[ -n "$selection" ]]; then
@@ -3634,6 +3636,9 @@ $PCT_OPTIONS_STRING"
exit 214 exit 214
fi fi
msg_ok "Storage space validated" msg_ok "Storage space validated"
# Report installation start to API (early - captures failed installs too)
post_to_api
fi fi
create_lxc_container || exit $? create_lxc_container || exit $?
@@ -4008,6 +4013,9 @@ EOF'
# Install SSH keys # Install SSH keys
install_ssh_keys_into_ct install_ssh_keys_into_ct
# Start timer for duration tracking
start_install_timer
# Run application installer # Run application installer
# Disable error trap - container errors are handled internally via flag file # Disable error trap - container errors are handled internally via flag file
set +Eeuo pipefail # Disable ALL error handling temporarily set +Eeuo pipefail # Disable ALL error handling temporarily
@@ -4038,9 +4046,10 @@ EOF'
if [[ $install_exit_code -ne 0 ]]; then if [[ $install_exit_code -ne 0 ]]; then
msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})" msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})"
# Copy both logs from container before potential deletion # Copy install log from container BEFORE API call so get_error_text() can read it
local build_log_copied=false local build_log_copied=false
local install_log_copied=false local install_log_copied=false
local host_install_log="/tmp/install-lxc-${CTID}-${SESSION_ID}.log"
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
# Copy BUILD_LOG (creation log) if it exists # Copy BUILD_LOG (creation log) if it exists
@@ -4048,15 +4057,22 @@ EOF'
cp "${BUILD_LOG}" "/tmp/create-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null && build_log_copied=true cp "${BUILD_LOG}" "/tmp/create-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null && build_log_copied=true
fi fi
# Copy INSTALL_LOG from container # Copy INSTALL_LOG from container to host
if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "/tmp/install-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null; then if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "$host_install_log" 2>/dev/null; then
install_log_copied=true install_log_copied=true
# Point INSTALL_LOG to host copy so get_error_text() finds it
INSTALL_LOG="$host_install_log"
fi fi
fi
# Show available logs # Report failure to telemetry API (now with log available on host)
post_update_to_api "failed" "$install_exit_code"
# Show available logs
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
echo "" echo ""
[[ "$build_log_copied" == true ]] && echo -e "${GN}${CL} Container creation log: ${BL}/tmp/create-lxc-${CTID}-${SESSION_ID}.log${CL}" [[ "$build_log_copied" == true ]] && echo -e "${GN}${CL} Container creation log: ${BL}/tmp/create-lxc-${CTID}-${SESSION_ID}.log${CL}"
[[ "$install_log_copied" == true ]] && echo -e "${GN}${CL} Installation log: ${BL}/tmp/install-lxc-${CTID}-${SESSION_ID}.log${CL}" [[ "$install_log_copied" == true ]] && echo -e "${GN}${CL} Installation log: ${BL}${host_install_log}${CL}"
fi fi
# Dev mode: Keep container or open breakpoint shell # Dev mode: Keep container or open breakpoint shell
@@ -5121,9 +5137,9 @@ EOF
# api_exit_script() # api_exit_script()
# #
# - Exit trap handler for reporting to API telemetry # - Exit trap handler for reporting to API telemetry
# - Captures exit code and reports to API using centralized error descriptions # - Captures exit code and reports to PocketBase using centralized error descriptions
# - Uses explain_exit_code() from error_handler.func for consistent error messages # - Uses explain_exit_code() from api.func for consistent error messages
# - Posts failure status with exit code to API (error description added automatically) # - Posts failure status with exit code to API (error description resolved automatically)
# - Only executes on non-zero exit codes # - Only executes on non-zero exit codes
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
api_exit_script() { api_exit_script() {
@@ -5136,6 +5152,6 @@ api_exit_script() {
if command -v pveversion >/dev/null 2>&1; then if command -v pveversion >/dev/null 2>&1; then
trap 'api_exit_script' EXIT trap 'api_exit_script' EXIT
fi fi
trap 'post_update_to_api "failed" "$BASH_COMMAND"' ERR trap 'post_update_to_api "failed" "$?"' ERR
trap 'post_update_to_api "failed" "INTERRUPTED"' SIGINT trap 'post_update_to_api "failed" "130"' SIGINT
trap 'post_update_to_api "failed" "TERMINATED"' SIGTERM trap 'post_update_to_api "failed" "143"' SIGTERM

View File

@@ -27,100 +27,90 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# explain_exit_code() # explain_exit_code()
# #
# - Maps numeric exit codes to human-readable error descriptions # - Canonical version is defined in api.func (sourced before this file)
# - Supports: # - This section only provides a fallback if api.func was not loaded
# * Generic/Shell errors (1, 2, 126, 127, 128, 130, 137, 139, 143) # - See api.func SECTION 1 for the authoritative exit code mappings
# * Package manager errors (APT, DPKG: 100, 101, 255)
# * Node.js/npm errors (243-249, 254)
# * Python/pip/uv errors (210-212)
# * PostgreSQL errors (231-234)
# * MySQL/MariaDB errors (241-244)
# * MongoDB errors (251-254)
# * Proxmox custom codes (200-231)
# - Returns description string for given exit code
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
explain_exit_code() { if ! declare -f explain_exit_code &>/dev/null; then
local code="$1" explain_exit_code() {
case "$code" in local code="$1"
# --- Generic / Shell --- case "$code" in
1) echo "General error / Operation not permitted" ;; 1) echo "General error / Operation not permitted" ;;
2) echo "Misuse of shell builtins (e.g. syntax error)" ;; 2) echo "Misuse of shell builtins (e.g. syntax error)" ;;
126) echo "Command invoked cannot execute (permission problem?)" ;; 6) echo "curl: DNS resolution failed (could not resolve host)" ;;
127) echo "Command not found" ;; 7) echo "curl: Failed to connect (network unreachable / host down)" ;;
128) echo "Invalid argument to exit" ;; 22) echo "curl: HTTP error returned (404, 429, 500+)" ;;
130) echo "Terminated by Ctrl+C (SIGINT)" ;; 28) echo "curl: Operation timeout (network slow or server not responding)" ;;
137) echo "Killed (SIGKILL / Out of memory?)" ;; 35) echo "curl: SSL/TLS handshake failed (certificate error)" ;;
139) echo "Segmentation fault (core dumped)" ;; 100) echo "APT: Package manager error (broken packages / dependency problems)" ;;
143) echo "Terminated (SIGTERM)" ;; 101) echo "APT: Configuration error (bad sources.list, malformed config)" ;;
102) echo "APT: Lock held by another process (dpkg/apt still running)" ;;
# --- Package manager / APT / DPKG --- 124) echo "Command timed out (timeout command)" ;;
100) echo "APT: Package manager error (broken packages / dependency problems)" ;; 126) echo "Command invoked cannot execute (permission problem?)" ;;
101) echo "APT: Configuration error (bad sources.list, malformed config)" ;; 127) echo "Command not found" ;;
255) echo "DPKG: Fatal internal error" ;; 128) echo "Invalid argument to exit" ;;
130) echo "Terminated by Ctrl+C (SIGINT)" ;;
# --- Node.js / npm / pnpm / yarn --- 134) echo "Process aborted (SIGABRT - possibly Node.js heap overflow)" ;;
243) echo "Node.js: Out of memory (JavaScript heap out of memory)" ;; 137) echo "Killed (SIGKILL / Out of memory?)" ;;
245) echo "Node.js: Invalid command-line option" ;; 139) echo "Segmentation fault (core dumped)" ;;
246) echo "Node.js: Internal JavaScript Parse Error" ;; 141) echo "Broken pipe (SIGPIPE - output closed prematurely)" ;;
247) echo "Node.js: Fatal internal error" ;; 143) echo "Terminated (SIGTERM)" ;;
248) echo "Node.js: Invalid C++ addon / N-API failure" ;; 150) echo "Systemd: Service failed to start" ;;
249) echo "Node.js: Inspector error" ;; 151) echo "Systemd: Service unit not found" ;;
254) echo "npm/pnpm/yarn: Unknown fatal error" ;; 152) echo "Permission denied (EACCES)" ;;
153) echo "Build/compile failed (make/gcc/cmake)" ;;
# --- Python / pip / uv --- 154) echo "Node.js: Native addon build failed (node-gyp)" ;;
210) echo "Python: Virtualenv / uv environment missing or broken" ;; 160) echo "Python: Virtualenv / uv environment missing or broken" ;;
211) echo "Python: Dependency resolution failed" ;; 161) echo "Python: Dependency resolution failed" ;;
212) echo "Python: Installation aborted (permissions or EXTERNALLY-MANAGED)" ;; 162) echo "Python: Installation aborted (permissions or EXTERNALLY-MANAGED)" ;;
170) echo "PostgreSQL: Connection failed (server not running / wrong socket)" ;;
# --- PostgreSQL --- 171) echo "PostgreSQL: Authentication failed (bad user/password)" ;;
231) echo "PostgreSQL: Connection failed (server not running / wrong socket)" ;; 172) echo "PostgreSQL: Database does not exist" ;;
232) echo "PostgreSQL: Authentication failed (bad user/password)" ;; 173) echo "PostgreSQL: Fatal error in query / syntax" ;;
233) echo "PostgreSQL: Database does not exist" ;; 180) echo "MySQL/MariaDB: Connection failed (server not running / wrong socket)" ;;
234) echo "PostgreSQL: Fatal error in query / syntax" ;; 181) echo "MySQL/MariaDB: Authentication failed (bad user/password)" ;;
182) echo "MySQL/MariaDB: Database does not exist" ;;
# --- MySQL / MariaDB --- 183) echo "MySQL/MariaDB: Fatal error in query / syntax" ;;
241) echo "MySQL/MariaDB: Connection failed (server not running / wrong socket)" ;; 190) echo "MongoDB: Connection failed (server not running)" ;;
242) echo "MySQL/MariaDB: Authentication failed (bad user/password)" ;; 191) echo "MongoDB: Authentication failed (bad user/password)" ;;
243) echo "MySQL/MariaDB: Database does not exist" ;; 192) echo "MongoDB: Database not found" ;;
244) echo "MySQL/MariaDB: Fatal error in query / syntax" ;; 193) echo "MongoDB: Fatal query error" ;;
200) echo "Proxmox: Failed to create lock file" ;;
# --- MongoDB --- 203) echo "Proxmox: Missing CTID variable" ;;
251) echo "MongoDB: Connection failed (server not running)" ;; 204) echo "Proxmox: Missing PCT_OSTYPE variable" ;;
252) echo "MongoDB: Authentication failed (bad user/password)" ;; 205) echo "Proxmox: Invalid CTID (<100)" ;;
253) echo "MongoDB: Database not found" ;; 206) echo "Proxmox: CTID already in use" ;;
254) echo "MongoDB: Fatal query error" ;; 207) echo "Proxmox: Password contains unescaped special characters" ;;
208) echo "Proxmox: Invalid configuration (DNS/MAC/Network format)" ;;
# --- Proxmox Custom Codes --- 209) echo "Proxmox: Container creation failed" ;;
200) echo "Proxmox: Failed to create lock file" ;; 210) echo "Proxmox: Cluster not quorate" ;;
203) echo "Proxmox: Missing CTID variable" ;; 211) echo "Proxmox: Timeout waiting for template lock" ;;
204) echo "Proxmox: Missing PCT_OSTYPE variable" ;; 212) echo "Proxmox: Storage type 'iscsidirect' does not support containers (VMs only)" ;;
205) echo "Proxmox: Invalid CTID (<100)" ;; 213) echo "Proxmox: Storage type does not support 'rootdir' content" ;;
206) echo "Proxmox: CTID already in use" ;; 214) echo "Proxmox: Not enough storage space" ;;
207) echo "Proxmox: Password contains unescaped special characters" ;; 215) echo "Proxmox: Container created but not listed (ghost state)" ;;
208) echo "Proxmox: Invalid configuration (DNS/MAC/Network format)" ;; 216) echo "Proxmox: RootFS entry missing in config" ;;
209) echo "Proxmox: Container creation failed" ;; 217) echo "Proxmox: Storage not accessible" ;;
210) echo "Proxmox: Cluster not quorate" ;; 218) echo "Proxmox: Template file corrupted or incomplete" ;;
211) echo "Proxmox: Timeout waiting for template lock" ;; 219) echo "Proxmox: CephFS does not support containers - use RBD" ;;
212) echo "Proxmox: Storage type 'iscsidirect' does not support containers (VMs only)" ;; 220) echo "Proxmox: Unable to resolve template path" ;;
213) echo "Proxmox: Storage type does not support 'rootdir' content" ;; 221) echo "Proxmox: Template file not readable" ;;
214) echo "Proxmox: Not enough storage space" ;; 222) echo "Proxmox: Template download failed" ;;
215) echo "Proxmox: Container created but not listed (ghost state)" ;; 223) echo "Proxmox: Template not available after download" ;;
216) echo "Proxmox: RootFS entry missing in config" ;; 224) echo "Proxmox: PBS storage is for backups only" ;;
217) echo "Proxmox: Storage not accessible" ;; 225) echo "Proxmox: No template available for OS/Version" ;;
219) echo "Proxmox: CephFS does not support containers - use RBD" ;; 231) echo "Proxmox: LXC stack upgrade failed" ;;
224) echo "Proxmox: PBS storage is for backups only" ;; 243) echo "Node.js: Out of memory (JavaScript heap out of memory)" ;;
218) echo "Proxmox: Template file corrupted or incomplete" ;; 245) echo "Node.js: Invalid command-line option" ;;
220) echo "Proxmox: Unable to resolve template path" ;; 246) echo "Node.js: Internal JavaScript Parse Error" ;;
221) echo "Proxmox: Template file not readable" ;; 247) echo "Node.js: Fatal internal error" ;;
222) echo "Proxmox: Template download failed" ;; 248) echo "Node.js: Invalid C++ addon / N-API failure" ;;
223) echo "Proxmox: Template not available after download" ;; 249) echo "npm/pnpm/yarn: Unknown fatal error" ;;
225) echo "Proxmox: No template available for OS/Version" ;; 255) echo "DPKG: Fatal internal error" ;;
231) echo "Proxmox: LXC stack upgrade failed" ;; *) echo "Unknown error" ;;
esac
# --- Default --- }
*) echo "Unknown error" ;; fi
esac
}
# ============================================================================== # ==============================================================================
# SECTION 2: ERROR HANDLERS # SECTION 2: ERROR HANDLERS
@@ -197,12 +187,7 @@ error_handler() {
# Create error flag file with exit code for host detection # Create error flag file with exit code for host detection
echo "$exit_code" >"/root/.install-${SESSION_ID:-error}.failed" 2>/dev/null || true echo "$exit_code" >"/root/.install-${SESSION_ID:-error}.failed" 2>/dev/null || true
# Log path is shown by host as combined log - no need to show container path
if declare -f msg_custom >/dev/null 2>&1; then
msg_custom "📋" "${YW}" "Log saved to: ${container_log}"
else
echo -e "${YW}Log saved to:${CL} ${BL}${container_log}${CL}"
fi
else else
# HOST CONTEXT: Show local log path and offer container cleanup # HOST CONTEXT: Show local log path and offer container cleanup
if declare -f msg_custom >/dev/null 2>&1; then if declare -f msg_custom >/dev/null 2>&1; then
@@ -213,6 +198,11 @@ error_handler() {
# Offer to remove container if it exists (build errors after container creation) # Offer to remove container if it exists (build errors after container creation)
if [[ -n "${CTID:-}" ]] && command -v pct &>/dev/null && pct status "$CTID" &>/dev/null; then if [[ -n "${CTID:-}" ]] && command -v pct &>/dev/null && pct status "$CTID" &>/dev/null; then
# Report failure to API before container cleanup
if declare -f post_update_to_api &>/dev/null; then
post_update_to_api "failed" "$exit_code"
fi
echo "" echo ""
echo -en "${YW}Remove broken container ${CTID}? (Y/n) [auto-remove in 60s]: ${CL}" echo -en "${YW}Remove broken container ${CTID}? (Y/n) [auto-remove in 60s]: ${CL}"
@@ -253,6 +243,18 @@ error_handler() {
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
on_exit() { on_exit() {
local exit_code=$? local exit_code=$?
# Report orphaned "installing" records to telemetry API
# Catches ALL exit paths: errors (non-zero), signals, AND clean exits where
# post_to_api was called ("installing" sent) but post_update_to_api was never called
if [[ "${POST_TO_API_DONE:-}" == "true" && "${POST_UPDATE_DONE:-}" != "true" ]]; then
if declare -f post_update_to_api >/dev/null 2>&1; then
if [[ $exit_code -ne 0 ]]; then
post_update_to_api "failed" "$exit_code"
else
post_update_to_api "failed" "1"
fi
fi
fi
[[ -n "${lockfile:-}" && -e "$lockfile" ]] && rm -f "$lockfile" [[ -n "${lockfile:-}" && -e "$lockfile" ]] && rm -f "$lockfile"
exit "$exit_code" exit "$exit_code"
} }
@@ -265,6 +267,10 @@ on_exit() {
# - Exits with code 130 (128 + SIGINT=2) # - Exits with code 130 (128 + SIGINT=2)
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
on_interrupt() { on_interrupt() {
# Report interruption to telemetry API (prevents stuck "installing" records)
if declare -f post_update_to_api >/dev/null 2>&1; then
post_update_to_api "failed" "130"
fi
if declare -f msg_error >/dev/null 2>&1; then if declare -f msg_error >/dev/null 2>&1; then
msg_error "Interrupted by user (SIGINT)" msg_error "Interrupted by user (SIGINT)"
else else
@@ -282,6 +288,10 @@ on_interrupt() {
# - Triggered by external process termination # - Triggered by external process termination
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
on_terminate() { on_terminate() {
# Report termination to telemetry API (prevents stuck "installing" records)
if declare -f post_update_to_api >/dev/null 2>&1; then
post_update_to_api "failed" "143"
fi
if declare -f msg_error >/dev/null 2>&1; then if declare -f msg_error >/dev/null 2>&1; then
msg_error "Terminated by signal (SIGTERM)" msg_error "Terminated by signal (SIGTERM)"
else else

View File

@@ -465,6 +465,7 @@ manage_tool_repository() {
msg_error "Failed to download MongoDB GPG key" msg_error "Failed to download MongoDB GPG key"
return 1 return 1
fi fi
chmod 644 "/etc/apt/keyrings/mongodb-server-${version}.gpg"
# Setup repository # Setup repository
local distro_codename local distro_codename
@@ -1294,12 +1295,33 @@ setup_deb822_repo() {
return 1 return 1
} }
# Import GPG # Import GPG key (auto-detect binary vs ASCII-armored format)
curl -fsSL "$gpg_url" | gpg --dearmor --yes -o "/etc/apt/keyrings/${name}.gpg" || { local tmp_gpg
msg_error "Failed to import GPG key for ${name}" tmp_gpg=$(mktemp) || return 1
curl -fsSL "$gpg_url" -o "$tmp_gpg" || {
msg_error "Failed to download GPG key for ${name}"
rm -f "$tmp_gpg"
return 1 return 1
} }
if grep -q "BEGIN PGP" "$tmp_gpg" 2>/dev/null; then
# ASCII-armored — dearmor to binary
gpg --dearmor --yes -o "/etc/apt/keyrings/${name}.gpg" < "$tmp_gpg" || {
msg_error "Failed to dearmor GPG key for ${name}"
rm -f "$tmp_gpg"
return 1
}
else
# Already in binary GPG format — copy directly
cp "$tmp_gpg" "/etc/apt/keyrings/${name}.gpg" || {
msg_error "Failed to install GPG key for ${name}"
rm -f "$tmp_gpg"
return 1
}
fi
rm -f "$tmp_gpg"
chmod 644 "/etc/apt/keyrings/${name}.gpg"
# Write deb822 # Write deb822
{ {
echo "Types: deb" echo "Types: deb"

View File

@@ -79,11 +79,24 @@ EOF
header_info header_info
msg "Installing NetBird..." msg "Installing NetBird..."
pct exec "$CTID" -- bash -c ' pct exec "$CTID" -- bash -c '
if ! command -v curl &>/dev/null; then
apt-get update -qq
apt-get install -y curl >/dev/null
fi
apt install -y ca-certificates gpg &>/dev/null apt install -y ca-certificates gpg &>/dev/null
curl -fsSL "https://pkgs.netbird.io/debian/public.key" | gpg --dearmor >/usr/share/keyrings/netbird-archive-keyring.gpg curl -fsSL "https://pkgs.netbird.io/debian/public.key" | gpg --dearmor >/usr/share/keyrings/netbird-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/netbird-archive-keyring.gpg] https://pkgs.netbird.io/debian stable main" >/etc/apt/sources.list.d/netbird.list echo "deb [signed-by=/usr/share/keyrings/netbird-archive-keyring.gpg] https://pkgs.netbird.io/debian stable main" >/etc/apt/sources.list.d/netbird.list
apt-get update &>/dev/null apt-get update &>/dev/null
apt-get install -y netbird-ui &>/dev/null apt-get install -y netbird-ui &>/dev/null
if systemctl list-unit-files docker.service &>/dev/null; then
mkdir -p /etc/systemd/system/netbird.service.d
cat <<OVERRIDE >/etc/systemd/system/netbird.service.d/after-docker.conf
[Unit]
After=docker.service
Wants=docker.service
OVERRIDE
systemctl daemon-reload
fi
' '
msg "\e[1;32m ✔ Installed NetBird.\e[0m" msg "\e[1;32m ✔ Installed NetBird.\e[0m"
sleep 2 sleep 2

View File

@@ -75,28 +75,62 @@ pct exec "$CTID" -- bash -c '
set -e set -e
export DEBIAN_FRONTEND=noninteractive export DEBIAN_FRONTEND=noninteractive
ID=$(grep "^ID=" /etc/os-release | cut -d"=" -f2) # Source os-release properly (handles quoted values)
VER=$(grep "^VERSION_CODENAME=" /etc/os-release | cut -d"=" -f2) source /etc/os-release
# fallback if DNS is poisoned or blocked # Fallback if DNS is poisoned or blocked
ORIG_RESOLV="/etc/resolv.conf" ORIG_RESOLV="/etc/resolv.conf"
BACKUP_RESOLV="/tmp/resolv.conf.backup" BACKUP_RESOLV="/tmp/resolv.conf.backup"
if ! dig +short pkgs.tailscale.com | grep -qvE "^127\.|^0\.0\.0\.0$"; then # Check DNS resolution using multiple methods (dig may not be installed)
dns_check_failed=true
if command -v dig &>/dev/null; then
if dig +short pkgs.tailscale.com 2>/dev/null | grep -qvE "^127\.|^0\.0\.0\.0$|^$"; then
dns_check_failed=false
fi
elif command -v host &>/dev/null; then
if host pkgs.tailscale.com 2>/dev/null | grep -q "has address"; then
dns_check_failed=false
fi
elif command -v nslookup &>/dev/null; then
if nslookup pkgs.tailscale.com 2>/dev/null | grep -q "Address:"; then
dns_check_failed=false
fi
elif command -v getent &>/dev/null; then
if getent hosts pkgs.tailscale.com &>/dev/null; then
dns_check_failed=false
fi
else
# No DNS tools available, try curl directly and assume DNS works
dns_check_failed=false
fi
if $dns_check_failed; then
echo "[INFO] DNS resolution for pkgs.tailscale.com failed (blocked or redirected)." echo "[INFO] DNS resolution for pkgs.tailscale.com failed (blocked or redirected)."
echo "[INFO] Temporarily overriding /etc/resolv.conf with Cloudflare DNS (1.1.1.1)" echo "[INFO] Temporarily overriding /etc/resolv.conf with Cloudflare DNS (1.1.1.1)"
cp "$ORIG_RESOLV" "$BACKUP_RESOLV" cp "$ORIG_RESOLV" "$BACKUP_RESOLV"
echo "nameserver 1.1.1.1" >"$ORIG_RESOLV" echo "nameserver 1.1.1.1" >"$ORIG_RESOLV"
fi fi
curl -fsSL https://pkgs.tailscale.com/stable/${ID}/${VER}.noarmor.gpg \ if ! command -v curl &>/dev/null; then
echo "[INFO] curl not found, installing..."
apt-get update -qq
apt update -qq
apt install -y curl >/dev/null
fi
# Ensure keyrings directory exists
mkdir -p /usr/share/keyrings
curl -fsSL "https://pkgs.tailscale.com/stable/${ID}/${VERSION_CODENAME}.noarmor.gpg" \
| tee /usr/share/keyrings/tailscale-archive-keyring.gpg >/dev/null | tee /usr/share/keyrings/tailscale-archive-keyring.gpg >/dev/null
echo "deb [signed-by=/usr/share/keyrings/tailscale-archive-keyring.gpg] https://pkgs.tailscale.com/stable/${ID} ${VER} main" \ echo "deb [signed-by=/usr/share/keyrings/tailscale-archive-keyring.gpg] https://pkgs.tailscale.com/stable/${ID} ${VERSION_CODENAME} main" \
>/etc/apt/sources.list.d/tailscale.list >/etc/apt/sources.list.d/tailscale.list
apt-get update -qq apt-get update -qq
apt-get install -y tailscale >/dev/null apt update -qq
apt install -y tailscale >/dev/null
if [[ -f /tmp/resolv.conf.backup ]]; then if [[ -f /tmp/resolv.conf.backup ]]; then
echo "[INFO] Restoring original /etc/resolv.conf" echo "[INFO] Restoring original /etc/resolv.conf"

View File

@@ -5,6 +5,11 @@
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/bakito/adguardhome-sync # Source: https://github.com/bakito/adguardhome-sync
if ! command -v curl &>/dev/null; then
printf "\r\e[2K%b" '\033[93m Setup Source \033[m' >&2
apt-get update >/dev/null 2>&1
apt-get install -y curl >/dev/null 2>&1
fi
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/core.func) source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/core.func)
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/tools.func) source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/tools.func)
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/error_handler.func) source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/error_handler.func)

View File

@@ -5,6 +5,11 @@
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE # License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/9001/copyparty # Source: https://github.com/9001/copyparty
if ! command -v curl &>/dev/null; then
printf "\r\e[2K%b" '\033[93m Setup Source \033[m' >&2
apt-get update >/dev/null 2>&1
apt-get install -y curl >/dev/null 2>&1
fi
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/core.func) source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/core.func)
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/tools.func) source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/tools.func)
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/error_handler.func) source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/error_handler.func)

Some files were not shown because too many files have changed in this diff Show More