mirror of
https://github.com/community-scripts/ProxmoxVE.git
synced 2026-02-15 01:33:25 +01:00
Compare commits
7 Commits
github-act
...
refactor/w
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
34e897def2 | ||
|
|
ec3d7a3c31 | ||
|
|
24013fe87c | ||
|
|
7e2ede1508 | ||
|
|
b590fb41fa | ||
|
|
c65d41e26d | ||
|
|
fc2c191ced |
216
.github/changelogs/2026/02.md
generated
vendored
216
.github/changelogs/2026/02.md
generated
vendored
@@ -1,219 +1,3 @@
|
|||||||
## 2026-02-14
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- Increase disk allocation for OpenWebUI and Ollama to prevent installation failures [@Copilot](https://github.com/Copilot) ([#11920](https://github.com/community-scripts/ProxmoxVE/pull/11920))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- core: handle missing RAM speed in nested VMs [@MickLesk](https://github.com/MickLesk) ([#11913](https://github.com/community-scripts/ProxmoxVE/pull/11913))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- core: overwriteable app version [@CrazyWolf13](https://github.com/CrazyWolf13) ([#11753](https://github.com/community-scripts/ProxmoxVE/pull/11753))
|
|
||||||
- core: validate container IDs cluster-wide across all nodes [@MickLesk](https://github.com/MickLesk) ([#11906](https://github.com/community-scripts/ProxmoxVE/pull/11906))
|
|
||||||
- core: improve error reporting with structured error strings and better categorization + output formatting [@MickLesk](https://github.com/MickLesk) ([#11907](https://github.com/community-scripts/ProxmoxVE/pull/11907))
|
|
||||||
- core: unified logging system with combined logs [@MickLesk](https://github.com/MickLesk) ([#11761](https://github.com/community-scripts/ProxmoxVE/pull/11761))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- lxc-updater: add patchmon aware [@failure101](https://github.com/failure101) ([#11905](https://github.com/community-scripts/ProxmoxVE/pull/11905))
|
|
||||||
|
|
||||||
### 🌐 Website
|
|
||||||
|
|
||||||
- #### 📝 Script Information
|
|
||||||
|
|
||||||
- Disable UniFi script - APT packages no longer available [@Copilot](https://github.com/Copilot) ([#11898](https://github.com/community-scripts/ProxmoxVE/pull/11898))
|
|
||||||
|
|
||||||
## 2026-02-13
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- OpenWebUI: pin numba constraint [@MickLesk](https://github.com/MickLesk) ([#11874](https://github.com/community-scripts/ProxmoxVE/pull/11874))
|
|
||||||
- Planka: add migrate step to update function [@ZimmermannLeon](https://github.com/ZimmermannLeon) ([#11877](https://github.com/community-scripts/ProxmoxVE/pull/11877))
|
|
||||||
- Pangolin: switch sqlite-specific back to generic [@MickLesk](https://github.com/MickLesk) ([#11868](https://github.com/community-scripts/ProxmoxVE/pull/11868))
|
|
||||||
- [Hotfix] Jotty: Copy contents of config backup into /opt/jotty/config [@vhsdream](https://github.com/vhsdream) ([#11864](https://github.com/community-scripts/ProxmoxVE/pull/11864))
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- Refactor: Radicale [@vhsdream](https://github.com/vhsdream) ([#11850](https://github.com/community-scripts/ProxmoxVE/pull/11850))
|
|
||||||
- chore(donetick): add config entry for v0.1.73 [@tomfrenzel](https://github.com/tomfrenzel) ([#11872](https://github.com/community-scripts/ProxmoxVE/pull/11872))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- core: retry reporting with fallback payloads [@MickLesk](https://github.com/MickLesk) ([#11885](https://github.com/community-scripts/ProxmoxVE/pull/11885))
|
|
||||||
|
|
||||||
### 📡 API
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- error-handler: Implement json_escape and enhance error handling [@MickLesk](https://github.com/MickLesk) ([#11875](https://github.com/community-scripts/ProxmoxVE/pull/11875))
|
|
||||||
|
|
||||||
### 🌐 Website
|
|
||||||
|
|
||||||
- #### 📝 Script Information
|
|
||||||
|
|
||||||
- SQLServer-2025: add PVE9/Kernel 6.x incompatibility warning [@MickLesk](https://github.com/MickLesk) ([#11829](https://github.com/community-scripts/ProxmoxVE/pull/11829))
|
|
||||||
|
|
||||||
## 2026-02-12
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- EMQX: increase disk to 6GB and add optional MQ disable prompt [@MickLesk](https://github.com/MickLesk) ([#11844](https://github.com/community-scripts/ProxmoxVE/pull/11844))
|
|
||||||
- Increased the Grafana container default disk size. [@shtefko](https://github.com/shtefko) ([#11840](https://github.com/community-scripts/ProxmoxVE/pull/11840))
|
|
||||||
- Pangolin: Update database generation command in install script [@tremor021](https://github.com/tremor021) ([#11825](https://github.com/community-scripts/ProxmoxVE/pull/11825))
|
|
||||||
- Deluge: add python3-setuptools as dep [@MickLesk](https://github.com/MickLesk) ([#11833](https://github.com/community-scripts/ProxmoxVE/pull/11833))
|
|
||||||
- Dispatcharr: migrate to uv sync [@MickLesk](https://github.com/MickLesk) ([#11831](https://github.com/community-scripts/ProxmoxVE/pull/11831))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- Archlinux-VM: fix LVM/LVM-thin storage and improve error reporting | VM's add correct exit_code for analytics [@MickLesk](https://github.com/MickLesk) ([#11842](https://github.com/community-scripts/ProxmoxVE/pull/11842))
|
|
||||||
- Debian13-VM: Optimize First Boot & add noCloud/Cloud Selection [@MickLesk](https://github.com/MickLesk) ([#11810](https://github.com/community-scripts/ProxmoxVE/pull/11810))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- tools.func: auto-detect binary vs armored GPG keys in setup_deb822_repo [@MickLesk](https://github.com/MickLesk) ([#11841](https://github.com/community-scripts/ProxmoxVE/pull/11841))
|
|
||||||
- core: remove old Go API and extend misc/api.func with new backend [@MickLesk](https://github.com/MickLesk) ([#11822](https://github.com/community-scripts/ProxmoxVE/pull/11822))
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- error_handler: prevent stuck 'installing' status [@MickLesk](https://github.com/MickLesk) ([#11845](https://github.com/community-scripts/ProxmoxVE/pull/11845))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- Tailscale: fix DNS check and keyrings directory issues [@MickLesk](https://github.com/MickLesk) ([#11837](https://github.com/community-scripts/ProxmoxVE/pull/11837))
|
|
||||||
|
|
||||||
## 2026-02-11
|
|
||||||
|
|
||||||
### 🆕 New Scripts
|
|
||||||
|
|
||||||
- Draw.io ([#11788](https://github.com/community-scripts/ProxmoxVE/pull/11788))
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- dispatcharr: include port 9191 in success-message [@MickLesk](https://github.com/MickLesk) ([#11808](https://github.com/community-scripts/ProxmoxVE/pull/11808))
|
|
||||||
- fix: make donetick 0.1.71 compatible [@tomfrenzel](https://github.com/tomfrenzel) ([#11804](https://github.com/community-scripts/ProxmoxVE/pull/11804))
|
|
||||||
- Kasm: Support new version URL format without hash suffix [@MickLesk](https://github.com/MickLesk) ([#11787](https://github.com/community-scripts/ProxmoxVE/pull/11787))
|
|
||||||
- LibreTranslate: Remove Torch [@tremor021](https://github.com/tremor021) ([#11783](https://github.com/community-scripts/ProxmoxVE/pull/11783))
|
|
||||||
- Snowshare: fix update script [@TuroYT](https://github.com/TuroYT) ([#11726](https://github.com/community-scripts/ProxmoxVE/pull/11726))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- [Feature] OpenCloud: support PosixFS Collaborative Mode [@vhsdream](https://github.com/vhsdream) ([#11806](https://github.com/community-scripts/ProxmoxVE/pull/11806))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- core: respect EDITOR variable for config editing [@ls-root](https://github.com/ls-root) ([#11693](https://github.com/community-scripts/ProxmoxVE/pull/11693))
|
|
||||||
|
|
||||||
### 📚 Documentation
|
|
||||||
|
|
||||||
- Fix formatting in kutt.json notes section [@tiagodenoronha](https://github.com/tiagodenoronha) ([#11774](https://github.com/community-scripts/ProxmoxVE/pull/11774))
|
|
||||||
|
|
||||||
## 2026-02-10
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- Immich: Pin version to 2.5.6 [@vhsdream](https://github.com/vhsdream) ([#11775](https://github.com/community-scripts/ProxmoxVE/pull/11775))
|
|
||||||
- Libretranslate: Fix setuptools [@tremor021](https://github.com/tremor021) ([#11772](https://github.com/community-scripts/ProxmoxVE/pull/11772))
|
|
||||||
- Element Synapse: prevent systemd invoke failure during apt install [@MickLesk](https://github.com/MickLesk) ([#11758](https://github.com/community-scripts/ProxmoxVE/pull/11758))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- Refactor: Slskd & Soularr [@vhsdream](https://github.com/vhsdream) ([#11674](https://github.com/community-scripts/ProxmoxVE/pull/11674))
|
|
||||||
|
|
||||||
### 🗑️ Deleted Scripts
|
|
||||||
|
|
||||||
- move paperless-exporter from LXC to addon ([#11737](https://github.com/community-scripts/ProxmoxVE/pull/11737))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- feat: improve storage parsing & add guestname [@carlosmaroot](https://github.com/carlosmaroot) ([#11752](https://github.com/community-scripts/ProxmoxVE/pull/11752))
|
|
||||||
|
|
||||||
### 📂 Github
|
|
||||||
|
|
||||||
- Github-Version Workflow: include addon scripts in extraction [@MickLesk](https://github.com/MickLesk) ([#11757](https://github.com/community-scripts/ProxmoxVE/pull/11757))
|
|
||||||
|
|
||||||
### 🌐 Website
|
|
||||||
|
|
||||||
- #### 📝 Script Information
|
|
||||||
|
|
||||||
- Snowshare: fix typo in config file path on website [@BirdMakingStuff](https://github.com/BirdMakingStuff) ([#11754](https://github.com/community-scripts/ProxmoxVE/pull/11754))
|
|
||||||
|
|
||||||
## 2026-02-09
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- several scripts: add --clear to uv venv calls for uv 0.10 compatibility [@MickLesk](https://github.com/MickLesk) ([#11723](https://github.com/community-scripts/ProxmoxVE/pull/11723))
|
|
||||||
- Koillection: ensure setup_composer is in update script [@MickLesk](https://github.com/MickLesk) ([#11734](https://github.com/community-scripts/ProxmoxVE/pull/11734))
|
|
||||||
- PeaNUT: symlink server.js after update [@vhsdream](https://github.com/vhsdream) ([#11696](https://github.com/community-scripts/ProxmoxVE/pull/11696))
|
|
||||||
- Umlautadaptarr: use release appsettings.json instead of hardcoded copy [@MickLesk](https://github.com/MickLesk) ([#11725](https://github.com/community-scripts/ProxmoxVE/pull/11725))
|
|
||||||
- tracearr: prepare for next stable release [@durzo](https://github.com/durzo) ([#11673](https://github.com/community-scripts/ProxmoxVE/pull/11673))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- remove whiptail from update scripts for unattended update support [@MickLesk](https://github.com/MickLesk) ([#11712](https://github.com/community-scripts/ProxmoxVE/pull/11712))
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- Refactor: FileFlows [@tremor021](https://github.com/tremor021) ([#11108](https://github.com/community-scripts/ProxmoxVE/pull/11108))
|
|
||||||
- Refactor: wger [@MickLesk](https://github.com/MickLesk) ([#11722](https://github.com/community-scripts/ProxmoxVE/pull/11722))
|
|
||||||
- Nginx-UI: better User Handling | ACME [@MickLesk](https://github.com/MickLesk) ([#11715](https://github.com/community-scripts/ProxmoxVE/pull/11715))
|
|
||||||
- NginxProxymanager: use better-sqlite3 [@MickLesk](https://github.com/MickLesk) ([#11708](https://github.com/community-scripts/ProxmoxVE/pull/11708))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- hwaccel: add libmfx-gen1.2 to Intel Arc setup for QSV support [@MickLesk](https://github.com/MickLesk) ([#11707](https://github.com/community-scripts/ProxmoxVE/pull/11707))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- addons: ensure curl is installed before use [@MickLesk](https://github.com/MickLesk) ([#11718](https://github.com/community-scripts/ProxmoxVE/pull/11718))
|
|
||||||
- Netbird (addon): add systemd ordering to start after Docker [@MickLesk](https://github.com/MickLesk) ([#11716](https://github.com/community-scripts/ProxmoxVE/pull/11716))
|
|
||||||
|
|
||||||
### ❔ Uncategorized
|
|
||||||
|
|
||||||
- Bichon: Update website [@tremor021](https://github.com/tremor021) ([#11711](https://github.com/community-scripts/ProxmoxVE/pull/11711))
|
|
||||||
|
|
||||||
## 2026-02-08
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- feat(healthchecks): add sendalerts service [@Mika56](https://github.com/Mika56) ([#11694](https://github.com/community-scripts/ProxmoxVE/pull/11694))
|
|
||||||
- ComfyUI: Dynamic Fetch PyTorch Versions [@MickLesk](https://github.com/MickLesk) ([#11657](https://github.com/community-scripts/ProxmoxVE/pull/11657))
|
|
||||||
|
|
||||||
- #### 💥 Breaking Changes
|
|
||||||
|
|
||||||
- Semaphore: switch from Debian to Ubuntu 24.04 [@MickLesk](https://github.com/MickLesk) ([#11670](https://github.com/community-scripts/ProxmoxVE/pull/11670))
|
|
||||||
|
|
||||||
## 2026-02-07
|
## 2026-02-07
|
||||||
|
|
||||||
### 🆕 New Scripts
|
### 🆕 New Scripts
|
||||||
|
|||||||
12
.github/workflows/update-versions-github.yml
generated
vendored
12
.github/workflows/update-versions-github.yml
generated
vendored
@@ -89,15 +89,9 @@ jobs:
|
|||||||
slug=$(jq -r '.slug // empty' "$json_file" 2>/dev/null)
|
slug=$(jq -r '.slug // empty' "$json_file" 2>/dev/null)
|
||||||
[[ -z "$slug" ]] && continue
|
[[ -z "$slug" ]] && continue
|
||||||
|
|
||||||
# Find corresponding script (install script or addon script)
|
# Find corresponding install script
|
||||||
install_script=""
|
install_script="install/${slug}-install.sh"
|
||||||
if [[ -f "install/${slug}-install.sh" ]]; then
|
[[ ! -f "$install_script" ]] && continue
|
||||||
install_script="install/${slug}-install.sh"
|
|
||||||
elif [[ -f "tools/addon/${slug}.sh" ]]; then
|
|
||||||
install_script="tools/addon/${slug}.sh"
|
|
||||||
else
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Look for fetch_and_deploy_gh_release calls
|
# Look for fetch_and_deploy_gh_release calls
|
||||||
# Pattern: fetch_and_deploy_gh_release "app" "owner/repo" ["mode"] ["version"]
|
# Pattern: fetch_and_deploy_gh_release "app" "owner/repo" ["mode"] ["version"]
|
||||||
|
|||||||
342
CHANGELOG.md
342
CHANGELOG.md
@@ -15,9 +15,6 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary><h2>📜 History</h2></summary>
|
<summary><h2>📜 History</h2></summary>
|
||||||
|
|
||||||
@@ -27,7 +24,7 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
|
|||||||
|
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary><h4>February (14 entries)</h4></summary>
|
<summary><h4>February (7 entries)</h4></summary>
|
||||||
|
|
||||||
[View February 2026 Changelog](.github/changelogs/2026/02.md)
|
[View February 2026 Changelog](.github/changelogs/2026/02.md)
|
||||||
|
|
||||||
@@ -404,180 +401,14 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
|
|||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
|
||||||
## 2026-02-14
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- Increase disk allocation for OpenWebUI and Ollama to prevent installation failures [@Copilot](https://github.com/Copilot) ([#11920](https://github.com/community-scripts/ProxmoxVE/pull/11920))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- core: handle missing RAM speed in nested VMs [@MickLesk](https://github.com/MickLesk) ([#11913](https://github.com/community-scripts/ProxmoxVE/pull/11913))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- core: overwriteable app version [@CrazyWolf13](https://github.com/CrazyWolf13) ([#11753](https://github.com/community-scripts/ProxmoxVE/pull/11753))
|
|
||||||
- core: validate container IDs cluster-wide across all nodes [@MickLesk](https://github.com/MickLesk) ([#11906](https://github.com/community-scripts/ProxmoxVE/pull/11906))
|
|
||||||
- core: improve error reporting with structured error strings and better categorization + output formatting [@MickLesk](https://github.com/MickLesk) ([#11907](https://github.com/community-scripts/ProxmoxVE/pull/11907))
|
|
||||||
- core: unified logging system with combined logs [@MickLesk](https://github.com/MickLesk) ([#11761](https://github.com/community-scripts/ProxmoxVE/pull/11761))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- lxc-updater: add patchmon aware [@failure101](https://github.com/failure101) ([#11905](https://github.com/community-scripts/ProxmoxVE/pull/11905))
|
|
||||||
|
|
||||||
### 🌐 Website
|
|
||||||
|
|
||||||
- #### 📝 Script Information
|
|
||||||
|
|
||||||
- Disable UniFi script - APT packages no longer available [@Copilot](https://github.com/Copilot) ([#11898](https://github.com/community-scripts/ProxmoxVE/pull/11898))
|
|
||||||
|
|
||||||
## 2026-02-13
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- OpenWebUI: pin numba constraint [@MickLesk](https://github.com/MickLesk) ([#11874](https://github.com/community-scripts/ProxmoxVE/pull/11874))
|
|
||||||
- Planka: add migrate step to update function [@ZimmermannLeon](https://github.com/ZimmermannLeon) ([#11877](https://github.com/community-scripts/ProxmoxVE/pull/11877))
|
|
||||||
- Pangolin: switch sqlite-specific back to generic [@MickLesk](https://github.com/MickLesk) ([#11868](https://github.com/community-scripts/ProxmoxVE/pull/11868))
|
|
||||||
- [Hotfix] Jotty: Copy contents of config backup into /opt/jotty/config [@vhsdream](https://github.com/vhsdream) ([#11864](https://github.com/community-scripts/ProxmoxVE/pull/11864))
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- Refactor: Radicale [@vhsdream](https://github.com/vhsdream) ([#11850](https://github.com/community-scripts/ProxmoxVE/pull/11850))
|
|
||||||
- chore(donetick): add config entry for v0.1.73 [@tomfrenzel](https://github.com/tomfrenzel) ([#11872](https://github.com/community-scripts/ProxmoxVE/pull/11872))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- core: retry reporting with fallback payloads [@MickLesk](https://github.com/MickLesk) ([#11885](https://github.com/community-scripts/ProxmoxVE/pull/11885))
|
|
||||||
|
|
||||||
### 📡 API
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- error-handler: Implement json_escape and enhance error handling [@MickLesk](https://github.com/MickLesk) ([#11875](https://github.com/community-scripts/ProxmoxVE/pull/11875))
|
|
||||||
|
|
||||||
### 🌐 Website
|
|
||||||
|
|
||||||
- #### 📝 Script Information
|
|
||||||
|
|
||||||
- SQLServer-2025: add PVE9/Kernel 6.x incompatibility warning [@MickLesk](https://github.com/MickLesk) ([#11829](https://github.com/community-scripts/ProxmoxVE/pull/11829))
|
|
||||||
|
|
||||||
## 2026-02-12
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- EMQX: increase disk to 6GB and add optional MQ disable prompt [@MickLesk](https://github.com/MickLesk) ([#11844](https://github.com/community-scripts/ProxmoxVE/pull/11844))
|
|
||||||
- Increased the Grafana container default disk size. [@shtefko](https://github.com/shtefko) ([#11840](https://github.com/community-scripts/ProxmoxVE/pull/11840))
|
|
||||||
- Pangolin: Update database generation command in install script [@tremor021](https://github.com/tremor021) ([#11825](https://github.com/community-scripts/ProxmoxVE/pull/11825))
|
|
||||||
- Deluge: add python3-setuptools as dep [@MickLesk](https://github.com/MickLesk) ([#11833](https://github.com/community-scripts/ProxmoxVE/pull/11833))
|
|
||||||
- Dispatcharr: migrate to uv sync [@MickLesk](https://github.com/MickLesk) ([#11831](https://github.com/community-scripts/ProxmoxVE/pull/11831))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- Archlinux-VM: fix LVM/LVM-thin storage and improve error reporting | VM's add correct exit_code for analytics [@MickLesk](https://github.com/MickLesk) ([#11842](https://github.com/community-scripts/ProxmoxVE/pull/11842))
|
|
||||||
- Debian13-VM: Optimize First Boot & add noCloud/Cloud Selection [@MickLesk](https://github.com/MickLesk) ([#11810](https://github.com/community-scripts/ProxmoxVE/pull/11810))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- tools.func: auto-detect binary vs armored GPG keys in setup_deb822_repo [@MickLesk](https://github.com/MickLesk) ([#11841](https://github.com/community-scripts/ProxmoxVE/pull/11841))
|
|
||||||
- core: remove old Go API and extend misc/api.func with new backend [@MickLesk](https://github.com/MickLesk) ([#11822](https://github.com/community-scripts/ProxmoxVE/pull/11822))
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- error_handler: prevent stuck 'installing' status [@MickLesk](https://github.com/MickLesk) ([#11845](https://github.com/community-scripts/ProxmoxVE/pull/11845))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- Tailscale: fix DNS check and keyrings directory issues [@MickLesk](https://github.com/MickLesk) ([#11837](https://github.com/community-scripts/ProxmoxVE/pull/11837))
|
|
||||||
|
|
||||||
## 2026-02-11
|
|
||||||
|
|
||||||
### 🆕 New Scripts
|
|
||||||
|
|
||||||
- Draw.io ([#11788](https://github.com/community-scripts/ProxmoxVE/pull/11788))
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- dispatcharr: include port 9191 in success-message [@MickLesk](https://github.com/MickLesk) ([#11808](https://github.com/community-scripts/ProxmoxVE/pull/11808))
|
|
||||||
- fix: make donetick 0.1.71 compatible [@tomfrenzel](https://github.com/tomfrenzel) ([#11804](https://github.com/community-scripts/ProxmoxVE/pull/11804))
|
|
||||||
- Kasm: Support new version URL format without hash suffix [@MickLesk](https://github.com/MickLesk) ([#11787](https://github.com/community-scripts/ProxmoxVE/pull/11787))
|
|
||||||
- LibreTranslate: Remove Torch [@tremor021](https://github.com/tremor021) ([#11783](https://github.com/community-scripts/ProxmoxVE/pull/11783))
|
|
||||||
- Snowshare: fix update script [@TuroYT](https://github.com/TuroYT) ([#11726](https://github.com/community-scripts/ProxmoxVE/pull/11726))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- [Feature] OpenCloud: support PosixFS Collaborative Mode [@vhsdream](https://github.com/vhsdream) ([#11806](https://github.com/community-scripts/ProxmoxVE/pull/11806))
|
|
||||||
|
|
||||||
### 💾 Core
|
|
||||||
|
|
||||||
- #### 🔧 Refactor
|
|
||||||
|
|
||||||
- core: respect EDITOR variable for config editing [@ls-root](https://github.com/ls-root) ([#11693](https://github.com/community-scripts/ProxmoxVE/pull/11693))
|
|
||||||
|
|
||||||
### 📚 Documentation
|
|
||||||
|
|
||||||
- Fix formatting in kutt.json notes section [@tiagodenoronha](https://github.com/tiagodenoronha) ([#11774](https://github.com/community-scripts/ProxmoxVE/pull/11774))
|
|
||||||
|
|
||||||
## 2026-02-10
|
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- Immich: Pin version to 2.5.6 [@vhsdream](https://github.com/vhsdream) ([#11775](https://github.com/community-scripts/ProxmoxVE/pull/11775))
|
|
||||||
- Libretranslate: Fix setuptools [@tremor021](https://github.com/tremor021) ([#11772](https://github.com/community-scripts/ProxmoxVE/pull/11772))
|
|
||||||
- Element Synapse: prevent systemd invoke failure during apt install [@MickLesk](https://github.com/MickLesk) ([#11758](https://github.com/community-scripts/ProxmoxVE/pull/11758))
|
|
||||||
|
|
||||||
- #### ✨ New Features
|
|
||||||
|
|
||||||
- Refactor: Slskd & Soularr [@vhsdream](https://github.com/vhsdream) ([#11674](https://github.com/community-scripts/ProxmoxVE/pull/11674))
|
|
||||||
|
|
||||||
### 🗑️ Deleted Scripts
|
|
||||||
|
|
||||||
- move paperless-exporter from LXC to addon ([#11737](https://github.com/community-scripts/ProxmoxVE/pull/11737))
|
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- feat: improve storage parsing & add guestname [@carlosmaroot](https://github.com/carlosmaroot) ([#11752](https://github.com/community-scripts/ProxmoxVE/pull/11752))
|
|
||||||
|
|
||||||
### 📂 Github
|
|
||||||
|
|
||||||
- Github-Version Workflow: include addon scripts in extraction [@MickLesk](https://github.com/MickLesk) ([#11757](https://github.com/community-scripts/ProxmoxVE/pull/11757))
|
|
||||||
|
|
||||||
### 🌐 Website
|
|
||||||
|
|
||||||
- #### 📝 Script Information
|
|
||||||
|
|
||||||
- Snowshare: fix typo in config file path on website [@BirdMakingStuff](https://github.com/BirdMakingStuff) ([#11754](https://github.com/community-scripts/ProxmoxVE/pull/11754))
|
|
||||||
|
|
||||||
## 2026-02-09
|
## 2026-02-09
|
||||||
|
|
||||||
### 🚀 Updated Scripts
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
- several scripts: add --clear to uv venv calls for uv 0.10 compatibility [@MickLesk](https://github.com/MickLesk) ([#11723](https://github.com/community-scripts/ProxmoxVE/pull/11723))
|
|
||||||
- Koillection: ensure setup_composer is in update script [@MickLesk](https://github.com/MickLesk) ([#11734](https://github.com/community-scripts/ProxmoxVE/pull/11734))
|
|
||||||
- PeaNUT: symlink server.js after update [@vhsdream](https://github.com/vhsdream) ([#11696](https://github.com/community-scripts/ProxmoxVE/pull/11696))
|
|
||||||
- Umlautadaptarr: use release appsettings.json instead of hardcoded copy [@MickLesk](https://github.com/MickLesk) ([#11725](https://github.com/community-scripts/ProxmoxVE/pull/11725))
|
|
||||||
- tracearr: prepare for next stable release [@durzo](https://github.com/durzo) ([#11673](https://github.com/community-scripts/ProxmoxVE/pull/11673))
|
- tracearr: prepare for next stable release [@durzo](https://github.com/durzo) ([#11673](https://github.com/community-scripts/ProxmoxVE/pull/11673))
|
||||||
|
- PeaNUT: symlink server.js after update [@vhsdream](https://github.com/vhsdream) ([#11696](https://github.com/community-scripts/ProxmoxVE/pull/11696))
|
||||||
|
|
||||||
- #### ✨ New Features
|
- #### ✨ New Features
|
||||||
|
|
||||||
@@ -585,8 +416,6 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
|
|||||||
|
|
||||||
- #### 🔧 Refactor
|
- #### 🔧 Refactor
|
||||||
|
|
||||||
- Refactor: FileFlows [@tremor021](https://github.com/tremor021) ([#11108](https://github.com/community-scripts/ProxmoxVE/pull/11108))
|
|
||||||
- Refactor: wger [@MickLesk](https://github.com/MickLesk) ([#11722](https://github.com/community-scripts/ProxmoxVE/pull/11722))
|
|
||||||
- Nginx-UI: better User Handling | ACME [@MickLesk](https://github.com/MickLesk) ([#11715](https://github.com/community-scripts/ProxmoxVE/pull/11715))
|
- Nginx-UI: better User Handling | ACME [@MickLesk](https://github.com/MickLesk) ([#11715](https://github.com/community-scripts/ProxmoxVE/pull/11715))
|
||||||
- NginxProxymanager: use better-sqlite3 [@MickLesk](https://github.com/MickLesk) ([#11708](https://github.com/community-scripts/ProxmoxVE/pull/11708))
|
- NginxProxymanager: use better-sqlite3 [@MickLesk](https://github.com/MickLesk) ([#11708](https://github.com/community-scripts/ProxmoxVE/pull/11708))
|
||||||
|
|
||||||
@@ -596,13 +425,6 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
|
|||||||
|
|
||||||
- hwaccel: add libmfx-gen1.2 to Intel Arc setup for QSV support [@MickLesk](https://github.com/MickLesk) ([#11707](https://github.com/community-scripts/ProxmoxVE/pull/11707))
|
- hwaccel: add libmfx-gen1.2 to Intel Arc setup for QSV support [@MickLesk](https://github.com/MickLesk) ([#11707](https://github.com/community-scripts/ProxmoxVE/pull/11707))
|
||||||
|
|
||||||
### 🧰 Tools
|
|
||||||
|
|
||||||
- #### 🐞 Bug Fixes
|
|
||||||
|
|
||||||
- addons: ensure curl is installed before use [@MickLesk](https://github.com/MickLesk) ([#11718](https://github.com/community-scripts/ProxmoxVE/pull/11718))
|
|
||||||
- Netbird (addon): add systemd ordering to start after Docker [@MickLesk](https://github.com/MickLesk) ([#11716](https://github.com/community-scripts/ProxmoxVE/pull/11716))
|
|
||||||
|
|
||||||
### ❔ Uncategorized
|
### ❔ Uncategorized
|
||||||
|
|
||||||
- Bichon: Update website [@tremor021](https://github.com/tremor021) ([#11711](https://github.com/community-scripts/ProxmoxVE/pull/11711))
|
- Bichon: Update website [@tremor021](https://github.com/tremor021) ([#11711](https://github.com/community-scripts/ProxmoxVE/pull/11711))
|
||||||
@@ -1355,4 +1177,162 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
|
|||||||
|
|
||||||
### ❔ Uncategorized
|
### ❔ Uncategorized
|
||||||
|
|
||||||
- qui: fix: category [@CrazyWolf13](https://github.com/CrazyWolf13) ([#10847](https://github.com/community-scripts/ProxmoxVE/pull/10847))
|
- qui: fix: category [@CrazyWolf13](https://github.com/CrazyWolf13) ([#10847](https://github.com/community-scripts/ProxmoxVE/pull/10847))
|
||||||
|
|
||||||
|
## 2026-01-15
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- Qui ([#10829](https://github.com/community-scripts/ProxmoxVE/pull/10829))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### ✨ New Features
|
||||||
|
|
||||||
|
- Refactor: FreshRSS + Bump to Debian 13 [@MickLesk](https://github.com/MickLesk) ([#10824](https://github.com/community-scripts/ProxmoxVE/pull/10824))
|
||||||
|
|
||||||
|
## 2026-01-14
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- Kutt ([#10812](https://github.com/community-scripts/ProxmoxVE/pull/10812))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Switch Ollama install to .tar.zst and add zstd dependency [@MickLesk](https://github.com/MickLesk) ([#10814](https://github.com/community-scripts/ProxmoxVE/pull/10814))
|
||||||
|
- Immich: Install libde265-dev from Debian Testing [@vhsdream](https://github.com/vhsdream) ([#10810](https://github.com/community-scripts/ProxmoxVE/pull/10810))
|
||||||
|
- nginxproxymanager: allow updates now the build is fixed [@durzo](https://github.com/durzo) ([#10796](https://github.com/community-scripts/ProxmoxVE/pull/10796))
|
||||||
|
- Fixed Apache Guacamole installer [@horvatbenjamin](https://github.com/horvatbenjamin) ([#10798](https://github.com/community-scripts/ProxmoxVE/pull/10798))
|
||||||
|
|
||||||
|
### 💾 Core
|
||||||
|
|
||||||
|
- #### ✨ New Features
|
||||||
|
|
||||||
|
- core: Improve NVIDIA GPU setup (5000x Series) [@MickLesk](https://github.com/MickLesk) ([#10807](https://github.com/community-scripts/ProxmoxVE/pull/10807))
|
||||||
|
|
||||||
|
### 🧰 Tools
|
||||||
|
|
||||||
|
- Fix whiptail dialog hanging in Proxmox web console [@comk22](https://github.com/comk22) ([#10794](https://github.com/community-scripts/ProxmoxVE/pull/10794))
|
||||||
|
|
||||||
|
### 🌐 Website
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Add search filtering to CommandDialog for improved script search functionality [@BramSuurdje](https://github.com/BramSuurdje) ([#10800](https://github.com/community-scripts/ProxmoxVE/pull/10800))
|
||||||
|
|
||||||
|
## 2026-01-13
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- Investbrain ([#10774](https://github.com/community-scripts/ProxmoxVE/pull/10774))
|
||||||
|
- Fladder ([#10768](https://github.com/community-scripts/ProxmoxVE/pull/10768))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Immich: Fix Intel version check; install legacy Intel packages during new install [@vhsdream](https://github.com/vhsdream) ([#10787](https://github.com/community-scripts/ProxmoxVE/pull/10787))
|
||||||
|
- Openwrt: Remove default VLAN for LAN [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#10782](https://github.com/community-scripts/ProxmoxVE/pull/10782))
|
||||||
|
- Refactor: Joplin Server [@tremor021](https://github.com/tremor021) ([#10769](https://github.com/community-scripts/ProxmoxVE/pull/10769))
|
||||||
|
- Fix Zammad nginx configuration causing installation failure [@Copilot](https://github.com/Copilot) ([#10757](https://github.com/community-scripts/ProxmoxVE/pull/10757))
|
||||||
|
|
||||||
|
- #### 🔧 Refactor
|
||||||
|
|
||||||
|
- Backrest: Bump to Trixie [@tremor021](https://github.com/tremor021) ([#10758](https://github.com/community-scripts/ProxmoxVE/pull/10758))
|
||||||
|
- Refactor: Caddy [@tremor021](https://github.com/tremor021) ([#10759](https://github.com/community-scripts/ProxmoxVE/pull/10759))
|
||||||
|
- Refactor: Leantime [@tremor021](https://github.com/tremor021) ([#10760](https://github.com/community-scripts/ProxmoxVE/pull/10760))
|
||||||
|
|
||||||
|
### 🧰 Tools
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- update_lxcs.sh: Add the option to skip stopped LXC [@michelroegl-brunner](https://github.com/michelroegl-brunner) ([#10783](https://github.com/community-scripts/ProxmoxVE/pull/10783))
|
||||||
|
|
||||||
|
## 2026-01-12
|
||||||
|
|
||||||
|
### 🆕 New Scripts
|
||||||
|
|
||||||
|
- Jellystat ([#10628](https://github.com/community-scripts/ProxmoxVE/pull/10628))
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- InfluxSB: fix If / fi [@chrnie](https://github.com/chrnie) ([#10753](https://github.com/community-scripts/ProxmoxVE/pull/10753))
|
||||||
|
- Cockpit: Downgrade to Debian 12 Bookworm (45Drives Issue) [@MickLesk](https://github.com/MickLesk) ([#10717](https://github.com/community-scripts/ProxmoxVE/pull/10717))
|
||||||
|
|
||||||
|
- #### ✨ New Features
|
||||||
|
|
||||||
|
- InfluxDB: add setup for influxdb v3 [@victorlap](https://github.com/victorlap) ([#10736](https://github.com/community-scripts/ProxmoxVE/pull/10736))
|
||||||
|
- Apache Guacamole: add schema upgrades and extension updates [@MickLesk](https://github.com/MickLesk) ([#10746](https://github.com/community-scripts/ProxmoxVE/pull/10746))
|
||||||
|
- Apache Tomcat: update support and refactor install script + debian 13 [@MickLesk](https://github.com/MickLesk) ([#10739](https://github.com/community-scripts/ProxmoxVE/pull/10739))
|
||||||
|
- Apache Guacamole: Function Bump + update_script [@MickLesk](https://github.com/MickLesk) ([#10728](https://github.com/community-scripts/ProxmoxVE/pull/10728))
|
||||||
|
- Apache CouchDB: bump to debian 13 and add update support [@MickLesk](https://github.com/MickLesk) ([#10721](https://github.com/community-scripts/ProxmoxVE/pull/10721))
|
||||||
|
- Apache Cassandra: bump to debian 13 and add update support [@MickLesk](https://github.com/MickLesk) ([#10720](https://github.com/community-scripts/ProxmoxVE/pull/10720))
|
||||||
|
|
||||||
|
- #### 🔧 Refactor
|
||||||
|
|
||||||
|
- Refactor: Booklore [@MickLesk](https://github.com/MickLesk) ([#10742](https://github.com/community-scripts/ProxmoxVE/pull/10742))
|
||||||
|
- Bump Argus to Debian 13 [@MickLesk](https://github.com/MickLesk) ([#10718](https://github.com/community-scripts/ProxmoxVE/pull/10718))
|
||||||
|
- Refactor Docker/Dockge & Bump to Debian 13 [@MickLesk](https://github.com/MickLesk) ([#10719](https://github.com/community-scripts/ProxmoxVE/pull/10719))
|
||||||
|
|
||||||
|
### 💾 Core
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- core: remove duplicated pve_version in advanced installs [@MickLesk](https://github.com/MickLesk) ([#10743](https://github.com/community-scripts/ProxmoxVE/pull/10743))
|
||||||
|
|
||||||
|
- #### ✨ New Features
|
||||||
|
|
||||||
|
- core: add storage validation & fix GB/MB display [@MickLesk](https://github.com/MickLesk) ([#10745](https://github.com/community-scripts/ProxmoxVE/pull/10745))
|
||||||
|
- core: validate container ID before pct create to prevent failures [@MickLesk](https://github.com/MickLesk) ([#10729](https://github.com/community-scripts/ProxmoxVE/pull/10729))
|
||||||
|
|
||||||
|
- #### 🔧 Refactor
|
||||||
|
|
||||||
|
- Enforce non-interactive apt mode in DB setup scripts [@MickLesk](https://github.com/MickLesk) ([#10714](https://github.com/community-scripts/ProxmoxVE/pull/10714))
|
||||||
|
|
||||||
|
## 2026-01-11
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Fix Invoice Ninja Error 500 by restoring file ownership after artisan commands [@Copilot](https://github.com/Copilot) ([#10709](https://github.com/community-scripts/ProxmoxVE/pull/10709))
|
||||||
|
|
||||||
|
- #### 🔧 Refactor
|
||||||
|
|
||||||
|
- Refactor: Infisical [@tremor021](https://github.com/tremor021) ([#10693](https://github.com/community-scripts/ProxmoxVE/pull/10693))
|
||||||
|
- Refactor: HortusFox [@tremor021](https://github.com/tremor021) ([#10697](https://github.com/community-scripts/ProxmoxVE/pull/10697))
|
||||||
|
- Refactor: Homer [@tremor021](https://github.com/tremor021) ([#10698](https://github.com/community-scripts/ProxmoxVE/pull/10698))
|
||||||
|
|
||||||
|
## 2026-01-10
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- [Endurain] Increase default RAM from 2048 to 4096 [@FutureCow](https://github.com/FutureCow) ([#10690](https://github.com/community-scripts/ProxmoxVE/pull/10690))
|
||||||
|
|
||||||
|
### 💾 Core
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- tools.func: hwaccel - make beignet-opencl-icd optional for legacy Intel GPUs [@MickLesk](https://github.com/MickLesk) ([#10677](https://github.com/community-scripts/ProxmoxVE/pull/10677))
|
||||||
|
|
||||||
|
## 2026-01-09
|
||||||
|
|
||||||
|
### 🚀 Updated Scripts
|
||||||
|
|
||||||
|
- #### 🐞 Bug Fixes
|
||||||
|
|
||||||
|
- Jenkins: Fix application repository setup [@tremor021](https://github.com/tremor021) ([#10671](https://github.com/community-scripts/ProxmoxVE/pull/10671))
|
||||||
|
- deCONZ: Fix sources check in update script [@tremor021](https://github.com/tremor021) ([#10664](https://github.com/community-scripts/ProxmoxVE/pull/10664))
|
||||||
|
- Remove '--cpu' option from ExecStart command [@sethgregory](https://github.com/sethgregory) ([#10659](https://github.com/community-scripts/ProxmoxVE/pull/10659))
|
||||||
|
|
||||||
|
### 💾 Core
|
||||||
|
|
||||||
|
- #### 💥 Breaking Changes
|
||||||
|
|
||||||
|
- fix: setup_mariadb hangs on [@CrazyWolf13](https://github.com/CrazyWolf13) ([#10672](https://github.com/community-scripts/ProxmoxVE/pull/10672))
|
||||||
5
api/.env.example
Normal file
5
api/.env.example
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
MONGO_USER=
|
||||||
|
MONGO_PASSWORD=
|
||||||
|
MONGO_IP=
|
||||||
|
MONGO_PORT=
|
||||||
|
MONGO_DATABASE=
|
||||||
23
api/go.mod
Normal file
23
api/go.mod
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
module proxmox-api
|
||||||
|
|
||||||
|
go 1.24.0
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/gorilla/mux v1.8.1
|
||||||
|
github.com/joho/godotenv v1.5.1
|
||||||
|
github.com/rs/cors v1.11.1
|
||||||
|
go.mongodb.org/mongo-driver v1.17.2
|
||||||
|
)
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/golang/snappy v0.0.4 // indirect
|
||||||
|
github.com/klauspost/compress v1.16.7 // indirect
|
||||||
|
github.com/montanaflynn/stats v0.7.1 // indirect
|
||||||
|
github.com/xdg-go/pbkdf2 v1.0.0 // indirect
|
||||||
|
github.com/xdg-go/scram v1.1.2 // indirect
|
||||||
|
github.com/xdg-go/stringprep v1.0.4 // indirect
|
||||||
|
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78 // indirect
|
||||||
|
golang.org/x/crypto v0.45.0 // indirect
|
||||||
|
golang.org/x/sync v0.18.0 // indirect
|
||||||
|
golang.org/x/text v0.31.0 // indirect
|
||||||
|
)
|
||||||
56
api/go.sum
Normal file
56
api/go.sum
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||||
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/golang/snappy v0.0.4 h1:yAGX7huGHXlcLOEtBnF4w7FQwA26wojNCwOYAEhLjQM=
|
||||||
|
github.com/golang/snappy v0.0.4/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
|
||||||
|
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
|
||||||
|
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
|
||||||
|
github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
|
||||||
|
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
|
||||||
|
github.com/joho/godotenv v1.5.1 h1:7eLL/+HRGLY0ldzfGMeQkb7vMd0as4CfYvUVzLqw0N0=
|
||||||
|
github.com/joho/godotenv v1.5.1/go.mod h1:f4LDr5Voq0i2e/R5DDNOoa2zzDfwtkZa6DnEwAbqwq4=
|
||||||
|
github.com/klauspost/compress v1.16.7 h1:2mk3MPGNzKyxErAw8YaohYh69+pa4sIQSC0fPGCFR9I=
|
||||||
|
github.com/klauspost/compress v1.16.7/go.mod h1:ntbaceVETuRiXiv4DpjP66DpAtAGkEQskQzEyD//IeE=
|
||||||
|
github.com/montanaflynn/stats v0.7.1 h1:etflOAAHORrCC44V+aR6Ftzort912ZU+YLiSTuV8eaE=
|
||||||
|
github.com/montanaflynn/stats v0.7.1/go.mod h1:etXPPgVO6n31NxCd9KQUMvCM+ve0ruNzt6R8Bnaayow=
|
||||||
|
github.com/rs/cors v1.11.1 h1:eU3gRzXLRK57F5rKMGMZURNdIG4EoAmX8k94r9wXWHA=
|
||||||
|
github.com/rs/cors v1.11.1/go.mod h1:XyqrcTp5zjWr1wsJ8PIRZssZ8b/WMcMf71DJnit4EMU=
|
||||||
|
github.com/xdg-go/pbkdf2 v1.0.0 h1:Su7DPu48wXMwC3bs7MCNG+z4FhcyEuz5dlvchbq0B0c=
|
||||||
|
github.com/xdg-go/pbkdf2 v1.0.0/go.mod h1:jrpuAogTd400dnrH08LKmI/xc1MbPOebTwRqcT5RDeI=
|
||||||
|
github.com/xdg-go/scram v1.1.2 h1:FHX5I5B4i4hKRVRBCFRxq1iQRej7WO3hhBuJf+UUySY=
|
||||||
|
github.com/xdg-go/scram v1.1.2/go.mod h1:RT/sEzTbU5y00aCK8UOx6R7YryM0iF1N2MOmC3kKLN4=
|
||||||
|
github.com/xdg-go/stringprep v1.0.4 h1:XLI/Ng3O1Atzq0oBs3TWm+5ZVgkq2aqdlvP9JtoZ6c8=
|
||||||
|
github.com/xdg-go/stringprep v1.0.4/go.mod h1:mPGuuIYwz7CmR2bT9j4GbQqutWS1zV24gijq1dTyGkM=
|
||||||
|
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78 h1:ilQV1hzziu+LLM3zUTJ0trRztfwgjqKnBWNtSRkbmwM=
|
||||||
|
github.com/youmark/pkcs8 v0.0.0-20240726163527-a2c0da244d78/go.mod h1:aL8wCCfTfSfmXjznFBSZNN13rSJjlIOI1fUNAtF7rmI=
|
||||||
|
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
|
||||||
|
go.mongodb.org/mongo-driver v1.17.2 h1:gvZyk8352qSfzyZ2UMWcpDpMSGEr1eqE4T793SqyhzM=
|
||||||
|
go.mongodb.org/mongo-driver v1.17.2/go.mod h1:Hy04i7O2kC4RS06ZrhPRqj/u4DTYkFDAAccj+rVKqgQ=
|
||||||
|
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
|
||||||
|
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
|
||||||
|
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
|
||||||
|
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
|
||||||
|
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
|
||||||
|
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
|
||||||
|
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
|
||||||
|
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
|
||||||
|
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||||
|
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||||
|
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
|
||||||
|
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||||
|
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||||
|
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
|
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
|
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
|
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
|
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||||
|
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
|
||||||
|
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||||
|
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||||
|
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
|
||||||
|
golang.org/x/text v0.3.8/go.mod h1:E6s5w1FMmriuDzIBO73fBruAKo1PCIq6d2Q6DHfQ8WQ=
|
||||||
|
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
|
||||||
|
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
|
||||||
|
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
||||||
|
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
|
||||||
|
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
|
||||||
|
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
||||||
450
api/main.go
Normal file
450
api/main.go
Normal file
@@ -0,0 +1,450 @@
|
|||||||
|
// Copyright (c) 2021-2026 community-scripts ORG
|
||||||
|
// Author: Michel Roegl-Brunner (michelroegl-brunner)
|
||||||
|
// License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"strconv"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/gorilla/mux"
|
||||||
|
"github.com/joho/godotenv"
|
||||||
|
"github.com/rs/cors"
|
||||||
|
"go.mongodb.org/mongo-driver/bson"
|
||||||
|
"go.mongodb.org/mongo-driver/bson/primitive"
|
||||||
|
"go.mongodb.org/mongo-driver/mongo"
|
||||||
|
"go.mongodb.org/mongo-driver/mongo/options"
|
||||||
|
)
|
||||||
|
|
||||||
|
var client *mongo.Client
|
||||||
|
var collection *mongo.Collection
|
||||||
|
|
||||||
|
func loadEnv() {
|
||||||
|
if err := godotenv.Load(); err != nil {
|
||||||
|
log.Fatal("Error loading .env file")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// DataModel represents a single document in MongoDB
|
||||||
|
type DataModel struct {
|
||||||
|
ID primitive.ObjectID `json:"id" bson:"_id,omitempty"`
|
||||||
|
CT_TYPE uint `json:"ct_type" bson:"ct_type"`
|
||||||
|
DISK_SIZE float32 `json:"disk_size" bson:"disk_size"`
|
||||||
|
CORE_COUNT uint `json:"core_count" bson:"core_count"`
|
||||||
|
RAM_SIZE uint `json:"ram_size" bson:"ram_size"`
|
||||||
|
OS_TYPE string `json:"os_type" bson:"os_type"`
|
||||||
|
OS_VERSION string `json:"os_version" bson:"os_version"`
|
||||||
|
DISABLEIP6 string `json:"disableip6" bson:"disableip6"`
|
||||||
|
NSAPP string `json:"nsapp" bson:"nsapp"`
|
||||||
|
METHOD string `json:"method" bson:"method"`
|
||||||
|
CreatedAt time.Time `json:"created_at" bson:"created_at"`
|
||||||
|
PVEVERSION string `json:"pve_version" bson:"pve_version"`
|
||||||
|
STATUS string `json:"status" bson:"status"`
|
||||||
|
RANDOM_ID string `json:"random_id" bson:"random_id"`
|
||||||
|
TYPE string `json:"type" bson:"type"`
|
||||||
|
ERROR string `json:"error" bson:"error"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type StatusModel struct {
|
||||||
|
RANDOM_ID string `json:"random_id" bson:"random_id"`
|
||||||
|
ERROR string `json:"error" bson:"error"`
|
||||||
|
STATUS string `json:"status" bson:"status"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type CountResponse struct {
|
||||||
|
TotalEntries int64 `json:"total_entries"`
|
||||||
|
StatusCount map[string]int64 `json:"status_count"`
|
||||||
|
NSAPPCount map[string]int64 `json:"nsapp_count"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ConnectDatabase initializes the MongoDB connection
|
||||||
|
func ConnectDatabase() {
|
||||||
|
loadEnv()
|
||||||
|
|
||||||
|
mongoURI := fmt.Sprintf("mongodb://%s:%s@%s:%s",
|
||||||
|
os.Getenv("MONGO_USER"),
|
||||||
|
os.Getenv("MONGO_PASSWORD"),
|
||||||
|
os.Getenv("MONGO_IP"),
|
||||||
|
os.Getenv("MONGO_PORT"))
|
||||||
|
|
||||||
|
database := os.Getenv("MONGO_DATABASE")
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
var err error
|
||||||
|
client, err = mongo.Connect(ctx, options.Client().ApplyURI(mongoURI))
|
||||||
|
if err != nil {
|
||||||
|
log.Fatal("Failed to connect to MongoDB!", err)
|
||||||
|
}
|
||||||
|
collection = client.Database(database).Collection("data_models")
|
||||||
|
fmt.Println("Connected to MongoDB on 10.10.10.18")
|
||||||
|
}
|
||||||
|
|
||||||
|
// UploadJSON handles API requests and stores data as a document in MongoDB
|
||||||
|
func UploadJSON(w http.ResponseWriter, r *http.Request) {
|
||||||
|
var input DataModel
|
||||||
|
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
input.CreatedAt = time.Now()
|
||||||
|
|
||||||
|
_, err := collection.InsertOne(context.Background(), input)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
log.Println("Received data:", input)
|
||||||
|
w.WriteHeader(http.StatusCreated)
|
||||||
|
json.NewEncoder(w).Encode(map[string]string{"message": "Data saved successfully"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// UpdateStatus updates the status of a record based on RANDOM_ID
|
||||||
|
func UpdateStatus(w http.ResponseWriter, r *http.Request) {
|
||||||
|
var input StatusModel
|
||||||
|
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&input); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
filter := bson.M{"random_id": input.RANDOM_ID}
|
||||||
|
update := bson.M{"$set": bson.M{"status": input.STATUS, "error": input.ERROR}}
|
||||||
|
|
||||||
|
_, err := collection.UpdateOne(context.Background(), filter, update)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
log.Println("Updated data:", input)
|
||||||
|
w.WriteHeader(http.StatusOK)
|
||||||
|
json.NewEncoder(w).Encode(map[string]string{"message": "Record updated successfully"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetDataJSON fetches all data from MongoDB
|
||||||
|
func GetDataJSON(w http.ResponseWriter, r *http.Request) {
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
func GetPaginatedData(w http.ResponseWriter, r *http.Request) {
|
||||||
|
page, _ := strconv.Atoi(r.URL.Query().Get("page"))
|
||||||
|
limit, _ := strconv.Atoi(r.URL.Query().Get("limit"))
|
||||||
|
if page < 1 {
|
||||||
|
page = 1
|
||||||
|
}
|
||||||
|
if limit < 1 {
|
||||||
|
limit = 10
|
||||||
|
}
|
||||||
|
skip := (page - 1) * limit
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
options := options.Find().SetSkip(int64(skip)).SetLimit(int64(limit))
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{}, options)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetSummary(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
totalCount, err := collection.CountDocuments(ctx, bson.M{})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
statusCount := make(map[string]int64)
|
||||||
|
nsappCount := make(map[string]int64)
|
||||||
|
|
||||||
|
pipeline := []bson.M{
|
||||||
|
{"$group": bson.M{"_id": "$status", "count": bson.M{"$sum": 1}}},
|
||||||
|
}
|
||||||
|
cursor, err := collection.Aggregate(ctx, pipeline)
|
||||||
|
if err == nil {
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var result struct {
|
||||||
|
ID string `bson:"_id"`
|
||||||
|
Count int64 `bson:"count"`
|
||||||
|
}
|
||||||
|
if err := cursor.Decode(&result); err == nil {
|
||||||
|
statusCount[result.ID] = result.Count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pipeline = []bson.M{
|
||||||
|
{"$group": bson.M{"_id": "$nsapp", "count": bson.M{"$sum": 1}}},
|
||||||
|
}
|
||||||
|
cursor, err = collection.Aggregate(ctx, pipeline)
|
||||||
|
if err == nil {
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var result struct {
|
||||||
|
ID string `bson:"_id"`
|
||||||
|
Count int64 `bson:"count"`
|
||||||
|
}
|
||||||
|
if err := cursor.Decode(&result); err == nil {
|
||||||
|
nsappCount[result.ID] = result.Count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
response := CountResponse{
|
||||||
|
TotalEntries: totalCount,
|
||||||
|
StatusCount: statusCount,
|
||||||
|
NSAPPCount: nsappCount,
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(response)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetByNsapp(w http.ResponseWriter, r *http.Request) {
|
||||||
|
nsapp := r.URL.Query().Get("nsapp")
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"nsapp": nsapp})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetByDateRange(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
|
startDate := r.URL.Query().Get("start_date")
|
||||||
|
endDate := r.URL.Query().Get("end_date")
|
||||||
|
|
||||||
|
if startDate == "" || endDate == "" {
|
||||||
|
http.Error(w, "Both start_date and end_date are required", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
start, err := time.Parse("2006-01-02T15:04:05.999999+00:00", startDate+"T00:00:00+00:00")
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Invalid start_date format", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
end, err := time.Parse("2006-01-02T15:04:05.999999+00:00", endDate+"T23:59:59+00:00")
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Invalid end_date format", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{
|
||||||
|
"created_at": bson.M{
|
||||||
|
"$gte": start,
|
||||||
|
"$lte": end,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
func GetByStatus(w http.ResponseWriter, r *http.Request) {
|
||||||
|
status := r.URL.Query().Get("status")
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"status": status})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetByOS(w http.ResponseWriter, r *http.Request) {
|
||||||
|
osType := r.URL.Query().Get("os_type")
|
||||||
|
osVersion := r.URL.Query().Get("os_version")
|
||||||
|
var records []DataModel
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"os_type": osType, "os_version": osVersion})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
records = append(records, record)
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(records)
|
||||||
|
}
|
||||||
|
|
||||||
|
func GetErrors(w http.ResponseWriter, r *http.Request) {
|
||||||
|
errorCount := make(map[string]int)
|
||||||
|
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cursor, err := collection.Find(ctx, bson.M{"error": bson.M{"$ne": ""}})
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer cursor.Close(ctx)
|
||||||
|
|
||||||
|
for cursor.Next(ctx) {
|
||||||
|
var record DataModel
|
||||||
|
if err := cursor.Decode(&record); err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if record.ERROR != "" {
|
||||||
|
errorCount[record.ERROR]++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
type ErrorCountResponse struct {
|
||||||
|
Error string `json:"error"`
|
||||||
|
Count int `json:"count"`
|
||||||
|
}
|
||||||
|
|
||||||
|
var errorCounts []ErrorCountResponse
|
||||||
|
for err, count := range errorCount {
|
||||||
|
errorCounts = append(errorCounts, ErrorCountResponse{
|
||||||
|
Error: err,
|
||||||
|
Count: count,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(struct {
|
||||||
|
ErrorCounts []ErrorCountResponse `json:"error_counts"`
|
||||||
|
}{
|
||||||
|
ErrorCounts: errorCounts,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
ConnectDatabase()
|
||||||
|
|
||||||
|
router := mux.NewRouter()
|
||||||
|
router.HandleFunc("/upload", UploadJSON).Methods("POST")
|
||||||
|
router.HandleFunc("/upload/updatestatus", UpdateStatus).Methods("POST")
|
||||||
|
router.HandleFunc("/data/json", GetDataJSON).Methods("GET")
|
||||||
|
router.HandleFunc("/data/paginated", GetPaginatedData).Methods("GET")
|
||||||
|
router.HandleFunc("/data/summary", GetSummary).Methods("GET")
|
||||||
|
router.HandleFunc("/data/nsapp", GetByNsapp).Methods("GET")
|
||||||
|
router.HandleFunc("/data/date", GetByDateRange).Methods("GET")
|
||||||
|
router.HandleFunc("/data/status", GetByStatus).Methods("GET")
|
||||||
|
router.HandleFunc("/data/os", GetByOS).Methods("GET")
|
||||||
|
router.HandleFunc("/data/errors", GetErrors).Methods("GET")
|
||||||
|
|
||||||
|
c := cors.New(cors.Options{
|
||||||
|
AllowedOrigins: []string{"*"},
|
||||||
|
AllowedMethods: []string{"GET", "POST"},
|
||||||
|
AllowedHeaders: []string{"Content-Type", "Authorization"},
|
||||||
|
AllowCredentials: true,
|
||||||
|
})
|
||||||
|
|
||||||
|
handler := c.Handler(router)
|
||||||
|
|
||||||
|
fmt.Println("Server running on port 8080")
|
||||||
|
log.Fatal(http.ListenAndServe(":8080", handler))
|
||||||
|
}
|
||||||
@@ -51,7 +51,7 @@ function update_script() {
|
|||||||
cp -r /opt/adventurelog-backup/backend/server/media /opt/adventurelog/backend/server/media
|
cp -r /opt/adventurelog-backup/backend/server/media /opt/adventurelog/backend/server/media
|
||||||
cd /opt/adventurelog/backend/server
|
cd /opt/adventurelog/backend/server
|
||||||
if [[ ! -x .venv/bin/python ]]; then
|
if [[ ! -x .venv/bin/python ]]; then
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD .venv/bin/python -m ensurepip --upgrade
|
$STD .venv/bin/python -m ensurepip --upgrade
|
||||||
fi
|
fi
|
||||||
$STD .venv/bin/python -m pip install --upgrade pip
|
$STD .venv/bin/python -m pip install --upgrade pip
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ APP="Alpine-Grafana"
|
|||||||
var_tags="${var_tags:-alpine;monitoring}"
|
var_tags="${var_tags:-alpine;monitoring}"
|
||||||
var_cpu="${var_cpu:-1}"
|
var_cpu="${var_cpu:-1}"
|
||||||
var_ram="${var_ram:-256}"
|
var_ram="${var_ram:-256}"
|
||||||
var_disk="${var_disk:-2}"
|
var_disk="${var_disk:-1}"
|
||||||
var_os="${var_os:-alpine}"
|
var_os="${var_os:-alpine}"
|
||||||
var_version="${var_version:-3.23}"
|
var_version="${var_version:-3.23}"
|
||||||
var_unprivileged="${var_unprivileged:-1}"
|
var_unprivileged="${var_unprivileged:-1}"
|
||||||
|
|||||||
@@ -44,7 +44,7 @@ function update_script() {
|
|||||||
msg_info "Updating Autocaliweb"
|
msg_info "Updating Autocaliweb"
|
||||||
cd "$INSTALL_DIR"
|
cd "$INSTALL_DIR"
|
||||||
if [[ ! -d "$VIRTUAL_ENV" ]]; then
|
if [[ ! -d "$VIRTUAL_ENV" ]]; then
|
||||||
$STD uv venv --clear "$VIRTUAL_ENV"
|
$STD uv venv "$VIRTUAL_ENV"
|
||||||
fi
|
fi
|
||||||
$STD uv sync --all-extras --active
|
$STD uv sync --all-extras --active
|
||||||
cd "$INSTALL_DIR"/koreader/plugins
|
cd "$INSTALL_DIR"/koreader/plugins
|
||||||
|
|||||||
@@ -40,7 +40,7 @@ function update_script() {
|
|||||||
chmod 775 /opt/bazarr /var/lib/bazarr/
|
chmod 775 /opt/bazarr /var/lib/bazarr/
|
||||||
# Always ensure venv exists
|
# Always ensure venv exists
|
||||||
if [[ ! -d /opt/bazarr/venv/ ]]; then
|
if [[ ! -d /opt/bazarr/venv/ ]]; then
|
||||||
$STD uv venv --clear /opt/bazarr/venv --python 3.12
|
$STD uv venv /opt/bazarr/venv --python 3.12
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Always check and fix service file if needed
|
# Always check and fix service file if needed
|
||||||
|
|||||||
@@ -28,7 +28,6 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
msg_info "Updating Deluge"
|
msg_info "Updating Deluge"
|
||||||
ensure_dependencies python3-setuptools
|
|
||||||
$STD apt update
|
$STD apt update
|
||||||
$STD pip3 install deluge[all] --upgrade
|
$STD pip3 install deluge[all] --upgrade
|
||||||
msg_ok "Updated Deluge"
|
msg_ok "Updated Deluge"
|
||||||
|
|||||||
@@ -103,8 +103,8 @@ function update_script() {
|
|||||||
|
|
||||||
cd /opt/dispatcharr
|
cd /opt/dispatcharr
|
||||||
rm -rf .venv
|
rm -rf .venv
|
||||||
$STD uv venv --clear
|
$STD uv venv
|
||||||
$STD uv sync
|
$STD uv pip install -r requirements.txt --index-strategy unsafe-best-match
|
||||||
$STD uv pip install gunicorn gevent celery redis daphne
|
$STD uv pip install gunicorn gevent celery redis daphne
|
||||||
msg_ok "Updated Dispatcharr Backend"
|
msg_ok "Updated Dispatcharr Backend"
|
||||||
|
|
||||||
@@ -144,4 +144,4 @@ description
|
|||||||
msg_ok "Completed successfully!\n"
|
msg_ok "Completed successfully!\n"
|
||||||
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:9191${CL}"
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}${CL}"
|
||||||
|
|||||||
@@ -35,15 +35,13 @@ function update_script() {
|
|||||||
msg_ok "Stopped Service"
|
msg_ok "Stopped Service"
|
||||||
|
|
||||||
msg_info "Backing Up Configurations"
|
msg_info "Backing Up Configurations"
|
||||||
mv /opt/donetick/config/selfhosted.yaml /opt/donetick/donetick.db /opt
|
mv /opt/donetick/config/selfhosted.yml /opt/donetick/donetick.db /opt
|
||||||
msg_ok "Backed Up Configurations"
|
msg_ok "Backed Up Configurations"
|
||||||
|
|
||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "donetick" "donetick/donetick" "prebuild" "latest" "/opt/donetick" "donetick_Linux_x86_64.tar.gz"
|
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "donetick" "donetick/donetick" "prebuild" "latest" "/opt/donetick" "donetick_Linux_x86_64.tar.gz"
|
||||||
|
|
||||||
msg_info "Restoring Configurations"
|
msg_info "Restoring Configurations"
|
||||||
mv /opt/selfhosted.yaml /opt/donetick/config
|
mv /opt/selfhosted.yml /opt/donetick/config
|
||||||
grep -q 'http://localhost"$' /opt/donetick/config/selfhosted.yaml || sed -i '/https:\/\/localhost"$/a\ - "http://localhost"' /opt/donetick/config/selfhosted.yaml
|
|
||||||
grep -q 'capacitor://localhost' /opt/donetick/config/selfhosted.yaml || sed -i '/http:\/\/localhost"$/a\ - "capacitor://localhost"' /opt/donetick/config/selfhosted.yaml
|
|
||||||
mv /opt/donetick.db /opt/donetick
|
mv /opt/donetick.db /opt/donetick
|
||||||
msg_ok "Restored Configurations"
|
msg_ok "Restored Configurations"
|
||||||
|
|
||||||
|
|||||||
58
ct/drawio.sh
58
ct/drawio.sh
@@ -1,58 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
|
||||||
# Copyright (c) 2021-2026 community-scripts ORG
|
|
||||||
# Author: Slaviša Arežina (tremor021)
|
|
||||||
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
|
||||||
# Source: https://www.drawio.com/
|
|
||||||
|
|
||||||
APP="DrawIO"
|
|
||||||
var_tags="${var_tags:-diagrams}"
|
|
||||||
var_cpu="${var_cpu:-1}"
|
|
||||||
var_ram="${var_ram:-2048}"
|
|
||||||
var_disk="${var_disk:-4}"
|
|
||||||
var_os="${var_os:-debian}"
|
|
||||||
var_version="${var_version:-13}"
|
|
||||||
var_unprivileged="${var_unprivileged:-1}"
|
|
||||||
|
|
||||||
header_info "$APP"
|
|
||||||
variables
|
|
||||||
color
|
|
||||||
catch_errors
|
|
||||||
|
|
||||||
function update_script() {
|
|
||||||
header_info
|
|
||||||
check_container_storage
|
|
||||||
check_container_resources
|
|
||||||
if [[ ! -f /var/lib/tomcat11/webapps/draw.war ]]; then
|
|
||||||
msg_error "No ${APP} Installation Found!"
|
|
||||||
exit
|
|
||||||
fi
|
|
||||||
|
|
||||||
if check_for_gh_release "drawio" "jgraph/drawio"; then
|
|
||||||
msg_info "Stopping service"
|
|
||||||
systemctl stop tomcat11
|
|
||||||
msg_ok "Service stopped"
|
|
||||||
|
|
||||||
msg_info "Updating Debian LXC"
|
|
||||||
$STD apt update
|
|
||||||
$STD apt upgrade -y
|
|
||||||
msg_ok "Updated Debian LXC"
|
|
||||||
|
|
||||||
USE_ORIGINAL_FILENAME=true fetch_and_deploy_gh_release "drawio" "jgraph/drawio" "singlefile" "latest" "/var/lib/tomcat11/webapps" "draw.war"
|
|
||||||
|
|
||||||
msg_info "Starting service"
|
|
||||||
systemctl start tomcat11
|
|
||||||
msg_ok "Service started"
|
|
||||||
msg_ok "Updated successfully!"
|
|
||||||
fi
|
|
||||||
exit
|
|
||||||
}
|
|
||||||
|
|
||||||
start
|
|
||||||
build_container
|
|
||||||
description
|
|
||||||
|
|
||||||
msg_ok "Completed Successfully!\n"
|
|
||||||
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
|
||||||
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
|
||||||
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:8080/draw${CL}"
|
|
||||||
@@ -9,7 +9,7 @@ APP="EMQX"
|
|||||||
var_tags="${var_tags:-mqtt}"
|
var_tags="${var_tags:-mqtt}"
|
||||||
var_cpu="${var_cpu:-2}"
|
var_cpu="${var_cpu:-2}"
|
||||||
var_ram="${var_ram:-1024}"
|
var_ram="${var_ram:-1024}"
|
||||||
var_disk="${var_disk:-6}"
|
var_disk="${var_disk:-4}"
|
||||||
var_os="${var_os:-debian}"
|
var_os="${var_os:-debian}"
|
||||||
var_version="${var_version:-13}"
|
var_version="${var_version:-13}"
|
||||||
var_unprivileged="${var_unprivileged:-1}"
|
var_unprivileged="${var_unprivileged:-1}"
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ function update_script() {
|
|||||||
msg_info "Updating Backend"
|
msg_info "Updating Backend"
|
||||||
cd /opt/endurain/backend
|
cd /opt/endurain/backend
|
||||||
$STD poetry export -f requirements.txt --output requirements.txt --without-hashes
|
$STD poetry export -f requirements.txt --output requirements.txt --without-hashes
|
||||||
$STD uv venv --clear
|
$STD uv venv
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
msg_ok "Backend Updated"
|
msg_ok "Backend Updated"
|
||||||
|
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ function update_script() {
|
|||||||
rm -rf "$VENV_PATH"
|
rm -rf "$VENV_PATH"
|
||||||
mkdir -p /opt/esphome
|
mkdir -p /opt/esphome
|
||||||
cd /opt/esphome
|
cd /opt/esphome
|
||||||
$STD uv venv --clear "$VENV_PATH"
|
$STD uv venv "$VENV_PATH"
|
||||||
$STD "$VENV_PATH/bin/python" -m ensurepip --upgrade
|
$STD "$VENV_PATH/bin/python" -m ensurepip --upgrade
|
||||||
$STD "$VENV_PATH/bin/python" -m pip install --upgrade pip
|
$STD "$VENV_PATH/bin/python" -m pip install --upgrade pip
|
||||||
$STD "$VENV_PATH/bin/python" -m pip install esphome tornado esptool
|
$STD "$VENV_PATH/bin/python" -m pip install esphome tornado esptool
|
||||||
|
|||||||
@@ -37,12 +37,17 @@ function update_script() {
|
|||||||
msg_info "Stopped Service"
|
msg_info "Stopped Service"
|
||||||
|
|
||||||
msg_info "Creating Backup"
|
msg_info "Creating Backup"
|
||||||
ls /opt/*.tar.gz &>/dev/null && rm -f /opt/*.tar.gz
|
|
||||||
backup_filename="/opt/${APP}_backup_$(date +%F).tar.gz"
|
backup_filename="/opt/${APP}_backup_$(date +%F).tar.gz"
|
||||||
tar -czf "$backup_filename" -C /opt/fileflows Data
|
tar -czf "$backup_filename" -C /opt/fileflows Data
|
||||||
msg_ok "Backup Created"
|
msg_ok "Backup Created"
|
||||||
|
|
||||||
fetch_and_deploy_from_url "https://fileflows.com/downloads/zip" "/opt/fileflows"
|
msg_info "Updating $APP to latest version"
|
||||||
|
temp_file=$(mktemp)
|
||||||
|
curl -fsSL https://fileflows.com/downloads/zip -o "$temp_file"
|
||||||
|
$STD unzip -o -d /opt/fileflows "$temp_file"
|
||||||
|
rm -rf "$temp_file"
|
||||||
|
rm -rf "$backup_filename"
|
||||||
|
msg_ok "Updated $APP to latest version"
|
||||||
|
|
||||||
msg_info "Starting Service"
|
msg_info "Starting Service"
|
||||||
systemctl start fileflows
|
systemctl start fileflows
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ APP="Grafana"
|
|||||||
var_tags="${var_tags:-monitoring;visualization}"
|
var_tags="${var_tags:-monitoring;visualization}"
|
||||||
var_cpu="${var_cpu:-1}"
|
var_cpu="${var_cpu:-1}"
|
||||||
var_ram="${var_ram:-512}"
|
var_ram="${var_ram:-512}"
|
||||||
var_disk="${var_disk:-4}"
|
var_disk="${var_disk:-2}"
|
||||||
var_os="${var_os:-debian}"
|
var_os="${var_os:-debian}"
|
||||||
var_version="${var_version:-13}"
|
var_version="${var_version:-13}"
|
||||||
var_unprivileged="${var_unprivileged:-1}"
|
var_unprivileged="${var_unprivileged:-1}"
|
||||||
|
|||||||
@@ -1,6 +0,0 @@
|
|||||||
____ ________
|
|
||||||
/ __ \_________ __ __/ _/ __ \
|
|
||||||
/ / / / ___/ __ `/ | /| / // // / / /
|
|
||||||
/ /_/ / / / /_/ /| |/ |/ // // /_/ /
|
|
||||||
/_____/_/ \__,_/ |__/|__/___/\____/
|
|
||||||
|
|
||||||
6
ct/headers/prometheus-paperless-ngx-exporter
Normal file
6
ct/headers/prometheus-paperless-ngx-exporter
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
____ __ __ ____ __ _ _________ __ ______ __
|
||||||
|
/ __ \_________ ____ ___ ___ / /_/ /_ ___ __ _______ / __ \____ _____ ___ _____/ /__ __________ / | / / ____/ |/ / / ____/ ______ ____ _____/ /____ _____
|
||||||
|
/ /_/ / ___/ __ \/ __ `__ \/ _ \/ __/ __ \/ _ \/ / / / ___/_____/ /_/ / __ `/ __ \/ _ \/ ___/ / _ \/ ___/ ___/_____/ |/ / / __ | /_____/ __/ | |/_/ __ \/ __ \/ ___/ __/ _ \/ ___/
|
||||||
|
/ ____/ / / /_/ / / / / / / __/ /_/ / / / __/ /_/ (__ )_____/ ____/ /_/ / /_/ / __/ / / / __(__ |__ )_____/ /| / /_/ // /_____/ /____> </ /_/ / /_/ / / / /_/ __/ /
|
||||||
|
/_/ /_/ \____/_/ /_/ /_/\___/\__/_/ /_/\___/\__,_/____/ /_/ \__,_/ .___/\___/_/ /_/\___/____/____/ /_/ |_/\____//_/|_| /_____/_/|_/ .___/\____/_/ \__/\___/_/
|
||||||
|
/_/ /_/
|
||||||
@@ -105,7 +105,7 @@ EOF
|
|||||||
msg_ok "Image-processing libraries up to date"
|
msg_ok "Image-processing libraries up to date"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
RELEASE="2.5.6"
|
RELEASE="2.5.5"
|
||||||
if check_for_gh_release "Immich" "immich-app/immich" "${RELEASE}"; then
|
if check_for_gh_release "Immich" "immich-app/immich" "${RELEASE}"; then
|
||||||
if [[ $(cat ~/.immich) > "2.5.1" ]]; then
|
if [[ $(cat ~/.immich) > "2.5.1" ]]; then
|
||||||
msg_info "Enabling Maintenance Mode"
|
msg_info "Enabling Maintenance Mode"
|
||||||
|
|||||||
@@ -46,7 +46,7 @@ function update_script() {
|
|||||||
msg_info "Restoring configuration & data"
|
msg_info "Restoring configuration & data"
|
||||||
mv /opt/app.env /opt/jotty/.env
|
mv /opt/app.env /opt/jotty/.env
|
||||||
[[ -d /opt/data ]] && mv /opt/data /opt/jotty/data
|
[[ -d /opt/data ]] && mv /opt/data /opt/jotty/data
|
||||||
[[ -d /opt/jotty/config ]] && cp -a /opt/config/* /opt/jotty/config && rm -rf /opt/config
|
[[ -d /opt/jotty/config ]] && mv /opt/config/* /opt/jotty/config
|
||||||
msg_ok "Restored configuration & data"
|
msg_ok "Restored configuration & data"
|
||||||
|
|
||||||
msg_info "Starting Service"
|
msg_info "Starting Service"
|
||||||
|
|||||||
@@ -34,7 +34,7 @@ function update_script() {
|
|||||||
PYTHON_VERSION="3.12" setup_uv
|
PYTHON_VERSION="3.12" setup_uv
|
||||||
mkdir -p "$INSTALL_DIR"
|
mkdir -p "$INSTALL_DIR"
|
||||||
cd "$INSTALL_DIR"
|
cd "$INSTALL_DIR"
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD "$VENV_PYTHON" -m ensurepip --upgrade
|
$STD "$VENV_PYTHON" -m ensurepip --upgrade
|
||||||
$STD "$VENV_PYTHON" -m pip install --upgrade pip
|
$STD "$VENV_PYTHON" -m pip install --upgrade pip
|
||||||
$STD "$VENV_PYTHON" -m pip install jupyter
|
$STD "$VENV_PYTHON" -m pip install jupyter
|
||||||
|
|||||||
11
ct/kasm.sh
11
ct/kasm.sh
@@ -34,19 +34,10 @@ function update_script() {
|
|||||||
CURRENT_VERSION=$(readlink -f /opt/kasm/current | awk -F'/' '{print $4}')
|
CURRENT_VERSION=$(readlink -f /opt/kasm/current | awk -F'/' '{print $4}')
|
||||||
KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1)
|
KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1)
|
||||||
if [[ -z "$KASM_URL" ]]; then
|
if [[ -z "$KASM_URL" ]]; then
|
||||||
SERVICE_IMAGE_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_service_images_amd64_[0-9]+\.[0-9]+\.[0-9]+\.tar\.gz' | head -n 1)
|
|
||||||
if [[ -n "$SERVICE_IMAGE_URL" ]]; then
|
|
||||||
KASM_VERSION=$(echo "$SERVICE_IMAGE_URL" | sed -E 's/.*kasm_release_service_images_amd64_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
|
|
||||||
KASM_URL="https://kasm-static-content.s3.amazonaws.com/kasm_release_${KASM_VERSION}.tar.gz"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ -z "$KASM_URL" ]] || [[ -z "$KASM_VERSION" ]]; then
|
|
||||||
msg_error "Unable to detect latest Kasm release URL."
|
msg_error "Unable to detect latest Kasm release URL."
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
|
||||||
msg_info "Checked for new version"
|
msg_info "Checked for new version"
|
||||||
|
|
||||||
msg_info "Removing outdated docker-compose plugin"
|
msg_info "Removing outdated docker-compose plugin"
|
||||||
|
|||||||
@@ -33,8 +33,7 @@ function update_script() {
|
|||||||
msg_ok "Stopped Service"
|
msg_ok "Stopped Service"
|
||||||
|
|
||||||
PHP_VERSION="8.5" PHP_APACHE="YES" setup_php
|
PHP_VERSION="8.5" PHP_APACHE="YES" setup_php
|
||||||
setup_composer
|
|
||||||
|
|
||||||
msg_info "Creating a backup"
|
msg_info "Creating a backup"
|
||||||
mv /opt/koillection/ /opt/koillection-backup
|
mv /opt/koillection/ /opt/koillection-backup
|
||||||
msg_ok "Backup created"
|
msg_ok "Backup created"
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ APP="Ollama"
|
|||||||
var_tags="${var_tags:-ai}"
|
var_tags="${var_tags:-ai}"
|
||||||
var_cpu="${var_cpu:-4}"
|
var_cpu="${var_cpu:-4}"
|
||||||
var_ram="${var_ram:-4096}"
|
var_ram="${var_ram:-4096}"
|
||||||
var_disk="${var_disk:-40}"
|
var_disk="${var_disk:-35}"
|
||||||
var_os="${var_os:-ubuntu}"
|
var_os="${var_os:-ubuntu}"
|
||||||
var_version="${var_version:-24.04}"
|
var_version="${var_version:-24.04}"
|
||||||
var_gpu="${var_gpu:-yes}"
|
var_gpu="${var_gpu:-yes}"
|
||||||
|
|||||||
@@ -30,7 +30,7 @@ function update_script() {
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
RELEASE="v5.0.2"
|
RELEASE="v5.0.2"
|
||||||
if check_for_gh_release "OpenCloud" "opencloud-eu/opencloud" "${RELEASE}"; then
|
if check_for_gh_release "opencloud" "opencloud-eu/opencloud" "${RELEASE}"; then
|
||||||
msg_info "Stopping services"
|
msg_info "Stopping services"
|
||||||
systemctl stop opencloud opencloud-wopi
|
systemctl stop opencloud opencloud-wopi
|
||||||
msg_ok "Stopped services"
|
msg_ok "Stopped services"
|
||||||
@@ -38,21 +38,9 @@ function update_script() {
|
|||||||
msg_info "Updating packages"
|
msg_info "Updating packages"
|
||||||
$STD apt-get update
|
$STD apt-get update
|
||||||
$STD apt-get dist-upgrade -y
|
$STD apt-get dist-upgrade -y
|
||||||
ensure_dependencies "inotify-tools"
|
|
||||||
msg_ok "Updated packages"
|
msg_ok "Updated packages"
|
||||||
|
|
||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "OpenCloud" "opencloud-eu/opencloud" "singlefile" "${RELEASE}" "/usr/bin" "opencloud-*-linux-amd64"
|
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "opencloud" "opencloud-eu/opencloud" "singlefile" "${RELEASE}" "/usr/bin" "opencloud-*-linux-amd64"
|
||||||
|
|
||||||
if ! grep -q 'POSIX_WATCH' /etc/opencloud/opencloud.env; then
|
|
||||||
sed -i '/^## External/i ## Uncomment below to enable PosixFS Collaborative Mode\
|
|
||||||
## Increase inotify watch/instance limits on your PVE host:\
|
|
||||||
### sysctl -w fs.inotify.max_user_watches=1048576\
|
|
||||||
### sysctl -w fs.inotify.max_user_instances=1024\
|
|
||||||
# STORAGE_USERS_POSIX_ENABLE_COLLABORATION=true\
|
|
||||||
# STORAGE_USERS_POSIX_WATCH_TYPE=inotifywait\
|
|
||||||
# STORAGE_USERS_POSIX_WATCH_FS=true\
|
|
||||||
# STORAGE_USERS_POSIX_WATCH_PATH=<path-to-storage-or-bind-mount>' /etc/opencloud/opencloud.env
|
|
||||||
fi
|
|
||||||
|
|
||||||
msg_info "Starting services"
|
msg_info "Starting services"
|
||||||
systemctl start opencloud opencloud-wopi
|
systemctl start opencloud opencloud-wopi
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ APP="Open WebUI"
|
|||||||
var_tags="${var_tags:-ai;interface}"
|
var_tags="${var_tags:-ai;interface}"
|
||||||
var_cpu="${var_cpu:-4}"
|
var_cpu="${var_cpu:-4}"
|
||||||
var_ram="${var_ram:-8192}"
|
var_ram="${var_ram:-8192}"
|
||||||
var_disk="${var_disk:-50}"
|
var_disk="${var_disk:-25}"
|
||||||
var_os="${var_os:-debian}"
|
var_os="${var_os:-debian}"
|
||||||
var_version="${var_version:-13}"
|
var_version="${var_version:-13}"
|
||||||
var_unprivileged="${var_unprivileged:-1}"
|
var_unprivileged="${var_unprivileged:-1}"
|
||||||
@@ -44,7 +44,7 @@ function update_script() {
|
|||||||
|
|
||||||
msg_info "Installing uv-based Open-WebUI"
|
msg_info "Installing uv-based Open-WebUI"
|
||||||
PYTHON_VERSION="3.12" setup_uv
|
PYTHON_VERSION="3.12" setup_uv
|
||||||
$STD uv tool install --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
|
$STD uv tool install --python 3.12 open-webui[all]
|
||||||
msg_ok "Installed uv-based Open-WebUI"
|
msg_ok "Installed uv-based Open-WebUI"
|
||||||
|
|
||||||
msg_info "Restoring data"
|
msg_info "Restoring data"
|
||||||
@@ -126,7 +126,7 @@ EOF
|
|||||||
|
|
||||||
msg_info "Updating Open WebUI via uv"
|
msg_info "Updating Open WebUI via uv"
|
||||||
PYTHON_VERSION="3.12" setup_uv
|
PYTHON_VERSION="3.12" setup_uv
|
||||||
$STD uv tool install --force --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
|
$STD uv tool upgrade --python 3.12 open-webui[all]
|
||||||
systemctl restart open-webui
|
systemctl restart open-webui
|
||||||
msg_ok "Updated Open WebUI"
|
msg_ok "Updated Open WebUI"
|
||||||
msg_ok "Updated successfully!"
|
msg_ok "Updated successfully!"
|
||||||
|
|||||||
@@ -51,7 +51,7 @@ function update_script() {
|
|||||||
$STD npm run db:generate
|
$STD npm run db:generate
|
||||||
$STD npm run build
|
$STD npm run build
|
||||||
$STD npm run build:cli
|
$STD npm run build:cli
|
||||||
$STD npm run db:push
|
$STD npm run db:sqlite:push
|
||||||
cp -R .next/standalone ./
|
cp -R .next/standalone ./
|
||||||
chmod +x ./dist/cli.mjs
|
chmod +x ./dist/cli.mjs
|
||||||
cp server/db/names.json ./dist/names.json
|
cp server/db/names.json ./dist/names.json
|
||||||
|
|||||||
@@ -61,12 +61,6 @@ function update_script() {
|
|||||||
rm -rf "$BK"
|
rm -rf "$BK"
|
||||||
msg_ok "Restored data"
|
msg_ok "Restored data"
|
||||||
|
|
||||||
msg_ok "Migrate Database"
|
|
||||||
cd /opt/planka
|
|
||||||
$STD npm run db:upgrade
|
|
||||||
$STD npm run db:migrate
|
|
||||||
msg_ok "Migrated Database"
|
|
||||||
|
|
||||||
msg_info "Starting Service"
|
msg_info "Starting Service"
|
||||||
systemctl start planka
|
systemctl start planka
|
||||||
msg_ok "Started Service"
|
msg_ok "Started Service"
|
||||||
|
|||||||
52
ct/prometheus-paperless-ngx-exporter.sh
Executable file
52
ct/prometheus-paperless-ngx-exporter.sh
Executable file
@@ -0,0 +1,52 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
|
||||||
|
# Copyright (c) 2021-2026 community-scripts ORG
|
||||||
|
# Author: Andy Grunwald (andygrunwald)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://github.com/hansmi/prometheus-paperless-exporter
|
||||||
|
|
||||||
|
APP="Prometheus-Paperless-NGX-Exporter"
|
||||||
|
var_tags="${var_tags:-monitoring;alerting}"
|
||||||
|
var_cpu="${var_cpu:-1}"
|
||||||
|
var_ram="${var_ram:-256}"
|
||||||
|
var_disk="${var_disk:-2}"
|
||||||
|
var_os="${var_os:-debian}"
|
||||||
|
var_version="${var_version:-13}"
|
||||||
|
var_unprivileged="${var_unprivileged:-1}"
|
||||||
|
|
||||||
|
header_info "$APP"
|
||||||
|
variables
|
||||||
|
color
|
||||||
|
catch_errors
|
||||||
|
|
||||||
|
function update_script() {
|
||||||
|
header_info
|
||||||
|
check_container_storage
|
||||||
|
check_container_resources
|
||||||
|
if [[ ! -f /etc/systemd/system/prometheus-paperless-ngx-exporter.service ]]; then
|
||||||
|
msg_error "No ${APP} Installation Found!"
|
||||||
|
exit
|
||||||
|
fi
|
||||||
|
if check_for_gh_release "prom-paperless-exp" "hansmi/prometheus-paperless-exporter"; then
|
||||||
|
msg_info "Stopping Service"
|
||||||
|
systemctl stop prometheus-paperless-ngx-exporter
|
||||||
|
msg_ok "Stopped Service"
|
||||||
|
|
||||||
|
fetch_and_deploy_gh_release "prom-paperless-exp" "hansmi/prometheus-paperless-exporter" "binary"
|
||||||
|
|
||||||
|
msg_info "Starting Service"
|
||||||
|
systemctl start prometheus-paperless-ngx-exporter
|
||||||
|
msg_ok "Started Service"
|
||||||
|
msg_ok "Updated successfully!"
|
||||||
|
fi
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
|
||||||
|
start
|
||||||
|
build_container
|
||||||
|
description
|
||||||
|
|
||||||
|
msg_ok "Completed successfully!\n"
|
||||||
|
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
|
||||||
|
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
|
||||||
|
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:8081/metrics${CL}"
|
||||||
@@ -41,7 +41,7 @@ function update_script() {
|
|||||||
rm -rf "$PVE_VENV_PATH"
|
rm -rf "$PVE_VENV_PATH"
|
||||||
mkdir -p /opt/prometheus-pve-exporter
|
mkdir -p /opt/prometheus-pve-exporter
|
||||||
cd /opt/prometheus-pve-exporter
|
cd /opt/prometheus-pve-exporter
|
||||||
$STD uv venv --clear "$PVE_VENV_PATH"
|
$STD uv venv "$PVE_VENV_PATH"
|
||||||
$STD "$PVE_VENV_PATH/bin/python" -m ensurepip --upgrade
|
$STD "$PVE_VENV_PATH/bin/python" -m ensurepip --upgrade
|
||||||
$STD "$PVE_VENV_PATH/bin/python" -m pip install --upgrade pip
|
$STD "$PVE_VENV_PATH/bin/python" -m pip install --upgrade pip
|
||||||
$STD "$PVE_VENV_PATH/bin/python" -m pip install prometheus-pve-exporter
|
$STD "$PVE_VENV_PATH/bin/python" -m pip install prometheus-pve-exporter
|
||||||
|
|||||||
@@ -28,55 +28,16 @@ function update_script() {
|
|||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if check_for_gh_release "Radicale" "Kozea/Radicale"; then
|
msg_info "Updating ${APP}"
|
||||||
msg_info "Stopping service"
|
$STD python3 -m venv /opt/radicale
|
||||||
systemctl stop radicale
|
source /opt/radicale/bin/activate
|
||||||
msg_ok "Stopped service"
|
$STD python3 -m pip install --upgrade https://github.com/Kozea/Radicale/archive/master.tar.gz
|
||||||
|
msg_ok "Updated ${APP}"
|
||||||
|
|
||||||
msg_info "Backing up users file"
|
msg_info "Starting Service"
|
||||||
cp /opt/radicale/users /opt/radicale_users_backup
|
systemctl enable -q --now radicale
|
||||||
msg_ok "Backed up users file"
|
msg_ok "Started Service"
|
||||||
|
msg_ok "Updated successfully!"
|
||||||
PYTHON_VERSION="3.13" setup_uv
|
|
||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Radicale" "Kozea/Radicale" "tarball" "latest" "/opt/radicale"
|
|
||||||
|
|
||||||
msg_info "Restoring users file"
|
|
||||||
rm -f /opt/radicale/users
|
|
||||||
mv /opt/radicale_users_backup /opt/radicale/users
|
|
||||||
msg_ok "Restored users file"
|
|
||||||
|
|
||||||
if grep -q 'start.sh' /etc/systemd/system/radicale.service; then
|
|
||||||
sed -i -e '/^Description/i[Unit]' \
|
|
||||||
-e '\|^ExecStart|iWorkingDirectory=/opt/radicale' \
|
|
||||||
-e 's|^ExecStart=.*|ExecStart=/usr/local/bin/uv run -m radicale --config /etc/radicale/config|' /etc/systemd/system/radicale.service
|
|
||||||
systemctl daemon-reload
|
|
||||||
fi
|
|
||||||
if [[ ! -f /etc/radicale/config ]]; then
|
|
||||||
msg_info "Migrating to config file (/etc/radicale/config)"
|
|
||||||
mkdir -p /etc/radicale
|
|
||||||
cat <<EOF >/etc/radicale/config
|
|
||||||
[server]
|
|
||||||
hosts = 0.0.0.0:5232
|
|
||||||
|
|
||||||
[auth]
|
|
||||||
type = htpasswd
|
|
||||||
htpasswd_filename = /opt/radicale/users
|
|
||||||
htpasswd_encryption = sha512
|
|
||||||
|
|
||||||
[storage]
|
|
||||||
type = multifilesystem
|
|
||||||
filesystem_folder = /var/lib/radicale/collections
|
|
||||||
|
|
||||||
[web]
|
|
||||||
type = internal
|
|
||||||
EOF
|
|
||||||
msg_ok "Migrated to config (/etc/radicale/config)"
|
|
||||||
fi
|
|
||||||
msg_info "Starting service"
|
|
||||||
systemctl start radicale
|
|
||||||
msg_ok "Started service"
|
|
||||||
msg_ok "Updated Successfully!"
|
|
||||||
fi
|
|
||||||
exit
|
exit
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -41,7 +41,7 @@ function update_script() {
|
|||||||
# Always ensure venv exists
|
# Always ensure venv exists
|
||||||
if [[ ! -d /opt/sabnzbd/venv ]]; then
|
if [[ ! -d /opt/sabnzbd/venv ]]; then
|
||||||
msg_info "Migrating SABnzbd to uv virtual environment"
|
msg_info "Migrating SABnzbd to uv virtual environment"
|
||||||
$STD uv venv --clear /opt/sabnzbd/venv
|
$STD uv venv /opt/sabnzbd/venv
|
||||||
msg_ok "Created uv venv at /opt/sabnzbd/venv"
|
msg_ok "Created uv venv at /opt/sabnzbd/venv"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
|||||||
@@ -38,7 +38,7 @@ function update_script() {
|
|||||||
|
|
||||||
msg_info "Updating Scraparr"
|
msg_info "Updating Scraparr"
|
||||||
cd /opt/scraparr
|
cd /opt/scraparr
|
||||||
$STD uv venv --clear /opt/scraparr/.venv
|
$STD uv venv /opt/scraparr/.venv
|
||||||
$STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt
|
$STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt
|
||||||
|
|||||||
89
ct/slskd.sh
89
ct/slskd.sh
@@ -3,7 +3,7 @@ source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxV
|
|||||||
# Copyright (c) 2021-2026 community-scripts ORG
|
# Copyright (c) 2021-2026 community-scripts ORG
|
||||||
# Author: vhsdream
|
# Author: vhsdream
|
||||||
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
# Source: https://github.com/slskd/slskd, https://github.com/mrusse/soularr
|
# Source: https://github.com/slskd/slskd, https://soularr.net
|
||||||
|
|
||||||
APP="slskd"
|
APP="slskd"
|
||||||
var_tags="${var_tags:-arr;p2p}"
|
var_tags="${var_tags:-arr;p2p}"
|
||||||
@@ -24,65 +24,50 @@ function update_script() {
|
|||||||
check_container_storage
|
check_container_storage
|
||||||
check_container_resources
|
check_container_resources
|
||||||
|
|
||||||
if [[ ! -d /opt/slskd ]]; then
|
if [[ ! -d /opt/slskd ]] || [[ ! -d /opt/soularr ]]; then
|
||||||
msg_error "No Slskd Installation Found!"
|
msg_error "No ${APP} Installation Found!"
|
||||||
exit
|
exit
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if check_for_gh_release "Slskd" "slskd/slskd"; then
|
RELEASE=$(curl -s https://api.github.com/repos/slskd/slskd/releases/latest | grep "tag_name" | awk '{print substr($2, 2, length($2)-3) }')
|
||||||
msg_info "Stopping Service(s)"
|
if [[ "${RELEASE}" != "$(cat /opt/${APP}_version.txt)" ]] || [[ ! -f /opt/${APP}_version.txt ]]; then
|
||||||
systemctl stop slskd
|
msg_info "Stopping Service"
|
||||||
[[ -f /etc/systemd/system/soularr.service ]] && systemctl stop soularr.timer soularr.service
|
systemctl stop slskd soularr.timer soularr.service
|
||||||
msg_ok "Stopped Service(s)"
|
msg_info "Stopped Service"
|
||||||
|
|
||||||
msg_info "Backing up config"
|
msg_info "Updating $APP to v${RELEASE}"
|
||||||
cp /opt/slskd/config/slskd.yml /opt/slskd.yml.bak
|
tmp_file=$(mktemp)
|
||||||
msg_ok "Backed up config"
|
curl -fsSL "https://github.com/slskd/slskd/releases/download/${RELEASE}/slskd-${RELEASE}-linux-x64.zip" -o $tmp_file
|
||||||
|
$STD unzip -oj $tmp_file slskd -d /opt/${APP}
|
||||||
|
echo "${RELEASE}" >/opt/${APP}_version.txt
|
||||||
|
msg_ok "Updated $APP to v${RELEASE}"
|
||||||
|
|
||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Slskd" "slskd/slskd" "prebuild" "latest" "/opt/slskd" "slskd-*-linux-x64.zip"
|
msg_info "Starting Service"
|
||||||
|
|
||||||
msg_info "Restoring config"
|
|
||||||
mv /opt/slskd.yml.bak /opt/slskd/config/slskd.yml
|
|
||||||
msg_ok "Restored config"
|
|
||||||
|
|
||||||
msg_info "Starting Service(s)"
|
|
||||||
systemctl start slskd
|
systemctl start slskd
|
||||||
[[ -f /etc/systemd/system/soularr.service ]] && systemctl start soularr.timer
|
msg_ok "Started Service"
|
||||||
msg_ok "Started Service(s)"
|
rm -rf $tmp_file
|
||||||
msg_ok "Updated Slskd successfully!"
|
else
|
||||||
|
msg_ok "No ${APP} update required. ${APP} is already at v${RELEASE}"
|
||||||
fi
|
fi
|
||||||
[[ -d /opt/soularr ]] && if check_for_gh_release "Soularr" "mrusse/soularr"; then
|
msg_info "Updating Soularr"
|
||||||
if systemctl is-active soularr.timer >/dev/null; then
|
cp /opt/soularr/config.ini /opt/config.ini.bak
|
||||||
msg_info "Stopping Timer and Service"
|
cp /opt/soularr/run.sh /opt/run.sh.bak
|
||||||
systemctl stop soularr.timer soularr.service
|
cd /tmp
|
||||||
msg_ok "Stopped Timer and Service"
|
rm -rf /opt/soularr
|
||||||
fi
|
curl -fsSL -o main.zip https://github.com/mrusse/soularr/archive/refs/heads/main.zip
|
||||||
|
$STD unzip main.zip
|
||||||
|
mv soularr-main /opt/soularr
|
||||||
|
cd /opt/soularr
|
||||||
|
$STD pip install -r requirements.txt
|
||||||
|
mv /opt/config.ini.bak /opt/soularr/config.ini
|
||||||
|
mv /opt/run.sh.bak /opt/soularr/run.sh
|
||||||
|
rm -rf /tmp/main.zip
|
||||||
|
msg_ok "Updated soularr"
|
||||||
|
|
||||||
msg_info "Backing up Soularr config"
|
msg_info "Starting soularr timer"
|
||||||
cp /opt/soularr/config.ini /opt/soularr_config.ini.bak
|
systemctl start soularr.timer
|
||||||
cp /opt/soularr/run.sh /opt/soularr_run.sh.bak
|
msg_ok "Started soularr timer"
|
||||||
msg_ok "Backed up Soularr config"
|
exit
|
||||||
|
|
||||||
PYTHON_VERSION="3.11" setup_uv
|
|
||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "Soularr" "mrusse/soularr" "tarball" "latest" "/opt/soularr"
|
|
||||||
msg_info "Updating Soularr"
|
|
||||||
cd /opt/soularr
|
|
||||||
$STD uv venv -c venv
|
|
||||||
$STD source venv/bin/activate
|
|
||||||
$STD uv pip install -r requirements.txt
|
|
||||||
deactivate
|
|
||||||
msg_ok "Updated Soularr"
|
|
||||||
|
|
||||||
msg_info "Restoring Soularr config"
|
|
||||||
mv /opt/soularr_config.ini.bak /opt/soularr/config.ini
|
|
||||||
mv /opt/soularr_run.sh.bak /opt/soularr/run.sh
|
|
||||||
msg_ok "Restored Soularr config"
|
|
||||||
|
|
||||||
msg_info "Starting Soularr Timer"
|
|
||||||
systemctl restart soularr.timer
|
|
||||||
msg_ok "Started Soularr Timer"
|
|
||||||
msg_ok "Updated Soularr successfully!"
|
|
||||||
fi
|
|
||||||
}
|
}
|
||||||
|
|
||||||
start
|
start
|
||||||
|
|||||||
@@ -33,15 +33,7 @@ function update_script() {
|
|||||||
systemctl stop snowshare
|
systemctl stop snowshare
|
||||||
msg_ok "Stopped Service"
|
msg_ok "Stopped Service"
|
||||||
|
|
||||||
msg_info "Backing up uploads"
|
fetch_and_deploy_gh_release "snowshare" "TuroYT/snowshare" "tarball"
|
||||||
[ -d /opt/snowshare/uploads ] && cp -a /opt/snowshare/uploads /opt/.snowshare_uploads_backup
|
|
||||||
msg_ok "Uploads backed up"
|
|
||||||
|
|
||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "snowshare" "TuroYT/snowshare" "tarball"
|
|
||||||
|
|
||||||
msg_info "Restoring uploads"
|
|
||||||
[ -d /opt/.snowshare_uploads_backup ] && rm -rf /opt/snowshare/uploads && cp -a /opt/.snowshare_uploads_backup /opt/snowshare/uploads
|
|
||||||
msg_ok "Uploads restored"
|
|
||||||
|
|
||||||
msg_info "Updating Snowshare"
|
msg_info "Updating Snowshare"
|
||||||
cd /opt/snowshare
|
cd /opt/snowshare
|
||||||
|
|||||||
@@ -39,7 +39,7 @@ function update_script() {
|
|||||||
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "streamlink-webui" "CrazyWolf13/streamlink-webui" "tarball"
|
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "streamlink-webui" "CrazyWolf13/streamlink-webui" "tarball"
|
||||||
|
|
||||||
msg_info "Updating streamlink-webui"
|
msg_info "Updating streamlink-webui"
|
||||||
$STD uv venv --clear /opt/streamlink-webui/backend/src/.venv
|
$STD uv venv /opt/streamlink-webui/backend/src/.venv
|
||||||
source /opt/streamlink-webui/backend/src/.venv/bin/activate
|
source /opt/streamlink-webui/backend/src/.venv/bin/activate
|
||||||
$STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/streamlink-webui/backend/src/.venv
|
$STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/streamlink-webui/backend/src/.venv
|
||||||
cd /opt/streamlink-webui/frontend/src
|
cd /opt/streamlink-webui/frontend/src
|
||||||
|
|||||||
@@ -50,7 +50,7 @@ function update_script() {
|
|||||||
cp -r /opt/tandoor.bak/{config,api,mediafiles,staticfiles} /opt/tandoor/
|
cp -r /opt/tandoor.bak/{config,api,mediafiles,staticfiles} /opt/tandoor/
|
||||||
mv /opt/tandoor.bak/.env /opt/tandoor/.env
|
mv /opt/tandoor.bak/.env /opt/tandoor/.env
|
||||||
cd /opt/tandoor
|
cd /opt/tandoor
|
||||||
$STD uv venv --clear .venv --python=python3
|
$STD uv venv .venv --python=python3
|
||||||
$STD uv pip install -r requirements.txt --python .venv/bin/python
|
$STD uv pip install -r requirements.txt --python .venv/bin/python
|
||||||
cd /opt/tandoor/vue3
|
cd /opt/tandoor/vue3
|
||||||
$STD yarn install
|
$STD yarn install
|
||||||
|
|||||||
@@ -33,9 +33,7 @@ function update_script() {
|
|||||||
systemctl stop umlautadaptarr
|
systemctl stop umlautadaptarr
|
||||||
msg_ok "Stopped Service"
|
msg_ok "Stopped Service"
|
||||||
|
|
||||||
cp /opt/UmlautAdaptarr/appsettings.json /opt/UmlautAdaptarr/appsettings.json.bak
|
|
||||||
fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip"
|
fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip"
|
||||||
cp /opt/UmlautAdaptarr/appsettings.json.bak /opt/UmlautAdaptarr/appsettings.json
|
|
||||||
|
|
||||||
msg_info "Starting Service"
|
msg_info "Starting Service"
|
||||||
systemctl start umlautadaptarr
|
systemctl start umlautadaptarr
|
||||||
|
|||||||
@@ -39,7 +39,7 @@ function update_script() {
|
|||||||
|
|
||||||
msg_info "Updating Warracker"
|
msg_info "Updating Warracker"
|
||||||
cd /opt/warracker/backend
|
cd /opt/warracker/backend
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD source .venv/bin/activate
|
$STD source .venv/bin/activate
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
msg_ok "Updated Warracker"
|
msg_ok "Updated Warracker"
|
||||||
|
|||||||
@@ -1,35 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "Draw.IO",
|
|
||||||
"slug": "drawio",
|
|
||||||
"categories": [
|
|
||||||
12
|
|
||||||
],
|
|
||||||
"date_created": "2026-02-11",
|
|
||||||
"type": "ct",
|
|
||||||
"updateable": true,
|
|
||||||
"privileged": false,
|
|
||||||
"interface_port": 8080,
|
|
||||||
"documentation": "https://www.drawio.com/doc/",
|
|
||||||
"website": "https://www.drawio.com/",
|
|
||||||
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/draw-io.webp",
|
|
||||||
"config_path": "",
|
|
||||||
"description": "draw.io is a configurable diagramming and whiteboarding application, jointly owned and developed by draw.io Ltd (previously named JGraph) and draw.io AG.",
|
|
||||||
"install_methods": [
|
|
||||||
{
|
|
||||||
"type": "default",
|
|
||||||
"script": "ct/drawio.sh",
|
|
||||||
"resources": {
|
|
||||||
"cpu": 1,
|
|
||||||
"ram": 2048,
|
|
||||||
"hdd": 4,
|
|
||||||
"os": "Debian",
|
|
||||||
"version": "13"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"default_credentials": {
|
|
||||||
"username": null,
|
|
||||||
"password": null
|
|
||||||
},
|
|
||||||
"notes": []
|
|
||||||
}
|
|
||||||
@@ -21,7 +21,7 @@
|
|||||||
"resources": {
|
"resources": {
|
||||||
"cpu": 2,
|
"cpu": 2,
|
||||||
"ram": 1024,
|
"ram": 1024,
|
||||||
"hdd": 6,
|
"hdd": 4,
|
||||||
"os": "debian",
|
"os": "debian",
|
||||||
"version": "13"
|
"version": "13"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"generated": "2026-02-14T18:07:29Z",
|
"generated": "2026-02-09T06:27:10Z",
|
||||||
"versions": [
|
"versions": [
|
||||||
{
|
{
|
||||||
"slug": "2fauth",
|
"slug": "2fauth",
|
||||||
@@ -15,13 +15,6 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2025-12-08T14:34:55Z"
|
"date": "2025-12-08T14:34:55Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "adguardhome-sync",
|
|
||||||
"repo": "bakito/adguardhome-sync",
|
|
||||||
"version": "v0.8.2",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2025-10-24T17:13:47Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "adventurelog",
|
"slug": "adventurelog",
|
||||||
"repo": "seanmorley15/adventurelog",
|
"repo": "seanmorley15/adventurelog",
|
||||||
@@ -67,9 +60,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "autobrr",
|
"slug": "autobrr",
|
||||||
"repo": "autobrr/autobrr",
|
"repo": "autobrr/autobrr",
|
||||||
"version": "v1.73.0",
|
"version": "v1.72.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T16:37:28Z"
|
"date": "2026-01-30T12:57:58Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "autocaliweb",
|
"slug": "autocaliweb",
|
||||||
@@ -116,9 +109,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "bentopdf",
|
"slug": "bentopdf",
|
||||||
"repo": "alam00000/bentopdf",
|
"repo": "alam00000/bentopdf",
|
||||||
"version": "v2.2.1",
|
"version": "v2.1.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T16:33:47Z"
|
"date": "2026-02-02T14:30:55Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "beszel",
|
"slug": "beszel",
|
||||||
@@ -193,30 +186,30 @@
|
|||||||
{
|
{
|
||||||
"slug": "cleanuparr",
|
"slug": "cleanuparr",
|
||||||
"repo": "Cleanuparr/Cleanuparr",
|
"repo": "Cleanuparr/Cleanuparr",
|
||||||
"version": "v2.6.1",
|
"version": "v2.5.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T10:00:19Z"
|
"date": "2026-01-11T00:46:17Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "cloudreve",
|
"slug": "cloudreve",
|
||||||
"repo": "cloudreve/cloudreve",
|
"repo": "cloudreve/cloudreve",
|
||||||
"version": "4.14.0",
|
"version": "4.13.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T06:05:06Z"
|
"date": "2026-02-05T12:53:24Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "comfyui",
|
"slug": "comfyui",
|
||||||
"repo": "comfyanonymous/ComfyUI",
|
"repo": "comfyanonymous/ComfyUI",
|
||||||
"version": "v0.13.0",
|
"version": "v0.12.3",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T20:27:38Z"
|
"date": "2026-02-05T07:04:07Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "commafeed",
|
"slug": "commafeed",
|
||||||
"repo": "Athou/commafeed",
|
"repo": "Athou/commafeed",
|
||||||
"version": "6.2.0",
|
"version": "6.1.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T19:44:58Z"
|
"date": "2026-01-26T15:14:16Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "configarr",
|
"slug": "configarr",
|
||||||
@@ -242,16 +235,16 @@
|
|||||||
{
|
{
|
||||||
"slug": "cronicle",
|
"slug": "cronicle",
|
||||||
"repo": "jhuckaby/Cronicle",
|
"repo": "jhuckaby/Cronicle",
|
||||||
"version": "v0.9.106",
|
"version": "v0.9.105",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T17:11:46Z"
|
"date": "2026-02-05T18:16:11Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "cryptpad",
|
"slug": "cryptpad",
|
||||||
"repo": "cryptpad/cryptpad",
|
"repo": "cryptpad/cryptpad",
|
||||||
"version": "2026.2.0",
|
"version": "2025.9.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T15:39:05Z"
|
"date": "2025-10-22T10:06:29Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "dawarich",
|
"slug": "dawarich",
|
||||||
@@ -263,51 +256,44 @@
|
|||||||
{
|
{
|
||||||
"slug": "discopanel",
|
"slug": "discopanel",
|
||||||
"repo": "nickheyer/discopanel",
|
"repo": "nickheyer/discopanel",
|
||||||
"version": "v1.0.36",
|
"version": "v1.0.35",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T21:15:44Z"
|
"date": "2026-02-02T05:20:12Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "dispatcharr",
|
"slug": "dispatcharr",
|
||||||
"repo": "Dispatcharr/Dispatcharr",
|
"repo": "Dispatcharr/Dispatcharr",
|
||||||
"version": "v0.19.0",
|
"version": "v0.18.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T21:18:10Z"
|
"date": "2026-01-27T17:09:11Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "docmost",
|
"slug": "docmost",
|
||||||
"repo": "docmost/docmost",
|
"repo": "docmost/docmost",
|
||||||
"version": "v0.25.3",
|
"version": "v0.25.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T02:58:23Z"
|
"date": "2026-02-06T19:50:55Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "domain-locker",
|
"slug": "domain-locker",
|
||||||
"repo": "Lissy93/domain-locker",
|
"repo": "Lissy93/domain-locker",
|
||||||
"version": "v0.1.4",
|
"version": "v0.1.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T07:41:29Z"
|
"date": "2025-11-14T22:08:23Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "domain-monitor",
|
"slug": "domain-monitor",
|
||||||
"repo": "Hosteroid/domain-monitor",
|
"repo": "Hosteroid/domain-monitor",
|
||||||
"version": "v1.1.3",
|
"version": "v1.1.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T15:48:18Z"
|
"date": "2025-11-18T11:32:30Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "donetick",
|
"slug": "donetick",
|
||||||
"repo": "donetick/donetick",
|
"repo": "donetick/donetick",
|
||||||
"version": "v0.1.73",
|
"version": "v0.1.64",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T23:42:30Z"
|
"date": "2025-10-03T05:18:24Z"
|
||||||
},
|
|
||||||
{
|
|
||||||
"slug": "drawio",
|
|
||||||
"repo": "jgraph/drawio",
|
|
||||||
"version": "v29.3.6",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2026-01-28T18:25:02Z"
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "duplicati",
|
"slug": "duplicati",
|
||||||
@@ -333,9 +319,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "endurain",
|
"slug": "endurain",
|
||||||
"repo": "endurain-project/endurain",
|
"repo": "endurain-project/endurain",
|
||||||
"version": "v0.17.4",
|
"version": "v0.17.3",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T04:54:22Z"
|
"date": "2026-01-23T22:02:05Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "ersatztv",
|
"slug": "ersatztv",
|
||||||
@@ -354,9 +340,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "firefly",
|
"slug": "firefly",
|
||||||
"repo": "firefly-iii/firefly-iii",
|
"repo": "firefly-iii/firefly-iii",
|
||||||
"version": "v6.4.20",
|
"version": "v6.4.18",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T12:39:02Z"
|
"date": "2026-02-08T07:28:00Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "fladder",
|
"slug": "fladder",
|
||||||
@@ -403,9 +389,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "ghostfolio",
|
"slug": "ghostfolio",
|
||||||
"repo": "ghostfolio/ghostfolio",
|
"repo": "ghostfolio/ghostfolio",
|
||||||
"version": "2.238.0",
|
"version": "2.237.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T18:28:55Z"
|
"date": "2026-02-08T13:59:53Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "gitea",
|
"slug": "gitea",
|
||||||
@@ -445,9 +431,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "gotify",
|
"slug": "gotify",
|
||||||
"repo": "gotify/server",
|
"repo": "gotify/server",
|
||||||
"version": "v2.9.0",
|
"version": "v2.8.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T15:22:31Z"
|
"date": "2026-01-02T11:56:16Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "grist",
|
"slug": "grist",
|
||||||
@@ -508,9 +494,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "homarr",
|
"slug": "homarr",
|
||||||
"repo": "homarr-labs/homarr",
|
"repo": "homarr-labs/homarr",
|
||||||
"version": "v1.53.1",
|
"version": "v1.53.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T19:47:11Z"
|
"date": "2026-02-06T19:42:58Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "homebox",
|
"slug": "homebox",
|
||||||
@@ -543,16 +529,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "huntarr",
|
"slug": "huntarr",
|
||||||
"repo": "plexguide/Huntarr.io",
|
"repo": "plexguide/Huntarr.io",
|
||||||
"version": "9.2.4.1",
|
"version": "9.2.3",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T22:17:47Z"
|
"date": "2026-02-07T04:44:20Z"
|
||||||
},
|
|
||||||
{
|
|
||||||
"slug": "immich-public-proxy",
|
|
||||||
"repo": "alangrainger/immich-public-proxy",
|
|
||||||
"version": "v1.15.1",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2026-01-26T08:04:27Z"
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "inspircd",
|
"slug": "inspircd",
|
||||||
@@ -571,23 +550,16 @@
|
|||||||
{
|
{
|
||||||
"slug": "invoiceninja",
|
"slug": "invoiceninja",
|
||||||
"repo": "invoiceninja/invoiceninja",
|
"repo": "invoiceninja/invoiceninja",
|
||||||
"version": "v5.12.59",
|
"version": "v5.12.55",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T02:26:13Z"
|
"date": "2026-02-05T01:06:15Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "jackett",
|
"slug": "jackett",
|
||||||
"repo": "Jackett/Jackett",
|
"repo": "Jackett/Jackett",
|
||||||
"version": "v0.24.1113",
|
"version": "v0.24.1074",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T17:46:58Z"
|
"date": "2026-02-09T06:01:19Z"
|
||||||
},
|
|
||||||
{
|
|
||||||
"slug": "jellystat",
|
|
||||||
"repo": "CyferShepard/Jellystat",
|
|
||||||
"version": "V1.1.8",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2026-02-08T08:15:00Z"
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "joplin-server",
|
"slug": "joplin-server",
|
||||||
@@ -599,9 +571,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "jotty",
|
"slug": "jotty",
|
||||||
"repo": "fccview/jotty",
|
"repo": "fccview/jotty",
|
||||||
"version": "1.20.0",
|
"version": "1.19.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T09:23:30Z"
|
"date": "2026-01-26T21:30:39Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "kapowarr",
|
"slug": "kapowarr",
|
||||||
@@ -627,9 +599,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "keycloak",
|
"slug": "keycloak",
|
||||||
"repo": "keycloak/keycloak",
|
"repo": "keycloak/keycloak",
|
||||||
"version": "26.5.3",
|
"version": "26.5.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T07:30:08Z"
|
"date": "2026-01-23T14:26:58Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "kimai",
|
"slug": "kimai",
|
||||||
@@ -662,9 +634,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "kometa",
|
"slug": "kometa",
|
||||||
"repo": "Kometa-Team/Kometa",
|
"repo": "Kometa-Team/Kometa",
|
||||||
"version": "v2.3.0",
|
"version": "v2.2.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T21:26:56Z"
|
"date": "2025-10-06T21:31:07Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "komga",
|
"slug": "komga",
|
||||||
@@ -711,9 +683,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "libretranslate",
|
"slug": "libretranslate",
|
||||||
"repo": "LibreTranslate/LibreTranslate",
|
"repo": "LibreTranslate/LibreTranslate",
|
||||||
"version": "v1.9.0",
|
"version": "v1.8.4",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T19:05:48Z"
|
"date": "2026-02-02T17:45:16Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "lidarr",
|
"slug": "lidarr",
|
||||||
@@ -746,9 +718,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "lubelogger",
|
"slug": "lubelogger",
|
||||||
"repo": "hargata/lubelog",
|
"repo": "hargata/lubelog",
|
||||||
"version": "v1.6.0",
|
"version": "v1.5.8",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T20:16:32Z"
|
"date": "2026-01-26T18:18:03Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "mafl",
|
"slug": "mafl",
|
||||||
@@ -767,9 +739,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "mail-archiver",
|
"slug": "mail-archiver",
|
||||||
"repo": "s1t5/mail-archiver",
|
"repo": "s1t5/mail-archiver",
|
||||||
"version": "2602.1",
|
"version": "2601.3",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T06:23:11Z"
|
"date": "2026-01-25T12:52:24Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "managemydamnlife",
|
"slug": "managemydamnlife",
|
||||||
@@ -781,9 +753,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "manyfold",
|
"slug": "manyfold",
|
||||||
"repo": "manyfold3d/manyfold",
|
"repo": "manyfold3d/manyfold",
|
||||||
"version": "v0.132.1",
|
"version": "v0.132.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T22:02:28Z"
|
"date": "2026-01-29T13:53:21Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "mealie",
|
"slug": "mealie",
|
||||||
@@ -795,9 +767,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "mediamanager",
|
"slug": "mediamanager",
|
||||||
"repo": "maxdorninger/MediaManager",
|
"repo": "maxdorninger/MediaManager",
|
||||||
"version": "v1.12.3",
|
"version": "v1.12.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T16:45:40Z"
|
"date": "2026-02-08T19:18:29Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "mediamtx",
|
"slug": "mediamtx",
|
||||||
@@ -823,16 +795,16 @@
|
|||||||
{
|
{
|
||||||
"slug": "metube",
|
"slug": "metube",
|
||||||
"repo": "alexta69/metube",
|
"repo": "alexta69/metube",
|
||||||
"version": "2026.02.14",
|
"version": "2026.02.08",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T07:49:11Z"
|
"date": "2026-02-08T17:01:37Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "miniflux",
|
"slug": "miniflux",
|
||||||
"repo": "miniflux/v2",
|
"repo": "miniflux/v2",
|
||||||
"version": "2.2.17",
|
"version": "2.2.16",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T20:30:17Z"
|
"date": "2026-01-07T03:26:27Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "monica",
|
"slug": "monica",
|
||||||
@@ -844,9 +816,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "myip",
|
"slug": "myip",
|
||||||
"repo": "jason5ng32/MyIP",
|
"repo": "jason5ng32/MyIP",
|
||||||
"version": "v5.2.1",
|
"version": "v5.2.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T07:38:47Z"
|
"date": "2026-01-05T05:56:57Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "mylar3",
|
"slug": "mylar3",
|
||||||
@@ -865,9 +837,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "navidrome",
|
"slug": "navidrome",
|
||||||
"repo": "navidrome/navidrome",
|
"repo": "navidrome/navidrome",
|
||||||
"version": "v0.60.3",
|
"version": "v0.60.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T23:55:04Z"
|
"date": "2026-02-07T19:42:33Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "netbox",
|
"slug": "netbox",
|
||||||
@@ -876,13 +848,6 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-03T13:54:26Z"
|
"date": "2026-02-03T13:54:26Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "nextcloud-exporter",
|
|
||||||
"repo": "xperimental/nextcloud-exporter",
|
|
||||||
"version": "v0.9.0",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2025-10-12T20:03:10Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "nginx-ui",
|
"slug": "nginx-ui",
|
||||||
"repo": "0xJacky/nginx-ui",
|
"repo": "0xJacky/nginx-ui",
|
||||||
@@ -998,9 +963,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "pangolin",
|
"slug": "pangolin",
|
||||||
"repo": "fosrl/pangolin",
|
"repo": "fosrl/pangolin",
|
||||||
"version": "1.15.4",
|
"version": "1.15.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T23:01:29Z"
|
"date": "2026-02-05T19:23:58Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "paperless-ai",
|
"slug": "paperless-ai",
|
||||||
@@ -1026,9 +991,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "patchmon",
|
"slug": "patchmon",
|
||||||
"repo": "PatchMon/PatchMon",
|
"repo": "PatchMon/PatchMon",
|
||||||
"version": "v1.4.0",
|
"version": "v1.3.7",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T10:39:03Z"
|
"date": "2025-12-25T11:08:14Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "paymenter",
|
"slug": "paymenter",
|
||||||
@@ -1047,16 +1012,16 @@
|
|||||||
{
|
{
|
||||||
"slug": "pelican-panel",
|
"slug": "pelican-panel",
|
||||||
"repo": "pelican-dev/panel",
|
"repo": "pelican-dev/panel",
|
||||||
"version": "v1.0.0-beta32",
|
"version": "v1.0.0-beta31",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T22:15:44Z"
|
"date": "2026-01-18T22:43:24Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "pelican-wings",
|
"slug": "pelican-wings",
|
||||||
"repo": "pelican-dev/wings",
|
"repo": "pelican-dev/wings",
|
||||||
"version": "v1.0.0-beta23",
|
"version": "v1.0.0-beta22",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T22:10:26Z"
|
"date": "2026-01-18T22:38:36Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "pf2etools",
|
"slug": "pf2etools",
|
||||||
@@ -1072,19 +1037,12 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2025-12-01T05:07:31Z"
|
"date": "2025-12-01T05:07:31Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "pihole-exporter",
|
|
||||||
"repo": "eko/pihole-exporter",
|
|
||||||
"version": "v1.2.0",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2025-07-29T19:15:37Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "planka",
|
"slug": "planka",
|
||||||
"repo": "plankanban/planka",
|
"repo": "plankanban/planka",
|
||||||
"version": "v2.0.0",
|
"version": "v2.0.0-rc.4",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T13:50:10Z"
|
"date": "2025-09-04T12:41:17Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "plant-it",
|
"slug": "plant-it",
|
||||||
@@ -1096,9 +1054,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "pocketbase",
|
"slug": "pocketbase",
|
||||||
"repo": "pocketbase/pocketbase",
|
"repo": "pocketbase/pocketbase",
|
||||||
"version": "v0.36.3",
|
"version": "v0.36.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T18:38:58Z"
|
"date": "2026-02-01T08:12:42Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "pocketid",
|
"slug": "pocketid",
|
||||||
@@ -1131,9 +1089,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "prometheus-alertmanager",
|
"slug": "prometheus-alertmanager",
|
||||||
"repo": "prometheus/alertmanager",
|
"repo": "prometheus/alertmanager",
|
||||||
"version": "v0.31.1",
|
"version": "v0.31.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T21:28:26Z"
|
"date": "2026-02-02T13:34:15Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "prometheus-blackbox-exporter",
|
"slug": "prometheus-blackbox-exporter",
|
||||||
@@ -1173,9 +1131,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "pulse",
|
"slug": "pulse",
|
||||||
"repo": "rcourtman/Pulse",
|
"repo": "rcourtman/Pulse",
|
||||||
"version": "v5.1.9",
|
"version": "v5.1.5",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T15:34:40Z"
|
"date": "2026-02-08T12:19:53Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "pve-scripts-local",
|
"slug": "pve-scripts-local",
|
||||||
@@ -1191,13 +1149,6 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2025-11-19T23:54:34Z"
|
"date": "2025-11-19T23:54:34Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "qbittorrent-exporter",
|
|
||||||
"repo": "martabal/qbittorrent-exporter",
|
|
||||||
"version": "v1.13.2",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2025-12-13T22:59:03Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "qdrant",
|
"slug": "qdrant",
|
||||||
"repo": "qdrant/qdrant",
|
"repo": "qdrant/qdrant",
|
||||||
@@ -1219,13 +1170,6 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2025-11-16T22:39:01Z"
|
"date": "2025-11-16T22:39:01Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "radicale",
|
|
||||||
"repo": "Kozea/Radicale",
|
|
||||||
"version": "v3.6.0",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2026-01-10T06:56:46Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "rclone",
|
"slug": "rclone",
|
||||||
"repo": "rclone/rclone",
|
"repo": "rclone/rclone",
|
||||||
@@ -1236,9 +1180,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "rdtclient",
|
"slug": "rdtclient",
|
||||||
"repo": "rogerfar/rdt-client",
|
"repo": "rogerfar/rdt-client",
|
||||||
"version": "v2.0.120",
|
"version": "v2.0.119",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T02:53:51Z"
|
"date": "2025-10-13T23:15:11Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "reactive-resume",
|
"slug": "reactive-resume",
|
||||||
@@ -1292,16 +1236,16 @@
|
|||||||
{
|
{
|
||||||
"slug": "scanopy",
|
"slug": "scanopy",
|
||||||
"repo": "scanopy/scanopy",
|
"repo": "scanopy/scanopy",
|
||||||
"version": "v0.14.4",
|
"version": "v0.14.3",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T03:57:28Z"
|
"date": "2026-02-04T01:41:01Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "scraparr",
|
"slug": "scraparr",
|
||||||
"repo": "thecfu/scraparr",
|
"repo": "thecfu/scraparr",
|
||||||
"version": "v3.0.3",
|
"version": "v2.2.5",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T14:20:56Z"
|
"date": "2025-10-07T12:34:31Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "seelf",
|
"slug": "seelf",
|
||||||
@@ -1313,9 +1257,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "semaphore",
|
"slug": "semaphore",
|
||||||
"repo": "semaphoreui/semaphore",
|
"repo": "semaphoreui/semaphore",
|
||||||
"version": "v2.17.0",
|
"version": "v2.16.51",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T21:08:30Z"
|
"date": "2026-01-12T16:26:38Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "shelfmark",
|
"slug": "shelfmark",
|
||||||
@@ -1338,13 +1282,6 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-01-16T12:08:28Z"
|
"date": "2026-01-16T12:08:28Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "slskd",
|
|
||||||
"repo": "slskd/slskd",
|
|
||||||
"version": "0.24.3",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2026-01-15T14:40:15Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "snipeit",
|
"slug": "snipeit",
|
||||||
"repo": "grokability/snipe-it",
|
"repo": "grokability/snipe-it",
|
||||||
@@ -1355,9 +1292,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "snowshare",
|
"slug": "snowshare",
|
||||||
"repo": "TuroYT/snowshare",
|
"repo": "TuroYT/snowshare",
|
||||||
"version": "v1.3.5",
|
"version": "v1.2.12",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T10:24:51Z"
|
"date": "2026-01-30T13:35:56Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "sonarr",
|
"slug": "sonarr",
|
||||||
@@ -1390,9 +1327,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "stirling-pdf",
|
"slug": "stirling-pdf",
|
||||||
"repo": "Stirling-Tools/Stirling-PDF",
|
"repo": "Stirling-Tools/Stirling-PDF",
|
||||||
"version": "v2.4.6",
|
"version": "v2.4.5",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T00:01:19Z"
|
"date": "2026-02-06T23:12:20Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "streamlink-webui",
|
"slug": "streamlink-webui",
|
||||||
@@ -1411,9 +1348,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "tandoor",
|
"slug": "tandoor",
|
||||||
"repo": "TandoorRecipes/recipes",
|
"repo": "TandoorRecipes/recipes",
|
||||||
"version": "2.5.3",
|
"version": "2.5.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-14T12:42:14Z"
|
"date": "2026-02-08T13:23:02Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "tasmoadmin",
|
"slug": "tasmoadmin",
|
||||||
@@ -1439,9 +1376,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "termix",
|
"slug": "termix",
|
||||||
"repo": "Termix-SSH/Termix",
|
"repo": "Termix-SSH/Termix",
|
||||||
"version": "release-1.11.1-tag",
|
"version": "release-1.11.0-tag",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T04:49:16Z"
|
"date": "2026-01-25T02:09:52Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "the-lounge",
|
"slug": "the-lounge",
|
||||||
@@ -1467,9 +1404,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "tianji",
|
"slug": "tianji",
|
||||||
"repo": "msgbyte/tianji",
|
"repo": "msgbyte/tianji",
|
||||||
"version": "v1.31.13",
|
"version": "v1.31.10",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T16:30:09Z"
|
"date": "2026-02-04T17:21:04Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "traccar",
|
"slug": "traccar",
|
||||||
@@ -1481,9 +1418,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "tracearr",
|
"slug": "tracearr",
|
||||||
"repo": "connorgallopo/Tracearr",
|
"repo": "connorgallopo/Tracearr",
|
||||||
"version": "v1.4.17",
|
"version": "v1.4.12",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T01:33:21Z"
|
"date": "2026-01-28T23:29:37Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "tracktor",
|
"slug": "tracktor",
|
||||||
@@ -1495,9 +1432,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "traefik",
|
"slug": "traefik",
|
||||||
"repo": "traefik/traefik",
|
"repo": "traefik/traefik",
|
||||||
"version": "v3.6.8",
|
"version": "v3.6.7",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T16:44:37Z"
|
"date": "2026-01-14T14:11:45Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "trilium",
|
"slug": "trilium",
|
||||||
@@ -1509,16 +1446,16 @@
|
|||||||
{
|
{
|
||||||
"slug": "trip",
|
"slug": "trip",
|
||||||
"repo": "itskovacs/TRIP",
|
"repo": "itskovacs/TRIP",
|
||||||
"version": "1.40.0",
|
"version": "1.39.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T20:12:53Z"
|
"date": "2026-02-07T16:59:51Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "tududi",
|
"slug": "tududi",
|
||||||
"repo": "chrisvel/tududi",
|
"repo": "chrisvel/tududi",
|
||||||
"version": "v0.88.5",
|
"version": "v0.88.4",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T13:54:14Z"
|
"date": "2026-01-20T15:11:58Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "tunarr",
|
"slug": "tunarr",
|
||||||
@@ -1558,23 +1495,23 @@
|
|||||||
{
|
{
|
||||||
"slug": "upsnap",
|
"slug": "upsnap",
|
||||||
"repo": "seriousm4x/UpSnap",
|
"repo": "seriousm4x/UpSnap",
|
||||||
"version": "5.2.8",
|
"version": "5.2.7",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T00:02:37Z"
|
"date": "2026-01-07T23:48:00Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "uptimekuma",
|
"slug": "uptimekuma",
|
||||||
"repo": "louislam/uptime-kuma",
|
"repo": "louislam/uptime-kuma",
|
||||||
"version": "2.1.1",
|
"version": "2.1.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-13T16:07:33Z"
|
"date": "2026-02-07T02:31:49Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "vaultwarden",
|
"slug": "vaultwarden",
|
||||||
"repo": "dani-garcia/vaultwarden",
|
"repo": "dani-garcia/vaultwarden",
|
||||||
"version": "1.35.3",
|
"version": "1.35.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T20:37:03Z"
|
"date": "2026-01-09T18:37:04Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "victoriametrics",
|
"slug": "victoriametrics",
|
||||||
@@ -1586,9 +1523,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "vikunja",
|
"slug": "vikunja",
|
||||||
"repo": "go-vikunja/vikunja",
|
"repo": "go-vikunja/vikunja",
|
||||||
"version": "v1.1.0",
|
"version": "v1.0.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T10:34:29Z"
|
"date": "2026-01-28T11:12:59Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "wallabag",
|
"slug": "wallabag",
|
||||||
@@ -1600,9 +1537,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "wallos",
|
"slug": "wallos",
|
||||||
"repo": "ellite/Wallos",
|
"repo": "ellite/Wallos",
|
||||||
"version": "v4.6.1",
|
"version": "v4.6.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T21:06:46Z"
|
"date": "2025-12-20T15:57:51Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "wanderer",
|
"slug": "wanderer",
|
||||||
@@ -1635,9 +1572,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "wavelog",
|
"slug": "wavelog",
|
||||||
"repo": "wavelog/wavelog",
|
"repo": "wavelog/wavelog",
|
||||||
"version": "2.3",
|
"version": "2.2.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T15:46:40Z"
|
"date": "2025-12-31T16:53:34Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "wealthfolio",
|
"slug": "wealthfolio",
|
||||||
@@ -1653,26 +1590,19 @@
|
|||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2025-11-11T14:30:28Z"
|
"date": "2025-11-11T14:30:28Z"
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"slug": "wger",
|
|
||||||
"repo": "wger-project/wger",
|
|
||||||
"version": "2.4",
|
|
||||||
"pinned": false,
|
|
||||||
"date": "2026-01-18T12:12:02Z"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"slug": "wikijs",
|
"slug": "wikijs",
|
||||||
"repo": "requarks/wiki",
|
"repo": "requarks/wiki",
|
||||||
"version": "v2.5.312",
|
"version": "v2.5.311",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-12T02:45:22Z"
|
"date": "2026-01-08T09:50:00Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "wishlist",
|
"slug": "wishlist",
|
||||||
"repo": "cmintey/wishlist",
|
"repo": "cmintey/wishlist",
|
||||||
"version": "v0.60.0",
|
"version": "v0.59.0",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-10T04:05:26Z"
|
"date": "2026-01-19T16:42:14Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "wizarr",
|
"slug": "wizarr",
|
||||||
@@ -1698,9 +1628,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "yubal",
|
"slug": "yubal",
|
||||||
"repo": "guillevc/yubal",
|
"repo": "guillevc/yubal",
|
||||||
"version": "v0.5.0",
|
"version": "v0.4.2",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-09T22:11:32Z"
|
"date": "2026-02-08T21:35:13Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "zigbee2mqtt",
|
"slug": "zigbee2mqtt",
|
||||||
@@ -1712,9 +1642,9 @@
|
|||||||
{
|
{
|
||||||
"slug": "zipline",
|
"slug": "zipline",
|
||||||
"repo": "diced/zipline",
|
"repo": "diced/zipline",
|
||||||
"version": "v4.4.2",
|
"version": "v4.4.1",
|
||||||
"pinned": false,
|
"pinned": false,
|
||||||
"date": "2026-02-11T04:58:54Z"
|
"date": "2026-01-20T01:29:01Z"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"slug": "zitadel",
|
"slug": "zitadel",
|
||||||
|
|||||||
@@ -21,7 +21,7 @@
|
|||||||
"resources": {
|
"resources": {
|
||||||
"cpu": 1,
|
"cpu": 1,
|
||||||
"ram": 512,
|
"ram": 512,
|
||||||
"hdd": 4,
|
"hdd": 2,
|
||||||
"os": "debian",
|
"os": "debian",
|
||||||
"version": "13"
|
"version": "13"
|
||||||
}
|
}
|
||||||
@@ -32,7 +32,7 @@
|
|||||||
"resources": {
|
"resources": {
|
||||||
"cpu": 1,
|
"cpu": 1,
|
||||||
"ram": 256,
|
"ram": 256,
|
||||||
"hdd": 2,
|
"hdd": 1,
|
||||||
"os": "alpine",
|
"os": "alpine",
|
||||||
"version": "3.23"
|
"version": "3.23"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -33,7 +33,7 @@
|
|||||||
},
|
},
|
||||||
"notes": [
|
"notes": [
|
||||||
{
|
{
|
||||||
"text": "Kutt needs so be served with an SSL certificate for its login to work. During install, you will be prompted to choose if you want to have Caddy installed for SSL termination or if you want to use your own reverse proxy (in that case point your reverse proxy to port 3000).",
|
"text": "Kutt needs so be served with an SSL certificate for its login to work. During install, you will be prompted to choose if you want to have Caddy installed for SSL termination or if you want to use your own reverse proxy (in that case point your reverse porxy to port 3000).",
|
||||||
"type": "info"
|
"type": "info"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
|||||||
@@ -21,7 +21,7 @@
|
|||||||
"resources": {
|
"resources": {
|
||||||
"cpu": 4,
|
"cpu": 4,
|
||||||
"ram": 4096,
|
"ram": 4096,
|
||||||
"hdd": 40,
|
"hdd": 35,
|
||||||
"os": "Ubuntu",
|
"os": "Ubuntu",
|
||||||
"version": "24.04"
|
"version": "24.04"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,7 @@
|
|||||||
"resources": {
|
"resources": {
|
||||||
"cpu": 4,
|
"cpu": 4,
|
||||||
"ram": 8192,
|
"ram": 8192,
|
||||||
"hdd": 50,
|
"hdd": 25,
|
||||||
"os": "debian",
|
"os": "debian",
|
||||||
"version": "13"
|
"version": "13"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,35 +1,44 @@
|
|||||||
{
|
{
|
||||||
"name": "Prometheus Paperless NGX Exporter",
|
"name": "Prometheus Paperless NGX Exporter",
|
||||||
"slug": "prometheus-paperless-ngx-exporter",
|
"slug": "prometheus-paperless-ngx-exporter",
|
||||||
"categories": [
|
"categories": [
|
||||||
9
|
9
|
||||||
],
|
],
|
||||||
"date_created": "2025-02-07",
|
"date_created": "2025-02-07",
|
||||||
"type": "addon",
|
"type": "ct",
|
||||||
"updateable": true,
|
"updateable": true,
|
||||||
"privileged": false,
|
"privileged": false,
|
||||||
"interface_port": 8081,
|
"interface_port": 8081,
|
||||||
"documentation": "https://github.com/hansmi/prometheus-paperless-exporter",
|
"documentation": null,
|
||||||
"website": "https://github.com/hansmi/prometheus-paperless-exporter",
|
"website": "https://github.com/hansmi/prometheus-paperless-exporter",
|
||||||
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/paperless-ngx.webp",
|
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/paperless-ngx.webp",
|
||||||
"config_path": "/etc/prometheus-paperless-ngx-exporter/config.env",
|
"config_path": "",
|
||||||
"description": "Prometheus metrics exporter for Paperless-NGX, a document management system transforming physical documents into a searchable online archive. The exporter relies on Paperless' REST API.",
|
"description": "Prometheus metrics exporter for Paperless-NGX, a document management system transforming physical documents into a searchable online archive. The exporter relies on Paperless' REST API.",
|
||||||
"install_methods": [
|
"install_methods": [
|
||||||
{
|
{
|
||||||
"type": "default",
|
"type": "default",
|
||||||
"script": "tools/addon/prometheus-paperless-ngx-exporter.sh",
|
"script": "ct/prometheus-paperless-ngx-exporter.sh",
|
||||||
"resources": {
|
"resources": {
|
||||||
"cpu": null,
|
"cpu": 1,
|
||||||
"ram": null,
|
"ram": 256,
|
||||||
"hdd": null,
|
"hdd": 2,
|
||||||
"os": null,
|
"os": "debian",
|
||||||
"version": null
|
"version": "13"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"default_credentials": {
|
"default_credentials": {
|
||||||
"username": null,
|
"username": null,
|
||||||
"password": null
|
"password": null
|
||||||
},
|
},
|
||||||
"notes": []
|
"notes": [
|
||||||
|
{
|
||||||
|
"text": "Please adjust the Paperless URL in the systemd unit file: /etc/systemd/system/prometheus-paperless-ngx-exporter.service",
|
||||||
|
"type": "info"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"text": "Please adjust the Paperless authentication token in the configuration file: /etc/prometheus-paperless-ngx-exporter/paperless_auth_token_file",
|
||||||
|
"type": "info"
|
||||||
|
}
|
||||||
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -12,7 +12,7 @@
|
|||||||
"documentation": "https://radicale.org/master.html#documentation-1",
|
"documentation": "https://radicale.org/master.html#documentation-1",
|
||||||
"website": "https://radicale.org/",
|
"website": "https://radicale.org/",
|
||||||
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/radicale.webp",
|
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/radicale.webp",
|
||||||
"config_path": "/etc/radicale/config",
|
"config_path": "/etc/radicale/config or ~/.config/radicale/config",
|
||||||
"description": "Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV (contacts)",
|
"description": "Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV (contacts)",
|
||||||
"install_methods": [
|
"install_methods": [
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"name": "Slskd",
|
"name": "slskd",
|
||||||
"slug": "slskd",
|
"slug": "slskd",
|
||||||
"categories": [
|
"categories": [
|
||||||
11
|
11
|
||||||
@@ -35,6 +35,10 @@
|
|||||||
{
|
{
|
||||||
"text": "See /opt/slskd/config/slskd.yml to add your Soulseek credentials",
|
"text": "See /opt/slskd/config/slskd.yml to add your Soulseek credentials",
|
||||||
"type": "info"
|
"type": "info"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"text": "This LXC includes Soularr; it needs to be configured (/opt/soularr/config.ini) before it will work",
|
||||||
|
"type": "info"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -10,7 +10,7 @@
|
|||||||
"privileged": false,
|
"privileged": false,
|
||||||
"interface_port": 3000,
|
"interface_port": 3000,
|
||||||
"documentation": "https://github.com/TuroYT/snowshare",
|
"documentation": "https://github.com/TuroYT/snowshare",
|
||||||
"config_path": "/opt/snowshare.env",
|
"config_path": "/opt/snowshare/.env",
|
||||||
"website": "https://github.com/TuroYT/snowshare",
|
"website": "https://github.com/TuroYT/snowshare",
|
||||||
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/png/snowshare.png",
|
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/png/snowshare.png",
|
||||||
"description": "A modern, secure file and link sharing platform built with Next.js, Prisma, and NextAuth. Share URLs, code snippets, and files with customizable expiration, privacy, and QR codes.",
|
"description": "A modern, secure file and link sharing platform built with Next.js, Prisma, and NextAuth. Share URLs, code snippets, and files with customizable expiration, privacy, and QR codes.",
|
||||||
|
|||||||
@@ -32,10 +32,6 @@
|
|||||||
"password": null
|
"password": null
|
||||||
},
|
},
|
||||||
"notes": [
|
"notes": [
|
||||||
{
|
|
||||||
"text": "SQL Server (2025) SQLPAL is incompatible with Proxmox VE 9 (Kernel 6.12+) in LXC containers. Use a VM instead or the SQL-Server 2022 LXC.",
|
|
||||||
"type": "warning"
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"text": "If you choose not to run the installation setup, execute: `/opt/mssql/bin/mssql-conf setup` in LXC shell.",
|
"text": "If you choose not to run the installation setup, execute: `/opt/mssql/bin/mssql-conf setup` in LXC shell.",
|
||||||
"type": "info"
|
"type": "info"
|
||||||
|
|||||||
@@ -14,8 +14,6 @@
|
|||||||
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/ubiquiti-unifi.webp",
|
"logo": "https://cdn.jsdelivr.net/gh/selfhst/icons@main/webp/ubiquiti-unifi.webp",
|
||||||
"config_path": "",
|
"config_path": "",
|
||||||
"description": "UniFi Network Server is a software that helps manage and monitor UniFi networks (Wi-Fi, Ethernet, etc.) by providing an intuitive user interface and advanced features. It allows network administrators to configure, monitor, and upgrade network devices, as well as view network statistics, client devices, and historical events. The aim of the application is to make the management of UniFi networks easier and more efficient.",
|
"description": "UniFi Network Server is a software that helps manage and monitor UniFi networks (Wi-Fi, Ethernet, etc.) by providing an intuitive user interface and advanced features. It allows network administrators to configure, monitor, and upgrade network devices, as well as view network statistics, client devices, and historical events. The aim of the application is to make the management of UniFi networks easier and more efficient.",
|
||||||
"disable": true,
|
|
||||||
"disable_description": "This script is disabled because UniFi no longer delivers APT packages for Debian systems. The installation relies on APT repositories that are no longer maintained or available. For more details, see: https://github.com/community-scripts/ProxmoxVE/issues/11876",
|
|
||||||
"install_methods": [
|
"install_methods": [
|
||||||
{
|
{
|
||||||
"type": "default",
|
"type": "default",
|
||||||
|
|||||||
@@ -58,7 +58,7 @@ DISABLE_REGISTRATION=False
|
|||||||
EOF
|
EOF
|
||||||
cd /opt/adventurelog/backend/server
|
cd /opt/adventurelog/backend/server
|
||||||
mkdir -p /opt/adventurelog/backend/server/media
|
mkdir -p /opt/adventurelog/backend/server/media
|
||||||
$STD uv venv --clear /opt/adventurelog/backend/server/.venv
|
$STD uv venv /opt/adventurelog/backend/server/.venv
|
||||||
$STD /opt/adventurelog/backend/server/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/adventurelog/backend/server/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install -r requirements.txt
|
$STD /opt/adventurelog/backend/server/.venv/bin/python -m pip install -r requirements.txt
|
||||||
|
|||||||
@@ -77,7 +77,7 @@ echo "${KEPUB_VERSION#v}" >"$INSTALL_DIR"/KEPUBIFY_RELEASE
|
|||||||
sed 's/^/v/' ~/.autocaliweb >"$INSTALL_DIR"/ACW_RELEASE
|
sed 's/^/v/' ~/.autocaliweb >"$INSTALL_DIR"/ACW_RELEASE
|
||||||
|
|
||||||
cd "$INSTALL_DIR"
|
cd "$INSTALL_DIR"
|
||||||
$STD uv venv --clear "$VIRTUAL_ENV"
|
$STD uv venv "$VIRTUAL_ENV"
|
||||||
$STD uv sync --all-extras --active
|
$STD uv sync --all-extras --active
|
||||||
cat <<EOF >./dirs.json
|
cat <<EOF >./dirs.json
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ fetch_and_deploy_gh_release "babybuddy" "babybuddy/babybuddy" "tarball"
|
|||||||
msg_info "Installing Babybuddy"
|
msg_info "Installing Babybuddy"
|
||||||
mkdir -p /opt/data
|
mkdir -p /opt/data
|
||||||
cd /opt/babybuddy
|
cd /opt/babybuddy
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD source .venv/bin/activate
|
$STD source .venv/bin/activate
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
cp babybuddy/settings/production.example.py babybuddy/settings/production.py
|
cp babybuddy/settings/production.example.py babybuddy/settings/production.py
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ msg_info "Installing Bazarr"
|
|||||||
mkdir -p /var/lib/bazarr/
|
mkdir -p /var/lib/bazarr/
|
||||||
chmod 775 /opt/bazarr /var/lib/bazarr/
|
chmod 775 /opt/bazarr /var/lib/bazarr/
|
||||||
sed -i.bak 's/--only-binary=Pillow//g' /opt/bazarr/requirements.txt
|
sed -i.bak 's/--only-binary=Pillow//g' /opt/bazarr/requirements.txt
|
||||||
$STD uv venv --clear /opt/bazarr/venv --python 3.12
|
$STD uv venv /opt/bazarr/venv --python 3.12
|
||||||
$STD uv pip install -r /opt/bazarr/requirements.txt --python /opt/bazarr/venv/bin/python3
|
$STD uv pip install -r /opt/bazarr/requirements.txt --python /opt/bazarr/venv/bin/python3
|
||||||
msg_ok "Installed Bazarr"
|
msg_ok "Installed Bazarr"
|
||||||
|
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ PYTHON_VERSION="3.12" setup_uv
|
|||||||
fetch_and_deploy_gh_release "ComfyUI" "comfyanonymous/ComfyUI" "tarball" "latest" "/opt/ComfyUI"
|
fetch_and_deploy_gh_release "ComfyUI" "comfyanonymous/ComfyUI" "tarball" "latest" "/opt/ComfyUI"
|
||||||
|
|
||||||
msg_info "Python dependencies"
|
msg_info "Python dependencies"
|
||||||
$STD uv venv --clear "/opt/ComfyUI/venv"
|
$STD uv venv "/opt/ComfyUI/venv"
|
||||||
|
|
||||||
if [[ "${comfyui_gpu_type,,}" == "nvidia" ]]; then
|
if [[ "${comfyui_gpu_type,,}" == "nvidia" ]]; then
|
||||||
pytorch_url="https://download.pytorch.org/whl/cu130"
|
pytorch_url="https://download.pytorch.org/whl/cu130"
|
||||||
|
|||||||
@@ -16,8 +16,7 @@ update_os
|
|||||||
msg_info "Installing Dependencies"
|
msg_info "Installing Dependencies"
|
||||||
$STD apt install -y \
|
$STD apt install -y \
|
||||||
python3-pip \
|
python3-pip \
|
||||||
python3-libtorrent \
|
python3-libtorrent
|
||||||
python3-setuptools
|
|
||||||
msg_ok "Installed Dependencies"
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
msg_info "Installing Deluge"
|
msg_info "Installing Deluge"
|
||||||
|
|||||||
@@ -36,8 +36,8 @@ fetch_and_deploy_gh_release "dispatcharr" "Dispatcharr/Dispatcharr" "tarball"
|
|||||||
|
|
||||||
msg_info "Installing Python Dependencies with uv"
|
msg_info "Installing Python Dependencies with uv"
|
||||||
cd /opt/dispatcharr
|
cd /opt/dispatcharr
|
||||||
$STD uv venv --clear
|
$STD uv venv
|
||||||
$STD uv sync
|
$STD uv pip install -r requirements.txt --index-strategy unsafe-best-match
|
||||||
$STD uv pip install gunicorn gevent celery redis daphne
|
$STD uv pip install gunicorn gevent celery redis daphne
|
||||||
msg_ok "Installed Python Dependencies"
|
msg_ok "Installed Python Dependencies"
|
||||||
|
|
||||||
|
|||||||
@@ -1,25 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
|
|
||||||
# Copyright (c) 2021-2026 community-scripts ORG
|
|
||||||
# Author: Slaviša Arežina (tremor021)
|
|
||||||
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
|
||||||
# Source: https://www.drawio.com/
|
|
||||||
|
|
||||||
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
|
|
||||||
color
|
|
||||||
verb_ip6
|
|
||||||
catch_errors
|
|
||||||
setting_up_container
|
|
||||||
network_check
|
|
||||||
update_os
|
|
||||||
setup_hwaccel
|
|
||||||
|
|
||||||
msg_info "Installing Dependencies"
|
|
||||||
$STD apt install -y tomcat11
|
|
||||||
msg_ok "Installed Dependencies"
|
|
||||||
|
|
||||||
USE_ORIGINAL_FILENAME=true fetch_and_deploy_gh_release "drawio" "jgraph/drawio" "singlefile" "latest" "/var/lib/tomcat11/webapps" "draw.war"
|
|
||||||
|
|
||||||
motd_ssh
|
|
||||||
customize
|
|
||||||
cleanup_lxc
|
|
||||||
@@ -31,10 +31,8 @@ setup_deb822_repo "matrix-org" \
|
|||||||
"main"
|
"main"
|
||||||
echo "matrix-synapse-py3 matrix-synapse/server-name string $servername" | debconf-set-selections
|
echo "matrix-synapse-py3 matrix-synapse/server-name string $servername" | debconf-set-selections
|
||||||
echo "matrix-synapse-py3 matrix-synapse/report-stats boolean false" | debconf-set-selections
|
echo "matrix-synapse-py3 matrix-synapse/report-stats boolean false" | debconf-set-selections
|
||||||
echo "exit 101" >/usr/sbin/policy-rc.d
|
|
||||||
chmod +x /usr/sbin/policy-rc.d
|
|
||||||
$STD apt install matrix-synapse-py3 -y
|
$STD apt install matrix-synapse-py3 -y
|
||||||
rm -f /usr/sbin/policy-rc.d
|
systemctl stop matrix-synapse
|
||||||
sed -i 's/127.0.0.1/0.0.0.0/g' /etc/matrix-synapse/homeserver.yaml
|
sed -i 's/127.0.0.1/0.0.0.0/g' /etc/matrix-synapse/homeserver.yaml
|
||||||
sed -i 's/'\''::1'\'', //g' /etc/matrix-synapse/homeserver.yaml
|
sed -i 's/'\''::1'\'', //g' /etc/matrix-synapse/homeserver.yaml
|
||||||
SECRET=$(openssl rand -hex 32)
|
SECRET=$(openssl rand -hex 32)
|
||||||
|
|||||||
@@ -38,18 +38,6 @@ rm -f "$DEB_FILE"
|
|||||||
echo "$LATEST_VERSION" >~/.emqx
|
echo "$LATEST_VERSION" >~/.emqx
|
||||||
msg_ok "Installed EMQX"
|
msg_ok "Installed EMQX"
|
||||||
|
|
||||||
read -r -p "${TAB3}Would you like to disable the EMQX MQ feature? (reduces disk/CPU usage) <y/N> " prompt
|
|
||||||
if [[ ${prompt,,} =~ ^(y|yes)$ ]]; then
|
|
||||||
msg_info "Disabling EMQX MQ feature"
|
|
||||||
mkdir -p /etc/emqx
|
|
||||||
if ! grep -q "^mq.enable" /etc/emqx/emqx.conf 2>/dev/null; then
|
|
||||||
echo "mq.enable = false" >>/etc/emqx/emqx.conf
|
|
||||||
else
|
|
||||||
sed -i 's/^mq.enable.*/mq.enable = false/' /etc/emqx/emqx.conf
|
|
||||||
fi
|
|
||||||
msg_ok "Disabled EMQX MQ feature"
|
|
||||||
fi
|
|
||||||
|
|
||||||
msg_info "Starting EMQX service"
|
msg_info "Starting EMQX service"
|
||||||
$STD systemctl enable -q --now emqx
|
$STD systemctl enable -q --now emqx
|
||||||
msg_ok "Enabled EMQX service"
|
msg_ok "Enabled EMQX service"
|
||||||
|
|||||||
@@ -86,7 +86,7 @@ $STD uv tool update-shell
|
|||||||
export PATH="/root/.local/bin:$PATH"
|
export PATH="/root/.local/bin:$PATH"
|
||||||
$STD poetry self add poetry-plugin-export
|
$STD poetry self add poetry-plugin-export
|
||||||
$STD poetry export -f requirements.txt --output requirements.txt --without-hashes
|
$STD poetry export -f requirements.txt --output requirements.txt --without-hashes
|
||||||
$STD uv venv --clear
|
$STD uv venv
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
msg_ok "Setup Backend"
|
msg_ok "Setup Backend"
|
||||||
|
|
||||||
|
|||||||
@@ -23,7 +23,7 @@ msg_info "Setting up Virtual Environment"
|
|||||||
mkdir -p /opt/esphome
|
mkdir -p /opt/esphome
|
||||||
mkdir -p /root/config
|
mkdir -p /root/config
|
||||||
cd /opt/esphome
|
cd /opt/esphome
|
||||||
$STD uv venv --clear /opt/esphome/.venv
|
$STD uv venv /opt/esphome/.venv
|
||||||
$STD /opt/esphome/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/esphome/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/esphome/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/esphome/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/esphome/.venv/bin/python -m pip install esphome tornado esptool
|
$STD /opt/esphome/.venv/bin/python -m pip install esphome tornado esptool
|
||||||
|
|||||||
@@ -15,30 +15,31 @@ network_check
|
|||||||
update_os
|
update_os
|
||||||
|
|
||||||
msg_info "Installing Dependencies"
|
msg_info "Installing Dependencies"
|
||||||
$STD apt install -y \
|
$STD apt-get install -y \
|
||||||
ffmpeg \
|
ffmpeg \
|
||||||
|
jq \
|
||||||
imagemagick
|
imagemagick
|
||||||
msg_ok "Installed Dependencies"
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
setup_hwaccel
|
setup_hwaccel
|
||||||
|
|
||||||
msg_info "Installing ASP.NET Core Runtime"
|
msg_info "Installing ASP.NET Core Runtime"
|
||||||
setup_deb822_repo \
|
curl -fsSL https://packages.microsoft.com/config/debian/13/packages-microsoft-prod.deb -o packages-microsoft-prod.deb
|
||||||
"microsoft" \
|
$STD dpkg -i packages-microsoft-prod.deb
|
||||||
"https://packages.microsoft.com/keys/microsoft-2025.asc" \
|
rm -rf packages-microsoft-prod.deb
|
||||||
"https://packages.microsoft.com/debian/13/prod/" \
|
$STD apt-get update
|
||||||
"trixie"
|
$STD apt-get install -y aspnetcore-runtime-8.0
|
||||||
$STD apt install -y aspnetcore-runtime-8.0
|
|
||||||
msg_ok "Installed ASP.NET Core Runtime"
|
msg_ok "Installed ASP.NET Core Runtime"
|
||||||
|
|
||||||
fetch_and_deploy_from_url "https://fileflows.com/downloads/zip" "/opt/fileflows"
|
|
||||||
|
|
||||||
msg_info "Setup FileFlows"
|
msg_info "Setup FileFlows"
|
||||||
$STD ln -svf /usr/bin/ffmpeg /usr/local/bin/ffmpeg
|
$STD ln -svf /usr/bin/ffmpeg /usr/local/bin/ffmpeg
|
||||||
$STD ln -svf /usr/bin/ffprobe /usr/local/bin/ffprobe
|
$STD ln -svf /usr/bin/ffprobe /usr/local/bin/ffprobe
|
||||||
cd /opt/fileflows/Server
|
temp_file=$(mktemp)
|
||||||
dotnet FileFlows.Server.dll --systemd install --root true
|
curl -fsSL https://fileflows.com/downloads/zip -o "$temp_file"
|
||||||
|
$STD unzip -d /opt/fileflows "$temp_file"
|
||||||
|
$STD bash -c "cd /opt/fileflows/Server && dotnet FileFlows.Server.dll --systemd install --root true"
|
||||||
systemctl enable -q --now fileflows
|
systemctl enable -q --now fileflows
|
||||||
|
rm -f "$temp_file"
|
||||||
msg_ok "Setup FileFlows"
|
msg_ok "Setup FileFlows"
|
||||||
|
|
||||||
motd_ssh
|
motd_ssh
|
||||||
|
|||||||
@@ -17,7 +17,7 @@ PYTHON_VERSION="3.12" setup_uv
|
|||||||
fetch_and_deploy_gh_release "huntarr" "plexguide/Huntarr.io" "tarball"
|
fetch_and_deploy_gh_release "huntarr" "plexguide/Huntarr.io" "tarball"
|
||||||
|
|
||||||
msg_info "Configure Huntarr"
|
msg_info "Configure Huntarr"
|
||||||
$STD uv venv --clear /opt/huntarr/.venv
|
$STD uv venv /opt/huntarr/.venv
|
||||||
$STD uv pip install --python /opt/huntarr/.venv/bin/python -r /opt/huntarr/requirements.txt
|
$STD uv pip install --python /opt/huntarr/.venv/bin/python -r /opt/huntarr/requirements.txt
|
||||||
msg_ok "Configured Huntrarr"
|
msg_ok "Configured Huntrarr"
|
||||||
|
|
||||||
|
|||||||
@@ -289,7 +289,7 @@ ML_DIR="${APP_DIR}/machine-learning"
|
|||||||
GEO_DIR="${INSTALL_DIR}/geodata"
|
GEO_DIR="${INSTALL_DIR}/geodata"
|
||||||
mkdir -p {"${APP_DIR}","${UPLOAD_DIR}","${GEO_DIR}","${INSTALL_DIR}"/cache}
|
mkdir -p {"${APP_DIR}","${UPLOAD_DIR}","${GEO_DIR}","${INSTALL_DIR}"/cache}
|
||||||
|
|
||||||
fetch_and_deploy_gh_release "Immich" "immich-app/immich" "tarball" "v2.5.6" "$SRC_DIR"
|
fetch_and_deploy_gh_release "Immich" "immich-app/immich" "tarball" "v2.5.5" "$SRC_DIR"
|
||||||
PNPM_VERSION="$(jq -r '.packageManager | split("@")[1]' ${SRC_DIR}/package.json)"
|
PNPM_VERSION="$(jq -r '.packageManager | split("@")[1]' ${SRC_DIR}/package.json)"
|
||||||
NODE_VERSION="24" NODE_MODULE="pnpm@${PNPM_VERSION}" setup_nodejs
|
NODE_VERSION="24" NODE_MODULE="pnpm@${PNPM_VERSION}" setup_nodejs
|
||||||
|
|
||||||
|
|||||||
@@ -18,7 +18,7 @@ PYTHON_VERSION="3.12" setup_uv
|
|||||||
msg_info "Installing Jupyter"
|
msg_info "Installing Jupyter"
|
||||||
mkdir -p /opt/jupyter
|
mkdir -p /opt/jupyter
|
||||||
cd /opt/jupyter
|
cd /opt/jupyter
|
||||||
$STD uv venv --clear /opt/jupyter/.venv
|
$STD uv venv /opt/jupyter/.venv
|
||||||
$STD /opt/jupyter/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/jupyter/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/jupyter/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/jupyter/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/jupyter/.venv/bin/python -m pip install jupyter
|
$STD /opt/jupyter/.venv/bin/python -m pip install jupyter
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ fetch_and_deploy_gh_release "kapowarr" "Casvt/Kapowarr" "tarball"
|
|||||||
|
|
||||||
msg_info "Setup Kapowarr"
|
msg_info "Setup Kapowarr"
|
||||||
cd /opt/kapowarr
|
cd /opt/kapowarr
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD source .venv/bin/activate
|
$STD source .venv/bin/activate
|
||||||
$STD uv pip install --upgrade pip
|
$STD uv pip install --upgrade pip
|
||||||
$STD uv pip install --no-cache-dir -r requirements.txt
|
$STD uv pip install --no-cache-dir -r requirements.txt
|
||||||
|
|||||||
@@ -20,19 +20,10 @@ msg_ok "Installed Docker"
|
|||||||
msg_info "Detecting latest Kasm Workspaces release"
|
msg_info "Detecting latest Kasm Workspaces release"
|
||||||
KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1)
|
KASM_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_[0-9]+\.[0-9]+\.[0-9]+\.[a-z0-9]+\.tar\.gz' | head -n 1)
|
||||||
if [[ -z "$KASM_URL" ]]; then
|
if [[ -z "$KASM_URL" ]]; then
|
||||||
SERVICE_IMAGE_URL=$(curl -fsSL "https://www.kasm.com/downloads" | tr '\n' ' ' | grep -oE 'https://kasm-static-content[^"]*kasm_release_service_images_amd64_[0-9]+\.[0-9]+\.[0-9]+\.tar\.gz' | head -n 1)
|
|
||||||
if [[ -n "$SERVICE_IMAGE_URL" ]]; then
|
|
||||||
KASM_VERSION=$(echo "$SERVICE_IMAGE_URL" | sed -E 's/.*kasm_release_service_images_amd64_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
|
|
||||||
KASM_URL="https://kasm-static-content.s3.amazonaws.com/kasm_release_${KASM_VERSION}.tar.gz"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ -z "$KASM_URL" ]] || [[ -z "$KASM_VERSION" ]]; then
|
|
||||||
msg_error "Unable to detect latest Kasm release URL."
|
msg_error "Unable to detect latest Kasm release URL."
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
KASM_VERSION=$(echo "$KASM_URL" | sed -E 's/.*kasm_release_([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
|
||||||
msg_ok "Detected Kasm Workspaces version $KASM_VERSION"
|
msg_ok "Detected Kasm Workspaces version $KASM_VERSION"
|
||||||
|
|
||||||
msg_warn "WARNING: This script will run an external installer from a third-party source (https://www.kasmweb.com/)."
|
msg_warn "WARNING: This script will run an external installer from a third-party source (https://www.kasmweb.com/)."
|
||||||
|
|||||||
@@ -50,7 +50,7 @@ $STD useradd librenms -d /opt/librenms -M -r -s "$(which bash)"
|
|||||||
mkdir -p /opt/librenms/{rrd,logs,bootstrap/cache,storage,html}
|
mkdir -p /opt/librenms/{rrd,logs,bootstrap/cache,storage,html}
|
||||||
cd /opt/librenms
|
cd /opt/librenms
|
||||||
APP_KEY=$(openssl rand -base64 40 | tr -dc 'a-zA-Z0-9')
|
APP_KEY=$(openssl rand -base64 40 | tr -dc 'a-zA-Z0-9')
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD source .venv/bin/activate
|
$STD source .venv/bin/activate
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
cat <<EOF >/opt/librenms/.env
|
cat <<EOF >/opt/librenms/.env
|
||||||
|
|||||||
@@ -37,13 +37,18 @@ PYTHON_VERSION="3.12" setup_uv
|
|||||||
fetch_and_deploy_gh_release "libretranslate" "LibreTranslate/LibreTranslate" "tarball"
|
fetch_and_deploy_gh_release "libretranslate" "LibreTranslate/LibreTranslate" "tarball"
|
||||||
|
|
||||||
msg_info "Setup LibreTranslate (Patience)"
|
msg_info "Setup LibreTranslate (Patience)"
|
||||||
|
TORCH_VERSION=$(grep -Eo '"torch ==[0-9]+\.[0-9]+\.[0-9]+' /opt/libretranslate/pyproject.toml |
|
||||||
|
tail -n1 | sed 's/.*==//')
|
||||||
|
if [[ -z "$TORCH_VERSION" ]]; then
|
||||||
|
TORCH_VERSION="2.5.0"
|
||||||
|
fi
|
||||||
cd /opt/libretranslate
|
cd /opt/libretranslate
|
||||||
$STD uv venv --clear .venv --python 3.12
|
$STD uv venv .venv --python 3.12
|
||||||
$STD source .venv/bin/activate
|
$STD source .venv/bin/activate
|
||||||
$STD uv pip install --upgrade pip
|
$STD uv pip install --upgrade pip setuptools
|
||||||
$STD uv pip install "setuptools<81"
|
|
||||||
$STD uv pip install Babel==2.12.1
|
$STD uv pip install Babel==2.12.1
|
||||||
$STD .venv/bin/python scripts/compile_locales.py
|
$STD .venv/bin/python scripts/compile_locales.py
|
||||||
|
$STD uv pip install "torch==${TORCH_VERSION}" --extra-index-url https://download.pytorch.org/whl/cpu
|
||||||
$STD uv pip install "numpy<2"
|
$STD uv pip install "numpy<2"
|
||||||
$STD uv pip install .
|
$STD uv pip install .
|
||||||
$STD uv pip install libretranslate
|
$STD uv pip install libretranslate
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ msg_ok "Set up PostgreSQL"
|
|||||||
msg_info "Setting up Virtual Environment"
|
msg_info "Setting up Virtual Environment"
|
||||||
mkdir -p /opt/litellm
|
mkdir -p /opt/litellm
|
||||||
cd /opt/litellm
|
cd /opt/litellm
|
||||||
$STD uv venv --clear /opt/litellm/.venv
|
$STD uv venv /opt/litellm/.venv
|
||||||
$STD /opt/litellm/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/litellm/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/litellm/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/litellm/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/litellm/.venv/bin/python -m pip install litellm[proxy] prisma
|
$STD /opt/litellm/.venv/bin/python -m pip install litellm[proxy] prisma
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ fetch_and_deploy_gh_release "mylar3" "mylar3/mylar3" "tarball"
|
|||||||
|
|
||||||
msg_info "Installing ${APPLICATION}"
|
msg_info "Installing ${APPLICATION}"
|
||||||
mkdir -p /opt/mylar3-data
|
mkdir -p /opt/mylar3-data
|
||||||
$STD uv venv --clear /opt/mylar3/.venv
|
$STD uv venv /opt/mylar3/.venv
|
||||||
$STD /opt/mylar3/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/mylar3/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/mylar3/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/mylar3/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/mylar3/.venv/bin/python -m pip install --no-cache-dir -r /opt/mylar3/requirements.txt
|
$STD /opt/mylar3/.venv/bin/python -m pip install --no-cache-dir -r /opt/mylar3/requirements.txt
|
||||||
|
|||||||
@@ -37,6 +37,7 @@ PageSize = 10
|
|||||||
Host = 0.0.0.0
|
Host = 0.0.0.0
|
||||||
Port = 9000
|
Port = 9000
|
||||||
RunMode = release
|
RunMode = release
|
||||||
|
JwtSecret = $(openssl rand -hex 32)
|
||||||
|
|
||||||
[cert]
|
[cert]
|
||||||
HTTPChallengePort = 9180
|
HTTPChallengePort = 9180
|
||||||
|
|||||||
@@ -38,10 +38,6 @@ for server in "${servers[@]}"; do
|
|||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
msg_info "Installing dependencies"
|
|
||||||
$STD apt install -y inotify-tools
|
|
||||||
msg_ok "Installed dependencies"
|
|
||||||
|
|
||||||
msg_info "Installing Collabora Online"
|
msg_info "Installing Collabora Online"
|
||||||
curl -fsSL https://collaboraoffice.com/downloads/gpg/collaboraonline-release-keyring.gpg -o /etc/apt/keyrings/collaboraonline-release-keyring.gpg
|
curl -fsSL https://collaboraoffice.com/downloads/gpg/collaboraonline-release-keyring.gpg -o /etc/apt/keyrings/collaboraonline-release-keyring.gpg
|
||||||
cat <<EOF >/etc/apt/sources.list.d/colloboraonline.sources
|
cat <<EOF >/etc/apt/sources.list.d/colloboraonline.sources
|
||||||
@@ -152,15 +148,8 @@ COLLABORATION_JWT_SECRET=
|
|||||||
# FRONTEND_FULL_TEXT_SEARCH_ENABLED=true
|
# FRONTEND_FULL_TEXT_SEARCH_ENABLED=true
|
||||||
# SEARCH_EXTRACTOR_TIKA_TIKA_URL=<your-tika-url>
|
# SEARCH_EXTRACTOR_TIKA_TIKA_URL=<your-tika-url>
|
||||||
|
|
||||||
## Uncomment below to enable PosixFS Collaborative Mode
|
## External storage test - Only NFS v4.2+ is supported
|
||||||
## Increase inotify watch/instance limits on your PVE host:
|
## User files
|
||||||
### sysctl -w fs.inotify.max_user_watches=1048576
|
|
||||||
### sysctl -w fs.inotify.max_user_instances=1024
|
|
||||||
# STORAGE_USERS_POSIX_ENABLE_COLLABORATION=true
|
|
||||||
# STORAGE_USERS_POSIX_WATCH_TYPE=inotifywait
|
|
||||||
# STORAGE_USERS_POSIX_WATCH_FS=true
|
|
||||||
# STORAGE_USERS_POSIX_WATCH_PATH=<path-to-storage-or-bind-mount>
|
|
||||||
## User files location - experimental - use at your own risk! - ZFS, NFS v4.2+ supported - CIFS/SMB not supported
|
|
||||||
# STORAGE_USERS_POSIX_ROOT=<path-to-your-bind_mount>
|
# STORAGE_USERS_POSIX_ROOT=<path-to-your-bind_mount>
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ setup_hwaccel
|
|||||||
PYTHON_VERSION="3.12" setup_uv
|
PYTHON_VERSION="3.12" setup_uv
|
||||||
|
|
||||||
msg_info "Installing Open WebUI"
|
msg_info "Installing Open WebUI"
|
||||||
$STD uv tool install --python 3.12 --constraint <(echo "numba>=0.60") open-webui[all]
|
$STD uv tool install --python 3.12 open-webui[all]
|
||||||
msg_ok "Installed Open WebUI"
|
msg_ok "Installed Open WebUI"
|
||||||
|
|
||||||
read -r -p "${TAB3}Would you like to add Ollama? <y/N> " prompt
|
read -r -p "${TAB3}Would you like to add Ollama? <y/N> " prompt
|
||||||
|
|||||||
@@ -178,7 +178,7 @@ http:
|
|||||||
servers:
|
servers:
|
||||||
- url: "http://$LOCAL_IP:3000"
|
- url: "http://$LOCAL_IP:3000"
|
||||||
EOF
|
EOF
|
||||||
$STD npm run db:push
|
$STD npm run db:sqlite:push
|
||||||
|
|
||||||
. /etc/os-release
|
. /etc/os-release
|
||||||
if [ "$VERSION_CODENAME" = "trixie" ]; then
|
if [ "$VERSION_CODENAME" = "trixie" ]; then
|
||||||
|
|||||||
47
install/prometheus-paperless-ngx-exporter-install.sh
Executable file
47
install/prometheus-paperless-ngx-exporter-install.sh
Executable file
@@ -0,0 +1,47 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# Copyright (c) 2021-2026 community-scripts ORG
|
||||||
|
# Author: Andy Grunwald (andygrunwald)
|
||||||
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
|
# Source: https://github.com/hansmi/prometheus-paperless-exporter
|
||||||
|
|
||||||
|
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
|
||||||
|
color
|
||||||
|
verb_ip6
|
||||||
|
catch_errors
|
||||||
|
setting_up_container
|
||||||
|
network_check
|
||||||
|
update_os
|
||||||
|
|
||||||
|
fetch_and_deploy_gh_release "prom-paperless-exp" "hansmi/prometheus-paperless-exporter" "binary"
|
||||||
|
|
||||||
|
msg_info "Configuring Prometheus Paperless NGX Exporter"
|
||||||
|
mkdir -p /etc/prometheus-paperless-ngx-exporter
|
||||||
|
echo "SECRET_AUTH_TOKEN" >/etc/prometheus-paperless-ngx-exporter/paperless_auth_token_file
|
||||||
|
msg_ok "Configured Prometheus Paperless NGX Exporter"
|
||||||
|
|
||||||
|
msg_info "Creating Service"
|
||||||
|
cat <<EOF >/etc/systemd/system/prometheus-paperless-ngx-exporter.service
|
||||||
|
[Unit]
|
||||||
|
Description=Prometheus Paperless NGX Exporter
|
||||||
|
Wants=network-online.target
|
||||||
|
After=network-online.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
User=root
|
||||||
|
Restart=always
|
||||||
|
Type=simple
|
||||||
|
ExecStart=/usr/bin/prometheus-paperless-exporter \
|
||||||
|
--paperless_url=http://paperless.example.org \
|
||||||
|
--paperless_auth_token_file=/etc/prometheus-paperless-ngx-exporter/paperless_auth_token_file
|
||||||
|
ExecReload=/bin/kill -HUP \$MAINPID
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
|
EOF
|
||||||
|
systemctl enable -q --now prometheus-paperless-ngx-exporter
|
||||||
|
msg_ok "Created Service"
|
||||||
|
|
||||||
|
motd_ssh
|
||||||
|
customize
|
||||||
|
cleanup_lxc
|
||||||
@@ -19,7 +19,7 @@ msg_info "Installing Prometheus Proxmox VE Exporter"
|
|||||||
mkdir -p /opt/prometheus-pve-exporter
|
mkdir -p /opt/prometheus-pve-exporter
|
||||||
cd /opt/prometheus-pve-exporter
|
cd /opt/prometheus-pve-exporter
|
||||||
|
|
||||||
$STD uv venv --clear /opt/prometheus-pve-exporter/.venv
|
$STD uv venv /opt/prometheus-pve-exporter/.venv
|
||||||
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install prometheus-pve-exporter
|
$STD /opt/prometheus-pve-exporter/.venv/bin/python -m pip install prometheus-pve-exporter
|
||||||
|
|||||||
@@ -14,51 +14,42 @@ network_check
|
|||||||
update_os
|
update_os
|
||||||
|
|
||||||
msg_info "Installing Dependencies"
|
msg_info "Installing Dependencies"
|
||||||
$STD apt install -y apache2-utils
|
$STD apt install -y \
|
||||||
|
apache2-utils \
|
||||||
|
python3-pip \
|
||||||
|
python3-venv
|
||||||
msg_ok "Installed Dependencies"
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
PYTHON_VERSION="3.13" setup_uv
|
|
||||||
fetch_and_deploy_gh_release "Radicale" "Kozea/Radicale" "tarball" "latest" "/opt/radicale"
|
|
||||||
|
|
||||||
msg_info "Setting up Radicale"
|
msg_info "Setting up Radicale"
|
||||||
cd /opt/radicale
|
python3 -m venv /opt/radicale
|
||||||
|
source /opt/radicale/bin/activate
|
||||||
|
$STD python3 -m pip install --upgrade https://github.com/Kozea/Radicale/archive/master.tar.gz
|
||||||
RNDPASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13)
|
RNDPASS=$(openssl rand -base64 18 | tr -dc 'a-zA-Z0-9' | head -c13)
|
||||||
$STD htpasswd -c -b -5 /opt/radicale/users admin "$RNDPASS"
|
$STD htpasswd -c -b -5 /opt/radicale/users admin $RNDPASS
|
||||||
{
|
{
|
||||||
echo "Radicale Credentials"
|
echo "Radicale Credentials"
|
||||||
echo "Admin User: admin"
|
echo "Admin User: admin"
|
||||||
echo "Admin Password: $RNDPASS"
|
echo "Admin Password: $RNDPASS"
|
||||||
} >>~/radicale.creds
|
} >>~/radicale.creds
|
||||||
|
msg_ok "Done setting up Radicale"
|
||||||
|
|
||||||
mkdir -p /etc/radicale
|
msg_info "Setup Service"
|
||||||
cat <<EOF >/etc/radicale/config
|
|
||||||
[server]
|
|
||||||
hosts = 0.0.0.0:5232
|
|
||||||
|
|
||||||
[auth]
|
cat <<EOF >/opt/radicale/start.sh
|
||||||
type = htpasswd
|
#!/usr/bin/env bash
|
||||||
htpasswd_filename = /opt/radicale/users
|
source /opt/radicale/bin/activate
|
||||||
htpasswd_encryption = sha512
|
python3 -m radicale --storage-filesystem-folder=/var/lib/radicale/collections --hosts 0.0.0.0:5232 --auth-type htpasswd --auth-htpasswd-filename /opt/radicale/users --auth-htpasswd-encryption sha512
|
||||||
|
|
||||||
[storage]
|
|
||||||
type = multifilesystem
|
|
||||||
filesystem_folder = /var/lib/radicale/collections
|
|
||||||
|
|
||||||
[web]
|
|
||||||
type = internal
|
|
||||||
EOF
|
EOF
|
||||||
msg_ok "Set up Radicale"
|
|
||||||
|
|
||||||
msg_info "Creating Service"
|
chmod +x /opt/radicale/start.sh
|
||||||
|
|
||||||
cat <<EOF >/etc/systemd/system/radicale.service
|
cat <<EOF >/etc/systemd/system/radicale.service
|
||||||
[Unit]
|
|
||||||
Description=A simple CalDAV (calendar) and CardDAV (contact) server
|
Description=A simple CalDAV (calendar) and CardDAV (contact) server
|
||||||
After=network.target
|
After=network.target
|
||||||
Requires=network.target
|
Requires=network.target
|
||||||
|
|
||||||
[Service]
|
[Service]
|
||||||
WorkingDirectory=/opt/radicale
|
ExecStart=/opt/radicale/start.sh
|
||||||
ExecStart=/usr/local/bin/uv run -m radicale --config /etc/radicale/config
|
|
||||||
Restart=on-failure
|
Restart=on-failure
|
||||||
# User=radicale
|
# User=radicale
|
||||||
# Deny other users access to the calendar data
|
# Deny other users access to the calendar data
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ msg_ok "Setup Unrar"
|
|||||||
fetch_and_deploy_gh_release "sabnzbd-org" "sabnzbd/sabnzbd" "prebuild" "latest" "/opt/sabnzbd" "SABnzbd-*-src.tar.gz"
|
fetch_and_deploy_gh_release "sabnzbd-org" "sabnzbd/sabnzbd" "prebuild" "latest" "/opt/sabnzbd" "SABnzbd-*-src.tar.gz"
|
||||||
|
|
||||||
msg_info "Installing SABnzbd"
|
msg_info "Installing SABnzbd"
|
||||||
$STD uv venv --clear /opt/sabnzbd/venv
|
$STD uv venv /opt/sabnzbd/venv
|
||||||
$STD uv pip install -r /opt/sabnzbd/requirements.txt --python=/opt/sabnzbd/venv/bin/python
|
$STD uv pip install -r /opt/sabnzbd/requirements.txt --python=/opt/sabnzbd/venv/bin/python
|
||||||
msg_ok "Installed SABnzbd"
|
msg_ok "Installed SABnzbd"
|
||||||
|
|
||||||
|
|||||||
@@ -18,7 +18,7 @@ fetch_and_deploy_gh_release "scrappar" "thecfu/scraparr" "tarball" "latest" "/op
|
|||||||
|
|
||||||
msg_info "Installing Scraparr"
|
msg_info "Installing Scraparr"
|
||||||
cd /opt/scraparr
|
cd /opt/scraparr
|
||||||
$STD uv venv --clear /opt/scraparr/.venv
|
$STD uv venv /opt/scraparr/.venv
|
||||||
$STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade
|
$STD /opt/scraparr/.venv/bin/python -m ensurepip --upgrade
|
||||||
$STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip
|
$STD /opt/scraparr/.venv/bin/python -m pip install --upgrade pip
|
||||||
$STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt
|
$STD /opt/scraparr/.venv/bin/python -m pip install -r /opt/scraparr/src/scraparr/requirements.txt
|
||||||
|
|||||||
@@ -131,7 +131,7 @@ msg_ok "Built Shelfmark frontend"
|
|||||||
|
|
||||||
msg_info "Configuring Shelfmark"
|
msg_info "Configuring Shelfmark"
|
||||||
cd /opt/shelfmark
|
cd /opt/shelfmark
|
||||||
$STD uv venv --clear ./venv
|
$STD uv venv ./venv
|
||||||
$STD source ./venv/bin/activate
|
$STD source ./venv/bin/activate
|
||||||
$STD uv pip install -r ./requirements-base.txt
|
$STD uv pip install -r ./requirements-base.txt
|
||||||
[[ "$DEPLOYMENT_TYPE" == "1" ]] && $STD uv pip install -r ./requirements-shelfmark.txt
|
[[ "$DEPLOYMENT_TYPE" == "1" ]] && $STD uv pip install -r ./requirements-shelfmark.txt
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
# Copyright (c) 2021-2026 community-scripts ORG
|
# Copyright (c) 2021-2026 community-scripts ORG
|
||||||
# Author: vhsdream
|
# Author: vhsdream
|
||||||
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
|
||||||
# Source: https://github.com/slskd/slskd/, https://github.com/mrusse/soularr
|
# Source: https://github.com/slskd/slskd/, https://soularr.net
|
||||||
|
|
||||||
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
|
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
|
||||||
color
|
color
|
||||||
@@ -13,71 +13,71 @@ setting_up_container
|
|||||||
network_check
|
network_check
|
||||||
update_os
|
update_os
|
||||||
|
|
||||||
fetch_and_deploy_gh_release "Slskd" "slskd/slskd" "prebuild" "latest" "/opt/slskd" "slskd-*-linux-x64.zip"
|
msg_info "Installing Dependencies"
|
||||||
|
$STD apt install -y \
|
||||||
|
python3-pip
|
||||||
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
msg_info "Configuring Slskd"
|
msg_info "Setup ${APPLICATION}"
|
||||||
|
tmp_file=$(mktemp)
|
||||||
|
RELEASE=$(curl -s https://api.github.com/repos/slskd/slskd/releases/latest | grep "tag_name" | awk '{print substr($2, 2, length($2)-3) }')
|
||||||
|
curl -fsSL "https://github.com/slskd/slskd/releases/download/${RELEASE}/slskd-${RELEASE}-linux-x64.zip" -o $tmp_file
|
||||||
|
$STD unzip $tmp_file -d /opt/${APPLICATION}
|
||||||
|
echo "${RELEASE}" >/opt/${APPLICATION}_version.txt
|
||||||
JWT_KEY=$(openssl rand -base64 44)
|
JWT_KEY=$(openssl rand -base64 44)
|
||||||
SLSKD_API_KEY=$(openssl rand -base64 44)
|
SLSKD_API_KEY=$(openssl rand -base64 44)
|
||||||
cp /opt/slskd/config/slskd.example.yml /opt/slskd/config/slskd.yml
|
cp /opt/${APPLICATION}/config/slskd.example.yml /opt/${APPLICATION}/config/slskd.yml
|
||||||
sed -i \
|
sed -i \
|
||||||
-e '/web:/,/cidr/s/^# //' \
|
-e "\|web:|,\|cidr|s|^#||" \
|
||||||
-e '/https:/,/port: 5031/s/false/true/' \
|
-e "\|https:|,\|5031|s|false|true|" \
|
||||||
-e '/port: 5030/,/socket/s/,.*$//' \
|
|
||||||
-e '/content_path:/,/authentication/s/false/true/' \
|
|
||||||
-e "\|api_keys|,\|cidr|s|<some.*$|$SLSKD_API_KEY|; \
|
-e "\|api_keys|,\|cidr|s|<some.*$|$SLSKD_API_KEY|; \
|
||||||
s|role: readonly|role: readwrite|; \
|
s|role: readonly|role: readwrite|; \
|
||||||
s|0.0.0.0/0,::/0|& # Replace this with your subnet|" \
|
s|0.0.0.0/0,::/0|& # Replace this with your subnet|" \
|
||||||
|
-e "\|soulseek|,\|write_queue|s|^#||" \
|
||||||
-e "\|jwt:|,\|ttl|s|key: ~|key: $JWT_KEY|" \
|
-e "\|jwt:|,\|ttl|s|key: ~|key: $JWT_KEY|" \
|
||||||
-e '/soulseek/,/write_queue/s/^# //' \
|
-e "s|^ picture|# picture|" \
|
||||||
-e 's/^.*picture/#&/' /opt/slskd/config/slskd.yml
|
/opt/${APPLICATION}/config/slskd.yml
|
||||||
msg_ok "Configured Slskd"
|
msg_ok "Setup ${APPLICATION}"
|
||||||
|
|
||||||
read -rp "${TAB3}Do you want to install Soularr? y/N " soularr
|
msg_info "Installing Soularr"
|
||||||
if [[ ${soularr,,} =~ ^(y|yes)$ ]]; then
|
rm -rf /usr/lib/python3.*/EXTERNALLY-MANAGED
|
||||||
PYTHON_VERSION="3.11" setup_uv
|
cd /tmp
|
||||||
fetch_and_deploy_gh_release "Soularr" "mrusse/soularr" "tarball" "latest" "/opt/soularr"
|
curl -fsSL -o main.zip https://github.com/mrusse/soularr/archive/refs/heads/main.zip
|
||||||
cd /opt/soularr
|
$STD unzip main.zip
|
||||||
$STD uv venv venv
|
mv soularr-main /opt/soularr
|
||||||
$STD source venv/bin/activate
|
cd /opt/soularr
|
||||||
$STD uv pip install -r requirements.txt
|
$STD pip install -r requirements.txt
|
||||||
sed -i \
|
sed -i \
|
||||||
-e "\|[Slskd]|,\|host_url|s|yourslskdapikeygoeshere|$SLSKD_API_KEY|" \
|
-e "\|[Slskd]|,\|host_url|s|yourslskdapikeygoeshere|$SLSKD_API_KEY|" \
|
||||||
-e "/host_url/s/slskd/localhost/" \
|
-e "/host_url/s/slskd/localhost/" \
|
||||||
/opt/soularr/config.ini
|
/opt/soularr/config.ini
|
||||||
cat <<EOF >/opt/soularr/run.sh
|
sed -i \
|
||||||
#!/usr/bin/env bash
|
-e "/#This\|#Default\|INTERVAL/{N;d;}" \
|
||||||
|
-e "/while\|#Pass/d" \
|
||||||
|
-e "\|python|s|app|opt/soularr|; s|python|python3|" \
|
||||||
|
-e "/dt/,+2d" \
|
||||||
|
/opt/soularr/run.sh
|
||||||
|
sed -i -E "/(soularr.py)/s/.{5}$//; /if/,/fi/s/.{4}//" /opt/soularr/run.sh
|
||||||
|
chmod +x /opt/soularr/run.sh
|
||||||
|
msg_ok "Installed Soularr"
|
||||||
|
|
||||||
if ps aux | grep "[s]oularr.py" >/dev/null; then
|
msg_info "Creating Services"
|
||||||
echo "Soularr is already running. Exiting..."
|
cat <<EOF >/etc/systemd/system/${APPLICATION}.service
|
||||||
exit 1
|
|
||||||
else
|
|
||||||
source /opt/soularr/venv/bin/activate
|
|
||||||
uv run python3 -u /opt/soularr/soularr.py --config-dir /opt/soularr
|
|
||||||
fi
|
|
||||||
EOF
|
|
||||||
chmod +x /opt/soularr/run.sh
|
|
||||||
deactivate
|
|
||||||
msg_ok "Installed Soularr"
|
|
||||||
fi
|
|
||||||
|
|
||||||
msg_info "Creating Service"
|
|
||||||
cat <<EOF >/etc/systemd/system/slskd.service
|
|
||||||
[Unit]
|
[Unit]
|
||||||
Description=Slskd Service
|
Description=${APPLICATION} Service
|
||||||
After=network.target
|
After=network.target
|
||||||
Wants=network.target
|
Wants=network.target
|
||||||
|
|
||||||
[Service]
|
[Service]
|
||||||
WorkingDirectory=/opt/slskd
|
WorkingDirectory=/opt/${APPLICATION}
|
||||||
ExecStart=/opt/slskd/slskd --config /opt/slskd/config/slskd.yml
|
ExecStart=/opt/${APPLICATION}/slskd --config /opt/${APPLICATION}/config/slskd.yml
|
||||||
Restart=always
|
Restart=always
|
||||||
|
|
||||||
[Install]
|
[Install]
|
||||||
WantedBy=multi-user.target
|
WantedBy=multi-user.target
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
if [[ -d /opt/soularr ]]; then
|
cat <<EOF >/etc/systemd/system/soularr.timer
|
||||||
cat <<EOF >/etc/systemd/system/soularr.timer
|
|
||||||
[Unit]
|
[Unit]
|
||||||
Description=Soularr service timer
|
Description=Soularr service timer
|
||||||
RefuseManualStart=no
|
RefuseManualStart=no
|
||||||
@@ -85,15 +85,15 @@ RefuseManualStop=no
|
|||||||
|
|
||||||
[Timer]
|
[Timer]
|
||||||
Persistent=true
|
Persistent=true
|
||||||
# run every 10 minutes
|
# run every 5 minutes
|
||||||
OnCalendar=*-*-* *:0/10:00
|
OnCalendar=*-*-* *:0/5:00
|
||||||
Unit=soularr.service
|
Unit=soularr.service
|
||||||
|
|
||||||
[Install]
|
[Install]
|
||||||
WantedBy=timers.target
|
WantedBy=timers.target
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
cat <<EOF >/etc/systemd/system/soularr.service
|
cat <<EOF >/etc/systemd/system/soularr.service
|
||||||
[Unit]
|
[Unit]
|
||||||
Description=Soularr service
|
Description=Soularr service
|
||||||
After=network.target slskd.service
|
After=network.target slskd.service
|
||||||
@@ -106,9 +106,10 @@ ExecStart=/bin/bash -c /opt/soularr/run.sh
|
|||||||
[Install]
|
[Install]
|
||||||
WantedBy=multi-user.target
|
WantedBy=multi-user.target
|
||||||
EOF
|
EOF
|
||||||
msg_warn "Add your Lidarr API key to Soularr in '/opt/soularr/config.ini', then run 'systemctl enable --now soularr.timer'"
|
systemctl enable -q --now ${APPLICATION}
|
||||||
fi
|
systemctl enable -q soularr.timer
|
||||||
systemctl enable -q --now slskd
|
rm -rf $tmp_file
|
||||||
|
rm -rf /tmp/main.zip
|
||||||
msg_ok "Created Services"
|
msg_ok "Created Services"
|
||||||
|
|
||||||
motd_ssh
|
motd_ssh
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ msg_ok "Installed LibreOffice Components"
|
|||||||
|
|
||||||
msg_info "Installing Python Dependencies"
|
msg_info "Installing Python Dependencies"
|
||||||
mkdir -p /tmp/stirling-pdf
|
mkdir -p /tmp/stirling-pdf
|
||||||
$STD uv venv --clear /opt/.venv
|
$STD uv venv /opt/.venv
|
||||||
export PATH="/opt/.venv/bin:$PATH"
|
export PATH="/opt/.venv/bin:$PATH"
|
||||||
source /opt/.venv/bin/activate
|
source /opt/.venv/bin/activate
|
||||||
$STD uv pip install --upgrade pip
|
$STD uv pip install --upgrade pip
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ fetch_and_deploy_gh_release "streamlink-webui" "CrazyWolf13/streamlink-webui" "t
|
|||||||
|
|
||||||
msg_info "Setup ${APPLICATION}"
|
msg_info "Setup ${APPLICATION}"
|
||||||
mkdir -p "/opt/${APPLICATION}-download"
|
mkdir -p "/opt/${APPLICATION}-download"
|
||||||
$STD uv venv --clear /opt/"${APPLICATION}"/backend/src/.venv
|
$STD uv venv /opt/"${APPLICATION}"/backend/src/.venv
|
||||||
source /opt/"${APPLICATION}"/backend/src/.venv/bin/activate
|
source /opt/"${APPLICATION}"/backend/src/.venv/bin/activate
|
||||||
$STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/"${APPLICATION}"/backend/src/.venv
|
$STD uv pip install -r /opt/streamlink-webui/backend/src/requirements.txt --python=/opt/"${APPLICATION}"/backend/src/.venv
|
||||||
cd /opt/"${APPLICATION}"/frontend/src
|
cd /opt/"${APPLICATION}"/frontend/src
|
||||||
|
|||||||
@@ -40,7 +40,7 @@ SECRET_KEY=$(openssl rand -base64 45 | sed 's/\//\\\//g')
|
|||||||
msg_info "Setup Tandoor"
|
msg_info "Setup Tandoor"
|
||||||
mkdir -p /opt/tandoor/{config,api,mediafiles,staticfiles}
|
mkdir -p /opt/tandoor/{config,api,mediafiles,staticfiles}
|
||||||
cd /opt/tandoor
|
cd /opt/tandoor
|
||||||
$STD uv venv --clear .venv --python=python3
|
$STD uv venv .venv --python=python3
|
||||||
$STD uv pip install -r requirements.txt --python .venv/bin/python
|
$STD uv pip install -r requirements.txt --python .venv/bin/python
|
||||||
cd /opt/tandoor/vue3
|
cd /opt/tandoor/vue3
|
||||||
$STD yarn install
|
$STD yarn install
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ cd /opt/Tautulli
|
|||||||
TAUTULLI_VERSION=$(get_latest_github_release "Tautulli/Tautulli" "false")
|
TAUTULLI_VERSION=$(get_latest_github_release "Tautulli/Tautulli" "false")
|
||||||
echo "${TAUTULLI_VERSION}" >/opt/Tautulli/version.txt
|
echo "${TAUTULLI_VERSION}" >/opt/Tautulli/version.txt
|
||||||
echo "master" >/opt/Tautulli/branch.txt
|
echo "master" >/opt/Tautulli/branch.txt
|
||||||
$STD uv venv --clear
|
$STD uv venv
|
||||||
$STD source /opt/Tautulli/.venv/bin/activate
|
$STD source /opt/Tautulli/.venv/bin/activate
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
$STD uv pip install pyopenssl
|
$STD uv pip install pyopenssl
|
||||||
|
|||||||
@@ -30,7 +30,7 @@ msg_ok "Built Frontend"
|
|||||||
|
|
||||||
msg_info "Setting up Backend"
|
msg_info "Setting up Backend"
|
||||||
cd /opt/trip/backend
|
cd /opt/trip/backend
|
||||||
$STD uv venv --clear /opt/trip/.venv
|
$STD uv venv /opt/trip/.venv
|
||||||
$STD uv pip install --python /opt/trip/.venv/bin/python -r trip/requirements.txt
|
$STD uv pip install --python /opt/trip/.venv/bin/python -r trip/requirements.txt
|
||||||
msg_ok "Set up Backend"
|
msg_ok "Set up Backend"
|
||||||
|
|
||||||
|
|||||||
@@ -27,6 +27,68 @@ msg_ok "Installed Dependencies"
|
|||||||
|
|
||||||
fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip"
|
fetch_and_deploy_gh_release "UmlautAdaptarr" "PCJones/Umlautadaptarr" "prebuild" "latest" "/opt/UmlautAdaptarr" "linux-x64.zip"
|
||||||
|
|
||||||
|
msg_info "Setting up UmlautAdaptarr"
|
||||||
|
cat <<EOF >/opt/UmlautAdaptarr/appsettings.json
|
||||||
|
{
|
||||||
|
"Logging": {
|
||||||
|
"LogLevel": {
|
||||||
|
"Default": "Information",
|
||||||
|
"Microsoft.AspNetCore": "Warning"
|
||||||
|
},
|
||||||
|
"Console": {
|
||||||
|
"TimestampFormat": "yyyy-MM-dd HH:mm:ss::"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"AllowedHosts": "*",
|
||||||
|
"Kestrel": {
|
||||||
|
"Endpoints": {
|
||||||
|
"Http": {
|
||||||
|
"Url": "http://[::]:5005"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"Settings": {
|
||||||
|
"UserAgent": "UmlautAdaptarr/1.0",
|
||||||
|
"UmlautAdaptarrApiHost": "https://umlautadaptarr.pcjones.de/api/v1",
|
||||||
|
"IndexerRequestsCacheDurationInMinutes": 12
|
||||||
|
},
|
||||||
|
"Sonarr": [
|
||||||
|
{
|
||||||
|
"Enabled": false,
|
||||||
|
"Name": "Sonarr",
|
||||||
|
"Host": "http://192.168.1.100:8989",
|
||||||
|
"ApiKey": "dein_sonarr_api_key"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Radarr": [
|
||||||
|
{
|
||||||
|
"Enabled": false,
|
||||||
|
"Name": "Radarr",
|
||||||
|
"Host": "http://192.168.1.101:7878",
|
||||||
|
"ApiKey": "dein_radarr_api_key"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Lidarr": [
|
||||||
|
{
|
||||||
|
"Enabled": false,
|
||||||
|
"Host": "http://192.168.1.102:8686",
|
||||||
|
"ApiKey": "dein_lidarr_api_key"
|
||||||
|
},
|
||||||
|
],
|
||||||
|
"Readarr": [
|
||||||
|
{
|
||||||
|
"Enabled": false,
|
||||||
|
"Host": "http://192.168.1.103:8787",
|
||||||
|
"ApiKey": "dein_readarr_api_key"
|
||||||
|
},
|
||||||
|
],
|
||||||
|
"IpLeakTest": {
|
||||||
|
"Enabled": false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
EOF
|
||||||
|
msg_ok "Setup UmlautAdaptarr"
|
||||||
|
|
||||||
msg_info "Creating Service"
|
msg_info "Creating Service"
|
||||||
cat <<EOF >/etc/systemd/system/umlautadaptarr.service
|
cat <<EOF >/etc/systemd/system/umlautadaptarr.service
|
||||||
[Unit]
|
[Unit]
|
||||||
|
|||||||
@@ -15,18 +15,16 @@ update_os
|
|||||||
|
|
||||||
msg_info "Installing Dependencies"
|
msg_info "Installing Dependencies"
|
||||||
$STD apt install -y apt-transport-https
|
$STD apt install -y apt-transport-https
|
||||||
curl -fsSL "https://dl.ui.com/unifi/unifi-repo.gpg" -o "/usr/share/keyrings/unifi-repo.gpg"
|
|
||||||
cat <<EOF | sudo tee /etc/apt/sources.list.d/100-ubnt-unifi.sources >/dev/null
|
|
||||||
Types: deb
|
|
||||||
URIs: https://www.ui.com/downloads/unifi/debian
|
|
||||||
Suites: stable
|
|
||||||
Components: ubiquiti
|
|
||||||
Architectures: amd64
|
|
||||||
Signed-By: /usr/share/keyrings/unifi-repo.gpg
|
|
||||||
EOF
|
|
||||||
$STD apt update
|
|
||||||
msg_ok "Installed Dependencies"
|
msg_ok "Installed Dependencies"
|
||||||
|
|
||||||
|
setup_deb822_repo \
|
||||||
|
"unifi" \
|
||||||
|
"https://dl.ui.com/unifi/unifi-repo.gpg" \
|
||||||
|
"https://www.ui.com/downloads/unifi/debian" \
|
||||||
|
"stable" \
|
||||||
|
"ubiquiti" \
|
||||||
|
"amd64"
|
||||||
|
|
||||||
JAVA_VERSION="21" setup_java
|
JAVA_VERSION="21" setup_java
|
||||||
|
|
||||||
if lscpu | grep -q 'avx'; then
|
if lscpu | grep -q 'avx'; then
|
||||||
|
|||||||
@@ -49,7 +49,7 @@ fetch_and_deploy_gh_release "warracker" "sassanix/Warracker" "tarball" "latest"
|
|||||||
|
|
||||||
msg_info "Installing Warracker"
|
msg_info "Installing Warracker"
|
||||||
cd /opt/warracker/backend
|
cd /opt/warracker/backend
|
||||||
$STD uv venv --clear .venv
|
$STD uv venv .venv
|
||||||
$STD source .venv/bin/activate
|
$STD source .venv/bin/activate
|
||||||
$STD uv pip install -r requirements.txt
|
$STD uv pip install -r requirements.txt
|
||||||
mv /opt/warracker/env.example /opt/.env
|
mv /opt/warracker/env.example /opt/.env
|
||||||
|
|||||||
1167
misc/api.func
1167
misc/api.func
File diff suppressed because it is too large
Load Diff
561
misc/build.func
561
misc/build.func
@@ -38,16 +38,15 @@
|
|||||||
# - Captures app-declared resource defaults (CPU, RAM, Disk)
|
# - Captures app-declared resource defaults (CPU, RAM, Disk)
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
variables() {
|
variables() {
|
||||||
NSAPP=$(echo "${APP,,}" | tr -d ' ') # This function sets the NSAPP variable by converting the value of the APP variable to lowercase and removing any spaces.
|
NSAPP=$(echo "${APP,,}" | tr -d ' ') # This function sets the NSAPP variable by converting the value of the APP variable to lowercase and removing any spaces.
|
||||||
var_install="${NSAPP}-install" # sets the var_install variable by appending "-install" to the value of NSAPP.
|
var_install="${NSAPP}-install" # sets the var_install variable by appending "-install" to the value of NSAPP.
|
||||||
INTEGER='^[0-9]+([.][0-9]+)?$' # it defines the INTEGER regular expression pattern.
|
INTEGER='^[0-9]+([.][0-9]+)?$' # it defines the INTEGER regular expression pattern.
|
||||||
PVEHOST_NAME=$(hostname) # gets the Proxmox Hostname and sets it to Uppercase
|
PVEHOST_NAME=$(hostname) # gets the Proxmox Hostname and sets it to Uppercase
|
||||||
DIAGNOSTICS="yes" # sets the DIAGNOSTICS variable to "yes", used for the API call.
|
DIAGNOSTICS="yes" # sets the DIAGNOSTICS variable to "yes", used for the API call.
|
||||||
METHOD="default" # sets the METHOD variable to "default", used for the API call.
|
METHOD="default" # sets the METHOD variable to "default", used for the API call.
|
||||||
RANDOM_UUID="$(cat /proc/sys/kernel/random/uuid)" # generates a random UUID and sets it to the RANDOM_UUID variable.
|
RANDOM_UUID="$(cat /proc/sys/kernel/random/uuid)" # generates a random UUID and sets it to the RANDOM_UUID variable.
|
||||||
SESSION_ID="${RANDOM_UUID:0:8}" # Short session ID (first 8 chars of UUID) for log files
|
SESSION_ID="${RANDOM_UUID:0:8}" # Short session ID (first 8 chars of UUID) for log files
|
||||||
BUILD_LOG="/tmp/create-lxc-${SESSION_ID}.log" # Host-side container creation log
|
BUILD_LOG="/tmp/create-lxc-${SESSION_ID}.log" # Host-side container creation log
|
||||||
combined_log="/tmp/install-${SESSION_ID}-combined.log" # Combined log (build + install) for failed installations
|
|
||||||
CTTYPE="${CTTYPE:-${CT_TYPE:-1}}"
|
CTTYPE="${CTTYPE:-${CT_TYPE:-1}}"
|
||||||
|
|
||||||
# Parse dev_mode early
|
# Parse dev_mode early
|
||||||
@@ -218,7 +217,7 @@ update_motd_ip() {
|
|||||||
local current_os="$(grep ^NAME /etc/os-release | cut -d= -f2 | tr -d '"') - Version: $(grep ^VERSION_ID /etc/os-release | cut -d= -f2 | tr -d '"')"
|
local current_os="$(grep ^NAME /etc/os-release | cut -d= -f2 | tr -d '"') - Version: $(grep ^VERSION_ID /etc/os-release | cut -d= -f2 | tr -d '"')"
|
||||||
local current_hostname="$(hostname)"
|
local current_hostname="$(hostname)"
|
||||||
local current_ip="$(hostname -I | awk '{print $1}')"
|
local current_ip="$(hostname -I | awk '{print $1}')"
|
||||||
|
|
||||||
# Update only if values actually changed
|
# Update only if values actually changed
|
||||||
if ! grep -q "OS:.*$current_os" "$PROFILE_FILE" 2>/dev/null; then
|
if ! grep -q "OS:.*$current_os" "$PROFILE_FILE" 2>/dev/null; then
|
||||||
sed -i "s|OS:.*|OS: \${GN}$current_os\${CL}\\\"|" "$PROFILE_FILE"
|
sed -i "s|OS:.*|OS: \${GN}$current_os\${CL}\\\"|" "$PROFILE_FILE"
|
||||||
@@ -277,9 +276,8 @@ install_ssh_keys_into_ct() {
|
|||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
# validate_container_id()
|
# validate_container_id()
|
||||||
#
|
#
|
||||||
# - Validates if a container ID is available for use (CLUSTER-WIDE)
|
# - Validates if a container ID is available for use
|
||||||
# - Checks cluster resources via pvesh for VMs/CTs on ALL nodes
|
# - Checks if ID is already used by VM or LXC container
|
||||||
# - Falls back to local config file check if pvesh unavailable
|
|
||||||
# - Checks if ID is used in LVM logical volumes
|
# - Checks if ID is used in LVM logical volumes
|
||||||
# - Returns 0 if ID is available, 1 if already in use
|
# - Returns 0 if ID is available, 1 if already in use
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
@@ -291,35 +289,11 @@ validate_container_id() {
|
|||||||
return 1
|
return 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# CLUSTER-WIDE CHECK: Query all VMs/CTs across all nodes
|
# Check if config file exists for VM or LXC
|
||||||
# This catches IDs used on other nodes in the cluster
|
|
||||||
# NOTE: Works on single-node too - Proxmox always has internal cluster structure
|
|
||||||
# Falls back gracefully if pvesh unavailable or returns empty
|
|
||||||
if command -v pvesh &>/dev/null; then
|
|
||||||
local cluster_ids
|
|
||||||
cluster_ids=$(pvesh get /cluster/resources --type vm --output-format json 2>/dev/null |
|
|
||||||
grep -oP '"vmid":\s*\K[0-9]+' 2>/dev/null || true)
|
|
||||||
if [[ -n "$cluster_ids" ]] && echo "$cluster_ids" | grep -qw "$ctid"; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
# LOCAL FALLBACK: Check if config file exists for VM or LXC
|
|
||||||
# This handles edge cases where pvesh might not return all info
|
|
||||||
if [[ -f "/etc/pve/qemu-server/${ctid}.conf" ]] || [[ -f "/etc/pve/lxc/${ctid}.conf" ]]; then
|
if [[ -f "/etc/pve/qemu-server/${ctid}.conf" ]] || [[ -f "/etc/pve/lxc/${ctid}.conf" ]]; then
|
||||||
return 1
|
return 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Check ALL nodes in cluster for config files (handles pmxcfs sync delays)
|
|
||||||
# NOTE: On single-node, /etc/pve/nodes/ contains just the one node - still works
|
|
||||||
if [[ -d "/etc/pve/nodes" ]]; then
|
|
||||||
for node_dir in /etc/pve/nodes/*/; do
|
|
||||||
if [[ -f "${node_dir}qemu-server/${ctid}.conf" ]] || [[ -f "${node_dir}lxc/${ctid}.conf" ]]; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Check if ID is used in LVM logical volumes
|
# Check if ID is used in LVM logical volumes
|
||||||
if lvs --noheadings -o lv_name 2>/dev/null | grep -qE "(^|[-_])${ctid}($|[-_])"; then
|
if lvs --noheadings -o lv_name 2>/dev/null | grep -qE "(^|[-_])${ctid}($|[-_])"; then
|
||||||
return 1
|
return 1
|
||||||
@@ -331,30 +305,63 @@ validate_container_id() {
|
|||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
# get_valid_container_id()
|
# get_valid_container_id()
|
||||||
#
|
#
|
||||||
# - Returns a valid, unused container ID (CLUSTER-AWARE)
|
# - Returns a valid, unused container ID
|
||||||
# - Uses pvesh /cluster/nextid as starting point (already cluster-aware)
|
|
||||||
# - If provided ID is valid, returns it
|
# - If provided ID is valid, returns it
|
||||||
# - Otherwise increments until a free one is found across entire cluster
|
# - Otherwise increments from suggested ID until a free one is found
|
||||||
# - Calls validate_container_id() to check availability
|
# - Calls validate_container_id() to check availability
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
get_valid_container_id() {
|
get_valid_container_id() {
|
||||||
local suggested_id="${1:-$(pvesh get /cluster/nextid 2>/dev/null || echo 100)}"
|
local suggested_id="${1:-$(pvesh get /cluster/nextid)}"
|
||||||
|
|
||||||
# Ensure we have a valid starting ID
|
while ! validate_container_id "$suggested_id"; do
|
||||||
if ! [[ "$suggested_id" =~ ^[0-9]+$ ]]; then
|
suggested_id=$((suggested_id + 1))
|
||||||
suggested_id=$(pvesh get /cluster/nextid 2>/dev/null || echo 100)
|
done
|
||||||
fi
|
|
||||||
|
echo "$suggested_id"
|
||||||
local max_attempts=1000
|
}
|
||||||
local attempts=0
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# validate_container_id()
|
||||||
|
#
|
||||||
|
# - Validates if a container ID is available for use
|
||||||
|
# - Checks if ID is already used by VM or LXC container
|
||||||
|
# - Checks if ID is used in LVM logical volumes
|
||||||
|
# - Returns 0 if ID is available, 1 if already in use
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
validate_container_id() {
|
||||||
|
local ctid="$1"
|
||||||
|
|
||||||
|
# Check if ID is numeric
|
||||||
|
if ! [[ "$ctid" =~ ^[0-9]+$ ]]; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if config file exists for VM or LXC
|
||||||
|
if [[ -f "/etc/pve/qemu-server/${ctid}.conf" ]] || [[ -f "/etc/pve/lxc/${ctid}.conf" ]]; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if ID is used in LVM logical volumes
|
||||||
|
if lvs --noheadings -o lv_name 2>/dev/null | grep -qE "(^|[-_])${ctid}($|[-_])"; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# get_valid_container_id()
|
||||||
|
#
|
||||||
|
# - Returns a valid, unused container ID
|
||||||
|
# - If provided ID is valid, returns it
|
||||||
|
# - Otherwise increments from suggested ID until a free one is found
|
||||||
|
# - Calls validate_container_id() to check availability
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
get_valid_container_id() {
|
||||||
|
local suggested_id="${1:-$(pvesh get /cluster/nextid)}"
|
||||||
|
|
||||||
while ! validate_container_id "$suggested_id"; do
|
while ! validate_container_id "$suggested_id"; do
|
||||||
suggested_id=$((suggested_id + 1))
|
suggested_id=$((suggested_id + 1))
|
||||||
attempts=$((attempts + 1))
|
|
||||||
if [[ $attempts -ge $max_attempts ]]; then
|
|
||||||
msg_error "Could not find available container ID after $max_attempts attempts"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
done
|
done
|
||||||
|
|
||||||
echo "$suggested_id"
|
echo "$suggested_id"
|
||||||
@@ -378,7 +385,7 @@ validate_hostname() {
|
|||||||
|
|
||||||
# Split by dots and validate each label
|
# Split by dots and validate each label
|
||||||
local IFS='.'
|
local IFS='.'
|
||||||
read -ra labels <<<"$hostname"
|
read -ra labels <<< "$hostname"
|
||||||
for label in "${labels[@]}"; do
|
for label in "${labels[@]}"; do
|
||||||
# Each label: 1-63 chars, alphanumeric, hyphens allowed (not at start/end)
|
# Each label: 1-63 chars, alphanumeric, hyphens allowed (not at start/end)
|
||||||
if [[ -z "$label" ]] || [[ ${#label} -gt 63 ]]; then
|
if [[ -z "$label" ]] || [[ ${#label} -gt 63 ]]; then
|
||||||
@@ -482,7 +489,7 @@ validate_ipv6_address() {
|
|||||||
# Check that no segment exceeds 4 hex chars
|
# Check that no segment exceeds 4 hex chars
|
||||||
local IFS=':'
|
local IFS=':'
|
||||||
local -a segments
|
local -a segments
|
||||||
read -ra segments <<<"$addr"
|
read -ra segments <<< "$addr"
|
||||||
for seg in "${segments[@]}"; do
|
for seg in "${segments[@]}"; do
|
||||||
if [[ ${#seg} -gt 4 ]]; then
|
if [[ ${#seg} -gt 4 ]]; then
|
||||||
return 1
|
return 1
|
||||||
@@ -532,14 +539,14 @@ validate_gateway_in_subnet() {
|
|||||||
|
|
||||||
# Convert IPs to integers
|
# Convert IPs to integers
|
||||||
local IFS='.'
|
local IFS='.'
|
||||||
read -r i1 i2 i3 i4 <<<"$ip"
|
read -r i1 i2 i3 i4 <<< "$ip"
|
||||||
read -r g1 g2 g3 g4 <<<"$gateway"
|
read -r g1 g2 g3 g4 <<< "$gateway"
|
||||||
|
|
||||||
local ip_int=$(((i1 << 24) + (i2 << 16) + (i3 << 8) + i4))
|
local ip_int=$(( (i1 << 24) + (i2 << 16) + (i3 << 8) + i4 ))
|
||||||
local gw_int=$(((g1 << 24) + (g2 << 16) + (g3 << 8) + g4))
|
local gw_int=$(( (g1 << 24) + (g2 << 16) + (g3 << 8) + g4 ))
|
||||||
|
|
||||||
# Check if both are in same network
|
# Check if both are in same network
|
||||||
if (((ip_int & mask) != (gw_int & mask))); then
|
if (( (ip_int & mask) != (gw_int & mask) )); then
|
||||||
return 1
|
return 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
@@ -1072,117 +1079,117 @@ load_vars_file() {
|
|||||||
# Validate values before setting (skip empty values - they use defaults)
|
# Validate values before setting (skip empty values - they use defaults)
|
||||||
if [[ -n "$var_val" ]]; then
|
if [[ -n "$var_val" ]]; then
|
||||||
case "$var_key" in
|
case "$var_key" in
|
||||||
var_mac)
|
var_mac)
|
||||||
if ! validate_mac_address "$var_val"; then
|
if ! validate_mac_address "$var_val"; then
|
||||||
msg_warn "Invalid MAC address '$var_val' in $file, ignoring"
|
msg_warn "Invalid MAC address '$var_val' in $file, ignoring"
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_vlan)
|
|
||||||
if ! validate_vlan_tag "$var_val"; then
|
|
||||||
msg_warn "Invalid VLAN tag '$var_val' in $file (must be 1-4094), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_mtu)
|
|
||||||
if ! validate_mtu "$var_val"; then
|
|
||||||
msg_warn "Invalid MTU '$var_val' in $file (must be 576-65535), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_tags)
|
|
||||||
if ! validate_tags "$var_val"; then
|
|
||||||
msg_warn "Invalid tags '$var_val' in $file (alphanumeric, -, _, ; only), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_timezone)
|
|
||||||
if ! validate_timezone "$var_val"; then
|
|
||||||
msg_warn "Invalid timezone '$var_val' in $file, ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_brg)
|
|
||||||
if ! validate_bridge "$var_val"; then
|
|
||||||
msg_warn "Bridge '$var_val' not found in $file, ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_gateway)
|
|
||||||
if ! validate_gateway_ip "$var_val"; then
|
|
||||||
msg_warn "Invalid gateway IP '$var_val' in $file, ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_hostname)
|
|
||||||
if ! validate_hostname "$var_val"; then
|
|
||||||
msg_warn "Invalid hostname '$var_val' in $file, ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_cpu)
|
|
||||||
if ! [[ "$var_val" =~ ^[0-9]+$ ]] || ((var_val < 1 || var_val > 128)); then
|
|
||||||
msg_warn "Invalid CPU count '$var_val' in $file (must be 1-128), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_ram)
|
|
||||||
if ! [[ "$var_val" =~ ^[0-9]+$ ]] || ((var_val < 256)); then
|
|
||||||
msg_warn "Invalid RAM '$var_val' in $file (must be >= 256 MiB), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_disk)
|
|
||||||
if ! [[ "$var_val" =~ ^[0-9]+$ ]] || ((var_val < 1)); then
|
|
||||||
msg_warn "Invalid disk size '$var_val' in $file (must be >= 1 GB), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_unprivileged)
|
|
||||||
if [[ "$var_val" != "0" && "$var_val" != "1" ]]; then
|
|
||||||
msg_warn "Invalid unprivileged value '$var_val' in $file (must be 0 or 1), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_nesting)
|
|
||||||
if [[ "$var_val" != "0" && "$var_val" != "1" ]]; then
|
|
||||||
msg_warn "Invalid nesting value '$var_val' in $file (must be 0 or 1), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
# Warn about potential issues with systemd-based OS when nesting is disabled via vars file
|
|
||||||
if [[ "$var_val" == "0" && "${var_os:-debian}" != "alpine" ]]; then
|
|
||||||
msg_warn "Nesting disabled in $file - modern systemd-based distributions may require nesting for proper operation"
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_keyctl)
|
|
||||||
if [[ "$var_val" != "0" && "$var_val" != "1" ]]; then
|
|
||||||
msg_warn "Invalid keyctl value '$var_val' in $file (must be 0 or 1), ignoring"
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
var_net)
|
|
||||||
# var_net can be: dhcp, static IP/CIDR, or IP range
|
|
||||||
if [[ "$var_val" != "dhcp" ]]; then
|
|
||||||
if is_ip_range "$var_val"; then
|
|
||||||
: # IP range is valid, will be resolved at runtime
|
|
||||||
elif ! validate_ip_address "$var_val"; then
|
|
||||||
msg_warn "Invalid network '$var_val' in $file (must be dhcp or IP/CIDR), ignoring"
|
|
||||||
continue
|
continue
|
||||||
fi
|
fi
|
||||||
fi
|
;;
|
||||||
;;
|
var_vlan)
|
||||||
var_fuse | var_tun | var_gpu | var_ssh | var_verbose | var_protection)
|
if ! validate_vlan_tag "$var_val"; then
|
||||||
if [[ "$var_val" != "yes" && "$var_val" != "no" ]]; then
|
msg_warn "Invalid VLAN tag '$var_val' in $file (must be 1-4094), ignoring"
|
||||||
msg_warn "Invalid boolean '$var_val' for $var_key in $file (must be yes/no), ignoring"
|
continue
|
||||||
continue
|
fi
|
||||||
fi
|
;;
|
||||||
;;
|
var_mtu)
|
||||||
var_ipv6_method)
|
if ! validate_mtu "$var_val"; then
|
||||||
if [[ "$var_val" != "auto" && "$var_val" != "dhcp" && "$var_val" != "static" && "$var_val" != "none" ]]; then
|
msg_warn "Invalid MTU '$var_val' in $file (must be 576-65535), ignoring"
|
||||||
msg_warn "Invalid IPv6 method '$var_val' in $file (must be auto/dhcp/static/none), ignoring"
|
continue
|
||||||
continue
|
fi
|
||||||
fi
|
;;
|
||||||
;;
|
var_tags)
|
||||||
|
if ! validate_tags "$var_val"; then
|
||||||
|
msg_warn "Invalid tags '$var_val' in $file (alphanumeric, -, _, ; only), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_timezone)
|
||||||
|
if ! validate_timezone "$var_val"; then
|
||||||
|
msg_warn "Invalid timezone '$var_val' in $file, ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_brg)
|
||||||
|
if ! validate_bridge "$var_val"; then
|
||||||
|
msg_warn "Bridge '$var_val' not found in $file, ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_gateway)
|
||||||
|
if ! validate_gateway_ip "$var_val"; then
|
||||||
|
msg_warn "Invalid gateway IP '$var_val' in $file, ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_hostname)
|
||||||
|
if ! validate_hostname "$var_val"; then
|
||||||
|
msg_warn "Invalid hostname '$var_val' in $file, ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_cpu)
|
||||||
|
if ! [[ "$var_val" =~ ^[0-9]+$ ]] || ((var_val < 1 || var_val > 128)); then
|
||||||
|
msg_warn "Invalid CPU count '$var_val' in $file (must be 1-128), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_ram)
|
||||||
|
if ! [[ "$var_val" =~ ^[0-9]+$ ]] || ((var_val < 256)); then
|
||||||
|
msg_warn "Invalid RAM '$var_val' in $file (must be >= 256 MiB), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_disk)
|
||||||
|
if ! [[ "$var_val" =~ ^[0-9]+$ ]] || ((var_val < 1)); then
|
||||||
|
msg_warn "Invalid disk size '$var_val' in $file (must be >= 1 GB), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_unprivileged)
|
||||||
|
if [[ "$var_val" != "0" && "$var_val" != "1" ]]; then
|
||||||
|
msg_warn "Invalid unprivileged value '$var_val' in $file (must be 0 or 1), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_nesting)
|
||||||
|
if [[ "$var_val" != "0" && "$var_val" != "1" ]]; then
|
||||||
|
msg_warn "Invalid nesting value '$var_val' in $file (must be 0 or 1), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
# Warn about potential issues with systemd-based OS when nesting is disabled via vars file
|
||||||
|
if [[ "$var_val" == "0" && "${var_os:-debian}" != "alpine" ]]; then
|
||||||
|
msg_warn "Nesting disabled in $file - modern systemd-based distributions may require nesting for proper operation"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_keyctl)
|
||||||
|
if [[ "$var_val" != "0" && "$var_val" != "1" ]]; then
|
||||||
|
msg_warn "Invalid keyctl value '$var_val' in $file (must be 0 or 1), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_net)
|
||||||
|
# var_net can be: dhcp, static IP/CIDR, or IP range
|
||||||
|
if [[ "$var_val" != "dhcp" ]]; then
|
||||||
|
if is_ip_range "$var_val"; then
|
||||||
|
: # IP range is valid, will be resolved at runtime
|
||||||
|
elif ! validate_ip_address "$var_val"; then
|
||||||
|
msg_warn "Invalid network '$var_val' in $file (must be dhcp or IP/CIDR), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_fuse|var_tun|var_gpu|var_ssh|var_verbose|var_protection)
|
||||||
|
if [[ "$var_val" != "yes" && "$var_val" != "no" ]]; then
|
||||||
|
msg_warn "Invalid boolean '$var_val' for $var_key in $file (must be yes/no), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var_ipv6_method)
|
||||||
|
if [[ "$var_val" != "auto" && "$var_val" != "dhcp" && "$var_val" != "static" && "$var_val" != "none" ]]; then
|
||||||
|
msg_warn "Invalid IPv6 method '$var_val' in $file (must be auto/dhcp/static/none), ignoring"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
;;
|
||||||
esac
|
esac
|
||||||
fi
|
fi
|
||||||
|
|
||||||
@@ -2757,26 +2764,6 @@ Advanced:
|
|||||||
[[ "$APT_CACHER" == "yes" ]] && echo -e "${INFO}${BOLD}${DGN}APT Cacher: ${BGN}$APT_CACHER_IP${CL}"
|
[[ "$APT_CACHER" == "yes" ]] && echo -e "${INFO}${BOLD}${DGN}APT Cacher: ${BGN}$APT_CACHER_IP${CL}"
|
||||||
echo -e "${SEARCH}${BOLD}${DGN}Verbose Mode: ${BGN}$VERBOSE${CL}"
|
echo -e "${SEARCH}${BOLD}${DGN}Verbose Mode: ${BGN}$VERBOSE${CL}"
|
||||||
echo -e "${CREATING}${BOLD}${RD}Creating an LXC of ${APP} using the above advanced settings${CL}"
|
echo -e "${CREATING}${BOLD}${RD}Creating an LXC of ${APP} using the above advanced settings${CL}"
|
||||||
|
|
||||||
# Log settings to file
|
|
||||||
log_section "CONTAINER SETTINGS (ADVANCED) - ${APP}"
|
|
||||||
log_msg "Application: ${APP}"
|
|
||||||
log_msg "PVE Version: ${PVEVERSION} (Kernel: ${KERNEL_VERSION})"
|
|
||||||
log_msg "Operating System: $var_os ($var_version)"
|
|
||||||
log_msg "Container Type: $([ "$CT_TYPE" == "1" ] && echo "Unprivileged" || echo "Privileged")"
|
|
||||||
log_msg "Container ID: $CT_ID"
|
|
||||||
log_msg "Hostname: $HN"
|
|
||||||
log_msg "Disk Size: ${DISK_SIZE} GB"
|
|
||||||
log_msg "CPU Cores: $CORE_COUNT"
|
|
||||||
log_msg "RAM Size: ${RAM_SIZE} MiB"
|
|
||||||
log_msg "Bridge: $BRG"
|
|
||||||
log_msg "IPv4: $NET"
|
|
||||||
log_msg "IPv6: $IPV6_METHOD"
|
|
||||||
log_msg "FUSE Support: ${ENABLE_FUSE:-no}"
|
|
||||||
log_msg "Nesting: $([ "${ENABLE_NESTING:-1}" == "1" ] && echo "Enabled" || echo "Disabled")"
|
|
||||||
log_msg "GPU Passthrough: ${ENABLE_GPU:-no}"
|
|
||||||
log_msg "Verbose Mode: $VERBOSE"
|
|
||||||
log_msg "Session ID: ${SESSION_ID}"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# ==============================================================================
|
# ==============================================================================
|
||||||
@@ -2884,7 +2871,6 @@ diagnostics_menu() {
|
|||||||
# - Prints summary of default values (ID, OS, type, disk, RAM, CPU, etc.)
|
# - Prints summary of default values (ID, OS, type, disk, RAM, CPU, etc.)
|
||||||
# - Uses icons and formatting for readability
|
# - Uses icons and formatting for readability
|
||||||
# - Convert CT_TYPE to description
|
# - Convert CT_TYPE to description
|
||||||
# - Also logs settings to log file for debugging
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
echo_default() {
|
echo_default() {
|
||||||
CT_TYPE_DESC="Unprivileged"
|
CT_TYPE_DESC="Unprivileged"
|
||||||
@@ -2906,20 +2892,6 @@ echo_default() {
|
|||||||
fi
|
fi
|
||||||
echo -e "${CREATING}${BOLD}${BL}Creating a ${APP} LXC using the above default settings${CL}"
|
echo -e "${CREATING}${BOLD}${BL}Creating a ${APP} LXC using the above default settings${CL}"
|
||||||
echo -e " "
|
echo -e " "
|
||||||
|
|
||||||
# Log settings to file
|
|
||||||
log_section "CONTAINER SETTINGS - ${APP}"
|
|
||||||
log_msg "Application: ${APP}"
|
|
||||||
log_msg "PVE Version: ${PVEVERSION} (Kernel: ${KERNEL_VERSION})"
|
|
||||||
log_msg "Container ID: ${CT_ID}"
|
|
||||||
log_msg "Operating System: $var_os ($var_version)"
|
|
||||||
log_msg "Container Type: $CT_TYPE_DESC"
|
|
||||||
log_msg "Disk Size: ${DISK_SIZE} GB"
|
|
||||||
log_msg "CPU Cores: ${CORE_COUNT}"
|
|
||||||
log_msg "RAM Size: ${RAM_SIZE} MiB"
|
|
||||||
[[ -n "${var_gpu:-}" && "${var_gpu}" == "yes" ]] && log_msg "GPU Passthrough: Enabled"
|
|
||||||
[[ "$VERBOSE" == "yes" ]] && log_msg "Verbose Mode: Enabled"
|
|
||||||
log_msg "Session ID: ${SESSION_ID}"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
@@ -3106,10 +3078,10 @@ settings_menu() {
|
|||||||
|
|
||||||
case "$choice" in
|
case "$choice" in
|
||||||
1) diagnostics_menu ;;
|
1) diagnostics_menu ;;
|
||||||
2) ${EDITOR:-nano} /usr/local/community-scripts/default.vars ;;
|
2) nano /usr/local/community-scripts/default.vars ;;
|
||||||
3)
|
3)
|
||||||
if [ -f "$(get_app_defaults_path)" ]; then
|
if [ -f "$(get_app_defaults_path)" ]; then
|
||||||
${EDITOR:-nano} "$(get_app_defaults_path)"
|
nano "$(get_app_defaults_path)"
|
||||||
else
|
else
|
||||||
# Back was selected (no app.vars available)
|
# Back was selected (no app.vars available)
|
||||||
return
|
return
|
||||||
@@ -3379,21 +3351,19 @@ msg_menu() {
|
|||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Display menu to /dev/tty so it doesn't get captured by command substitution
|
# Display menu
|
||||||
{
|
echo ""
|
||||||
echo ""
|
msg_custom "📋" "${BL}" "${title}"
|
||||||
msg_custom "📋" "${BL}" "${title}"
|
echo ""
|
||||||
echo ""
|
for i in "${!tags[@]}"; do
|
||||||
for i in "${!tags[@]}"; do
|
local marker=" "
|
||||||
local marker=" "
|
[[ $i -eq 0 ]] && marker="* "
|
||||||
[[ $i -eq 0 ]] && marker="* "
|
printf "${TAB3}${marker}%s) %s\n" "${tags[$i]}" "${descs[$i]}"
|
||||||
printf "${TAB3}${marker}%s) %s\n" "${tags[$i]}" "${descs[$i]}"
|
done
|
||||||
done
|
echo ""
|
||||||
echo ""
|
|
||||||
} >/dev/tty
|
|
||||||
|
|
||||||
local selection=""
|
local selection=""
|
||||||
read -r -t 10 -p "${TAB3}Select [default=${default_tag}, timeout 10s]: " selection </dev/tty >/dev/tty || true
|
read -r -t 10 -p "${TAB3}Select [default=${default_tag}, timeout 10s]: " selection || true
|
||||||
|
|
||||||
# Validate selection
|
# Validate selection
|
||||||
if [[ -n "$selection" ]]; then
|
if [[ -n "$selection" ]]; then
|
||||||
@@ -3664,9 +3634,6 @@ $PCT_OPTIONS_STRING"
|
|||||||
exit 214
|
exit 214
|
||||||
fi
|
fi
|
||||||
msg_ok "Storage space validated"
|
msg_ok "Storage space validated"
|
||||||
|
|
||||||
# Report installation start to API (early - captures failed installs too)
|
|
||||||
post_to_api
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
create_lxc_container || exit $?
|
create_lxc_container || exit $?
|
||||||
@@ -4041,9 +4008,6 @@ EOF'
|
|||||||
# Install SSH keys
|
# Install SSH keys
|
||||||
install_ssh_keys_into_ct
|
install_ssh_keys_into_ct
|
||||||
|
|
||||||
# Start timer for duration tracking
|
|
||||||
start_install_timer
|
|
||||||
|
|
||||||
# Run application installer
|
# Run application installer
|
||||||
# Disable error trap - container errors are handled internally via flag file
|
# Disable error trap - container errors are handled internally via flag file
|
||||||
set +Eeuo pipefail # Disable ALL error handling temporarily
|
set +Eeuo pipefail # Disable ALL error handling temporarily
|
||||||
@@ -4074,58 +4038,25 @@ EOF'
|
|||||||
if [[ $install_exit_code -ne 0 ]]; then
|
if [[ $install_exit_code -ne 0 ]]; then
|
||||||
msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})"
|
msg_error "Installation failed in container ${CTID} (exit code: ${install_exit_code})"
|
||||||
|
|
||||||
# Copy install log from container BEFORE API call so get_error_text() can read it
|
# Copy both logs from container before potential deletion
|
||||||
local build_log_copied=false
|
local build_log_copied=false
|
||||||
local install_log_copied=false
|
local install_log_copied=false
|
||||||
local combined_log="/tmp/${NSAPP:-lxc}-${CTID}-${SESSION_ID}.log"
|
|
||||||
|
|
||||||
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
|
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
|
||||||
# Create combined log with header
|
# Copy BUILD_LOG (creation log) if it exists
|
||||||
{
|
|
||||||
echo "================================================================================"
|
|
||||||
echo "COMBINED INSTALLATION LOG - ${APP:-LXC}"
|
|
||||||
echo "Container ID: ${CTID}"
|
|
||||||
echo "Session ID: ${SESSION_ID}"
|
|
||||||
echo "Timestamp: $(date '+%Y-%m-%d %H:%M:%S')"
|
|
||||||
echo "================================================================================"
|
|
||||||
echo ""
|
|
||||||
} >"$combined_log"
|
|
||||||
|
|
||||||
# Append BUILD_LOG (host-side creation log) if it exists
|
|
||||||
if [[ -f "${BUILD_LOG}" ]]; then
|
if [[ -f "${BUILD_LOG}" ]]; then
|
||||||
{
|
cp "${BUILD_LOG}" "/tmp/create-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null && build_log_copied=true
|
||||||
echo "================================================================================"
|
|
||||||
echo "PHASE 1: CONTAINER CREATION (Host)"
|
|
||||||
echo "================================================================================"
|
|
||||||
cat "${BUILD_LOG}"
|
|
||||||
echo ""
|
|
||||||
} >>"$combined_log"
|
|
||||||
build_log_copied=true
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Copy and append INSTALL_LOG from container
|
# Copy INSTALL_LOG from container
|
||||||
local temp_install_log="/tmp/.install-temp-${SESSION_ID}.log"
|
if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "/tmp/install-lxc-${CTID}-${SESSION_ID}.log" 2>/dev/null; then
|
||||||
if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "$temp_install_log" 2>/dev/null; then
|
|
||||||
{
|
|
||||||
echo "================================================================================"
|
|
||||||
echo "PHASE 2: APPLICATION INSTALLATION (Container)"
|
|
||||||
echo "================================================================================"
|
|
||||||
cat "$temp_install_log"
|
|
||||||
echo ""
|
|
||||||
} >>"$combined_log"
|
|
||||||
rm -f "$temp_install_log"
|
|
||||||
install_log_copied=true
|
install_log_copied=true
|
||||||
# Point INSTALL_LOG to combined log so get_error_text() finds it
|
|
||||||
INSTALL_LOG="$combined_log"
|
|
||||||
fi
|
fi
|
||||||
fi
|
|
||||||
|
|
||||||
# Report failure to telemetry API (now with log available on host)
|
# Show available logs
|
||||||
post_update_to_api "failed" "$install_exit_code"
|
echo ""
|
||||||
|
[[ "$build_log_copied" == true ]] && echo -e "${GN}✔${CL} Container creation log: ${BL}/tmp/create-lxc-${CTID}-${SESSION_ID}.log${CL}"
|
||||||
# Show combined log location
|
[[ "$install_log_copied" == true ]] && echo -e "${GN}✔${CL} Installation log: ${BL}/tmp/install-lxc-${CTID}-${SESSION_ID}.log${CL}"
|
||||||
if [[ -n "$CTID" && -n "${SESSION_ID:-}" ]]; then
|
|
||||||
msg_custom "📋" "${YW}" "Installation log: ${combined_log}"
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Dev mode: Keep container or open breakpoint shell
|
# Dev mode: Keep container or open breakpoint shell
|
||||||
@@ -4148,21 +4079,19 @@ EOF'
|
|||||||
exit $install_exit_code
|
exit $install_exit_code
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Prompt user for cleanup with 60s timeout
|
# Prompt user for cleanup with 60s timeout (plain echo - no msg_info to avoid spinner)
|
||||||
echo ""
|
echo ""
|
||||||
echo -en "${TAB}❓${TAB}${YW}Remove broken container ${CTID}? (Y/n) [auto-remove in 60s]: ${CL}"
|
echo -en "${YW}Remove broken container ${CTID}? (Y/n) [auto-remove in 60s]: ${CL}"
|
||||||
|
|
||||||
if read -t 60 -r response; then
|
if read -t 60 -r response; then
|
||||||
if [[ -z "$response" || "$response" =~ ^[Yy]$ ]]; then
|
if [[ -z "$response" || "$response" =~ ^[Yy]$ ]]; then
|
||||||
# Remove container
|
# Remove container
|
||||||
echo ""
|
echo -e "\n${TAB}${HOLD}${YW}Removing container ${CTID}${CL}"
|
||||||
msg_info "Removing container ${CTID}"
|
|
||||||
pct stop "$CTID" &>/dev/null || true
|
pct stop "$CTID" &>/dev/null || true
|
||||||
pct destroy "$CTID" &>/dev/null || true
|
pct destroy "$CTID" &>/dev/null || true
|
||||||
msg_ok "Container ${CTID} removed"
|
echo -e "${BFR}${CM}${GN}Container ${CTID} removed${CL}"
|
||||||
elif [[ "$response" =~ ^[Nn]$ ]]; then
|
elif [[ "$response" =~ ^[Nn]$ ]]; then
|
||||||
echo ""
|
echo -e "\n${TAB}${YW}Container ${CTID} kept for debugging${CL}"
|
||||||
msg_warn "Container ${CTID} kept for debugging"
|
|
||||||
|
|
||||||
# Dev mode: Setup MOTD/SSH for debugging access to broken container
|
# Dev mode: Setup MOTD/SSH for debugging access to broken container
|
||||||
if [[ "${DEV_MODE_MOTD:-false}" == "true" ]]; then
|
if [[ "${DEV_MODE_MOTD:-false}" == "true" ]]; then
|
||||||
@@ -4178,17 +4107,13 @@ EOF'
|
|||||||
fi
|
fi
|
||||||
else
|
else
|
||||||
# Timeout - auto-remove
|
# Timeout - auto-remove
|
||||||
echo ""
|
echo -e "\n${YW}No response - auto-removing container${CL}"
|
||||||
msg_info "No response - removing container ${CTID}"
|
echo -e "${TAB}${HOLD}${YW}Removing container ${CTID}${CL}"
|
||||||
pct stop "$CTID" &>/dev/null || true
|
pct stop "$CTID" &>/dev/null || true
|
||||||
pct destroy "$CTID" &>/dev/null || true
|
pct destroy "$CTID" &>/dev/null || true
|
||||||
msg_ok "Container ${CTID} removed"
|
echo -e "${BFR}${CM}${GN}Container ${CTID} removed${CL}"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Force one final status update attempt after cleanup
|
|
||||||
# This ensures status is updated even if the first attempt failed (e.g., HTTP 400)
|
|
||||||
post_update_to_api "failed" "$install_exit_code" "force"
|
|
||||||
|
|
||||||
exit $install_exit_code
|
exit $install_exit_code
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
@@ -5192,74 +5117,18 @@ EOF
|
|||||||
# SECTION 10: ERROR HANDLING & EXIT TRAPS
|
# SECTION 10: ERROR HANDLING & EXIT TRAPS
|
||||||
# ==============================================================================
|
# ==============================================================================
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# ensure_log_on_host()
|
|
||||||
#
|
|
||||||
# - Ensures INSTALL_LOG points to a readable file on the host
|
|
||||||
# - If INSTALL_LOG points to a container path (e.g. /root/.install-*),
|
|
||||||
# tries to pull it from the container and create a combined log
|
|
||||||
# - This allows get_error_text() to find actual error output for telemetry
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
ensure_log_on_host() {
|
|
||||||
# Already readable on host? Nothing to do.
|
|
||||||
[[ -n "${INSTALL_LOG:-}" && -s "${INSTALL_LOG}" ]] && return 0
|
|
||||||
|
|
||||||
# Try pulling from container and creating combined log
|
|
||||||
if [[ -n "${CTID:-}" && -n "${SESSION_ID:-}" ]] && command -v pct &>/dev/null; then
|
|
||||||
local combined_log="/tmp/${NSAPP:-lxc}-${CTID}-${SESSION_ID}.log"
|
|
||||||
if [[ ! -s "$combined_log" ]]; then
|
|
||||||
# Create combined log
|
|
||||||
{
|
|
||||||
echo "================================================================================"
|
|
||||||
echo "COMBINED INSTALLATION LOG - ${APP:-LXC}"
|
|
||||||
echo "Container ID: ${CTID}"
|
|
||||||
echo "Session ID: ${SESSION_ID}"
|
|
||||||
echo "Timestamp: $(date '+%Y-%m-%d %H:%M:%S')"
|
|
||||||
echo "================================================================================"
|
|
||||||
echo ""
|
|
||||||
} >"$combined_log" 2>/dev/null || return 0
|
|
||||||
# Append BUILD_LOG if it exists
|
|
||||||
if [[ -f "${BUILD_LOG:-}" ]]; then
|
|
||||||
{
|
|
||||||
echo "================================================================================"
|
|
||||||
echo "PHASE 1: CONTAINER CREATION (Host)"
|
|
||||||
echo "================================================================================"
|
|
||||||
cat "${BUILD_LOG}"
|
|
||||||
echo ""
|
|
||||||
} >>"$combined_log"
|
|
||||||
fi
|
|
||||||
# Pull INSTALL_LOG from container
|
|
||||||
local temp_log="/tmp/.install-temp-${SESSION_ID}.log"
|
|
||||||
if pct pull "$CTID" "/root/.install-${SESSION_ID}.log" "$temp_log" 2>/dev/null; then
|
|
||||||
{
|
|
||||||
echo "================================================================================"
|
|
||||||
echo "PHASE 2: APPLICATION INSTALLATION (Container)"
|
|
||||||
echo "================================================================================"
|
|
||||||
cat "$temp_log"
|
|
||||||
echo ""
|
|
||||||
} >>"$combined_log"
|
|
||||||
rm -f "$temp_log"
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
if [[ -s "$combined_log" ]]; then
|
|
||||||
INSTALL_LOG="$combined_log"
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
# api_exit_script()
|
# api_exit_script()
|
||||||
#
|
#
|
||||||
# - Exit trap handler for reporting to API telemetry
|
# - Exit trap handler for reporting to API telemetry
|
||||||
# - Captures exit code and reports to PocketBase using centralized error descriptions
|
# - Captures exit code and reports to API using centralized error descriptions
|
||||||
# - Uses explain_exit_code() from api.func for consistent error messages
|
# - Uses explain_exit_code() from error_handler.func for consistent error messages
|
||||||
# - Posts failure status with exit code to API (error description resolved automatically)
|
# - Posts failure status with exit code to API (error description added automatically)
|
||||||
# - Only executes on non-zero exit codes
|
# - Only executes on non-zero exit codes
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
api_exit_script() {
|
api_exit_script() {
|
||||||
exit_code=$?
|
exit_code=$?
|
||||||
if [ $exit_code -ne 0 ]; then
|
if [ $exit_code -ne 0 ]; then
|
||||||
ensure_log_on_host
|
|
||||||
post_update_to_api "failed" "$exit_code"
|
post_update_to_api "failed" "$exit_code"
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
@@ -5267,6 +5136,6 @@ api_exit_script() {
|
|||||||
if command -v pveversion >/dev/null 2>&1; then
|
if command -v pveversion >/dev/null 2>&1; then
|
||||||
trap 'api_exit_script' EXIT
|
trap 'api_exit_script' EXIT
|
||||||
fi
|
fi
|
||||||
trap 'ensure_log_on_host; post_update_to_api "failed" "$?"' ERR
|
trap 'post_update_to_api "failed" "$BASH_COMMAND"' ERR
|
||||||
trap 'ensure_log_on_host; post_update_to_api "failed" "130"' SIGINT
|
trap 'post_update_to_api "failed" "INTERRUPTED"' SIGINT
|
||||||
trap 'ensure_log_on_host; post_update_to_api "failed" "143"' SIGTERM
|
trap 'post_update_to_api "failed" "TERMINATED"' SIGTERM
|
||||||
|
|||||||
652
misc/core.func
652
misc/core.func
@@ -115,7 +115,7 @@ icons() {
|
|||||||
BRIDGE="${TAB}🌉${TAB}${CL}"
|
BRIDGE="${TAB}🌉${TAB}${CL}"
|
||||||
NETWORK="${TAB}📡${TAB}${CL}"
|
NETWORK="${TAB}📡${TAB}${CL}"
|
||||||
GATEWAY="${TAB}🌐${TAB}${CL}"
|
GATEWAY="${TAB}🌐${TAB}${CL}"
|
||||||
ICON_DISABLEIPV6="${TAB}🚫${TAB}${CL}"
|
DISABLEIPV6="${TAB}🚫${TAB}${CL}"
|
||||||
DEFAULT="${TAB}⚙️${TAB}${CL}"
|
DEFAULT="${TAB}⚙️${TAB}${CL}"
|
||||||
MACADDRESS="${TAB}🔗${TAB}${CL}"
|
MACADDRESS="${TAB}🔗${TAB}${CL}"
|
||||||
VLANTAG="${TAB}🏷️${TAB}${CL}"
|
VLANTAG="${TAB}🏷️${TAB}${CL}"
|
||||||
@@ -413,69 +413,6 @@ get_active_logfile() {
|
|||||||
# Legacy compatibility: SILENT_LOGFILE points to active log
|
# Legacy compatibility: SILENT_LOGFILE points to active log
|
||||||
SILENT_LOGFILE="$(get_active_logfile)"
|
SILENT_LOGFILE="$(get_active_logfile)"
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# strip_ansi()
|
|
||||||
#
|
|
||||||
# - Removes ANSI escape sequences from input text
|
|
||||||
# - Used to clean colored output for log files
|
|
||||||
# - Handles both piped input and arguments
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
strip_ansi() {
|
|
||||||
if [[ $# -gt 0 ]]; then
|
|
||||||
echo -e "$*" | sed 's/\x1b\[[0-9;]*m//g; s/\x1b\[[0-9;]*[a-zA-Z]//g'
|
|
||||||
else
|
|
||||||
sed 's/\x1b\[[0-9;]*m//g; s/\x1b\[[0-9;]*[a-zA-Z]//g'
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# log_msg()
|
|
||||||
#
|
|
||||||
# - Writes message to active log file without ANSI codes
|
|
||||||
# - Adds timestamp prefix for log correlation
|
|
||||||
# - Creates log file if it doesn't exist
|
|
||||||
# - Arguments: message text (can include ANSI codes, will be stripped)
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
log_msg() {
|
|
||||||
local msg="$*"
|
|
||||||
local logfile
|
|
||||||
logfile="$(get_active_logfile)"
|
|
||||||
|
|
||||||
[[ -z "$msg" ]] && return
|
|
||||||
[[ -z "$logfile" ]] && return
|
|
||||||
|
|
||||||
# Ensure log directory exists
|
|
||||||
mkdir -p "$(dirname "$logfile")" 2>/dev/null || true
|
|
||||||
|
|
||||||
# Strip ANSI codes and write with timestamp
|
|
||||||
local clean_msg
|
|
||||||
clean_msg=$(strip_ansi "$msg")
|
|
||||||
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $clean_msg" >>"$logfile"
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# log_section()
|
|
||||||
#
|
|
||||||
# - Writes a section header to the log file
|
|
||||||
# - Used for separating different phases of installation
|
|
||||||
# - Arguments: section name
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
log_section() {
|
|
||||||
local section="$1"
|
|
||||||
local logfile
|
|
||||||
logfile="$(get_active_logfile)"
|
|
||||||
|
|
||||||
[[ -z "$logfile" ]] && return
|
|
||||||
mkdir -p "$(dirname "$logfile")" 2>/dev/null || true
|
|
||||||
|
|
||||||
{
|
|
||||||
echo ""
|
|
||||||
echo "================================================================================"
|
|
||||||
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $section"
|
|
||||||
echo "================================================================================"
|
|
||||||
} >>"$logfile"
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
# silent()
|
# silent()
|
||||||
#
|
#
|
||||||
@@ -522,9 +459,15 @@ silent() {
|
|||||||
msg_custom "→" "${YWB}" "${cmd}"
|
msg_custom "→" "${YWB}" "${cmd}"
|
||||||
|
|
||||||
if [[ -s "$logfile" ]]; then
|
if [[ -s "$logfile" ]]; then
|
||||||
echo -e "\n${TAB}--- Last 10 lines of log ---"
|
local log_lines=$(wc -l <"$logfile")
|
||||||
|
echo "--- Last 10 lines of silent log ---"
|
||||||
tail -n 10 "$logfile"
|
tail -n 10 "$logfile"
|
||||||
echo -e "${TAB}-----------------------------------\n"
|
echo "-----------------------------------"
|
||||||
|
|
||||||
|
# Show how to view full log if there are more lines
|
||||||
|
if [[ $log_lines -gt 10 ]]; then
|
||||||
|
msg_custom "📋" "${YW}" "View full log (${log_lines} lines): ${logfile}"
|
||||||
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
exit "$rc"
|
exit "$rc"
|
||||||
@@ -545,7 +488,7 @@ spinner() {
|
|||||||
local i=0
|
local i=0
|
||||||
while true; do
|
while true; do
|
||||||
local index=$((i++ % ${#chars[@]}))
|
local index=$((i++ % ${#chars[@]}))
|
||||||
printf "\r\033[2K%s %b" "${CS_YWB}${chars[$index]}${CS_CL}" "${CS_YWB}${msg}${CS_CL}"
|
printf "\r\033[2K%s %b" "${CS_YWB}${TAB}${chars[$index]}${TAB}${CS_CL}" "${CS_YWB}${msg}${CS_CL}"
|
||||||
sleep 0.1
|
sleep 0.1
|
||||||
done
|
done
|
||||||
}
|
}
|
||||||
@@ -612,9 +555,6 @@ msg_info() {
|
|||||||
[[ -n "${MSG_INFO_SHOWN["$msg"]+x}" ]] && return
|
[[ -n "${MSG_INFO_SHOWN["$msg"]+x}" ]] && return
|
||||||
MSG_INFO_SHOWN["$msg"]=1
|
MSG_INFO_SHOWN["$msg"]=1
|
||||||
|
|
||||||
# Log to file
|
|
||||||
log_msg "[INFO] $msg"
|
|
||||||
|
|
||||||
stop_spinner
|
stop_spinner
|
||||||
SPINNER_MSG="$msg"
|
SPINNER_MSG="$msg"
|
||||||
|
|
||||||
@@ -658,10 +598,7 @@ msg_ok() {
|
|||||||
stop_spinner
|
stop_spinner
|
||||||
clear_line
|
clear_line
|
||||||
echo -e "$CM ${GN}${msg}${CL}"
|
echo -e "$CM ${GN}${msg}${CL}"
|
||||||
log_msg "[OK] $msg"
|
unset MSG_INFO_SHOWN["$msg"]
|
||||||
local sanitized_msg
|
|
||||||
sanitized_msg=$(printf '%s' "$msg" | sed 's/\x1b\[[0-9;]*m//g; s/[^a-zA-Z0-9_]/_/g')
|
|
||||||
unset 'MSG_INFO_SHOWN['"$sanitized_msg"']' 2>/dev/null || true
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
@@ -676,7 +613,6 @@ msg_error() {
|
|||||||
stop_spinner
|
stop_spinner
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${BFR:-}${CROSS:-✖️} ${RD}${msg}${CL}" >&2
|
echo -e "${BFR:-}${CROSS:-✖️} ${RD}${msg}${CL}" >&2
|
||||||
log_msg "[ERROR] $msg"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
@@ -691,7 +627,6 @@ msg_warn() {
|
|||||||
stop_spinner
|
stop_spinner
|
||||||
local msg="$1"
|
local msg="$1"
|
||||||
echo -e "${BFR:-}${INFO:-ℹ️} ${YWB}${msg}${CL}" >&2
|
echo -e "${BFR:-}${INFO:-ℹ️} ${YWB}${msg}${CL}" >&2
|
||||||
log_msg "[WARN] $msg"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
@@ -709,7 +644,6 @@ msg_custom() {
|
|||||||
[[ -z "$msg" ]] && return
|
[[ -z "$msg" ]] && return
|
||||||
stop_spinner
|
stop_spinner
|
||||||
echo -e "${BFR:-} ${symbol} ${color}${msg}${CL:-\e[0m}"
|
echo -e "${BFR:-} ${symbol} ${color}${msg}${CL:-\e[0m}"
|
||||||
log_msg "$msg"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
# ------------------------------------------------------------------------------
|
||||||
@@ -874,562 +808,6 @@ is_verbose_mode() {
|
|||||||
[[ "$verbose" != "no" || ! -t 2 ]]
|
[[ "$verbose" != "no" || ! -t 2 ]]
|
||||||
}
|
}
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# is_unattended()
|
|
||||||
#
|
|
||||||
# - Detects if script is running in unattended/non-interactive mode
|
|
||||||
# - Checks MODE variable first (primary method)
|
|
||||||
# - Falls back to legacy flags (PHS_SILENT, var_unattended)
|
|
||||||
# - Returns 0 (true) if unattended, 1 (false) otherwise
|
|
||||||
# - Used by prompt functions to auto-apply defaults
|
|
||||||
#
|
|
||||||
# Modes that are unattended:
|
|
||||||
# - default (1) : Use script defaults, no prompts
|
|
||||||
# - mydefaults (3) : Use user's default.vars, no prompts
|
|
||||||
# - appdefaults (4) : Use app-specific defaults, no prompts
|
|
||||||
#
|
|
||||||
# Modes that are interactive:
|
|
||||||
# - advanced (2) : Full wizard with all options
|
|
||||||
#
|
|
||||||
# Note: Even in advanced mode, install scripts run unattended because
|
|
||||||
# all values are already collected during the wizard phase.
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
is_unattended() {
|
|
||||||
# Primary: Check MODE variable (case-insensitive)
|
|
||||||
local mode="${MODE:-${mode:-}}"
|
|
||||||
mode="${mode,,}" # lowercase
|
|
||||||
|
|
||||||
case "$mode" in
|
|
||||||
default | 1)
|
|
||||||
return 0
|
|
||||||
;;
|
|
||||||
mydefaults | userdefaults | 3)
|
|
||||||
return 0
|
|
||||||
;;
|
|
||||||
appdefaults | 4)
|
|
||||||
return 0
|
|
||||||
;;
|
|
||||||
advanced | 2)
|
|
||||||
# Advanced mode is interactive ONLY during wizard
|
|
||||||
# Inside container (install scripts), it should be unattended
|
|
||||||
# Check if we're inside a container (no pveversion command)
|
|
||||||
if ! command -v pveversion &>/dev/null; then
|
|
||||||
# We're inside the container - all values already collected
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
# On host during wizard - interactive
|
|
||||||
return 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
|
|
||||||
# Legacy fallbacks for compatibility
|
|
||||||
[[ "${PHS_SILENT:-0}" == "1" ]] && return 0
|
|
||||||
[[ "${var_unattended:-}" =~ ^(yes|true|1)$ ]] && return 0
|
|
||||||
[[ "${UNATTENDED:-}" =~ ^(yes|true|1)$ ]] && return 0
|
|
||||||
|
|
||||||
# No TTY available = unattended
|
|
||||||
[[ ! -t 0 ]] && return 0
|
|
||||||
|
|
||||||
# Default: interactive
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# show_missing_values_warning()
|
|
||||||
#
|
|
||||||
# - Displays a summary of required values that used fallback defaults
|
|
||||||
# - Should be called at the end of install scripts
|
|
||||||
# - Only shows warning if MISSING_REQUIRED_VALUES array has entries
|
|
||||||
# - Provides clear guidance on what needs manual configuration
|
|
||||||
#
|
|
||||||
# Global:
|
|
||||||
# MISSING_REQUIRED_VALUES - Array of variable names that need configuration
|
|
||||||
#
|
|
||||||
# Example:
|
|
||||||
# # At end of install script:
|
|
||||||
# show_missing_values_warning
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
show_missing_values_warning() {
|
|
||||||
if [[ ${#MISSING_REQUIRED_VALUES[@]} -gt 0 ]]; then
|
|
||||||
echo ""
|
|
||||||
echo -e "${YW}╔════════════════════════════════════════════════════════════╗${CL}"
|
|
||||||
echo -e "${YW}║ ⚠️ MANUAL CONFIGURATION REQUIRED ║${CL}"
|
|
||||||
echo -e "${YW}╠════════════════════════════════════════════════════════════╣${CL}"
|
|
||||||
echo -e "${YW}║ The following values were not provided and need to be ║${CL}"
|
|
||||||
echo -e "${YW}║ configured manually for the service to work properly: ║${CL}"
|
|
||||||
echo -e "${YW}╟────────────────────────────────────────────────────────────╢${CL}"
|
|
||||||
for val in "${MISSING_REQUIRED_VALUES[@]}"; do
|
|
||||||
printf "${YW}║${CL} • %-56s ${YW}║${CL}\n" "$val"
|
|
||||||
done
|
|
||||||
echo -e "${YW}╟────────────────────────────────────────────────────────────╢${CL}"
|
|
||||||
echo -e "${YW}║ Check the service configuration files or environment ║${CL}"
|
|
||||||
echo -e "${YW}║ variables and update the placeholder values. ║${CL}"
|
|
||||||
echo -e "${YW}╚════════════════════════════════════════════════════════════╝${CL}"
|
|
||||||
echo ""
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
return 0
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# prompt_confirm()
|
|
||||||
#
|
|
||||||
# - Prompts user for yes/no confirmation with timeout and unattended support
|
|
||||||
# - In unattended mode: immediately returns default value
|
|
||||||
# - In interactive mode: waits for user input with configurable timeout
|
|
||||||
# - After timeout: auto-applies default value
|
|
||||||
#
|
|
||||||
# Arguments:
|
|
||||||
# $1 - Prompt message (required)
|
|
||||||
# $2 - Default value: "y" or "n" (optional, default: "n")
|
|
||||||
# $3 - Timeout in seconds (optional, default: 60)
|
|
||||||
#
|
|
||||||
# Returns:
|
|
||||||
# 0 - User confirmed (yes)
|
|
||||||
# 1 - User declined (no) or timeout with default "n"
|
|
||||||
#
|
|
||||||
# Example:
|
|
||||||
# if prompt_confirm "Proceed with installation?" "y" 30; then
|
|
||||||
# echo "Installing..."
|
|
||||||
# fi
|
|
||||||
#
|
|
||||||
# # Unattended: prompt_confirm will use default without waiting
|
|
||||||
# var_unattended=yes
|
|
||||||
# prompt_confirm "Delete files?" "n" && echo "Deleting" || echo "Skipped"
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
prompt_confirm() {
|
|
||||||
local message="${1:-Confirm?}"
|
|
||||||
local default="${2:-n}"
|
|
||||||
local timeout="${3:-60}"
|
|
||||||
local response
|
|
||||||
|
|
||||||
# Normalize default to lowercase
|
|
||||||
default="${default,,}"
|
|
||||||
[[ "$default" != "y" ]] && default="n"
|
|
||||||
|
|
||||||
# Build prompt hint
|
|
||||||
local hint
|
|
||||||
if [[ "$default" == "y" ]]; then
|
|
||||||
hint="[Y/n]"
|
|
||||||
else
|
|
||||||
hint="[y/N]"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Unattended mode: apply default immediately
|
|
||||||
if is_unattended; then
|
|
||||||
if [[ "$default" == "y" ]]; then
|
|
||||||
return 0
|
|
||||||
else
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Check if running in a TTY
|
|
||||||
if [[ ! -t 0 ]]; then
|
|
||||||
# Not a TTY, use default
|
|
||||||
if [[ "$default" == "y" ]]; then
|
|
||||||
return 0
|
|
||||||
else
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Interactive prompt with timeout
|
|
||||||
echo -en "${YW}${message} ${hint} (auto-${default} in ${timeout}s): ${CL}"
|
|
||||||
|
|
||||||
if read -t "$timeout" -r response; then
|
|
||||||
# User provided input
|
|
||||||
response="${response,,}" # lowercase
|
|
||||||
case "$response" in
|
|
||||||
y | yes)
|
|
||||||
return 0
|
|
||||||
;;
|
|
||||||
n | no)
|
|
||||||
return 1
|
|
||||||
;;
|
|
||||||
"")
|
|
||||||
# Empty response, use default
|
|
||||||
if [[ "$default" == "y" ]]; then
|
|
||||||
return 0
|
|
||||||
else
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
# Invalid input, use default
|
|
||||||
echo -e "${YW}Invalid response, using default: ${default}${CL}"
|
|
||||||
if [[ "$default" == "y" ]]; then
|
|
||||||
return 0
|
|
||||||
else
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
else
|
|
||||||
# Timeout occurred
|
|
||||||
echo "" # Newline after timeout
|
|
||||||
echo -e "${YW}Timeout - auto-selecting: ${default}${CL}"
|
|
||||||
if [[ "$default" == "y" ]]; then
|
|
||||||
return 0
|
|
||||||
else
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# prompt_input()
|
|
||||||
#
|
|
||||||
# - Prompts user for text input with timeout and unattended support
|
|
||||||
# - In unattended mode: immediately returns default value
|
|
||||||
# - In interactive mode: waits for user input with configurable timeout
|
|
||||||
# - After timeout: auto-applies default value
|
|
||||||
#
|
|
||||||
# Arguments:
|
|
||||||
# $1 - Prompt message (required)
|
|
||||||
# $2 - Default value (optional, default: "")
|
|
||||||
# $3 - Timeout in seconds (optional, default: 60)
|
|
||||||
#
|
|
||||||
# Output:
|
|
||||||
# Prints the user input or default value to stdout
|
|
||||||
#
|
|
||||||
# Example:
|
|
||||||
# username=$(prompt_input "Enter username:" "admin" 30)
|
|
||||||
# echo "Using username: $username"
|
|
||||||
#
|
|
||||||
# # With validation
|
|
||||||
# while true; do
|
|
||||||
# port=$(prompt_input "Enter port:" "8080" 30)
|
|
||||||
# [[ "$port" =~ ^[0-9]+$ ]] && break
|
|
||||||
# echo "Invalid port number"
|
|
||||||
# done
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
prompt_input() {
|
|
||||||
local message="${1:-Enter value:}"
|
|
||||||
local default="${2:-}"
|
|
||||||
local timeout="${3:-60}"
|
|
||||||
local response
|
|
||||||
|
|
||||||
# Build display default hint
|
|
||||||
local hint=""
|
|
||||||
[[ -n "$default" ]] && hint=" (default: ${default})"
|
|
||||||
|
|
||||||
# Unattended mode: return default immediately
|
|
||||||
if is_unattended; then
|
|
||||||
echo "$default"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Check if running in a TTY
|
|
||||||
if [[ ! -t 0 ]]; then
|
|
||||||
# Not a TTY, use default
|
|
||||||
echo "$default"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Interactive prompt with timeout
|
|
||||||
echo -en "${YW}${message}${hint} (auto-default in ${timeout}s): ${CL}" >&2
|
|
||||||
|
|
||||||
if read -t "$timeout" -r response; then
|
|
||||||
# User provided input (or pressed Enter for empty)
|
|
||||||
if [[ -n "$response" ]]; then
|
|
||||||
echo "$response"
|
|
||||||
else
|
|
||||||
echo "$default"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
# Timeout occurred
|
|
||||||
echo "" >&2 # Newline after timeout
|
|
||||||
echo -e "${YW}Timeout - using default: ${default}${CL}" >&2
|
|
||||||
echo "$default"
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# prompt_input_required()
|
|
||||||
#
|
|
||||||
# - Prompts user for REQUIRED text input with fallback support
|
|
||||||
# - In unattended mode: Uses fallback value if no env var set (with warning)
|
|
||||||
# - In interactive mode: loops until user provides non-empty input
|
|
||||||
# - Tracks missing required values for end-of-script summary
|
|
||||||
#
|
|
||||||
# Arguments:
|
|
||||||
# $1 - Prompt message (required)
|
|
||||||
# $2 - Fallback/example value for unattended mode (optional)
|
|
||||||
# $3 - Timeout in seconds (optional, default: 120)
|
|
||||||
# $4 - Environment variable name hint for error messages (optional)
|
|
||||||
#
|
|
||||||
# Output:
|
|
||||||
# Prints the user input or fallback value to stdout
|
|
||||||
#
|
|
||||||
# Returns:
|
|
||||||
# 0 - Success (value provided or fallback used)
|
|
||||||
# 1 - Failed (interactive timeout without input)
|
|
||||||
#
|
|
||||||
# Global:
|
|
||||||
# MISSING_REQUIRED_VALUES - Array tracking fields that used fallbacks
|
|
||||||
#
|
|
||||||
# Example:
|
|
||||||
# # With fallback - script continues even in unattended mode
|
|
||||||
# token=$(prompt_input_required "Enter API Token:" "YOUR_TOKEN_HERE" 60 "var_api_token")
|
|
||||||
#
|
|
||||||
# # Check at end of script if any values need manual configuration
|
|
||||||
# if [[ ${#MISSING_REQUIRED_VALUES[@]} -gt 0 ]]; then
|
|
||||||
# msg_warn "Please configure: ${MISSING_REQUIRED_VALUES[*]}"
|
|
||||||
# fi
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# Global array to track missing required values
|
|
||||||
declare -g -a MISSING_REQUIRED_VALUES=()
|
|
||||||
|
|
||||||
prompt_input_required() {
|
|
||||||
local message="${1:-Enter required value:}"
|
|
||||||
local fallback="${2:-CHANGE_ME}"
|
|
||||||
local timeout="${3:-120}"
|
|
||||||
local env_var_hint="${4:-}"
|
|
||||||
local response=""
|
|
||||||
|
|
||||||
# Check if value is already set via environment variable (if hint provided)
|
|
||||||
if [[ -n "$env_var_hint" ]]; then
|
|
||||||
local env_value="${!env_var_hint:-}"
|
|
||||||
if [[ -n "$env_value" ]]; then
|
|
||||||
echo "$env_value"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Unattended mode: use fallback with warning
|
|
||||||
if is_unattended; then
|
|
||||||
if [[ -n "$env_var_hint" ]]; then
|
|
||||||
echo -e "${YW}⚠ Required value '${env_var_hint}' not set - using fallback: ${fallback}${CL}" >&2
|
|
||||||
MISSING_REQUIRED_VALUES+=("$env_var_hint")
|
|
||||||
else
|
|
||||||
echo -e "${YW}⚠ Required value not provided - using fallback: ${fallback}${CL}" >&2
|
|
||||||
MISSING_REQUIRED_VALUES+=("(unnamed)")
|
|
||||||
fi
|
|
||||||
echo "$fallback"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Check if running in a TTY
|
|
||||||
if [[ ! -t 0 ]]; then
|
|
||||||
echo -e "${YW}⚠ Not interactive - using fallback: ${fallback}${CL}" >&2
|
|
||||||
MISSING_REQUIRED_VALUES+=("${env_var_hint:-unnamed}")
|
|
||||||
echo "$fallback"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Interactive prompt - loop until non-empty input or use fallback on timeout
|
|
||||||
local attempts=0
|
|
||||||
while [[ -z "$response" ]]; do
|
|
||||||
attempts=$((attempts + 1))
|
|
||||||
|
|
||||||
if [[ $attempts -gt 3 ]]; then
|
|
||||||
echo -e "${YW}Too many empty inputs - using fallback: ${fallback}${CL}" >&2
|
|
||||||
MISSING_REQUIRED_VALUES+=("${env_var_hint:-manual_input}")
|
|
||||||
echo "$fallback"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo -en "${YW}${message} (required, timeout ${timeout}s): ${CL}" >&2
|
|
||||||
|
|
||||||
if read -t "$timeout" -r response; then
|
|
||||||
if [[ -z "$response" ]]; then
|
|
||||||
echo -e "${YW}This field is required. Please enter a value. (attempt ${attempts}/3)${CL}" >&2
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
# Timeout occurred - use fallback
|
|
||||||
echo "" >&2
|
|
||||||
echo -e "${YW}Timeout - using fallback value: ${fallback}${CL}" >&2
|
|
||||||
MISSING_REQUIRED_VALUES+=("${env_var_hint:-timeout}")
|
|
||||||
echo "$fallback"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
|
|
||||||
echo "$response"
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# prompt_select()
|
|
||||||
#
|
|
||||||
# - Prompts user to select from a list of options with timeout support
|
|
||||||
# - In unattended mode: immediately returns default selection
|
|
||||||
# - In interactive mode: displays numbered menu and waits for choice
|
|
||||||
# - After timeout: auto-applies default selection
|
|
||||||
#
|
|
||||||
# Arguments:
|
|
||||||
# $1 - Prompt message (required)
|
|
||||||
# $2 - Default option number, 1-based (optional, default: 1)
|
|
||||||
# $3 - Timeout in seconds (optional, default: 60)
|
|
||||||
# $4+ - Options to display (required, at least 2)
|
|
||||||
#
|
|
||||||
# Output:
|
|
||||||
# Prints the selected option value to stdout
|
|
||||||
#
|
|
||||||
# Returns:
|
|
||||||
# 0 - Success
|
|
||||||
# 1 - No options provided or invalid state
|
|
||||||
#
|
|
||||||
# Example:
|
|
||||||
# choice=$(prompt_select "Select database:" 1 30 "PostgreSQL" "MySQL" "SQLite")
|
|
||||||
# echo "Selected: $choice"
|
|
||||||
#
|
|
||||||
# # With array
|
|
||||||
# options=("Option A" "Option B" "Option C")
|
|
||||||
# selected=$(prompt_select "Choose:" 2 60 "${options[@]}")
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
prompt_select() {
|
|
||||||
local message="${1:-Select option:}"
|
|
||||||
local default="${2:-1}"
|
|
||||||
local timeout="${3:-60}"
|
|
||||||
shift 3
|
|
||||||
|
|
||||||
local options=("$@")
|
|
||||||
local num_options=${#options[@]}
|
|
||||||
|
|
||||||
# Validate options
|
|
||||||
if [[ $num_options -eq 0 ]]; then
|
|
||||||
echo "" >&2
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Validate default
|
|
||||||
if [[ ! "$default" =~ ^[0-9]+$ ]] || [[ "$default" -lt 1 ]] || [[ "$default" -gt "$num_options" ]]; then
|
|
||||||
default=1
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Unattended mode: return default immediately
|
|
||||||
if is_unattended; then
|
|
||||||
echo "${options[$((default - 1))]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Check if running in a TTY
|
|
||||||
if [[ ! -t 0 ]]; then
|
|
||||||
echo "${options[$((default - 1))]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Display menu
|
|
||||||
echo -e "${YW}${message}${CL}" >&2
|
|
||||||
local i
|
|
||||||
for i in "${!options[@]}"; do
|
|
||||||
local num=$((i + 1))
|
|
||||||
if [[ $num -eq $default ]]; then
|
|
||||||
echo -e " ${GN}${num})${CL} ${options[$i]} ${YW}(default)${CL}" >&2
|
|
||||||
else
|
|
||||||
echo -e " ${GN}${num})${CL} ${options[$i]}" >&2
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
|
|
||||||
# Interactive prompt with timeout
|
|
||||||
echo -en "${YW}Select [1-${num_options}] (auto-select ${default} in ${timeout}s): ${CL}" >&2
|
|
||||||
|
|
||||||
local response
|
|
||||||
if read -t "$timeout" -r response; then
|
|
||||||
if [[ -z "$response" ]]; then
|
|
||||||
# Empty response, use default
|
|
||||||
echo "${options[$((default - 1))]}"
|
|
||||||
elif [[ "$response" =~ ^[0-9]+$ ]] && [[ "$response" -ge 1 ]] && [[ "$response" -le "$num_options" ]]; then
|
|
||||||
# Valid selection
|
|
||||||
echo "${options[$((response - 1))]}"
|
|
||||||
else
|
|
||||||
# Invalid input, use default
|
|
||||||
echo -e "${YW}Invalid selection, using default: ${options[$((default - 1))]}${CL}" >&2
|
|
||||||
echo "${options[$((default - 1))]}"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
# Timeout occurred
|
|
||||||
echo "" >&2 # Newline after timeout
|
|
||||||
echo -e "${YW}Timeout - auto-selecting: ${options[$((default - 1))]}${CL}" >&2
|
|
||||||
echo "${options[$((default - 1))]}"
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
# prompt_password()
|
|
||||||
#
|
|
||||||
# - Prompts user for password input with hidden characters
|
|
||||||
# - In unattended mode: returns default or generates random password
|
|
||||||
# - Supports auto-generation of secure passwords
|
|
||||||
# - After timeout: generates random password if allowed
|
|
||||||
#
|
|
||||||
# Arguments:
|
|
||||||
# $1 - Prompt message (required)
|
|
||||||
# $2 - Default value or "generate" for auto-generation (optional)
|
|
||||||
# $3 - Timeout in seconds (optional, default: 60)
|
|
||||||
# $4 - Minimum length for validation (optional, default: 0 = no minimum)
|
|
||||||
#
|
|
||||||
# Output:
|
|
||||||
# Prints the password to stdout
|
|
||||||
#
|
|
||||||
# Example:
|
|
||||||
# password=$(prompt_password "Enter password:" "generate" 30 8)
|
|
||||||
# echo "Password set"
|
|
||||||
#
|
|
||||||
# # Require user input (no default)
|
|
||||||
# db_pass=$(prompt_password "Database password:" "" 60 12)
|
|
||||||
# ------------------------------------------------------------------------------
|
|
||||||
prompt_password() {
|
|
||||||
local message="${1:-Enter password:}"
|
|
||||||
local default="${2:-}"
|
|
||||||
local timeout="${3:-60}"
|
|
||||||
local min_length="${4:-0}"
|
|
||||||
local response
|
|
||||||
|
|
||||||
# Generate random password if requested
|
|
||||||
local generated=""
|
|
||||||
if [[ "$default" == "generate" ]]; then
|
|
||||||
generated=$(openssl rand -base64 16 2>/dev/null | tr -dc 'a-zA-Z0-9' | head -c 16)
|
|
||||||
[[ -z "$generated" ]] && generated=$(head /dev/urandom | tr -dc 'a-zA-Z0-9' | head -c 16)
|
|
||||||
default="$generated"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Unattended mode: return default immediately
|
|
||||||
if is_unattended; then
|
|
||||||
echo "$default"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Check if running in a TTY
|
|
||||||
if [[ ! -t 0 ]]; then
|
|
||||||
echo "$default"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Build hint
|
|
||||||
local hint=""
|
|
||||||
if [[ -n "$generated" ]]; then
|
|
||||||
hint=" (Enter for auto-generated)"
|
|
||||||
elif [[ -n "$default" ]]; then
|
|
||||||
hint=" (Enter for default)"
|
|
||||||
fi
|
|
||||||
[[ "$min_length" -gt 0 ]] && hint="${hint} [min ${min_length} chars]"
|
|
||||||
|
|
||||||
# Interactive prompt with timeout (silent input)
|
|
||||||
echo -en "${YW}${message}${hint} (timeout ${timeout}s): ${CL}" >&2
|
|
||||||
|
|
||||||
if read -t "$timeout" -rs response; then
|
|
||||||
echo "" >&2 # Newline after hidden input
|
|
||||||
if [[ -n "$response" ]]; then
|
|
||||||
# Validate minimum length
|
|
||||||
if [[ "$min_length" -gt 0 ]] && [[ ${#response} -lt "$min_length" ]]; then
|
|
||||||
echo -e "${YW}Password too short (min ${min_length}), using default${CL}" >&2
|
|
||||||
echo "$default"
|
|
||||||
else
|
|
||||||
echo "$response"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
echo "$default"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
# Timeout occurred
|
|
||||||
echo "" >&2 # Newline after timeout
|
|
||||||
echo -e "${YW}Timeout - using generated password${CL}" >&2
|
|
||||||
echo "$default"
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# ==============================================================================
|
# ==============================================================================
|
||||||
# SECTION 6: CLEANUP & MAINTENANCE
|
# SECTION 6: CLEANUP & MAINTENANCE
|
||||||
# ==============================================================================
|
# ==============================================================================
|
||||||
@@ -1518,13 +896,15 @@ check_or_create_swap() {
|
|||||||
|
|
||||||
msg_error "No active swap detected"
|
msg_error "No active swap detected"
|
||||||
|
|
||||||
if ! prompt_confirm "Do you want to create a swap file?" "n" 60; then
|
read -p "Do you want to create a swap file? [y/N]: " create_swap
|
||||||
|
create_swap="${create_swap,,}" # to lowercase
|
||||||
|
|
||||||
|
if [[ "$create_swap" != "y" && "$create_swap" != "yes" ]]; then
|
||||||
msg_info "Skipping swap file creation"
|
msg_info "Skipping swap file creation"
|
||||||
return 1
|
return 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
local swap_size_mb
|
read -p "Enter swap size in MB (e.g., 2048 for 2GB): " swap_size_mb
|
||||||
swap_size_mb=$(prompt_input "Enter swap size in MB (e.g., 2048 for 2GB):" "2048" 60)
|
|
||||||
if ! [[ "$swap_size_mb" =~ ^[0-9]+$ ]]; then
|
if ! [[ "$swap_size_mb" =~ ^[0-9]+$ ]]; then
|
||||||
msg_error "Invalid size input. Aborting."
|
msg_error "Invalid size input. Aborting."
|
||||||
return 1
|
return 1
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user