Compare commits

...

24 Commits

Author SHA1 Message Date
Tobias
e1bc1e4fdb add: photon note 2026-03-18 11:21:17 +01:00
Tobias
2fc9bd3e30 Restart nginx after starting photon service 2026-03-18 10:09:35 +01:00
Tobias
2fb5503959 reitti: fix: v4 2026-03-18 10:02:24 +01:00
Tobias
062c3c07d3 reitti: fix: v4 2026-03-18 10:01:10 +01:00
community-scripts-pr-app[bot]
ee7829496f Update .app files (#13037)
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2026-03-18 08:44:39 +01:00
community-scripts-pr-app[bot]
b67cbc1585 Update CHANGELOG.md (#13036)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-18 07:44:05 +00:00
push-app-to-main[bot]
bf17edcc89 split-pro (#12975)
* Add split-pro (ct)

* Update ct/split-pro.sh

Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>

* Enhance setup completion feedback in split-pro.sh

Add success message and access URL after setup completion.

* Refactor dependency installation command

* Apply suggestion from @tremor021

---------

Co-authored-by: push-app-to-main[bot] <203845782+push-app-to-main[bot]@users.noreply.github.com>
Co-authored-by: CanbiZ (MickLesk) <47820557+MickLesk@users.noreply.github.com>
Co-authored-by: Slaviša Arežina <58952836+tremor021@users.noreply.github.com>
Co-authored-by: Tobias <96661824+CrazyWolf13@users.noreply.github.com>
2026-03-18 08:43:40 +01:00
community-scripts-pr-app[bot]
8aa4e5b8cb Update CHANGELOG.md (#13035)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-18 07:33:23 +00:00
community-scripts-pr-app[bot]
7c770e2ee7 Update CHANGELOG.md (#13034)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-18 07:33:03 +00:00
CanbiZ (MickLesk)
6d15a22d94 Paperless-NGX: increase default RAM to 3GB (#13018) 2026-03-18 08:32:58 +01:00
CanbiZ (MickLesk)
3db4ac1050 Plex: restart service after update to apply new version (#13017) 2026-03-18 08:32:34 +01:00
community-scripts-pr-app[bot]
16dfad5c77 Update CHANGELOG.md (#13033)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-18 07:32:08 +00:00
CanbiZ (MickLesk)
5197d759d7 pve-scripts-local: Increase default disk size from 4GB to 10GB (#13009) 2026-03-18 08:31:41 +01:00
CanbiZ (MickLesk)
c073286d53 sparkyfitness switch to pnpm 2026-03-18 08:18:04 +01:00
community-scripts-pr-app[bot]
cbaba2f133 Update CHANGELOG.md (#13032)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-18 07:01:01 +00:00
CanbiZ (MickLesk)
48afb6c017 tools.func: Implement check_for_gh_tag function (#12998)
* tools.func Implement check_for_gh_tag function

Adds a function to check for new GitHub tags for repositories without releases. (needed for termix / guacd-server)

* Update documentation for check_for_gh_tag function
2026-03-18 08:00:33 +01:00
community-scripts-pr-app[bot]
37f2e0242d Update CHANGELOG.md (#13031)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-18 06:44:35 +00:00
CanbiZ (MickLesk)
7c62147a00 tools.func: Implement fetch_and_deploy_gh_tag function (#13000)
* tools.func: Implement fetch_and_deploy_gh_tag function

Adds function to fetch and deploy GitHub tag-based source tarballs.

* Refactor fetch_and_deploy_gh_tag function and comments

Updated the function to fetch and deploy GitHub tags, enhancing its description and usage instructions.

* cleanuo
2026-03-18 07:44:13 +01:00
community-scripts-pr-app[bot]
7d11f9acd6 Update CHANGELOG.md (#13029)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-17 23:00:07 +00:00
community-scripts-pr-app[bot]
55a877a3e2 Update CHANGELOG.md (#13028)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-17 22:59:55 +00:00
CanbiZ (MickLesk)
27c9d7fd07 fix(gluetun): add OpenVPN process user and cleanup stale config (#13016)
- Add OPENVPN_PROCESS_USER=root, PUID=0, PGID=0 to default .env
  to prevent gluetun from injecting 'user' directive into target.ovpn
  which causes 'Unrecognized option' errors on Debian
- Add ExecStartPre to remove stale /etc/openvpn/target.ovpn before start
- Fixes #12988
2026-03-17 23:59:40 +01:00
CanbiZ (MickLesk)
c907e10334 Frigate: check OpenVino model files exist before configuring detector and use curl_with_retry instead of default wget (#13019)
* fix(frigate): check OpenVino model files exist before configuring detector

When the OpenVino model build fails (e.g. TensorFlow import error),
the model files are not created but the config still references them
if the CPU supports avx/sse4_2, causing Frigate to crash on start
with FileNotFoundError.

Now also checks that ssdlite_mobilenet_v2.xml and coco_91cl_bkgr.txt
actually exist before configuring the OpenVino detector, falling back
to CPU model otherwise.

Fixes #12808

* fix(frigate): use curl_with_retry for all downloads

Replace all wget and bare curl calls with curl_with_retry from
tools.func for robust downloads with retry logic, exponential
backoff, and DNS pre-checks. This prevents install failures from
transient network issues during model and dependency downloads.
2026-03-17 23:59:28 +01:00
community-scripts-pr-app[bot]
804c462dd3 Update CHANGELOG.md (#13020)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-03-17 20:20:06 +00:00
Slaviša Arežina
9df9a2831e Update (#13008) 2026-03-17 21:19:36 +01:00
14 changed files with 347 additions and 91 deletions

View File

@@ -423,8 +423,45 @@ Exercise vigilance regarding copycat or coat-tailing sites that seek to exploit
</details> </details>
## 2026-03-18
### 🆕 New Scripts
- split-pro ([#12975](https://github.com/community-scripts/ProxmoxVE/pull/12975))
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- Paperless-NGX: increase default RAM to 3GB [@MickLesk](https://github.com/MickLesk) ([#13018](https://github.com/community-scripts/ProxmoxVE/pull/13018))
- Plex: restart service after update to apply new version [@MickLesk](https://github.com/MickLesk) ([#13017](https://github.com/community-scripts/ProxmoxVE/pull/13017))
- #### 🔧 Refactor
- pve-scripts-local: Increase default disk size from 4GB to 10GB [@MickLesk](https://github.com/MickLesk) ([#13009](https://github.com/community-scripts/ProxmoxVE/pull/13009))
### 💾 Core
- #### ✨ New Features
- tools.func: Implement check_for_gh_tag function [@MickLesk](https://github.com/MickLesk) ([#12998](https://github.com/community-scripts/ProxmoxVE/pull/12998))
- tools.func: Implement fetch_and_deploy_gh_tag function [@MickLesk](https://github.com/MickLesk) ([#13000](https://github.com/community-scripts/ProxmoxVE/pull/13000))
## 2026-03-17 ## 2026-03-17
### 🚀 Updated Scripts
- #### 🐞 Bug Fixes
- Gluetun: add OpenVPN process user and cleanup stale config [@MickLesk](https://github.com/MickLesk) ([#13016](https://github.com/community-scripts/ProxmoxVE/pull/13016))
- Frigate: check OpenVino model files exist before configuring detector and use curl_with_retry instead of default wget [@MickLesk](https://github.com/MickLesk) ([#13019](https://github.com/community-scripts/ProxmoxVE/pull/13019))
### 💾 Core
- #### 🔧 Refactor
- tools.func: Update `create_self_signed_cert()` [@tremor021](https://github.com/tremor021) ([#13008](https://github.com/community-scripts/ProxmoxVE/pull/13008))
## 2026-03-16 ## 2026-03-16
### 🆕 New Scripts ### 🆕 New Scripts

6
ct/headers/split-pro Normal file
View File

@@ -0,0 +1,6 @@
_____ ___ __ ____
/ ___/____ / (_) /_ / __ \_________
\__ \/ __ \/ / / __/_____/ /_/ / ___/ __ \
___/ / /_/ / / / /_/_____/ ____/ / / /_/ /
/____/ .___/_/_/\__/ /_/ /_/ \____/
/_/

View File

@@ -8,7 +8,7 @@ source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxV
APP="Paperless-ngx" APP="Paperless-ngx"
var_tags="${var_tags:-document;management}" var_tags="${var_tags:-document;management}"
var_cpu="${var_cpu:-2}" var_cpu="${var_cpu:-2}"
var_ram="${var_ram:-2048}" var_ram="${var_ram:-3072}"
var_disk="${var_disk:-12}" var_disk="${var_disk:-12}"
var_os="${var_os:-debian}" var_os="${var_os:-debian}"
var_version="${var_version:-13}" var_version="${var_version:-13}"

View File

@@ -79,6 +79,11 @@ function update_script() {
$STD apt update $STD apt update
$STD apt install -y plexmediaserver $STD apt install -y plexmediaserver
msg_ok "Updated Plex Media Server" msg_ok "Updated Plex Media Server"
msg_info "Restarting Plex Media Server"
systemctl restart plexmediaserver
msg_ok "Restarted Plex Media Server"
msg_ok "Updated successfully!" msg_ok "Updated successfully!"
exit exit
} }

View File

@@ -9,7 +9,7 @@ APP="PVE-Scripts-Local"
var_tags="${var_tags:-pve-scripts-local}" var_tags="${var_tags:-pve-scripts-local}"
var_cpu="${var_cpu:-2}" var_cpu="${var_cpu:-2}"
var_ram="${var_ram:-4096}" var_ram="${var_ram:-4096}"
var_disk="${var_disk:-4}" var_disk="${var_disk:-10}"
var_os="${var_os:-debian}" var_os="${var_os:-debian}"
var_version="${var_version:-13}" var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}" var_unprivileged="${var_unprivileged:-1}"

View File

@@ -89,17 +89,49 @@ EOF
msg_ok "Started Service" msg_ok "Started Service"
msg_ok "Updated successfully!" msg_ok "Updated successfully!"
fi fi
if check_for_gh_release "photon" "komoot/photon"; then if check_for_gh_release "photon" "komoot/photon"; then
if [[ -f "$HOME/.photon" ]] && [[ "$(cat "$HOME/.photon")" == 0.7 ]]; then
CURRENT_VERSION="$(<"$HOME/.photon")"
echo
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Photon v1 upgrade detected (breaking change)"
echo
echo "Your current version: $CURRENT_VERSION"
echo
echo "Photon v1 requires a manual migration before updating."
echo
echo "You need to:"
echo " 1. Remove existing geocoding data (not actual reitti data):"
echo " rm -rf /opt/photon_data"
echo
echo " 2. Follow the inial setup guide again:"
echo " https://github.com/community-scripts/ProxmoxVE/discussions/8737"
echo
echo " 3. Re-download and import Photon data for v1"
echo
read -rp "Do you want to continue anyway? (y/N): " CONTINUE
echo
if [[ ! "$CONTINUE" =~ ^[Yy]$ ]]; then
msg_info "Migration required. Update cancelled."
exit 0
fi
msg_warn "Continuing without migration may break Photon in the future!"
fi
msg_info "Stopping Service" msg_info "Stopping Service"
systemctl stop photon systemctl stop photon
msg_ok "Stopped Service" msg_ok "Stopped Service"
rm -f /opt/photon/photon.jar rm -f /opt/photon/photon.jar
USE_ORIGINAL_FILENAME="true" fetch_and_deploy_gh_release "photon" "komoot/photon" "singlefile" "latest" "/opt/photon" "photon-0*.jar" USE_ORIGINAL_FILENAME="true" fetch_and_deploy_gh_release "photon" "komoot/photon" "singlefile" "latest" "/opt/photon" "photon-*.jar"
mv /opt/photon/photon-*.jar /opt/photon/photon.jar mv /opt/photon/photon-*.jar /opt/photon/photon.jar
msg_info "Starting Service" msg_info "Starting Service"
systemctl start photon systemctl start photon
systemctl restart nginx
msg_ok "Started Service" msg_ok "Started Service"
msg_ok "Updated successfully!" msg_ok "Updated successfully!"
fi fi

View File

@@ -51,7 +51,7 @@ function update_script() {
msg_info "Updating Sparky Fitness Backend" msg_info "Updating Sparky Fitness Backend"
cd /opt/sparkyfitness/SparkyFitnessServer cd /opt/sparkyfitness/SparkyFitnessServer
$STD npm install --legacy-peer-deps $STD pnpm install
msg_ok "Updated Sparky Fitness Backend" msg_ok "Updated Sparky Fitness Backend"
msg_info "Updating Sparky Fitness Frontend (Patience)" msg_info "Updating Sparky Fitness Frontend (Patience)"

68
ct/split-pro.sh Normal file
View File

@@ -0,0 +1,68 @@
#!/usr/bin/env bash
source <(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/misc/build.func)
# Copyright (c) 2021-2026 community-scripts ORG
# Author: johanngrobe
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/oss-apps/split-pro
APP="Split-Pro"
var_tags="${var_tags:-finance;expense-sharing}"
var_cpu="${var_cpu:-2}"
var_ram="${var_ram:-4096}"
var_disk="${var_disk:-6}"
var_os="${var_os:-debian}"
var_version="${var_version:-13}"
var_unprivileged="${var_unprivileged:-1}"
variables
color
catch_errors
function update_script() {
header_info
check_container_storage
check_container_resources
if [[ ! -d /opt/split-pro ]]; then
msg_error "No Split Pro Installation Found!"
exit
fi
if check_for_gh_release "split-pro" "oss-apps/split-pro"; then
msg_info "Stopping Service"
systemctl stop split-pro
msg_ok "Stopped Service"
msg_info "Backing up Data"
cp /opt/split-pro/.env /opt/split-pro.env
msg_ok "Backed up Data"
CLEAN_INSTALL=1 fetch_and_deploy_gh_release "split-pro" "oss-apps/split-pro" "tarball"
msg_info "Building Application"
cd /opt/split-pro
$STD pnpm install --frozen-lockfile
$STD pnpm build
cp /opt/split-pro.env /opt/split-pro/.env
rm -f /opt/split-pro.env
ln -sf /opt/split-pro_data/uploads /opt/split-pro/uploads
$STD pnpm exec prisma migrate deploy
msg_ok "Built Application"
msg_info "Starting Service"
systemctl start split-pro
msg_ok "Started Service"
msg_ok "Updated successfully!"
fi
exit
}
start
build_container
description
msg_ok "Completed successfully!\n"
echo -e "${CREATING}${GN}${APP} setup has been successfully initialized!${CL}"
echo -e "${INFO}${YW} Access it using the following URL:${CL}"
echo -e "${TAB}${GATEWAY}${BGN}http://${IP}:3000${CL}"

View File

@@ -146,7 +146,7 @@ ldconfig
msg_ok "Built libUSB" msg_ok "Built libUSB"
msg_info "Bootstrapping pip" msg_info "Bootstrapping pip"
wget -q https://bootstrap.pypa.io/get-pip.py -O /tmp/get-pip.py curl_with_retry "https://bootstrap.pypa.io/get-pip.py" "/tmp/get-pip.py"
sed -i 's/args.append("setuptools")/args.append("setuptools==77.0.3")/' /tmp/get-pip.py sed -i 's/args.append("setuptools")/args.append("setuptools==77.0.3")/' /tmp/get-pip.py
$STD python3 /tmp/get-pip.py "pip" $STD python3 /tmp/get-pip.py "pip"
rm -f /tmp/get-pip.py rm -f /tmp/get-pip.py
@@ -169,13 +169,13 @@ NODE_VERSION="20" setup_nodejs
msg_info "Downloading Inference Models" msg_info "Downloading Inference Models"
mkdir -p /models /openvino-model mkdir -p /models /openvino-model
wget -q -O /edgetpu_model.tflite https://github.com/google-coral/test_data/raw/release-frogfish/ssdlite_mobiledet_coco_qat_postprocess_edgetpu.tflite curl_with_retry "https://github.com/google-coral/test_data/raw/release-frogfish/ssdlite_mobiledet_coco_qat_postprocess_edgetpu.tflite" "/edgetpu_model.tflite"
wget -q -O /models/cpu_model.tflite https://github.com/google-coral/test_data/raw/release-frogfish/ssdlite_mobiledet_coco_qat_postprocess.tflite curl_with_retry "https://github.com/google-coral/test_data/raw/release-frogfish/ssdlite_mobiledet_coco_qat_postprocess.tflite" "/models/cpu_model.tflite"
cp /opt/frigate/labelmap.txt /labelmap.txt cp /opt/frigate/labelmap.txt /labelmap.txt
msg_ok "Downloaded Inference Models" msg_ok "Downloaded Inference Models"
msg_info "Downloading Audio Model" msg_info "Downloading Audio Model"
wget -q -O /tmp/yamnet.tar.gz https://www.kaggle.com/api/v1/models/google/yamnet/tfLite/classification-tflite/1/download curl_with_retry "https://www.kaggle.com/api/v1/models/google/yamnet/tfLite/classification-tflite/1/download" "/tmp/yamnet.tar.gz"
$STD tar xzf /tmp/yamnet.tar.gz -C / $STD tar xzf /tmp/yamnet.tar.gz -C /
mv /1.tflite /cpu_audio_model.tflite mv /1.tflite /cpu_audio_model.tflite
cp /opt/frigate/audio-labelmap.txt /audio-labelmap.txt cp /opt/frigate/audio-labelmap.txt /audio-labelmap.txt
@@ -205,7 +205,7 @@ msg_ok "Installed OpenVino"
msg_info "Building OpenVino Model" msg_info "Building OpenVino Model"
cd /models cd /models
wget -q http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz curl_with_retry "http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz" "ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz"
$STD tar -zxf ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz --no-same-owner $STD tar -zxf ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz --no-same-owner
if python3 /opt/frigate/docker/main/build_ov_model.py &>/dev/null; then if python3 /opt/frigate/docker/main/build_ov_model.py &>/dev/null; then
mkdir -p /openvino-model mkdir -p /openvino-model
@@ -219,7 +219,7 @@ if python3 /opt/frigate/docker/main/build_ov_model.py &>/dev/null; then
if [[ -n "$OV_LABELS" ]]; then if [[ -n "$OV_LABELS" ]]; then
ln -sf "$OV_LABELS" /openvino-model/coco_91cl_bkgr.txt ln -sf "$OV_LABELS" /openvino-model/coco_91cl_bkgr.txt
else else
wget -q "https://raw.githubusercontent.com/openvinotoolkit/open_model_zoo/master/data/dataset_classes/coco_91cl_bkgr.txt" -O /openvino-model/coco_91cl_bkgr.txt curl_with_retry "https://raw.githubusercontent.com/openvinotoolkit/open_model_zoo/master/data/dataset_classes/coco_91cl_bkgr.txt" "/openvino-model/coco_91cl_bkgr.txt"
fi fi
fi fi
sed -i 's/truck/car/g' /openvino-model/coco_91cl_bkgr.txt sed -i 's/truck/car/g' /openvino-model/coco_91cl_bkgr.txt
@@ -246,7 +246,7 @@ msg_info "Configuring Frigate"
mkdir -p /config /media/frigate mkdir -p /config /media/frigate
cp -r /opt/frigate/config/. /config cp -r /opt/frigate/config/. /config
curl -fsSL "https://github.com/intel-iot-devkit/sample-videos/raw/master/person-bicycle-car-detection.mp4" -o "/media/frigate/person-bicycle-car-detection.mp4" curl_with_retry "https://github.com/intel-iot-devkit/sample-videos/raw/master/person-bicycle-car-detection.mp4" "/media/frigate/person-bicycle-car-detection.mp4"
echo "tmpfs /tmp/cache tmpfs defaults 0 0" >>/etc/fstab echo "tmpfs /tmp/cache tmpfs defaults 0 0" >>/etc/fstab
@@ -289,7 +289,7 @@ detect:
enabled: false enabled: false
EOF EOF
if grep -q -o -m1 -E 'avx[^ ]*|sse4_2' /proc/cpuinfo; then if grep -q -o -m1 -E 'avx[^ ]*|sse4_2' /proc/cpuinfo && [[ -f /openvino-model/ssdlite_mobilenet_v2.xml ]] && [[ -f /openvino-model/coco_91cl_bkgr.txt ]]; then
cat <<EOF >>/config/config.yml cat <<EOF >>/config/config.yml
ffmpeg: ffmpeg:
hwaccel_args: auto hwaccel_args: auto

View File

@@ -46,6 +46,9 @@ VPN_TYPE=openvpn
OPENVPN_CUSTOM_CONFIG=/opt/gluetun-data/custom.ovpn OPENVPN_CUSTOM_CONFIG=/opt/gluetun-data/custom.ovpn
OPENVPN_USER= OPENVPN_USER=
OPENVPN_PASSWORD= OPENVPN_PASSWORD=
OPENVPN_PROCESS_USER=root
PUID=0
PGID=0
HTTP_CONTROL_SERVER_ADDRESS=:8000 HTTP_CONTROL_SERVER_ADDRESS=:8000
HTTPPROXY=off HTTPPROXY=off
SHADOWSOCKS=off SHADOWSOCKS=off
@@ -76,6 +79,7 @@ User=root
WorkingDirectory=/opt/gluetun-data WorkingDirectory=/opt/gluetun-data
EnvironmentFile=/opt/gluetun-data/.env EnvironmentFile=/opt/gluetun-data/.env
UnsetEnvironment=USER UnsetEnvironment=USER
ExecStartPre=/bin/sh -c 'rm -f /etc/openvpn/target.ovpn'
ExecStart=/usr/local/bin/gluetun ExecStart=/usr/local/bin/gluetun
Restart=on-failure Restart=on-failure
RestartSec=5 RestartSec=5

View File

@@ -44,7 +44,7 @@ msg_ok "Configured RabbitMQ"
USE_ORIGINAL_FILENAME="true" fetch_and_deploy_gh_release "reitti" "dedicatedcode/reitti" "singlefile" "latest" "/opt/reitti" "reitti-app.jar" USE_ORIGINAL_FILENAME="true" fetch_and_deploy_gh_release "reitti" "dedicatedcode/reitti" "singlefile" "latest" "/opt/reitti" "reitti-app.jar"
mv /opt/reitti/reitti-*.jar /opt/reitti/reitti.jar mv /opt/reitti/reitti-*.jar /opt/reitti/reitti.jar
USE_ORIGINAL_FILENAME="true" fetch_and_deploy_gh_release "photon" "komoot/photon" "singlefile" "latest" "/opt/photon" "photon-0*.jar" USE_ORIGINAL_FILENAME="true" fetch_and_deploy_gh_release "photon" "komoot/photon" "singlefile" "latest" "/opt/photon" "photon-*.jar"
mv /opt/photon/photon-*.jar /opt/photon/photon.jar mv /opt/photon/photon-*.jar /opt/photon/photon.jar
msg_info "Installing Nginx Tile Cache" msg_info "Installing Nginx Tile Cache"

View File

@@ -47,7 +47,7 @@ msg_ok "Configured Sparky Fitness"
msg_info "Building Backend" msg_info "Building Backend"
cd /opt/sparkyfitness/SparkyFitnessServer cd /opt/sparkyfitness/SparkyFitnessServer
$STD npm install --legacy-peer-deps $STD pnpm install
msg_ok "Built Backend" msg_ok "Built Backend"
msg_info "Building Frontend (Patience)" msg_info "Building Frontend (Patience)"

View File

@@ -0,0 +1,74 @@
#!/usr/bin/env bash
# Copyright (c) 2021-2026 community-scripts ORG
# Author: johanngrobe
# License: MIT | https://github.com/community-scripts/ProxmoxVE/raw/main/LICENSE
# Source: https://github.com/oss-apps/split-pro
source /dev/stdin <<<"$FUNCTIONS_FILE_PATH"
color
verb_ip6
catch_errors
setting_up_container
network_check
update_os
NODE_VERSION="22" NODE_MODULE="pnpm" setup_nodejs
PG_VERSION="17" PG_MODULES="cron" setup_postgresql
msg_info "Installing Dependencies"
$STD apt install -y openssl
msg_ok "Installed Dependencies"
PG_DB_NAME="splitpro" PG_DB_USER="splitpro" PG_DB_EXTENSIONS="pg_cron" setup_postgresql_db
fetch_and_deploy_gh_release "split-pro" "oss-apps/split-pro" "tarball"
msg_info "Installing Dependencies"
cd /opt/split-pro
$STD pnpm install --frozen-lockfile
msg_ok "Installed Dependencies"
msg_info "Building Split Pro"
cd /opt/split-pro
mkdir -p /opt/split-pro_data/uploads
ln -sf /opt/split-pro_data/uploads /opt/split-pro/uploads
NEXTAUTH_SECRET=$(openssl rand -base64 32)
cp .env.example .env
sed -i "s|^DATABASE_URL=.*|DATABASE_URL=\"postgresql://${PG_DB_USER}:${PG_DB_PASS}@localhost:5432/${PG_DB_NAME}\"|" .env
sed -i "s|^NEXTAUTH_SECRET=.*|NEXTAUTH_SECRET=\"${NEXTAUTH_SECRET}\"|" .env
sed -i "s|^NEXTAUTH_URL=.*|NEXTAUTH_URL=\"http://${LOCAL_IP}:3000\"|" .env
sed -i "s|^NEXTAUTH_URL_INTERNAL=.*|NEXTAUTH_URL_INTERNAL=\"http://localhost:3000\"|" .env
sed -i "/^POSTGRES_CONTAINER_NAME=/d" .env
sed -i "/^POSTGRES_USER=/d" .env
sed -i "/^POSTGRES_PASSWORD=/d" .env
sed -i "/^POSTGRES_DB=/d" .env
sed -i "/^POSTGRES_PORT=/d" .env
$STD pnpm build
$STD pnpm exec prisma migrate deploy
msg_ok "Built Split Pro"
msg_info "Creating Service"
cat <<EOF >/etc/systemd/system/split-pro.service
[Unit]
Description=Split Pro
After=network.target postgresql.service
Requires=postgresql.service
[Service]
Type=simple
User=root
WorkingDirectory=/opt/split-pro
EnvironmentFile=/opt/split-pro/.env
ExecStart=/usr/bin/pnpm start
Restart=on-failure
RestartSec=5
[Install]
WantedBy=multi-user.target
EOF
systemctl enable -q --now split-pro
msg_ok "Created Service"
motd_ssh
customize
cleanup_lxc

View File

@@ -2077,103 +2077,131 @@ verify_gpg_fingerprint() {
} }
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# Get latest GitHub tag for a repository. # Fetches and deploys a GitHub tag-based source tarball.
# #
# Description: # Description:
# - Queries the GitHub API for tags (not releases) # - Downloads the source tarball for a given tag from GitHub
# - Useful for repos that only create tags, not full releases # - Extracts to the target directory
# - Supports optional prefix filter and version-only extraction # - Writes the version to ~/.<app>
# - Returns the latest tag name (printed to stdout)
# #
# Usage: # Usage:
# MONGO_VERSION=$(get_latest_gh_tag "mongodb/mongo-tools") # fetch_and_deploy_gh_tag "guacd" "apache/guacamole-server"
# LATEST=$(get_latest_gh_tag "owner/repo" "v") # only tags starting with "v" # fetch_and_deploy_gh_tag "guacd" "apache/guacamole-server" "latest" "/opt/guacamole-server"
# LATEST=$(get_latest_gh_tag "owner/repo" "" "true") # strip leading "v"
# #
# Arguments: # Arguments:
# $1 - GitHub repo (owner/repo) # $1 - App name (used for version file ~/.<app>)
# $2 - Tag prefix filter (optional, e.g. "v" or "100.") # $2 - GitHub repo (owner/repo)
# $3 - Strip prefix from result (optional, "true" to strip $2 prefix) # $3 - Tag version (default: "latest" → auto-detect via get_latest_gh_tag)
# # $4 - Target directory (default: /opt/$app)
# Returns:
# 0 on success (tag printed to stdout), 1 on failure
# #
# Notes: # Notes:
# - Skips tags containing "rc", "alpha", "beta", "dev", "test" # - Supports CLEAN_INSTALL=1 to wipe target before extracting
# - Sorts by version number (sort -V) to find the latest # - For repos that only publish tags, not GitHub Releases
# - Respects GITHUB_TOKEN for rate limiting
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
get_latest_gh_tag() { fetch_and_deploy_gh_tag() {
local repo="$1" local app="$1"
local prefix="${2:-}" local repo="$2"
local strip_prefix="${3:-false}" local version="${3:-latest}"
local target="${4:-/opt/$app}"
local app_lc=""
app_lc="$(echo "${app,,}" | tr -d ' ')"
local version_file="$HOME/.${app_lc}"
local header_args=() if [[ "$version" == "latest" ]]; then
[[ -n "${GITHUB_TOKEN:-}" ]] && header_args=(-H "Authorization: Bearer $GITHUB_TOKEN") version=$(get_latest_gh_tag "$repo") || {
msg_error "Failed to determine latest tag for ${repo}"
return 1
}
fi
local http_code="" local current_version=""
http_code=$(curl -sSL --max-time 20 -w "%{http_code}" -o /tmp/gh_tags.json \ [[ -f "$version_file" ]] && current_version=$(<"$version_file")
-H 'Accept: application/vnd.github+json' \
-H 'X-GitHub-Api-Version: 2022-11-28' \
"${header_args[@]}" \
"https://api.github.com/repos/${repo}/tags?per_page=100" 2>/dev/null) || true
if [[ "$http_code" == "401" ]]; then if [[ "$current_version" == "$version" ]]; then
msg_error "GitHub API authentication failed (HTTP 401)." msg_ok "$app is already up-to-date ($version)"
if [[ -n "${GITHUB_TOKEN:-}" ]]; then return 0
msg_error "Your GITHUB_TOKEN appears to be invalid or expired." fi
else
msg_error "The repository may require authentication. Try: export GITHUB_TOKEN=\"ghp_your_token\"" local tmpdir
fi tmpdir=$(mktemp -d) || return 1
rm -f /tmp/gh_tags.json local tarball_url="https://github.com/${repo}/archive/refs/tags/${version}.tar.gz"
local filename="${app_lc}-${version}.tar.gz"
msg_info "Fetching GitHub tag: ${app} (${version})"
download_file "$tarball_url" "$tmpdir/$filename" || {
msg_error "Download failed: $tarball_url"
rm -rf "$tmpdir"
return 1 return 1
}
mkdir -p "$target"
if [[ "${CLEAN_INSTALL:-0}" == "1" ]]; then
rm -rf "${target:?}/"*
fi fi
if [[ "$http_code" == "403" ]]; then tar --no-same-owner -xzf "$tmpdir/$filename" -C "$tmpdir" || {
msg_error "GitHub API rate limit exceeded (HTTP 403)." msg_error "Failed to extract tarball"
msg_error "To increase the limit, export a GitHub token before running the script:" rm -rf "$tmpdir"
msg_error " export GITHUB_TOKEN=\"ghp_your_token_here\""
rm -f /tmp/gh_tags.json
return 1 return 1
fi }
if [[ "$http_code" == "000" || -z "$http_code" ]]; then local unpack_dir
msg_error "GitHub API connection failed (no response)." unpack_dir=$(find "$tmpdir" -mindepth 1 -maxdepth 1 -type d | head -n1)
msg_error "Check your network/DNS: curl -sSL https://api.github.com/rate_limit"
rm -f /tmp/gh_tags.json
return 1
fi
if [[ "$http_code" != "200" ]] || [[ ! -s /tmp/gh_tags.json ]]; then shopt -s dotglob nullglob
msg_error "Unable to fetch tags for ${repo} (HTTP ${http_code})" cp -r "$unpack_dir"/* "$target/"
rm -f /tmp/gh_tags.json shopt -u dotglob nullglob
return 1
fi
local tags_json rm -rf "$tmpdir"
tags_json=$(</tmp/gh_tags.json) echo "$version" >"$version_file"
rm -f /tmp/gh_tags.json msg_ok "Deployed ${app} ${version} to ${target}"
# Extract tag names, filter by prefix, exclude pre-release patterns, sort by version
local latest=""
latest=$(echo "$tags_json" | grep -oP '"name":\s*"\K[^"]+' |
{ [[ -n "$prefix" ]] && grep "^${prefix}" || cat; } |
grep -viE '(rc|alpha|beta|dev|test|preview|snapshot)' |
sort -V | tail -n1)
if [[ -z "$latest" ]]; then
msg_warn "No matching tags found for ${repo}${prefix:+ (prefix: $prefix)}"
return 1
fi
if [[ "$strip_prefix" == "true" && -n "$prefix" ]]; then
latest="${latest#"$prefix"}"
fi
echo "$latest"
return 0 return 0
} }
# ------------------------------------------------------------------------------
# Checks for new GitHub tag (for repos without releases).
#
# Description:
# - Uses get_latest_gh_tag to fetch the latest tag
# - Compares it to a local cached version (~/.<app>)
# - If newer, sets global CHECK_UPDATE_RELEASE and returns 0
#
# Usage:
# if check_for_gh_tag "guacd" "apache/guacamole-server"; then
# fetch_and_deploy_gh_tag "guacd" "apache/guacamole-server" "/opt/guacamole-server"
# fi
#
# Notes:
# - For repos that only publish tags, not GitHub Releases
# - Same interface as check_for_gh_release
# ------------------------------------------------------------------------------
check_for_gh_tag() {
local app="$1"
local repo="$2"
local prefix="${3:-}"
local app_lc=""
app_lc="$(echo "${app,,}" | tr -d ' ')"
local current_file="$HOME/.${app_lc}"
msg_info "Checking for update: ${app}"
local latest=""
latest=$(get_latest_gh_tag "$repo" "$prefix") || return 1
local current=""
[[ -f "$current_file" ]] && current="$(<"$current_file")"
if [[ -z "$current" || "$current" != "$latest" ]]; then
CHECK_UPDATE_RELEASE="$latest"
msg_ok "Update available: ${app} ${current:-not installed}${latest}"
return 0
fi
msg_ok "No update available: ${app} (${latest})"
return 1
}
# ============================================================================== # ==============================================================================
# INSTALL FUNCTIONS # INSTALL FUNCTIONS
# ============================================================================== # ==============================================================================
@@ -2519,6 +2547,8 @@ check_for_codeberg_release() {
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
create_self_signed_cert() { create_self_signed_cert() {
local APP_NAME="${1:-${APPLICATION}}" local APP_NAME="${1:-${APPLICATION}}"
local HOSTNAME="$(hostname -f)"
local IP="$(hostname -I | awk '{print $1}')"
local APP_NAME_LC=$(echo "${APP_NAME,,}" | tr -d ' ') local APP_NAME_LC=$(echo "${APP_NAME,,}" | tr -d ' ')
local CERT_DIR="/etc/ssl/${APP_NAME_LC}" local CERT_DIR="/etc/ssl/${APP_NAME_LC}"
local CERT_KEY="${CERT_DIR}/${APP_NAME_LC}.key" local CERT_KEY="${CERT_DIR}/${APP_NAME_LC}.key"
@@ -2536,8 +2566,8 @@ create_self_signed_cert() {
mkdir -p "$CERT_DIR" mkdir -p "$CERT_DIR"
$STD openssl req -new -newkey rsa:2048 -days 365 -nodes -x509 \ $STD openssl req -new -newkey rsa:2048 -days 365 -nodes -x509 \
-subj "/CN=${APP_NAME}" \ -subj "/CN=${HOSTNAME}" \
-addext "subjectAltName=DNS:${APP_NAME}" \ -addext "subjectAltName=DNS:${HOSTNAME},DNS:localhost,IP:${IP},IP:127.0.0.1" \
-keyout "$CERT_KEY" \ -keyout "$CERT_KEY" \
-out "$CERT_CRT" || { -out "$CERT_CRT" || {
msg_error "Failed to create self-signed certificate" msg_error "Failed to create self-signed certificate"