I’m new to TrueNAS and got great help from this forum with various issues and figured I’d share my solution for anyone interested.
Mostly for backup and secure remote availability of my files (without exposing my NAS to the outside world) I wanted to have my document files synced to Proton Drive, but one way only: from my NAS to Proton. Running rclone on my entire dataset where I keep my personal files, which is quite a lot (2TB), is very time consuming (HOURS), so, I came up with this approach:
- keep a local DB of all my files
- on schedule, I do it every morning at 6 am, do a snapshot of local files
- compare the snapshot files to database records by name and modified time
- compile new/modified and deleted file lists
- upload to Proton Drive only the new/modified files and remove deleted files
- record snapshot as the current state of files in the database
- keep log of the process
This entire process completes in seconds or in minutes (if modified/new list is large) instead of hours running rclone comparing local files to remote.
So, I have the script below saved as .sh file in my home folder with required execute permission and schedule it via system->advanced->cron jobs
The pre-requisite is that you must run rcolone and setup new remote to your Proton Drive - it walks you through it step by step.
I hope it helps someone. The script is below
#!/bin/bash
USER_HOME="/mnt/home/YOUR_USERNAME"
SOURCE_DIR="/mnt/YOUR_DATAPOOL/MyFiles" #NAS location of your files
DB_DIR="/mnt/YOUR_DATAPOOL/home/YOUR_USERNAME/proton_file_tracker"
DB_FILE="$DB_DIR/file_tracker.db"
LOG_FILE="/mnt/YOUR_DATAPOOL/rclone_pd.log" #LOCATION OF THE LOG FILE
# Ensure the directory exists for storing DB and temp files
mkdir -p "$DB_DIR"
# Create the database if it doesn't exist
if [ ! -f "$DB_FILE" ]; then
sqlite3 "$DB_FILE" <<SQL
CREATE TABLE files (path TEXT PRIMARY KEY, mtime INTEGER, size INTEGER);
CREATE TABLE metadata (key TEXT PRIMARY KEY, value INTEGER);
INSERT INTO metadata (key, value) VALUES ('last_run_time', 0);
SQL
fi
# Update the current file info in bulk
find "$SOURCE_DIR" -type f -printf "%p|%T@|%s\n" > "$DB_DIR/current_files.txt"
# Track current timestamp
CURRENT_TIMESTAMP=$(date +%s)
# Before the SQLite command, prepare a variable for the SQL query
# to reduce path of the files from absolute to relative
RELATIVE_SQL_CHANGED="SELECT substr(f.path, length('$SOURCE_DIR') + 1) AS path FROM files f LEFT JOIN files_last_run flr ON f.path = flr.path WHERE flr.path IS NULL OR f.mtime > COALESCE(flr.mtime, 0);"
RELATIVE_SQL_DELETED="SELECT substr(flr.path, length('$SOURCE_DIR') + 1) AS path FROM files_last_run flr WHERE flr.path NOT IN (SELECT path FROM files);"
# Performance tuning and database operations
sqlite3 "$DB_FILE" <<SQL > /dev/null 2>&1
PRAGMA synchronous=OFF;
PRAGMA journal_mode=WAL;
BEGIN TRANSACTION;
-- Backup the current files table as files_last_run
DROP TABLE IF EXISTS files_last_run;
ALTER TABLE files RENAME TO files_last_run;
-- Recreate files table and create indexes
CREATE TABLE files (path TEXT PRIMARY KEY, mtime INTEGER, size INTEGER);
CREATE INDEX IF NOT EXISTS idx_files_path ON files(path);
CREATE INDEX IF NOT EXISTS idx_files_last_run_path ON files_last_run(path);
-- Import new file data
.import $DB_DIR/current_files.txt files
-- Identify changes
.output $DB_DIR/changed_files.txt
$RELATIVE_SQL_CHANGED
-- Identify deletions
.output $DB_DIR/deleted_files.txt
$RELATIVE_SQL_DELETED
-- Update last run time after all changes have been identified
INSERT OR REPLACE INTO metadata (key, value) VALUES ('last_run_time', $CURRENT_TIMESTAMP);
COMMIT;
SQL
# Error handling for rclone operations
# Sync changed files
if [ -s "$DB_DIR/changed_files.txt" ]; then
if ! rclone sync "$SOURCE_DIR" "proton_drive:Home-NAS/Docs" --files-from "$DB_DIR/changed_files.txt" --update --local-no-check-updated --protondrive-replace-existing-draft=true --log-file="$LOG_FILE" --log-format "date:'2006-01-02 15:04:05',level,msg" --log-level INFO; then
echo "Error syncing files. Details in $LOG_FILE" >> "$LOG_FILE"
exit 1
fi
fi
# Handle deletions
if [ -s "$DB_DIR/deleted_files.txt" ]; then
if ! rclone delete "proton_drive:Home-NAS/Docs" --include-from "$DB_DIR/deleted_files.txt" --protondrive-replace-existing-draft=true --log-file="$LOG_FILE" --log-format "date:'2006-01-02 15:04:05',level,msg" --log-level INFO; then
echo "Error deleting files. Details in $LOG_FILE" >> "$LOG_FILE"
exit 1
fi
fi
# Clean up temporary files
rm -f "$DB_DIR/changed_files.txt" "$DB_DIR/deleted_files.txt" "$DB_DIR/current_files.txt"
echo "############################################################" >> "$LOG_FILE"
echo "Sync completed successfully." >> "$LOG_FILE"
echo "############################################################" >> "$LOG_FILE"