• Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Search
  • Register
  • Login
Netgate Discussion Forum
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Search
  • Register
  • Login

script to back up pfsense config to github

Off-Topic & Non-Support Discussion
3
5
182
Loading More Posts
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M
    michmoor LAYER 8 Rebel Alliance
    last edited by michmoor 24 days ago 24 days ago

    Just wanted to share my working script of backing up pfsense and then committing to my github repo. Feel free to modify to your liking

    The grabbing of the pfsense configuration via bash is documented in the official netgate documentation. I just added some flair and the github component.

    #!/bin/bash
    
    # === Configuration ===
    BACKUP_DIR="pfsense_backup_cfgs"
    GIT_REPO_DIR="/home/michael/gafw"
    DATE_STR=$(date +%Y_%m_%d_%H_%M)
    FILENAME="gafw1_config-router-${DATE_STR}.xml"
    FULL_BACKUP_PATH="${BACKUP_DIR}/${FILENAME}"
    LOG_FILE="$HOME/pfsense_backup.log"  # Change to /var/log/pfsense_backup.log if running as root
    
    # === Logging Function ===
    mkdir -p "$(dirname "$LOG_FILE")"
    touch "$LOG_FILE"
    log() {
        echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
    }
    
    # === Begin Backup ===
    log "🚀 Starting pfSense backup process"
    
    # Ensure backup directory exists
    mkdir -p "$BACKUP_DIR"
    
    # Step 1: Login and get CSRF token
    log "🔐 Logging into pfSense..."
    curl -s -L -k --cookie-jar cookies.txt \
         https://192.168.50.254:10443/ \
         | grep "name='__csrf_magic'" \
         | sed 's/.*value="\(.*\)".*/\1/' > csrf.txt
    
    curl -s -L -k --cookie cookies.txt --cookie-jar cookies.txt \
         --data-urlencode "login=Login" \
         --data-urlencode "usernamefld=admin" \
         --data-urlencode "passwordfld=aergageageg44543456356463blurgggg" \
         --data-urlencode "__csrf_magic=$(cat csrf.txt)" \
         https://192.168.50.254:10443/ > /dev/null
    
    # Get new CSRF token for backup
    curl -s -L -k --cookie cookies.txt --cookie-jar cookies.txt \
         https://192.168.50.254:10443/diag_backup.php \
         | grep "name='__csrf_magic'" \
         | sed 's/.*value="\(.*\)".*/\1/' > csrf.txt
    
    # Download backup
    log "💾 Downloading configuration backup..."
    curl -s -L -k --cookie cookies.txt --cookie-jar cookies.txt \
         --data-urlencode "download=download" \
         --data-urlencode "donotbackuprrd=yes" \
         --data-urlencode "__csrf_magic=$(head -n 1 csrf.txt)" \
         https://192.168.50.254:10443/diag_backup.php > "$FULL_BACKUP_PATH"
    
    # Check if the backup was successfully created
    if [ ! -f "$FULL_BACKUP_PATH" ]; then
        log "❌ ERROR: Backup file was not created at $FULL_BACKUP_PATH"
        exit 1
    fi
    log "✅ Backup created: $FULL_BACKUP_PATH"
    
    # Step 2: Cleanup old backups
    log "🧹 Cleaning up backups older than 30 days..."
    find "$BACKUP_DIR" -name "*.xml" -type f -mtime +30 -exec rm -v {} \; | tee -a "$LOG_FILE"
    
    # Step 3: GitHub push
    log "📤 Copying backup to Git repo and committing..."
    cp "$FULL_BACKUP_PATH" "$GIT_REPO_DIR"
    
    cd "$GIT_REPO_DIR" || {
        log "❌ ERROR: Could not change to git repo directory: $GIT_REPO_DIR"
        exit 1
    }
    
    git add "$FILENAME"
    if git commit -m "Automated backup on ${DATE_STR}"; then
        log "✅ Commit successful"
    else
        log "ℹ️  Nothing to commit"
    fi
    
    if git push origin main; then
        log "🚀 Backup pushed to GitHub successfully"
    else
        log "❌ ERROR: Failed to push to GitHub"
        exit 1
    fi
    
    log "🏁 Backup process complete"
    

    Things looking good on my end.

    login-to-view

    Firewall: NetGate,Palo Alto-VM,Juniper SRX
    Routing: Juniper, Arista, Cisco
    Switching: Juniper, Arista, Cisco
    Wireless: Unifi, Aruba IAP
    JNCIP,CCNP Enterprise

    L 1 Reply Last reply 20 days ago Reply Quote 3
    • L
      LukasInCloud @michmoor
      last edited by 20 days ago

      @michmoor This is an excellent working script - neat, clear, and already equipped with emoji logs (which I love 😍 ). But it could be cleaned up a bit and made more reliable, secure, and readable. I suggest moving the IP address, port, username, and password into clearly named variables at the top of the script. Better yet, store sensitive credentials like passwords in a separate config file (~/.pfsense_backup.conf) or use environment variables to avoid hardcoding passwords into the script.

      1 Reply Last reply Reply Quote 1
      • E
        elvisimprsntr
        last edited by elvisimprsntr 19 days ago 19 days ago

        I created something similar a long time ago to backup to a USB thumb drive (if installed) and to a TrueNAS server. Uses public/private key pair to eliminate need for credentials.

        #!/bin/sh
        VERSION=`cat /etc/version`
        DATE=`date +%Y%m%d`
        FILE="config_`hostname -s`_"$DATE"_"$VERSION".xml"
        NAS="nas-1"
        PATH="/mnt/data/Software/pfsense"
        
        # mkdir /media/usb
        
        /sbin/mount_msdosfs /dev/da0s1 /media/usb
        if [ "$?" -eq "0" ]; then
        	echo "USB found"
        	/bin/cp /cf/conf/config.xml /media/usb/$FILE
        	echo "Backup $FILE created"
        	/usr/bin/find /media/usb/ -name "config_*.xml" -mtime +365 -exec rm {} \;
        	/sbin/umount /media/usb	
        else
        	echo "USB not found"
        fi	
        
        /sbin/ping -c 3 $NAS > /dev/null 2>&1
        	if [ $? -eq 0 ]; then
        		echo "$NAS found" 
        		/usr/bin/scp /cf/conf/config.xml root@$NAS:$PATH/$FILE
        		/usr/bin/scp /root/pkg_check.php root@$NAS:$PATH/
        		/usr/bin/scp /root/att_cidr.sh root@$NAS:$PATH/
        		/usr/bin/scp /root/backup.sh root@$NAS:$PATH/
        		echo "Backup $FILE copied to $NAS"
        	else
        		echo "$NAS not found"
        	fi
        	
        # install cron package and add cron job
        # 0 4 * * Sun /bin/sh /root/backup.sh > /dev/null
        
        L 1 Reply Last reply 19 days ago Reply Quote 1
        • L
          LukasInCloud @elvisimprsntr
          last edited by 19 days ago

          @elvisimprsntr Honestly, that’s a solid, old-school, reliable shell script. But it has a few weak points. It doesn’t stop on errors, so if something fails, the rest of the script will still run without warning. It also uses plain echo statements without logging to a file, which makes it harder to track issues later. The USB device path is hardcoded, which isn’t very flexible if the device name changes. There’s no error handling for the scp commands either. if copying to the NAS fails, you won’t know unless you’re watching the console. It might also be a good idea to move the NAS address and backup path into a config file for easier updates. And while copying the config file locally is fine, it skips CSRF protection that the web interface uses, so it might miss some config details or protections in certain setups. Overall, it’s a good base but could be improved with safer practices and a bit more flexibility.

          E 1 Reply Last reply 19 days ago Reply Quote 0
          • E
            elvisimprsntr @LukasInCloud
            last edited by elvisimprsntr 19 days ago 19 days ago

            This post is deleted!
            1 Reply Last reply Reply Quote 0
            1 out of 5
            • First post
              1/5
              Last post
            Copyright 2025 Rubicon Communications LLC (Netgate). All rights reserved.