Back Up Configuration
-
Go easy on me - I am still finding my feet with pfSense.
Got a nice shiny new box about a month ago - happily set it up in a fairly basic setup, with snort and squid working fine. I am now looking at the finer points, I would like to get automatic backups saved on a FreeNAS box I also have. So have done my due dilergence and read around the topic but I am struggling.
I have created a user with effective privileges for only "WebCfg - Diagnostics: Backup/restore page"
I have the following script (saved on the FreeNAS machine (and for testing purposes I have set the permissions to 777)):
#!/bin/sh wget -qO- --keep-session-cookies --save-cookies cookies.txt \ --no-check-certificate https://192.168.1.1/diag_backup.php \ | grep "name='__csrf_magic'" | sed 's/.*value="\(.*\)".*/\1/' > csrf.txt wget -qO- --keep-session-cookies --load-cookies cookies.txt \ --save-cookies cookies.txt --no-check-certificate \ --post-data "login=Login&usernamefld=[b]MYUSER[/b]&passwordfld=[b]MYPASSWORD[/b]&__csrf_magic=$(cat csrf.txt)" \ https://192.168.1.1/diag_backup.php | grep "name='__csrf_magic'" \ | sed 's/.*value="\(.*\)".*/\1/' > csrf2.txt wget --keep-session-cookies --load-cookies cookies.txt --no-check-certificate \ --post-data "Submit=download&__csrf_magic=$(cat csrf2.txt)" \ https://192.168.1.1/diag_backup.php -O config-router-`date +%Y%m%d%H%M%S`.xml
In the FreeNAS command line I run "./Backup.sh"
The following files are generated cookie.txt, csrf.txt and csrf2.txt and a config-router.xml (with the date and time)
Looks good - however - the config.xml file is 0KB, the csrf csrf2 files are blank (not sure if they are meant to be?) the cookie file just says
# HTTP cookie file. # Generated by Wget on 2016-01-05 12:13:49. # Edit at your own risk.
And I get the following in the command line on the FreeNAS box
grep: : No such file or directory --2016-01-05 12:13:50-- http://%20--post-data/ Resolving --post-data ( --post-data)... failed: hostname nor servname provided, or not known. wget: unable to resolve host address ‘ --post-data’ --2016-01-05 12:13:50-- http://submit=download&__csrf_magic=/ Resolving submit=download&__csrf_magic= (submit=download&__csrf_magic=)... faile d: hostname nor servname provided, or not known. wget: unable to resolve host address ‘submit=download&__csrf_magic=’ https://192.168.1.1/diag_backup.php: Scheme missing.
I got the code from the docs page, I am running 2.2.6 so took those lines - I have changed "Submit=download&donotbackuprrd=yes&__csrf_magic=$(cat csrf2.txt)" to "Submit=downloads&__csrf_magic=$(cat csrf2.txt) as I want to backup the rrd stuff, but it didnt work before I made that change.
Any pointers on where I've gone wrong?
-
Why not just upload your config.xml from the /conf folder via ftp from the PFS? You can then cron a copy job to your NAS, if you can't enable ftp on the NAS directly.
eg:
#!/bin/sh
cd /conf
cat <<end |="" ftp="" 'ftp:="" username:password@ftp.site.com="" backupfolder="" '<br="">put ./config.xml
ENDLot less coding, too!</end>
-
So - I realised my (first) mistake:
I had included all the "" that the document page had at the end of each line….
Removing those so the code now looks like:
#!/bin/sh wget -qO- --keep-session-cookies --save-cookies cookies.txt --no-check-certificate https://192.168.1.1/diag_backup.php | grep "name='__csrf_magic'" | sed 's/.*value="\(.*\)".*/\1/' > csrf.txt wget -qO- --keep-session-cookies --load-cookies cookies.txt --save-cookies cookies.txt --no-check-certificate --post-data "login=Login&usernamefld=[b]MYUSER[/b]&passwordfld=[b]MYPASSWORD[/b]&__csrf_magic=$(cat csrf.txt)" https://192.168.1.1/diag_backup.php | grep "name='__csrf_magic'" | sed 's/.*value="\(.*\)".*/\1/' > csrf2.txt wget --keep-session-cookies --load-cookies cookies.txt --no-check-certificate --post-data "Submit=download&donotbackuprrd=yes&__csrf_magic=$(cat csrf2.txt)" https://192.168.1.1/diag_backup.php -O config-router-`date +%Y%m%d%H%M%S`.xml
Gives this in the FreeNAS terminal…
--2016-01-05 15:03:37-- https://192.168.1.1/diag_backup.php Connecting to 192.168.1.1:443... connected. WARNING: cannot verify 192.168.1.1's certificate, issued by ‘/C=US/ST=State/L=Lo cality/O=pfSense webConfigurator Self-Signed Certificate/emailAddress=admin@pfSe nse.localdomain/CN=pfSense’: Unable to locally verify the issuer's authority. WARNING: certificate common name ‘pfSense’ doesn't match reque sted host name ‘192.168.1.1’. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: ‘config-router-20160105150337.xml’ config-router-20160 [ <=> ] 6.59K --.-KB/s in 0.004s 2016-01-05 15:03:37 (1.74 MB/s) - ‘config-router-20160105150337.xml’ saved [6751 ]
It also yields (slightly) more useful files:
The csrf and csrf files both have information in (implying they're working now?)
and the cookie file has more information in as well.
However the config file itself I still don't feel is correct - its now 7KB which is good but it looks nothing like the config I can manually download instead contains the following"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <title>Login</title> .....................and so on
Why not just upload your config.xml from the /conf folder via ftp from the PFS? You can then cron a copy job to your NAS, if you can't enable ftp on the NAS directly.
eg:
#!/bin/sh
cd /conf
cat <<end |="" ftp="" 'ftp:="" username:password@ftp.site.com="" backupfolder="" '<br="">put ./config.xml
ENDLot less coding, too!</end>
Thank you for you suggestion - I would prefer to get this script working rather than use ftp - I am also using it as a learning curve.
-
Okay - I've manged to get it to work! But - only when I use "admin" and the associated password.
So assumed it was an issue with the logging on. I saw that the password for my user had a "&" in it, wondered if that caused issues. So changed the password. No joy.
So looking at the permissions difference between admin and the user - obviously admin has "WebCfg All" and the user only has "WebCfg - Diagnostics: Backup/restore page"
Changed the user to "WebCfg All" still couldn't get it to work, the saw admin had "User - System - Shell account access" so added that to the user as well. No joy. So now I am stumped
-
Okay - so I have now fixed this and achieved what I wanted
here is the final code:
#!/bin/sh wget -qO- --keep-session-cookies --save-cookies cookies.txt --no-check-certificate https://192.168.1.1/diag_backup.php | grep "name='__csrf_magic'" | sed 's/.*value="\(.*\)".*/\1/' > csrf.txt wget -qO- --keep-session-cookies --load-cookies cookies.txt --save-cookies cookies.txt --no-check-certificate --post-data "login=Login&usernamefld=[b]MYUSER[/b]&passwordfld=[b]MYPASSWORD[/b]&__csrf_magic=$(cat csrf.txt)" https://192.168.1.1/diag_backup.php | grep "name='__csrf_magic'" | sed 's/.*value="\(.*\)".*/\1/' > csrf2.txt wget --keep-session-cookies --load-cookies cookies.txt --no-check-certificate --post-data "Submit=download&__csrf_magic=$(cat csrf2.txt)" https://192.168.1.1/diag_backup.php -O config-router-`date +%Y%m%d%H%M%S`.xml rm cookies.txt rm csrf.txt rm csrf2.txt ls -td *.xml | awk 'NR>30' | xargs rm
I got it to work by removing all special characters from the password, I wasn't sure which one was causing the issues as my admin password that worked also has a couple, but removing them all worked.
I have put the user back to only have access to "Diag/Backup-Restore" page, and not the other login permission.
I also added the last four lines:
They remove the files created by the script, and the final line keeps deletes the old configs once there is more than 30 (I have just the .sh file and the .xml configs in their own directory). I will run a daily cron job on the FreeNAS box to run this script therefore having the last 30 days of configs saved.
Its working in testing - but if anyone has any pointers on what I can improve (or may have overlooked) please don't hesitate to educate me.Also seeing as how I've started a thread and basically answered my own question, if this needs to be deleted so be it. But I've left this here for anyone else in the future.
Cheers