*SOLVED* script to back up config.xml not working from Crontab



  • Hello,

    I have a script that I use to manually back up the config.xml, and it works when fired manually, but for some reason is not functioning from crontab. Here's the script code:

    
    #!/bin/sh
    cd /root
    rm -Rf *.xml
    wget -qO/dev/null --keep-session-cookies --save-cookies cookies.txt --post-data 'login=Login&usernamefld=admin&passwordfld=password' http://10.0.0.1/diag_backup.php
    wget --keep-session-cookies --load-cookies cookies.txt --post-data 'Submit=download' http://10.0.0.1/diag_backup.php -O config-router-`date +%Y%m%d%H%M`.xml
    ftp -n -v ftp.myserver.com <quote user username
    quote pass password
    binary
    cd workingdir
    lcd /root
    prompt
    mput *.xml
    quit
    exit
    
    

    This works great manually, but I want the config.xml backed up automagically every hour and uploaded. I tried installing the crontab package and added the following:

    60      *       *       *       *       root    /usr/bin/nice -n20 /root/backup.sh
    

    any ideas why this wouldn't work as expected? Thanks in advance for any advice or info!



  • could be missing just a

    chmod +x /root/backup.sh

    sometimes you need to put full path in all binaries you call in script.



  • @marcelloc:

    could be missing just a

    chmod +x /root/backup.sh

    sometimes you need to put full path in all binaries you call in script.

    Thanks for the assist. I executed the above. Just need to wait an hour to see if it works!



  • No dice. I'm having a hard time understanding why this script works manually (./backup.sh), but will not work from the crontab…



  • Did you act on:
    @marcelloc:

    sometimes you need to put full path in all binaries you call in script.

    So, for example, in your script you may need to change a sh command to /bin/sh. cron jobs don't run in the same context as interactive logins; in particular the default path for shell commands may not be the same in a cron job as in an interactive login.



  • Ya, tried changing to /path/to/wget and still get nothing. Looking at rc.updatebogons.sh I see that the script calls several binaries without using the full path. Still a mystery…



  • Try to redirect script output to a log file in cron



  • @marcelloc:

    Try to redirect script output to a log file in cron

    Done. Changed cron to run this script every minute so I hopefully get an idea about what's going on. In the meantime, here's the output from a manual run of the script. Maybe someone with experience can spot something there…

    --2012-01-07 10:45:39--  http://10.0.0.1/diag_backup.php
    Connecting to 10.0.0.1:80... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: 1106307 (1.1M) [application/octet-stream]
    Saving to: `config-router-201201071045.xml'
    
    100%[=========================================>] 1,106,307   --.-K/s   in 0.01s
    
    2012-01-07 10:45:39 (75.7 MB/s) - `config-router-201201071045.xml' saved [1106307/1106307]
    
    Connected to ftp.myserver.com.
    220 Freakin' Awesome FTP Service
    331 Password required for user.
    230 User logged in.
    200 Type set to I.
    250 CWD command successful.
    250 CWD command successful.
    250 CWD command successful.
    Local directory now: /root
    Interactive mode off.
    local: config-router-201201071045.xml remote: config-router-201201071045.xml
    500 'EPSV': command not understood
    227 Entering Passive Mode (204,14,91,28,13,235).
    125 Data connection already open; Transfer starting.
    100% |**********************************************************************************|  1080 KB  121.29 KB/s    00:00 ETA
    226 Transfer complete.
    1106307 bytes sent in 00:09 (117.05 KB/s)
    221
    


  • So far I have nothing. I made some of the suggested changes, but still I get no output, no clues, no obvious mistake. The thing still works manually though… Here's the modified script:

    #!/bin/sh
    
    echo "Changing directory." | logger
    
    cd /root
    
    echo "Removing old config files." | logger
    
    rm -Rf *.xml
    
    echo "Fetching config file from firewall." | logger
    
    /usr/sbin/wget -qO/dev/null --keep-session-cookies --save-cookies cookies.txt --post-data 'login=Login&usernamefld=user&passwordfld=password' http://10.0.0.1/diag_backup.php >> /root/backup.log
    
    /usr/sbin/wget --keep-session-cookies --load-cookies cookies.txt --post-data 'Submit=download' http://10.0.0.1/diag_backup.php -O config-router-`date +%Y%m%d%H%M`.xml >> /root/backup.log
    
    echo "Initiating FTP connection." | logger
    
    /usr/bin/ftp -n -v ftp.myservercom <
    echo "Using programmed credentials." | logger
    
    quote user user
    
    quote pass password
    
    echo "Switch to BIN mode for FTP transfer." | logger
    
    binary
    
    echo "Changing remote directory." | logger
    
    cd backups
    
    cd freebsd
    
    cd config
    
    echo "Changing local working directory." | logger
    
    lcd /root
    
    prompt
    
    echo "Uploading current config.xml." | logger
    
    mput *.xml
    
    echo "Disconnecting FTP session." | logger
    
    quit
    
    echo "Exiting script." | logger
    
    exit
    
    

    Oddly enough, I get "?invalid command" wherever I have piped to logger, which may explain why I haven't seen anything in the system logs. Also, /root/backup.log is empty, so whatever wget is doing is not being logged. It seems to me this should work, but I am definitely not a programmer. I'm using rc.update_bogons.sh as a template and am piping to logger just as the bogons script, but when I run bogons it sits quietly doing it's thing.



  • Solved. Looks like it was not providing the full path to the FTP binary and a botched cron entry. Thanks to those who assisted with this!


Locked