Calling bash script works when called manually but not via Cron?

Hi, I've got a Bash backup script I'm trying to run on a directory via a cron job nightly. If I ssh in and run the script manually it works flawlessly. If I set up the cron to run evertything is totally messed up I don't even know where to begin.

Basically the path structure is

/home/username/domainname.com/_mybash.sh

where I'm trying to backup domainname.com

If I ssh in and go cd domainname.com and then sh _mybash.sh from the command line. The script runs fine. If I set up a cron to /home/username/domainname.com/_mybash.sh the script runs and attempts to backup everything under username... I don't understand.

This is the script:

#!/bin/bash

#/************************ EDIT VARIABLES ************************/
projectName=mydomain
backupDir=/home/username/_backups
#/************************ //EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST "wp-config.php" |cut -d "'" -f 4)
username=$(grep DB_USER "wp-config.php" | cut -d "'" -f 4)
password=$(grep DB_PASSWORD "wp-config.php" | cut -d "'" -f 4)
dbName=$(grep DB_NAME "wp-config.php" |cut -d "'" -f 4)

# Color Output
green='\033[1;32m'
nc='\033[0m' # No Color

# Initial setup
TODAY=$(date)
echo "----------------------------------------------------
$(tput bold)Date:$(tput sgr0) $TODAY
$(tput bold)Host:$(tput sgr0) mydomain.com automated backup"

echo "----------------------------------------------------"
echo "Dumping MySQL..."
mysqldump -h "$host" -u "$username" -p"$password" "$dbName" | gzip > $fileName.sql.gz
echo "Done!"

echo "----------------------------------------------------"
echo "Archiving Files..."
tar -zcf $fileName.tar.gz * .htaccess
echo "Done!"
echo "----------------------------------------------------"
echo "Cleaning..."
rm -f $fileName.sql.gz
echo "Done!"

echo "----------------------------------------------------"
mkdir -p $backupDir;
echo "Moving file to backup dir..."
mv $fileName.tar.gz $backupDir
echo "----------------------------------------------------"
echo "Removing old backups..."
find $backupDir -type f -mtime +30 -exec rm {} \;
echo -e "${green}Backup of $projectName Complete!${nc}"

It's probably not finding the paths of certain things.
What user do you run/cron it as?
Put set -x in the top of it to debug. Did you try placing the cd command at the top of the script.
Don't expect the escape sequences to work for color since you are not at standard input and standard output are not terminas
ls when you cron.
Use full paths to exeecutables.

Usually reason is PATH value. Your login env PATH is different as the cron use.

In the terminal session save the env:

cd /home/username/domainname.com
env > myenv
# or save only PATH
echo "$PATH" > myenv

Then add to the script:

#!/bin/bash

.  ./myenv   # set env as you have after login

# printout env - compare cron and login session - debug using
env 

...

And then crontab line. Change directory to where you have script and wp-config.php. Here is ex. which save output to some logfile and stderr also.
Ex.

0 3 * * * (cd /home/username/domainname.com;./_mybash.sh ) > /home/username/domainname.com/cron.log 2>&1

yes, CD!

Here is the latest, and stripped down version. I removed all the tput and formatting stuff since cron doesn't bother with it. A few questions, what permissions would make the most sense from a security standpoint... 755, or 700? How can I make this more secure? How could I organize the archive where there was a www and database folder that the files were placed in before archival? Since this was originally made to be run manually in command prompt, a lot of the output for cron now is sort of irrelevant... are there relevant things I could add to provide more useful info in my cron email?

Thanks

#!/bin/bash

cd ~/mysub.example.com

#/************************ EDIT VARIABLES ************************/
projectName=mysub_demo
backupDir=/home/username/_backups
#/************************ //EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST "wp-config.php" |cut -d "'" -f 4)
username=$(grep DB_USER "wp-config.php" | cut -d "'" -f 4)
password=$(grep DB_PASSWORD "wp-config.php" | cut -d "'" -f 4)
dbName=$(grep DB_NAME "wp-config.php" |cut -d "'" -f 4)

# Initial setup
TODAY=$(date)
echo "----------------------------------------------------
Date: $TODAY
Host: mysub.example.com automated backup"

# Backup DB
echo "----------------------------------------------------"
echo "Dumping MySQL..."
mysqldump -h "$host" -u "$username" -p"$password" "$dbName" | gzip > $fileName.sql.gz
echo "Done!"

# Backup files
echo "----------------------------------------------------"
echo "Archiving Files..."
tar -zcf $fileName.tar.gz * .htaccess
echo "Done!"
echo "----------------------------------------------------"
echo "Cleaning..."
rm -f $fileName.sql.gz
echo "Done!"

# Move to backup directory
echo "----------------------------------------------------"
mkdir -p $backupDir;
echo "Moving file to backup dir..."
mv $fileName.tar.gz $backupDir

# Keep last 30 Backups
echo "----------------------------------------------------"
echo "Removing old backups..."
find $backupDir -type f -mtime +30 -exec rm {} ;
echo "Backup of Complete!" 

The line in your script:

find $backupDir -type f -mtime +30 -exec rm {} ;

should be changed to:

find $backupDir -type f -mtime +30 -exec rm {} \;

or preferably:

find $backupDir -type f -mtime +30 -exec rm {} +

On many systems a find -exec primary must be terminated by a semicolon or by a plus sign. The semicolon in your script is eaten by the shell so find won't see it. Using + instead of \; is more efficient in cases when you have more than one file to remove.

1 Like

Thanks for pointing that out i thought i had changed that. How could i go about catching errors to be output in the email. Also, is it not necessary to have the .sh extension on my bash/cron script? Here is the updated script without all the output.

#!/bin/bash

cd ~/mysub.example.com

#/************************ EDIT VARIABLES ************************/
projectName="mysub_demo"
backupDir="/home/username/_backups"
#/************************ //EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST "wp-config.php" |cut -d "'" -f 4)
username=$(grep DB_USER "wp-config.php" | cut -d "'" -f 4)
password=$(grep DB_PASSWORD "wp-config.php" | cut -d "'" -f 4)
dbName=$(grep DB_NAME "wp-config.php" |cut -d "'" -f 4)

# Backup DB
mysqldump -h "$host" -u "$username" -p"$password" "$dbName" | gzip > $fileName.sql.gz

# Backup files
tar -zcf $fileName.tar.gz * .htaccess
rm -f $fileName.sql.gz

# Move to backup directory
mkdir -p $backupDir;
mv $fileName.tar.gz $backupDir

# Keep last 30 Backups
find $backupDir -type f -mtime +30 -exec rm {} +

Do I need to be wrapping each step in some kind of if else to do error checking? Something like the following?

if [ "$?" = "0" ]; then
    #some command
else
    echo "Error. Couldn't do some command!" 1>&2
    exit 1
fi

---------- Post updated at 01:08 PM ---------- Previous update was at 07:48 AM ----------

---------- Post updated at 02:03 PM ---------- Previous update was at 01:08 PM ----------

Sorry for the additional post but this is a work in progress and keeps changing. This is the latest... trying to do a bit of checking and handling here... really dont know what I'm doing though. Could use some advice.

#!/bin/bash

cd ~/mysub.example.com || exit

#/************************ EDIT VARIABLES ************************/
projectName="mysub_demo"
backupDir="/home/username/_backups"
#/************************ //EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST "wp-config.php" |cut -d "'" -f 4)
username=$(grep DB_USER "wp-config.php" | cut -d "'" -f 4)
password=$(grep DB_PASSWORD "wp-config.php" | cut -d "'" -f 4)
dbName=$(grep DB_NAME "wp-config.php" |cut -d "'" -f 4)

# Backup DB
mysqldump -h "$host" -u "$username" -p"$password" "$dbName" | gzip > $fileName.sql.gz

# Backup files
tar -zcf "$fileName.tar.gz" * .htaccess &&
    rm -f "$fileName.sql.gz"

# Move to backup directory
mkdir -p "$backupDir";
mv "$fileName.tar.gz" "$backupDir"

# Keep last 30 Backups
find "$backupDir" -type f -mtime +30 -exec rm {} +

I would start (as already pointed out) to use full pathname to all of your binaries.

example. Original:

host=$(grep DB_HOST "wp-config.php" |cut -d "'" -f 4)

Change to:

host=$(/bin/grep DB_HOST "wp-config.php" | /bin/cut -d "'" -f 4)

Repeat. Also, the host variable above is looking for wp-config.php file (your wordpress configuration file). Your first cd command is:

cd ~/mysub.example.com

That is NOT an absolute path (it's relative). So cron won't find it.

To find the exact locations of the files, in your CLI just run

which <binary>

to find out where it is. Such as

which tar