my problem is this: when I run a script from the command line it works but returns a failure if I run it from crontab.
Basically I wanted to send a file to hdfs,
I thought it was related to the fact that crontab do not know the path to hdfs so I put the full path but it still does not work: here is the piece of code that fails :
put: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "aocsv155bed0p.foo.acme.net/10.12.11.133"; destination host is: "aocsv155bna1p.foo.acme.net":8020;
thank for the links sadique
The script works when I launch it on the command line and I don't have this credential problem , the same user ( who is me ) when I try to run it on crontab I get this error !!!
for file in $FILE_LIST
do
/usr/bin/hdfs dfs -put -f $source_directory/$file $hdfs_target_directory
res=$?
if test "$res" != "0"; then
echo "the hdfs put command failed with: $res"
else
echo "the hdfs put command success with : $res"
fi
done
I believe cron jobs start with a reduced environment. I would guess that you either have "logged in" to something prior to running the script, or there is a setting in your .profile or .bashrc file that allows your script to be run from the terminal. This probably won't work, but try prefixing your command line (within the crontab) with /usr/bin/env (example):
#!/bin/bash
# Reference: http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in
SOURCE="${BASH_SOURCE[0]}"
BIN_DIR="$( dirname "$SOURCE" )"
while [ -h "$SOURCE" ]
do
SOURCE="$(readlink "$SOURCE")"
[[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE"
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
done
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
LIB_DIR=$BIN_DIR/../lib
# Autodetect JAVA_HOME if not defined
. $LIB_DIR/bigtop-utils/bigtop-detect-javahome
export HADOOP_LIBEXEC_DIR=//$LIB_DIR/hadoop/libexec
exec $LIB_DIR/hadoop-hdfs/bin/hdfs "$@"
---------- Post updated at 08:23 AM ---------- Previous update was at 08:12 AM ----------
Even when I add /usr/bin/env it doesn't work.
To use Hadoop command you need to use kinit command to get a Kerberos ticket first, this kinit is generated automatically for me ( in the command line) but krontab ignores this .
My question is how to inform krontab so it considers this ticket ??
I don't understand what you meant by the above (in red). Do you run kinit on the command line?
As I said earlier, your interactive environment and your cron environment are different. Your active tickets will not be available in the cron environment.
I think you need to save the kerberos ticket in such a way your script can pick it up, using kinit (which is not present in your script). I think your best bet is to google for how to do this.