SFTP fails from crontab but works from terminal

Dear community,
I'm driving crazy with a strange issue.

I have a simple script to transfer a file to a remote system:

#!/bin/bash
echo "put /tmp/server.log" > /tmp/server1_transfer.sftp
sftp -b /tmp/server1_transfer.sftp user@10.99.1.2:

Between client and server there is a SSH KEY, so if I run the script from root terminal, the file is correctly transferred.

Now, I edit the root crontab to automatically execute this script, but the trasfer fails!! :wall:

Seems the SSH KEY doesn't work from root cron, because every time the script runs, the system send me the following mail:

Received: (from root@localhost)
        by xyz (8.13.8/8.13.8/Submit) id q7GGv1qv022647;
        Thu, 16 Aug 2012 18:57:01 +0200
Date: Thu, 16 Aug 2012 18:57:01 +0200
Message-Id: <201208161657.q7GGv1qv022647@xyz>
From: root@xyz (Cron Daemon)
To: root@xyz
Subject: Cron <root@xyz1> /tmp/test.sh
Content-Type: text/plain; charset=ANSI_X3.4-1968
Auto-Submitted: auto-generated
X-Cron-Env: <SHELL=/bin/sh>
X-Cron-Env: <HOME=/root>
X-Cron-Env: <PATH=/usr/bin:/bin>
X-Cron-Env: <LOGNAME=root>
X-Cron-Env: <USER=root>

Permission denied, please try again.
Permission denied, please try again.
Permission denied (publickey).

Please, could someone explain me why from terminal works and from crontab doesn't work? :frowning:

Many thanks
Lucas

Make sure it's using the correct key. Point it to it with -i if necessary.

The copy command fails because cron is not a regular shell. I'm on my phone but the SCP/SSH man pages explain how to provide a config directory/key file option to the command

Thanks both for tips, but I believe I cannot provide to SFTP the SSHKEY path. This is what I seen on the SFTP manual.

     sftp [-1Cv] [-B buffer_size] [-b batchfile] [-F ssh_config] [-o ssh_option] [-P sftp_server_path]
          [-R num_requests] [-S program] [-s subsystem | sftp_server] host
     sftp [[user@]host[:file [file]]]
     sftp [[user@]host[:dir[/]]]
     sftp -b batchfile [user@]host

What's your system?

# cat /etc/redhat-release
Red Hat Enterprise Linux Server release 5.7 (Tikanga)

You have openssh then, which supports -i. See man sftp.

Nope :frowning:

sftp -b /tmp/server1_transfer.sftp -i /root/.ssh/authorized_keys user@10.99.1.2:
X-Cron-Env: <USER=root>
sftp: illegal option -- i
usage: sftp [-1Cv] [-B buffer_size] [-b batchfile] [-F ssh_config]
            [-o ssh_option] [-P sftp_server_path] [-R num_requests]
            [-S program] [-s subsystem | sftp_server] host
       sftp [[user@]host[:file [file]]]
       sftp [[user@]host[:dir[/]]]
       sftp -b batchfile [user@]host

Note the current working directory when the script is run manually.
Try making the first line of your script a cd to that directory.

Same result: :frowning:
Permission denied (publickey,password).
Couldn't read packet: Connection reset by peer

---------- Post updated at 01:18 PM ---------- Previous update was at 12:57 PM ----------

Well, I can use succesfully the REXEC command from crontab, providing USERNAME and PASSWORD.

Now I need to tranfer the file, I tried some methods:
SFTP: Doesn't work from crontab
SCP (OpenSSH): Doesn't work since the destination server is HP OpenVMS and for some reason it can't accept the SCP protocol
Simple FTP: Does't work, since I receive the following error message

Connected to 10.99.1.2.
220 FTP Server (Version 5.6) Ready.
502 AUTH is unimplemented.
502 AUTH is unimplemented.
KERBEROS_V4 rejected as an authentication type

At this point, Is there any other way to copy this file to remote OpenVMS server?

sftp accept ssh options through the -o option:

sftp -o IdentityFile=$HOME/.ssh/id_rsa

Does your key have a pass phrase? Maybe you are running ssh-agent. Your interactive shell has the env SSH_AUTH_SOCK set, but not for the cron job.

ftp -vu host # The 'u' disables auto-authentication
passive # I believe vms requires passive mode
Then try the 'user' command

Well, thanks all for hints,
I resolved the issue using NCFTP (ncftp(1) manual page)

It also allow me to setup a timeout when one of the destination server is down (i.e. for maintenance). This avoid the script hangs awaiting for long timeout:

/usr/local/bin/ncftpput -t $ftp_timeout -r 1 -u $username -p $password $server1 / /tmp/server.log 2> /dev/null

-t = Timeout in seconds
-r = The retry attempts

You do realize this makes your password visible in plaintext to the entire system for the entire duration of the connection, yes!?

Yes I already realized that, indeed I created a special limited account for this operation (transfer and remote command execute),

I just tried a lot of combination between Linux and OpenVMS, but OpenVMS is the worst OS I ever seen, and I always failed! :mad:

Putting it in a limited account doesn't change who can see the password... It's visible to anyone on the system, limited or not, via ps.