i need to output an ls command to a file but also capture any errors from that command and output them to a log file and the screen.
if it's only possible to output them to a log file and not the screen then that's fine.
this is what i've tried so far, but it won't populate log.txt. i've purposefully added a directory (scripts) that doesn't exist to force the ls command to fail.
ls -lR > scripts/release_file.txt 2>&1 | tee -a log.txt
i suspect it's the first redirection that's causing the problem as the command usually works if i'm not outputting the ls results to another file.
the cat /dev/null >log.txt will empty the log file, then the ls is run, and the results (including any errors) are sent to log.txt, then the results are output to the screen.
$ ls data notthere
ls: notthere: No such file or directory
data
$ ls data not_there 2>&1 >stdout | tee stderr
ls: not_there: No such file or directory
$ cat stdout
data
$ cat stderr
ls: not_there: No such file or directory
$
In bash, you can turn on the option "pipefail", and $? will return the result of the last command to fail in the pipeline (otherwise, it will return the result of the last command OF the pipeline). Use "set -o pipefail" to turn this feature on. Observe:
yeah, i've already considered the extra check to see if the file was created. i was hoping there was a more direct solution using a method i've never seen before. a super advanced system admin/unix expert method.
In fact there is and i am glad to be able to explain - once in a lifetime - "super advanced system admin expert methods" to the audience: You can redirect output streams with the "exec" command:
exec 2>output.stderr
Will redirect every output of every subsequent command to output.stderr.
This way you do not have to worry about truncating the file at all:
exec 2>output.stderr # truncates the file and redirects stderr to it
job1
job2
job3
exec 2>&- # closes stderr and removes the redirection
"super advanced system admin expert methods" means writing answers without first understanding the question.
Seriously, at the *beginning* of the script, you could do something like this:
exec >standard.out 2>error.out
Every time there's an error, you stop the script and output the error.out file. However, it doesn't deal with automatically printing the normal output. However, you could go to another terminal and run:
yes, you're right that method doesn't suit my purpose. i think i'll do a simple check to see if the file got created.
ls -lR ${BUILD_CODE} 2>&1 >${REL_FILES}/${RELNO}.txt | tee -a ${LOGFILE}
if [ ! -e ${REL_FILES}/${RELNO}.txt ]
then
echo "ERROR - release file creation failed
exit
fi
this is all done and dusted now but i'd just like to say to Arunprasad, you were correct, it's -s i need to use, because the file gets created regardless of any errors. the indication of failure is size of the file.