File descriptors, redirecting output, and stdout

Hello all. I've been lurking here for a year or two and finally decided to post.

I need some assistance with file descriptors, stdout, and redirecting output. I've searched through a number of very helpful threads here (unfortunately I can't link to any of them yet due to my low post count...), and I've done a fair amount of experimenting, but I haven't been able to come up with a solution. (I really tried to do my own legwork before asking for help.)

I'm writing a ksh script that opens up two file descriptors, one for the "regular" log file (exec 3> myfile) and one specifically for an error log file (exec 4> my_error_file).

I want the output to be directed to specific files, and to make this mandatory and beyond user control. However, I would also like to have the script output to the terminal as it executes.

I can redirect to my file descriptors just fine; I can't figure out how to simultaneously redirect to the terminal, though. I have unsuccessfully tried to use tee and I've tried redirecting my output in various ways, but I'm missing something fundamental. I would be very appreciative if anybody could offer any advice!

Here are the relevant code snippets and some psuedocode of what I'm doing and what I've tried thus far:

#!/usr/bin/ksh
LOG_FILE=/home/mydir/regular_log_file.log
LOG_ERROR_FILE=/home/mydir/error_log_file.log
 
exec 3> $LOG_ERROR_FILE # send this to regular_log_file.log. Non-error information messages get written here
exec 4> $LOG_ERROR_FILE # send this to error_log_file.log. If it blows up, it gets written here
 
# usage() function
usage(){ 
print " There's lots of print commands here that show how to use this script. I want these to display to both the screen and the relevant log file. 
I get it to the log file just fine by using 'usage 1>&4', so I don't have to specify where the file goes when I call print"
}
 
# send_email() function
# send email if the script fails
send_email(){ 
cat $ERROR_LOG_FILE | mailx ... ... ...
}

... and later in the script ...

# attempt to touch tarfile to see whether we can write
# the TARFILE_LOCATION and TARFILE variables get set properly in code I didn't include; trust me
touch ${TARFILE_LOCATION}/${TARFILE} > /dev/null
if [[ $? != 0 ]]; then
print -u4 " "
print -u4 "ERROR: Unable to write tarfile to $TARFILE_LOCATION."
print -u4 " Script cannot recover from this error. Exiting..."
send_email
fi

So, as you can see, all of my error output gets processed through "print -u4" commands, removing the need for excessive amounts of > and >> redirects. The script looks tidy, I don't have to carefully troubleshoot 700 lines to make sure that I didn't accidentally > when I meant to >>, and life is good.

Debugging is a pain, though. I have to go dig up the log files each and every time in order to see how the script behaved. I would like to be able to continue to call "print -u4 text" but also be able to see that output in stdout, so when I (or other people) use the script, it appears to actually do something.

I know the script is doing something, but a lot of people will use this, see a blank line and "hung" cursor, and panic...

Any help would be much appreciated! Thank you!

Instead of redirecting to 3 and 4 and using 'print -u4', you could've redirected stdout and stderr themselves. with 'exec 1>' and 'exec 2>' respectively, and not modified your print commands at all.

Of course, since you seem to wish to preserve stdout/stderr now, it'd be better to keep stdout/err as is.

A redirection can't cause something to print twice. If you want to print twice, you have to print twice. You can make a function to do so, however, simplifying things for you.

function output
{
        print "$@" >&3
        [ ! -z "$DEBUG" ] && print "$@"
}

function debug
{
        print "$@" >&4
        [ ! -z "$DEBUG" ] && print "$@" >&2
}

If you want to see the output appear on the screen, run it like DEBUG=1 ./myscript

or export the DEBUG variable in your shell or profile.

1 Like

Corona688,

Thank you for the advice.

Multiple users in different parts of my organization will end up using this script. I decided to write to 3 and 4 because I did not want other users controlling where the log files got written, such as by calling the script as:

 
myscript.ksh > /some/other/location/output.log 2>&1

I'm paranoid; I hope that is forgivable. :slight_smile:

... I think I understand what you're saying, though.

Even if someone does try to co-opt my output using the redirect shown above, the script wrangles the output I defined for 2, overriding the other users' redirects.

I tried it and it seems to work that way. Thanks! I never thought about doing it this way.