Could you please let us know what you are trying to do here. Because I don't think you are trying to traverse directories and trying to get files, if you could explain your requirement with details, we could try to help.
You can keep the ls -trcd options, but -l has to go. And parameter expansion in the shell is MUCH faster than invoking another awk per file processed. Maybe something more like:
ls -tcrd | while IFS= read -r eachlfile
do
echo "=========================="
printf '%s\n' "${eachlfile##*__}"
echo "=========================="
cat "$eachlfile"
done
would do what you want. This should work with ksh or bash on any of those systems. On a Solaris 10 or earlier SunOS system, you'd have to use /usr/xpg4/bin/sh instead of /bin/sh (if you insist on using a shell named sh ). The printf is safer than echo . The output from echo can vary depending on what characters are in the pathnames being printed; the printf will give you the pathname itself. Of course, with no indication of what the pathnames you're processing really look like, bite above is completely untested.
I made a mistake in my last post... The first line:
ls -tcrd | while IFS= read -r eachlfile
should have been:
ls -tcrd ${BACKUPDIR}/${BACKUPNAME}*/content_${BACKUPNAME}* |
while IFS= read -r eachlfile
I should have also noted that you can't cat a directory and with or without ls -d you can end up with directory names from your ls if there are any directories matching the pattern ${BACKUPDIR}/${BACKUPNAME}*/content_${BACKUPNAME}* . If that is a problem, you might want to consider changing the line:
cat "$eachlfile"
to:
if [ -f "$eachlfile" ]
then cat "$eachlfile"
else printf '*** Not a regular file. ***\n'
fi