Lsof command

Hi,

I am trying to find the files in a specific directory that are currently in open state. I need this information to archive the old files that are not active in the directory.

I get the following output when I try the command

[root@myhadoop tmp]$ lsof +d '/var/tmp/'|awk '{print $9}'|sort -u
NAME
/var/tmp
/var/tmp/abc.txt
/var/tmp/xyz.txt
/var/tmp/123.txt

I do not want to see the first 2 lines which are the header info and the directory path where the files are being searched from.

I can use the sed command to get rid of the first 2 lines, but I was wondering if there is any better way to handle this.

Thanks
-Vicky

I'm afraid, no. lsof doesn't seem to offer what you request by itself. So you'd be dependent on massageing the output afterwards. But, in lieu of appending another pipe into sed , grep , or another awk , you could enhance your existing awk let. But, be aware that the directory itself doesn't have to be open although files in it are (try exec 3>/path/to/somewhere/completely/else and lsof +d that path), so be very sure you really want to exclude it.