to find the file with full path

Hi,
I have written this shell script:
fl=`ls -1lrt $mylist | grep '\.xml$' | awk '{print $9}'`
echo $fl (1)
for i in $fl
do
for dir in $mylist
do
if [ -f $dir/$i ]
then
echo $dir/$i >> tmp (2)
fi
done
done

The mylist contains some directory names. The satement (1) gives the sorted list of files from multiple directories. the satement (2) lists the same files with their full path. But here I get some duplicate lines in tmp file when two files with same name in different directories. Due to some condition I have no clue to remove the duplicate entries bcoz it is unknown that which entry should be removed. The above script works fine if there is no multiple file ahving same name.
I have written the above code to find the list of files from multiple directories which are sorted by the time of their creation irrespective of directories.
so please tell is there any other way to do the same thing.

As far as I understand, you want to remove the duplicates files which have different paths.

Look at this thread.

Removing duplicate files from list with different path

vino

No I cannt remove the duplicate entries from the list which have the same path. Because I dont know to remove which entry. Bcoz according to this entry some manipulation are being done. ok I am attatching the out put here:
For the satement (1) .......
test.xml test1.xml sample.xml tst2.xml test3.xml mvtest.xml mvtest.xml neha.xml
for the satement (2)..........
/backup/surjya/mdbase/test.xml
/backup/surjya/mdbase/test1.xml
/backup/surjya/test/sample.xml
/backup/surjya/test/tst2.xml
/backup/surjya/test/test3.xml
/backup/surjya/mvfile/mvtest.xml
/backup/surjya/test/mvtest.xml
/backup/surjya/mvfile/mvtest.xml
/backup/surjya/test/mvtest.xml
/backup/surjya/mvfile/neha.xml

The above output shows that it has 10 entries where as it should have 8 entries. At this stage we dont know which entry shoulsd be removed.
So please tell me is there any other way to find the sorted file list with full path accros the directories.

If you want to remove the duplicate entries look at Jim's solution in Removing duplicates

vino

Hi
Still I have same problem. Is it possible to find list of files in the order of their time of creation accross the directories.
We acn use "ls -rt" to find the list of files from the working dorectory. to get the list of files in certain order I used the command:
ls -rt $dir1 $dir2 $dir3 | grep '\.xml'
But here I face the problem. It works fine and gives the list of files in the certain order accross the directories. But consider very recently I created a ".xml" file in the $dir2 directory. The whole list for $dir2 comes to the end of the list. Please solve this problem

I found this while searching for fast filename expansion ideas.

I know this is very old message, but thought it should be put to a close.
Here is one way to achieve the result.

ls -rt $dir1/.xml $dir2/.xml $dir3/*.xml

This will sort files from all directories matching "*.xml", with the oldest first.
If your directory list is a variale list then you may try the following.
Assuming input variable as "IN_VAR" containing list of dirs.

DIR_LIST=`echo $IN_VAR | sed 's, ,/.xml ,g; s,,$/.xml,'`

echo "$DIR_LIST"

ls -rt $DIR_LIST