I have a script to get files from remote server to local path the issue I wanna log the output of every collected files but look like something went wrong , I feel that my error is hide in what follows:
- In ftp function I did not manage to control if file exist in my local directory then skipped (no need to collect it again , in order not to overwrite the collected one )
- " Move file1xxxx 20130428 " message from output log "20130428.log" is accumulated thus if "file1xxxx" was already organized in directory 20130428 & still in remote server when ftp works it will be collected & organized again (although I'm need for any clue to overcome this but no matter as the size of file has no changed ) but it will be added to 20130428.log as duplicated one & unfortunately I can't (sort, uniq) that log to skip any duplication , thanks for follow up :wall:
for file in * ; do
dir=$( echo $file | cut -c1-8 )
[ -d $dir ] || mkdir -p $dir
echo "$dir was created "
[ -f $file ] && mv $file $dir
echo "Move $file $dir" >> /user/$dir.log
sort -u /user/$dir.log
done