Skipped duplicated files from log

I have a script to get files from remote server to local path the issue I wanna log the output of every collected files but look like something went wrong , I feel that my error is hide in what follows:

  1. In ftp function I did not manage to control if file exist in my local directory then skipped (no need to collect it again , in order not to overwrite the collected one )
  2. " Move file1xxxx 20130428 " message from output log "20130428.log" is accumulated thus if "file1xxxx" was already organized in directory 20130428 & still in remote server when ftp works it will be collected & organized again (although I'm need for any clue to overcome this but no matter as the size of file has no changed ) but it will be added to 20130428.log as duplicated one & unfortunately I can't (sort, uniq) that log to skip any duplication , thanks for follow up :wall:
for file in * ; do
dir=$( echo $file | cut -c1-8 ) 
    [ -d $dir ] || mkdir -p $dir     
echo "$dir was created "
[ -f $file  ] && mv  $file  $dir      
echo "Move $file  $dir"    >>  /user/$dir.log 
     sort -u  /user/$dir.log
done

Here is that answer again. :slight_smile:

for file in * ; do
  dir=$( echo $file | cut -c1-8 )
  [ -d $dir ] || mkdir -p $dir
  echo "$dir was created"
  [ -f $file ] && mv $file $dir

  log=/user/$dir.log
  echo "Move $file $dir" >> $log
  sort -u $log > /tmp/temp.x
  mv /tmp/temp.x $log
done

Ensure that files are really files and more than 8 characters long!
Quote variables in command arguments!

for file in ?????????* ; do
  [ -f "$file" ] || continue
  dir=$( echo "$file" | cut -c1-8 )
  [ -d "$dir" ] || mkdir -p "$dir"
...
1 Like