My script failed and can't fix it ?

You are very welcome, patience pays off, and thanks for explaining how you resolved it.

you welcome as well dear , but now I have another issue I wanna log the output of every collected files but look like something went wrong , I feel that my error is hide in what follows:

  1. In ftp function I did not manage to control if file exist in my local directory then skipped (no need to collect it again , in order not to overwrite the collected one )
  2. " Move file1xxxx 20130428 " message from output log "20130428.log" is accumulated thus if "file1xxxx" was already organized in directory 20130428 & still in remote server when ftp works it will be collected & organized again (although I'm need for any clue to overcome this but no matter as the size of file has no changed ) but it will be added to 20130428.log as duplicated one & unfortunately I can't (sort, uniq) that log to skip any duplication , thanks for follow up
for file in * ; do
  dir=$( echo $file | cut -c1-8 )
    [ -d $dir ] || mkdir -p $dir
    echo "$dir was created "
    [ -f $file  ] && mv  $file  $dir
     echo "Move $file  $dir"    >>  /user/$dir.log
     sort -u  /user/$dir.log
done

I edited the script some to show how to get rid of the duplication.

for file in * ; do
  dir=$( echo $file | cut -c1-8 )
  [ -d $dir ] || mkdir -p $dir
  echo "$dir was created"
  [ -f $file ] && mv $file $dir

  log=/user/$dir.log
  echo "Move $file $dir" >> $log
  sort -u $log > /tmp/temp.x
  mv /tmp/temp.x $log
done
1 Like

thanks for your editing I'm going to test it soon but would like to know ? for how long the temp.x file will be in system till be deleted ? do you think my script was failed due to RAM issue or it was running so fast thus the system can not sort the files , is it possible to tell where is my fault ? really I like scripting but sometimes I do mistaken for easy thing , thank you again :frowning:
/tmp/temp.x

It will be in the system very briefly. The mv command renames it to the name stored in $log variable, and then temp.x file is gone. Probably better to use $HOME/temp.x to be safer.

Yes, it is possible to tell. Your sort -u /user/$dir.log command was producing sorted output, but the sorted output was not getting saved. That was why I redirected the output to the temp.x file, and then used mv to overwrite the $log file, so the sorted output would get saved.

Anyway, see what happens, and if still a problem, repost the script and whatever error message happens.

:b:
It was really helpful to look at your post , it was successfully done , from now on I'm going to check what you are going to post , thanks a lot hanson44

You are very welcome. I'm glad you were patient and it's working. :b:

I wanna do a little bit change on my whole script ( ftp and for loop ) since I tried to run it as cornjob but as you know it could not be run every 1 second, why I'm doing that because while I'm copying the files from remote server to my side another use his ftp session to cut the files thus I'm losing some of them, I'm thinking not to run my script as cronjob , do using one of the following then execute it directly in background to be able to connect that server and get the files every 1 second before cutting them by another user , am I right or not ?

while true 
do
ftp -in x,x,x,x  << ENDFTP
user username password
cd $ddir 
lcd $sdir 
bin 
prompt 
mget *.text 
bye 
ENDFTP
done

or

while true
do
ftp function I have
done