renaming multiple files

I have to rename 100+ files at a time on the server
& was trying to use a script for doing that.
I have used ultra edit to create a file having
current filename & new file name as below

file234.txt | file956.txt
file687.txt | file385.txt

There is no fixed pattern while renaming & would appreciate if
somebody can suggest a way to do this using a script for
100+ files.

One idea.

cat file_containing_the_name_pairs | awk -F'|' | while read oldfile newfile
do
          # Remove echo when sure that this does what you want
          echo mv "${oldfile}" "${newfile}"
done

(Please no stupid comments about "cat", we don't know what O/S or Shell this is?).

cat list|while read a
do 
  old=`echo "${a}"|cut -f1 -d"|"` ## old file name, first column
  new=`echo "${a}"|cut -f2 -d"|"` ## new file name, first column
  mv ${old} ${new}
done

---------- Post updated at 06:50 PM ---------- Previous update was at 06:49 PM ----------

You beat me by a few seconds :slight_smile:

@linuxpenguin
In unix there is no fixed solution to a problem. I love to see an alternative different and valid solution.

while read old scrap new;do echo mv $old $new;done < file_list

We don't need awk, cut & cat :wink:

We will do it with awk. :wink:

awk -F"|" '{ print "mv ",$1,$2; }' file | sh

It doesn't matter what OS or shell, it's a UUOC.

tr -d '|' < pair_file | while read line; do mv "$line"; done

There's no need for tr:

while IFS="|" read old new
do
  mv "$old" "new"
done < pair_file

Less typing :wink:

More importantly, less time, fewer processes, etc..

thanks for all your replies.. It really made my job lot easier & simple.
Thanks again

@cfajohnson

1) If necessary I will point you at strong advice about not ever changing "IFS".

2) I continue to stand by the use of "cat" to start pipelines in portable and readable scripts. Reading input from the last line the loop is just about comprehensible in example scripts but incomprehensible on a commercial scale.

Please do, and I'll refute it.

There are many time when changing IFS is necessary. There is no reason not to.

If, by commercial scale, you mean very long blocks inside a loop, then that is just bad coding. The body should be encapsulated in a function.

@cfajohnson

Off topic, but this current tread shows what a mess you can get into with IFS and functions when you have missed the basics.

http://www.unix.com/shell-programming-scripting/131162-help-shell-script-ftp.html

There is no inappropriate use of IFS in that thread.

It is a case of someone not knowing the basics of shell scripting.

@cfajohnson
Agreed. The IFS construct works in many modern shells. Never in dispute - just difficult to follow.

I can write to any local standard, but long experience has shown which constructs are difficult to follow by colleagues and passing contractors.

I too can write obfuscated code which works efficiently but is unmaintainable in a commercial environment.

Sometimes when the data volumes are large and the processing is complex (and I don't have time to write a program) I will use every trick to get a shell script to process the data in the least possible time. The script may be unreadable but it will work.

So vote for the "Clear Code Party" in the next General Election.

(Dismounts high horse and prepares to count both of the votes).

It works in all Bourne-type shells. (And you shouldn't use any other kind for scripts.)

I don't see anything difficult to understand in setting IFS any more than in setting any other variable.

There's nothing obfuscated about setting IFS, especially before a read statement, where it is standard practice.

That's when you should be using awk.

We are in different time zones, and I am getting tired, and with apologies to the O/P for hijacking the post:

I fully agree that the Bourne shell is a sound base for writing portable shell scripts. I do not agree that interfering with IFS is sensible in all common shells.

I must argue this point. It is obfuscated and it is not standard practice. I have scanned thousands of manufacture standard issue scripts and not found one case.

I choose the language to match the specification. There are occasions when awk will do the job, there are occasions when it is just too slow or awkward. I treat awk as a programming language rather than a shell command.

No, it's not. The POSIX shell (a Bourne-type shell) is the sound base for writing portable shell scripts. The Bourne shell is obsolete.

I don't know what you consider standard, but I have found it to be common practice everywhere.

In fact, the bash man page says,

 IFS   The Internal Field Separator that is used ... to split lines into
       words with the read builtin command.

awk is slow if you use it to process strings; it is much faster than the shell when processing files (anything but a fairly short file, that is).