Need help combining large number of text files

Hi,
i have more than 1000 data files(.txt) like this

first file format:
178.83 554.545
179.21 80.392

second file:
178.83 990.909
179.21 90.196

etc.

I want to combine them to the following format:

178.83,554.545,990.909,...
179.21,80.392,90.196,...

such that the first column(which is identical across all the files) is preserved, but the other data fields are added onto the same line by comma(or space/tab) separation.

I am very new to shell scripting, hope to receive some kind advice on my question.

Cheers!
Daniel

Actually do you want to compare only 2 files or 1000 files at a same time?

Whether your all 1000 files have same no of lines?

thanks for the reply, everyone.

Yes, all my files have exactly the same number of lines (2048 lines).

each file consist of 2 columns. a 'field' and a 'data', if u like, as per my earlier post

what i want to achieve is to get a big final 'table' file which shows

field data(from file 1) data(from file 2) data(from file 3).... data(from file 1000)

hope this clarifies. thanks again for the help!

the join command could probably do the best for that purpose.

Try:

awk '{ A[$1]=A[$1]" "$2;}  END { for (i in A) print i,A;}' *.txt >outfile

Hello frans,we can't use the join command directly to solve this problem.Because,mr_monocyte wants to deal with 1000 files not only two files which is accepted by join command.But,its a good idea if we have less number of files.mr_monocyte,use the following script to do your requirement.

k=0
for i in $* #reading the command line arguments which are all files given
do
k=0
while read line  #reading line by line from the file
do
if [[ $i == $1 ]];then #checking whether the file received is first file
arr[$k]=$line
let k+=1
else
val=`echo $line | cut -d ' ' -f 1` #getting the first id to match
if [[ $val == `echo ${arr[$k]} | cut -d ' ' -f 1` ]];then #checking whether both the values are equal
result[$k]=${result[$k]},`echo $line | cut -d ' ' -f 2` #storing it in an array
let k+=1 #incrementing the array index
fi
fi
done < $i
done

for((l=0;l<${#result[@]};l++)) #traversing the list stored in an array
do
echo `echo ${arr[$l]} | cut -d ' ' -f 1`,`echo ${arr[$l]} | cut -d ' ' -f 2`${result[$l]} #printing the resulting output
#If you want the output to be in file,use the following command
echo `echo ${arr[$l]} | cut -d ' ' -f 1`,`echo ${arr[$l]} | cut -d ' ' -f 2`${result[$l]} >> result #printing the resulting output
done

For those who enjoy putting the filesystem to good use :wink: (k=key, v=value):

$ cat 1
178.83 554.545
179.21 80.392

$ cat 2
178.83 990.909
179.21 90.196

$ mkdir temp

$ cat * | (cd temp; while read k v; do echo $v >> $k; done; for k in *; do echo $k,$(paste -sd, $k); done)
178.83,554.545,990.909
179.21,80.392,90.196

Cheers,
Alister