How to read particular line in file from specific column?

Hi...friends....

I want to create inventory...information for that I need to read some specific row say 2nd row from 1st 3 column and and write data with particular file used, I have some more column also but I need only 3 column data of first entry after header

I attached sample file..those who know please help...
my input

ST_DATE    LATITUDE    LONGITUDE
05 Aug 1988 07:10:00 AM    9.983    68.033
05 Aug 1988 07:10:00 AM    9.983    68.033
05 Aug 1988 07:10:00 AM    9.983    68.033

need of output is like this

filename    date    time    lat    long
myfile.txt    05/08/98    07:10:00    9.983    68.033
awk 'FNR == 2 {print $1" "$2" "$3 " "$4 " "$5 " " $6}'

I can read but how can I write name of the file from which I read and appending the new data from next file

Use internal awk variable FILENAME

ok you mean

{print filename, $1......}

right..but without using cat how can I append to new file

You will redirect it to a new file awk '{ commands } ' file > newfile

If we do so only last files data will be written to file...other files data will be overwritten....
am doing by

cat *.txt > new.xyz 

by generating n number of text files from n number of input file

Hope this is what you want..

awk 'FNR == 2 {print FILENAME,$1,$2,$3,$4,$5,$6}' *.txt >> new.xyz
1 Like

thank u so much pamu... why its

>>

why not

>

and I am using this to sort files in ascending order but *.txt not coming in order

for file in `ls *.txt |sort -t"-" -k2n,2`; do

Because > is used to overwrite to the file or create new file and >> is used for appending data into file.

Fisrt, is all your files having - in their file name?
Please provide outpur of ls *.txt

here is list of one of my files in one folder for which I am trying...

gs_10.asc    ts_104a.asc  ts_16.asc   ts_2a.asc  ts_43.asc  ts_58.asc  ts_71.asc  ts_86.asc   ts_95a.asc  tse_6.asc  tss_6.asc
gs_2.asc     ts_104.asc   ts_17a.asc  ts_2.asc   ts_44.asc  ts_59.asc  ts_72.asc  ts_87.asc   ts_95.asc   tse_7.asc  tss_7.asc
gs_3.asc     ts_105a.asc  ts_17.asc   ts_30.asc  ts_45.asc  ts_5a.asc  ts_73.asc  ts_88.asc   ts_96a.asc  tse_8.asc  tss_8.asc
gs_4.asc     ts_105.asc   ts_18.asc   ts_31.asc  ts_46.asc  ts_5.asc   ts_74.asc  ts_89a.asc  ts_96.asc   tse_9.asc  tss_9.asc
gs_5.asc     ts_10a.asc   ts_19.asc   ts_32.asc  ts_47.asc  ts_60.asc  ts_75.asc  ts_89.asc   ts_97a.asc  tsn_1.asc  tsw_1.asc
gs_6.asc     ts_10.asc    ts_1a.asc   ts_33.asc  ts_48.asc  ts_61.asc  ts_76.asc  ts_8a.asc   ts_97.asc   tsn_2.asc  tsw_2.asc
gs_7.asc     ts_11a.asc   ts_1.asc    ts_34.asc  ts_49.asc  ts_62.asc  ts_77.asc  ts_8.asc    ts_98a.asc  tsn_3.asc  tsw_3.asc
gs_8.asc     ts_11.asc    ts_20.asc   ts_35.asc  ts_4a.asc  ts_63.asc  ts_78.asc  ts_90a.asc  ts_98.asc   tsn_4.asc  tsw_4.asc
gs_9.asc     ts_12a.asc   ts_21.asc   ts_36.asc  ts_4.asc   ts_64.asc  ts_79.asc  ts_90.asc   ts_99a.asc  tsn_5.asc  tsw_5.asc
ts_100a.asc  ts_12.asc    ts_22.asc   ts_37.asc  ts_50.asc  ts_65.asc  ts_7a.asc  ts_91a.asc  ts_99.asc   tsn_6.asc  tsw_6.asc
ts_100.asc   ts_13a.asc   ts_23.asc   ts_38.asc  ts_51.asc  ts_66.asc  ts_7.asc   ts_91.asc   ts_9a.asc   tsn_7.asc  tsw_7.asc
ts_101a.asc  ts_13.asc    ts_24.asc   ts_39.asc  ts_52.asc  ts_67.asc  ts_80.asc  ts_92a.asc  ts_9.asc    tsn_8.asc  tsw_8.asc
ts_101.asc   ts_14a.asc   ts_25.asc   ts_3a.asc  ts_53.asc  ts_68.asc  ts_81.asc  ts_92.asc   tse_1.asc   tsn_9.asc  tsw_9.asc
ts_102a.asc  ts_14.asc    ts_26.asc   ts_3.asc   ts_54.asc  ts_69.asc  ts_82.asc  ts_93a.asc  tse_2.asc   tss_2.asc
ts_102.asc   ts_15a.asc   ts_27.asc   ts_40.asc  ts_55.asc  ts_6a.asc  ts_83.asc  ts_93.asc   tse_3.asc   tss_3.asc
ts_103a.asc  ts_15.asc    ts_28.asc   ts_41.asc  ts_56.asc  ts_6.asc   ts_84.asc  ts_94a.asc  tse_4.asc   tss_4.asc
ts_103.asc   ts_16a.asc   ts_29.asc   ts_42.asc  ts_57.asc  ts_70.asc  ts_85.asc  ts_94.asc   tse_5.asc   tss_5.asc

Check Your below command you should use _ there :slight_smile:

1 Like

Thank you so much....whether pamu its possible to convert binary file to ascii completely..in shell...

I am not aware of this..

this thread may help you..