awk parsing file

Looking to parse a file to remove the duplicates and get only few fields and uniq rows.

input.tx

    Loc (TC) ID   : ssfad_fs
    Serial         : PIC002340098    
    
-----------------------------------------------------------------------------------------------------------------------
                                                                          Total
                                           Pic                                      Comp
ID   Name                        Ver  LOC       Time                      (Size)   (Size)         Complete Time
----------------------------------- ---- ----- ------------------------ ---------- ---------- ------------------------
FAD00 TINT_PAP_1516048511           0 .X... Mon Jan 15 20:35:14 2018          0          0                       NA
      sfad_jpg                      0 ..... Mon Jan 15 20:24:02 2018          0          0 Wed Jan 17 20:24:01 2018
      sfad_jpg                      1 ..... Mon Jan 15 16:24:02 2018          0          0 Wed Jan 17 16:24:02 2018
      sfad_jpg                      2 ..... Mon Jan 15 12:24:03 2018          0          0 Wed Jan 17 12:24:02 2018
      sfad_jpg                      3 ..... Mon Jan 15 08:24:02 2018          0          0 Wed Jan 17 08:24:02 2018
      sfad_jpg                      4 ..... Mon Jan 15 04:24:02 2018          0          0 Wed Jan 17 04:24:02 2018
      sfad_jpg                      5 ..... Mon Jan 15 00:24:03 2018          0          0 Wed Jan 17 00:24:02 2018
      sfad_jpg                      6 ..... Sun Jan 14 20:24:03 2018          0          0 Tue Jan 16 20:24:02 2018
      sfad_jpg                      7 ..... Sun Jan 14 16:24:03 2018          0          0 Tue Jan 16 16:24:02 2018
      sfad_jpg                      8 ..... Sun Jan 14 12:24:02 2018          0          0 Tue Jan 16 12:24:02 2018
      sfad_jpg                      9 ..... Sun Jan 14 08:24:03 2018          0          0 Tue Jan 16 08:24:02 2018
      sfad_jpg                     10 ..... Sun Jan 14 04:24:02 2018          0          0 Tue Jan 16 04:24:01 2018
      sfad_jpg                     11 ..... Sun Jan 14 00:24:02 2018          0          0 Tue Jan 16 00:24:02 2018
      sfad_jpg                     12 ..... Tue Oct 10 21:00:47 2017      65550      49153                       NA
BAC76 TINT_PAP_1516048511           0 .X... Mon Jan 15 20:35:14 2018          0          0                       NA
      sfad_jpg                      0 ..... Mon Jan 15 20:24:02 2018          0          0 Wed Jan 17 20:24:01 2018
      sfad_jpg                      1 ..... Mon Jan 15 16:24:02 2018          0          0 Wed Jan 17 16:24:02 2018
      sfad_jpg                      2 ..... Mon Jan 15 12:24:03 2018          0          0 Wed Jan 17 12:24:02 2018
      sfad_jpg                      3 ..... Mon Jan 15 08:24:02 2018          0          0 Wed Jan 17 08:24:02 2018
      sfad_jpg                      4 ..... Mon Jan 15 04:24:02 2018          0          0 Wed Jan 17 04:24:02 2018
      sfad_jpg                      5 ..... Mon Jan 15 00:24:03 2018          0          0 Wed Jan 17 00:24:02 2018
      sfad_jpg                      6 ..... Sun Jan 14 20:24:03 2018          0          0 Tue Jan 16 20:24:02 2018
      sfad_jpg                      7 ..... Sun Jan 14 16:24:03 2018          0          0 Tue Jan 16 16:24:02 2018
      sfad_jpg                      8 ..... Sun Jan 14 12:24:02 2018          0          0 Tue Jan 16 12:24:02 2018
      sfad_jpg                      9 ..... Sun Jan 14 08:24:03 2018          0          0 Tue Jan 16 08:24:02 2018
      sfad_jpg                     10 ..... Sun Jan 14 04:24:02 2018          0          0 Tue Jan 16 04:24:01 2018
      sfad_jpg                     11 ..... Sun Jan 14 00:24:02 2018          0          0 Tue Jan 16 00:24:02 2018
      sfad_jpg                     12 ..... Tue Oct 10 21:00:47 2017      65550      49153                       NA
      sfad_sf_test                  0 ..... Fri Aug 11 19:14:27 2017      65550      49153                       NA

Expecting output

Loc ID      Img Name              Ver.   Time                    Complete Time
----------     -------------            -----   ------------------   ----------------
ssfad_fs  TINT_PAP_1516048511    0  Mon Jan 15 20:35:13 2018     NA
ssfad_fs  sfad_jpg                  1  Mon Jan 15 16:24:02 2018   Wed Jan 17 16:24:01 2018
ssfad_fs  sfad_jpg                  2  Mon Jan 15 12:24:02 2018   Wed Jan 17 12:24:02 2018
ssfad_fs  sfad_jpg                  3  Mon Jan 15 08:24:02 2018   Wed Jan 17 08:24:01 2018
ssfad_fs  sfad_jpg                  4  Mon Jan 15 04:24:02 2018   Wed Jan 17 04:24:01 2018

I am using following awk command

cat input.txt | head -n-12 |cut -c6- |awk -v LOC="ssfad_fs" 'NR>9;$0=SG $0; {if ($1 ~ /^[\w-]+$/ && $2 ~ /0-100/);{$4=$10=$11="";print}}' |sort -unk 2,2

But when I use following code I don't get desired result.

cat input.txt | head -n-12 |cut -c6- | awk -v SG="ssfad_fs" 'NR>9;$0=SG $0 {$4=$10=$11=""}; {print $0}'

ssfad_fs sfad_jpg 9  Sun Jan 14 08:24:02 2018   Tue Jan 16 08:24:02 2018
 sfad_jpg                     10 ..... Sun Jan 14 04:24:01 2018          0          0 Tue Jan 16 04:24:01 2018
ssfad_fs sfad_jpg 10  Sun Jan 14 04:24:01 2018   Tue Jan 16 04:24:01 2018
 sfad_jpg                     11 ..... Sun Jan 14 00:24:02 2018          0          0 Tue Jan 16 00:24:01 2018
ssfad_fs sfad_jpg 11  Sun Jan 14 00:24:02 2018   Tue Jan 16 00:24:01 2018
 sfad_jpg                     12 ..... Tue Oct 10 21:00:46 2017      65550      49153                       NA
ssfad_fs sfad_jpg 12  Tue Oct 10 21:00:46 2017   NA
 ssfad_fs_test                      0 ..... Fri Aug 11 19:14:26 2017      65550      49153                       NA
ssfad_fs ssfad_fs_test 0  Fri Aug 11 19:14:26 2017   NA

Other options I tried doesn't come closer to what I am expecting. Your help/guidance would be appreciated.

awk 'NR>9;BEGIN {printf ("%s%n%s%s","Loc ID", "Img Name" "Ver#", "Time", "Complete Time")}{if ($1 ~ /^[\w-]+$/ && $2 ~ /0-100/);{$3=$9=$10="";print}

What defines a "duplicate" in your input file?
Why are some of the seconds in your expected output decremented by one, others not?

Actually, I don't need ID field ( ID Field has hex values like FAD00, BAC76) I don't need those. I want only LOC ID which I can provide as a variable input as well ( awk -v LOC="ssfad_fs" . Also don't require LOC, Total Size, Comp Size fields in the output.

Regarding some of the seconds are decremented because when I generate file sometimes there is 1 second difference. Actually the I/P file I provided and O/P are from different files. That is why there is mismatch. Sorry for the confusion. Here is the O/P from same I/P file.

Loc ID	 Img Name               Ver.   Time                    Complete Time
ssfad_fs TINT_PAP_1516048511     0    Mon Jan 15 20:35:13 2018 NA
ssfad_fs sfad_jpg                1    Mon Jan 15 16:24:02 2018 Wed Jan 17 16:24:02 2018
ssfad_fs sfad_jpg                2    Mon Jan 15 12:24:03 2018 Wed Jan 17 12:24:02 2018
ssfad_fs sfad_jpg                3    Mon Jan 15 08:24:02 2018 Wed Jan 17 08:24:02 2018
ssfad_fs sfad_jpg                4    Mon Jan 15 04:24:02 2018 Wed Jan 17 04:24:02 2018
ssfad_fs sfad_jpg                5    Mon Jan 15 00:24:03 2018 Wed Jan 17 00:24:02 2018
ssfad_fs sfad_jpg                6    Sun Jan 14 20:24:03 2018 Tue Jan 16 20:24:02 2018
ssfad_fs sfad_jpg                7    Sun Jan 14 16:24:03 2018 Tue Jan 16 16:24:02 2018
ssfad_fs sfad_jpg                8    Sun Jan 14 12:24:02 2018 Tue Jan 16 12:24:02 2018
ssfad_fs sfad_jpg                9    Sun Jan 14 08:24:03 2018 Tue Jan 16 08:24:02 2018
ssfad_fs sfad_jpg               10    Sun Jan 14 04:24:02 2018 Tue Jan 16 04:24:01 2018
ssfad_fs sfad_jpg               11    Sun Jan 14 00:24:02 2018 Tue Jan 16 00:24:02 2018
ssfad_fs sfad_jpg               12    Tue Oct 10 21:00:47 2017 NA

You still didn't tell people in here which lines to print and which to suppress.

After Removing fields ID,LOC, Total Size, Comp Size fields from the input I want unique lines only.

How far would

cut -c7-38,45-69,92- file | tail -n+9 | sort -uk2,2n
TINT_PAP_1516048511           0 Mon Jan 15 20:35:14 2018                       NA
sfad_jpg                      1 Mon Jan 15 16:24:02 2018 Wed Jan 17 16:24:02 2018
sfad_jpg                      2 Mon Jan 15 12:24:03 2018 Wed Jan 17 12:24:02 2018
sfad_jpg                      3 Mon Jan 15 08:24:02 2018 Wed Jan 17 08:24:02 2018
sfad_jpg                      4 Mon Jan 15 04:24:02 2018 Wed Jan 17 04:24:02 2018
sfad_jpg                      5 Mon Jan 15 00:24:03 2018 Wed Jan 17 00:24:02 2018
sfad_jpg                      6 Sun Jan 14 20:24:03 2018 Tue Jan 16 20:24:02 2018
sfad_jpg                      7 Sun Jan 14 16:24:03 2018 Tue Jan 16 16:24:02 2018
sfad_jpg                      8 Sun Jan 14 12:24:02 2018 Tue Jan 16 12:24:02 2018
sfad_jpg                      9 Sun Jan 14 08:24:03 2018 Tue Jan 16 08:24:02 2018
sfad_jpg                     10 Sun Jan 14 04:24:02 2018 Tue Jan 16 04:24:01 2018
sfad_jpg                     11 Sun Jan 14 00:24:02 2018 Tue Jan 16 00:24:02 2018
sfad_jpg                     12 Tue Oct 10 21:00:47 2017                       NA

get you?

Hi RUDIC, Thanks for your response but would be better if we can append "Loc (TC) ID" which is "ssfad_fs" in this case. I would prefer to pass it as a variable so that I can use it in the script.

Try

cut -c7-38,45-69,92- file | tail -n+9 | sort -uk2,2n | sed "s/^/$LOC /"

EDIT: Or, now we make use of sed anyhow, why not

sed "1,8d; s/.\{22\}\(.\{24\}\)$/\1/; s/^\(.\{37\}\).\{6\}/\1/; s/^.\{6\}/$LOC /;" file | sort -uk3,3n

or, with EREs,

sed -r "1,8d; s/.{22}(.{24})$/\1/; s/^(.{37}).{6}/\1/; s/^.{6}/$LOC /;" file | sort -uk3,3n

or, even simpler,

sed -r "1,8d; s/^.{6}(.{32}).{6}(.{24}).{22}(.{24})/$LOC \1\2\3/;" file | sort -uk3,3n

Also, would like to add headers as follows.

Loc ID      Img Name              Ver.   Time                    Complete Time
----------     -------------            -----   ------------------   ----------------