AWK Key Sep Output to JSON

Hi, thanks for any help, examples, pity. :slight_smile:

I have a program that is outputting data in the following format.

Attr_1: Login
Attr_2: 1
Attr_13: [User]="Joe" [eventType]="Information"
Attr_2421: Success
Event Recv Time 1584823379
Attr_1: Login
Attr_2: 1
Attr_13: [User]="Bob" [eventType]="Information"
Attr_2421: Success
Event Recv Time 1584843379

The data streams out continuously, over and over. Each record should be from ^Attr_1: through ^Event Recv Time [0-9]$

I want to convert this output to proper formatted JSON output, but I am struggling badly.

I also have a map file that is formatted as such.

Attr_1: type
Attr_2: statusCode
Attr_13: rawData
Attr_451: otherField
Attr_2421: statusMessage

So before formatting to JSON, I will need to lookup the field name for each attribute and replace it with the value from the map file.

Example output would look something like this.

[
   {
     "type": "Login",
     "statusCode": "1",
     "rawData": "[User]=\"Joe\"  [eventType]=\"Information\"",
     "statusMessage": "Success"
   },
   {
     "type": "Login",
     "statusCode": "1",
     "rawData": "[User]=\"Bob\"  [eventType]=\"Information\"",
     "statusMessage": "Success"
   }
]

Notice that the quotes in the rawData field need to be escaped.

I am trying to convert it with AWK for performance reasons. I have millions of lines to convert. Plus, AWK supports streaming (at least the build I am does), so after converting it, I can stream it to an http listener without having to wait for all of the lines to parse.

I am open to some other tool set to convert it if someone has something better/faster.

I would appreciate any help. I have so many example attempts in AWK, but I prefer not to post them because I am embarrassed... I know they are horrible... :frowning:

Alright @HappyGuy , welcome!

Perhaps you could give this a try and adjust it to your need:

BEGIN {
  FS=": "
  printf "["
}

NR==FNR {
  key[$1]=$2
  next
}

/^Event/ {
  printf "%s\n   {\n%s\n   }",(c++?",":""),s
  s=""
  next
}

{
  gsub(/"/,"\\\"",$2)
  s=s (s?",\n":"") sprintf("     \"%s\": \"%s\"", key[$1],$2)
}

END {
  print "\n]"
}
awk -f script.awk mapfile file
1 Like

a somewhat similar approach to @Scrutinizer.
awk -f happy.awk mapfile myFile where happy.awk is:

BEGIN {
  qq="\""
  FS=": +"
}
FNR==NR { f2[$1]=$2;next}
FNR==1 { print "[";next}
!/Event Recv Time/{
   f1=($1 in f2)?f2[$1]:"unknown"
   gsub(qq, "\\" qq, $2)
   s=((s)?s"," ORS: "") "\t\t" qq f1 qq ":" " " qq $2 qq
   next
}
{
   print "\t{" ORS s ORS "\t}" ((cnt++)?"":",")
   s=""
}
END { print "]" }
2 Likes

@Scrutinizer, thank you so much for your example! I know sometimes people get frustrated with requests for do this for me, but in this case, I was completely a fish out of water! Your example got me where I needed to be and with a minor bit of tweaking I was able to make your code do what I needed!

@vgersh99, for some reason your example was missing commas between records.

Here is the final version of what I am using.

BEGIN {
  FS=": "
  printf "["
}

/^\s*$/ {
  next
}

/IPV4/ {
  $2=$3;
}

NR==FNR {
  key[$1]=$2;
  next
}

/^Event/ {
  printf "%s{%s}",(c++?",":""),s
  s=""
  next
}

{
  gsub(/[^[:print:]]/,"",$2)
  gsub(/\\+/,"\\\\",$2)
  gsub(/"/,"\\\"",$2)
  s=s (s?",":"") sprintf("\"%s\": \"%s\"", key[$1],$2)
}

END {
  print "\n]"
}

I had a few surprises after I tested your example out and found there were some random blank lines, so I added the section to skip lines that were blank.

Then I hit another snag where certain fields actually had IP addresses in them, but they were prefixed with IPV4: , so I added the additional logic to shift the variables if that condition was found.

Another issue I found was there was some non-ASCII data in some of the fields. It took me forever to trace it down, but once I did, I used gsub(/[^[:print:]]/,"",$2) to remove it.

Lastly, I found some stray \ characters in one of the fields, and of course those need to be escaped, so I added gsub(/\\+/,"\\\\",$2)

So that said, is there anything you would do differently? I am trying to take this opportunity to learn as much as I can! :slight_smile:

1 Like