In PErl script: need to read the data one file and generate multiple files based on the data

We have the data looks like below in a log file.
I want to generat files based on the string between two hash(#) symbol like below

Source:

#ext1#test1.tale2 drop
#ext1#test11.tale21 drop
#ext1#test123.tale21 drop
#ext2#test1.tale21 drop
#ext2#test12.tale21 drop
#ext3#test11.tale21 drop
#ext3#test123.tale21 drop
#ext4#test1.tale21 drop
#ext4#test124.tale21 drop

Desired output :

ext1.log

test1.tale2 drop
test11.tale21 drop
test123.tale21 drop

ext2.log

test1.tale21 drop
test12.tale21 drop

ext3.log

test11.tale21 drop
test123.tale21 drop

ext4.log

test1.tale21 drop
test124.tale21 drop

If not necessarily perl:

while IFS="#" read _ file data
do
   echo $data >> "$file".log
done < Source
# use_sanjeev_g.file.pl
use strict;
use warnings;

my $last_name = '';
my $writeout;
while(<>) {
    if (/^#(\w+)#(.+)$/) {
        if ($last_name ne $1) {
            close $writeout if $writeout;
            open($writeout, '>', "$1.log");
        }
        $last_name = $1;
        print $writeout "$2\n" if $writeout;
    }
}
close $writeout or die;
perl use_sanjeev_g.file.pl sanjeev_g.file
$ ls ext[0-9]*.log | while read f; do printf "file: $f\n--------------\n";cat $f; echo; done

Output:

file: ext1.log
--------------
test1.tale2 drop
test11.tale21 drop
test123.tale21 drop

file: ext2.log
--------------
test1.tale21 drop
test12.tale21 drop

file: ext3.log
--------------
test11.tale21 drop
test123.tale21 drop

file: ext4.log
--------------
test1.tale21 drop
test124.tale21 drop
1 Like

Thanks Aia, It working fine for below condition.
If we want to add/append few lines in all the the file which was generated,Could you please help with the logic.

want to add below data in all the output file

Example :

EXTRACT e1_bill1
SETENV (ORACLE_SID = "TV19DBM")
SETENV (ORACLE_HOME = /opt/oracle/product/11.2.0.4/db_1)
SETENV (NLS_LANG = "AMERICAN_AMERICA.US7ASCII")
 
--USERID GGSCHEMA, PASSWORD , &
USERID 4444444A@xxxxxx, PASSWORD "xxxxxx"
 
TRANLOGOPTIONS DBLOGREADER
TRANLOGOPTIONS excludeuser GGSCHEMA
TRANLOGOPTIONS excludeuser esg_dba
nocompressupdates -- Include the full record
nocompressdeletes -- Include the full record
-- To avoid need to delete log manually[but may need to enable it in Dataguard Environemnt]
--TRANLOGOPTIONS LOGRETENTION DISABLED
-- Cache Manger parameter to limit memory utilization by extract
CACHEMGR CACHESIZE 1GB
-- Must be used together and directs Extract to fetch data from the source table if it cannot fetch
FETCHOPTIONS USELATESTVERSION, NOUSESNAPSHOT
--Warns for a long running transaction
WARNLONGTRANS 1h, CHECKINTERVAL 10m

It is best to create a different thread if the request/inquire is different than the original post.

while(<>) {
    print $_;
}

while (<DATA>) {
    print $_;
}

__DATA__
EXTRACT e1_bill1
SETENV (ORACLE_SID = "TV19DBM")
SETENV (ORACLE_HOME = /opt/oracle/product/11.2.0.4/db_1)
SETENV (NLS_LANG = "AMERICAN_AMERICA.US7ASCII")

--USERID GGSCHEMA, PASSWORD , &
USERID 4444444A@xxxxxx, PASSWORD "xxxxxx"

TRANLOGOPTIONS DBLOGREADER
TRANLOGOPTIONS excludeuser GGSCHEMA
TRANLOGOPTIONS excludeuser esg_dba
nocompressupdates -- Include the full record
nocompressdeletes -- Include the full record
-- To avoid need to delete log manually[but may need to enable it in Dataguard Environemnt]
--TRANLOGOPTIONS LOGRETENTION DISABLED
-- Cache Manger parameter to limit memory utilization by extract
CACHEMGR CACHESIZE 1GB
-- Must be used together and directs Extract to fetch data from the source table if it cannot fetch
FETCHOPTIONS USELATESTVERSION, NOUSESNAPSHOT
--Warns for a long running transaction
WARNLONGTRANS 1h, CHECKINTERVAL 10m

Hi Aia,

Thanks for the reply.....
Appreciate your help for the below issue.
Im using below code.....I dont want to attach the logs when I ran the perl twice...I just want to take backup with today date and generate new logs...What I need to do for the below scirpt..............

1)if logs exist it should move the logs with extention of today date nad need to generate new one
2) only for first log file coming perfectly and for remaining logfiles and input.txt not adding in the outputfile.

Script :

my $last_name = '';
my $writeout;
 
while(<>) {
    if (/^#(\w+)#(.+)$/) {
        if ($last_name ne $1) {
            close $writeout if $writeout;
            open($writeout, '>>', "$1.log");
        }
        $last_name = $1;
if ($. == 1) {
print $writeout "Extract $1 \n";  
my $filename="input.txt";
open (my $ip, "<" , $filename) || die ("Can't open file input.txt");
while ( <$ip> ) {
    next if  (/^$/);
    print $writeout  "$_";
}
close $ip;
}
        print $writeout "$2\n" if $writeout;
    }
}
close $writeout or die;

input files :

more sanj.txt

#ext1#test1.tale2 drop
#ext1#test11.tale21 drop
#ext1#test123.tale21 drop
#ext2#test1.tale21 drop
#ext2#test12.tale21 drop
#ext3#test11.tale21 drop
#ext3#test123.tale21 drop
#ext4#test1.tale21 drop
#ext4#test124.tale21 drop
#ext1#test1.tale2 drop

more input.txt

1.1.1.1
2.2.2.2

ls ext[0-9]*.log | while read f; do printf "file: $f\n--------------\n";cat $f; echo; done


file: ext1.log

  Extract ext1 
1.1.1.1
2.2.2.2
test1.tale2 drop
test11.tale21 drop
test123.tale21 drop
test1.tale2 drop

file: ext2.log
--------------
  test1.tale21 drop
test12.tale21 drop

file: ext3.log
--------------
 test11.tale21 drop
test123.tale21 drop

file: ext4.log
--------------
  test1.tale21 drop
test124.tale21 drop

Thanks,
G sanjeev Kumar