Conversion of Perl Script to Shell Script..

Hi Guys
I am having a perl script that fetches exclude list from a unix client and I trying it to convert it to shell script but I am having issues please help me...

#!/usr/bin/perl

use strict; 
use warnings; 
use Getopt::Std; 


# To turn on debuging (i.e. more information) specify -d on the command line 
our $opt_d = 0; 

# To get ONLY the version information specify -v on the command line 
our $opt_v = 0; 

getopts('dv'); 

our $debug = $opt_d; 

my $uname = `uname -n`; 
chomp $uname; 

# Desiginate where to write the in/exclude files (must be fully qualified path) 
our $PWD = $ENV{PWD}; 
our $output_dir = "$PWD/EI_${uname}"; 
if ( ! -d $output_dir ) { mkdir $output_dir } 


# Location of bp.... commands 
our $nbadmin = "/usr/openv/netbackup/bin/admincmd"; 

# Generate a list of policies 
our @policy_list = `$nbadmin/bppllist`; 

# Used the get output of the bpgetconfig command. Only need for debut purposes 
our @status = (); 


foreach my $policy (@policy_list) { 
chomp $policy; 

# Get the individual policy information 
my @policy = `$nbadmin/bppllist $policy -l`; 

# Extract the info line 
my @info = grep /^INFO /, @policy; 

# If the policy type is not standard, ignore 
if ( (split /\s+/, $info[0])[1] != 0 ) { next } 

# If the policy is inactive, ignore 
if ( (split /\s+/, $info[0])[11] != 0 ) { next } 

# Pull out the clients for this policy and then keep only the client names 
my @clients = grep /^CLIENT /, @policy; 
@clients = map { (split /\s+/, $_)[1]} @clients; 

# Pull out the schedules for this policy and then keep only the schedule names 
my @schedules = grep /^SCHED /,@policy; 
@schedules = map { (split /\s+/, $_)[1]} @schedules; 

# Now for each client 
foreach my $client (@clients) { 

if ($debug != 0) { print STDERR "$client $policy\n"; } 

# Be sure the client is at leas pingable otherewise the bpgetconfig command will take a long time to fail 
system("ping -c 1 -W 5 $client > /dev/null 2>&1"); 
if ($? != 0 ) { print STDERR "$client not pingable\n"; next } 

open VERSION, ">$output_dir/version.$client" or die "Couldn't open $output_dir/version.$client for output: $!\n"; 
print VERSION "============= Version Check of $client ===================\n"; 
print VERSION `bpgetconfig -t -A -g $client 2>&1`; 
print VERSION "============= End Version Check of $client ===================\n"; 
close VERSION; 

if ( ! $opt_v ) { 
# get, if any, the basic include and/or exclude files. (i.e. /usr/openv/netbackup/exclude_list or include_list) 
@status = `$nbadmin/bpgetconfig -e \"/$output_dir/exclude.$client.basic\" \"$client\" 2>&1`; 
if ($? != 0 && $debug != 0 ) { print STDERR "$client bpgetconfig -exclude no policy failed with $?\n @status" } 
@status = `$nbadmin/bpgetconfig -i \"/$output_dir/include.$client.basic\" \"$client\" 2>&1`; 
if ($? != 0 && $debug != 0 ) { print STDERR "$client bpgetconfig -include no policy failed with $?\n @status" } 

# get, if any, the policy include and/or exclude files. (i.e. /usr/openv/netbackup/exclude_list.policy or include_list.policy) 
@status = `$nbadmin/bpgetconfig -e \"/$output_dir/exclude.$policy.$client\" \"$client\" \"$policy\" 2>&1`; 
if ($? != 0 && $debug != 0 ) { print STDERR "$client bpgetconfig -exclude policy only failed with $?\n @status" } 
@status = `$nbadmin/bpgetconfig -i \"/$output_dir/include.$policy.$client\" \"$client\" \"$policy\" 2>&1`; 
if ($? != 0 && $debug != 0 ) { print STDERR "$client bpgetconfig -include policy only failed with $?\n @status" } 

# Now for each schedule in the policy (i.e. /usr/openv/netbackup/exclude.policy.schedule= 
foreach my $schedule (@schedules) { 
if ($debug != 0) { print STDERR "$client $policy $schedule\n"; } 
@status = `$nbadmin/bpgetconfig -e \"/$output_dir/exclude.$policy.$client.$schedule\" \"$client\" \"$policy\" \"$schedule\" 2>&1`; 
if ($? != 0 && $debug != 0 ) { print STDERR "$client bpgetconfig -exclude with policy and schedule failed with $?\n @status" } 
@status = `$nbadmin/bpgetconfig -i \"/$output_dir/include.$policy.$client.$schedule\" \"$client\" \"$policy\" \"$schedule\" 2>&1`; 
if ($? != 0 && $debug != 0 ) { print STDERR "$client bpgetconfig -include with policy and schedule failed with $?\n @status" } 
} 
} 
} 

# If any of the bpgetconfigs work the ouput will be in the file name following the -e or -i 
} 

exit;

====================== End of code ========================

Are you asking someone to rewrite your entire Perl script in Bash? That is more than most people are willing to do. A better question would be to ask how to convert one specific function from Perl to Bash. No one is going to want to write a script from scratch for someone they don't know.

It is a lot of work and time to do this right. Perl is a better language to use for serious apps. So is Python. They both produce more security than the shell apps. What you want in essence is a downgrade. Why??

I'd like to run this script to see what it does, but cant because it needs more data files.

You can't just "convert" from one language to another. It's a program, not a jpeg. Show the input you have and the output you want and we'll show you how to do it.

1 Like

I disagree. This is too sweeping a generalization, and I have seen far too many perl programs which are nothing by line after line of things in system() and backticks -- i.e. shells.

1 Like

Hello Corona688

I mentioned security, and what I had in mind at that moment was protection from buffer over-runs, sql-insertions, and the like. The script presented here has the markings of an 'out-facing' app used by...who knows? I just don't think bash would be the best choice out there in the wild wild west. :stuck_out_tongue:

System calls are common...bash does it better sometimes (thinking of grep vs Perl grep for example).
Cheers.

Buffer over-runs aren't much of an issue in a scripting language, and this script accepts no input except -d and -v.

Even though mostly what this script does, is call bash?

Every time you do system(), that's bash.

Every time you use a set of ``, that's bash.

If you really want to avoid bash, you are going to have a hard time using perl to control any external processes, because perl is really bad at it, so bad that the standard function defaults to handing it off to bash... In which case, if you can, why not just write it in bash?

This script makes no system calls.

I like the shell; don't get me wrong. But just to keep it interesting...

uname is shell, chomp is perl:

my $uname = `uname -n`; 
chomp $uname; 

The below is not shell. It's a Symantec product about backing up servers.

`$nbadmin/bpgetconfig -i...`

I wonder how well that would interface with the shell?

...and then there is the arrays and string juggling, which is where perl shines.

a bash programming guru will disagree vehemently with this premise.

there's been very few, if any, situations i've been in that i wasn't able to use bash or awk. perl is a language i will caution people to be very careful with given the simple fact that in most cases, it requires the compilation, download/configuration of modules...which can be detrimental to existing applications on a system.

if perl is your favorite language, fine. stick to it but make sure you know it well enough that you dont have to rely or force your users to have to get modules in order for it to work.

matter of fact, just do a search in this forum alone and you will see very few people care to use perl. its usually awk/bash over perl.

I've turned this thread into a perl vs bash bash haven't I?
Didn't mean to.

Keeping the original poster in mind, whom we haven't heard from since; If someone wants to drop a bash script in his lap, go for it. It isn't going to improve on what he already has IMHO. Rather than leave us arguing about 'what's wrong with perl?', he should be telling us what is wrong with this particular script.

On my own behalf, I have apparently given the impression that I don't like bash. Quite the contrary, bash is what I know best and I use it for everything I do these days. The last time I used perl was to put together a gui app. Looking back I think Python would have been a better choice.

uname is not shell! Perl is using a shell here, but it doesn't need to. You should use fork() and exec() to run it more efficiently. But then you would need to set up pipes and such to connect it and worry about closing them after -- all the things BASH does without you even thinking about it. Perl has none of this. It shoves it all into BASH and forgets.

Or you could throw out the perl and not worry about any of it. If you're controlling external processes, you should be using shell, because perl is really bad at it.

( chomp() is not really needed here either. The shell makes do fine without it, but Perl takes things a little too literally. )

Perl is terrific at string juggling, but modern shells have arrays. Even better, they have and have always had lists...

1 Like

It is already mostly "bash" code, in a membrane-thin wrapping of perl.

I don't think he's coming back at this point, we wanted him to do too much work.

1 Like

Hi ongoto...

Let me know when PERL, Python or nearly any other language for that matter, can do this:-

I have made the *NIX shell my language of choice now and all but abandoned Python, I have little experience of PERL however.

EDIT:
BTW, I am looking into FFT stuff for another possible project using the shell...

1 Like

Good. We all agree. :smiley:

The shell is the coders swiss army knife. But to use it exclusively out of prejudice is cheating yourself. OO languages do have a lot to offer if you need to pass around groups of data all in one go and that sort of thing. The shell now has associative arrays, but they don't stay ordered. Try looping through them. Try passing an array to a function. There goes your indexes. But that doesn't make the shell bad. You're gonna hit a wall in the others too.

In Perl you always end up making calls to the shell because it saves work. Why write a block of Perl code to do 'uname -r' ? On the other hand, Perl lets you do things like "do something unless condition", or "if condition, etc" all in one statement. If Perl can save you some work, then why not use it.? Same with Python and the others mentioned. What would we shell coders do without Awk and Sed and Grep?

Ya got tools that are metric, some are std SAE, and then there are the 'star' bits and the universal sockets. But you can't cut down a tree with a phillips!

Cheers mates :slight_smile:

Awesome.
AE6UB said that. :slight_smile:

Reminds me of my Visual Basic 4 times, before i knew how to create my own dll's, i made other exe's i then called using the console calls.

Still proud of my first ActiveX-DLL based application - T4C-Desktop (TPP to the 4th comming, mmorpg).
The 'desktop' was the container app to be started, using a custom 'ocx' for the menu lists, which contained entries according to the dlls downloaded.
Each dll had some 'fixed' strings to return, such as its group, so it could be loaded to the appropriate ocx list item, which would only show, if there was a an item to be displayed.
That way, i could add new groups, without to prepare the container app for it, by just saying so within an activex dll.
They also contained a 'resize' function, which was triggered by the resize of the container app, so the 'hwnd' - which was such an element too, and just written to a picturebox - could be replaced/udpated.

But in the end, in some way, it doesnt matter if one opens a subshell from within a script, or calls the console/shell - it is the same, though, not for the same purpose.

Hi sea

Yeah. I tried one time to do a command line app in Python. I had to have an answer for every possible event; mouse, keyboard, etc. I didn't want to code all that, so I finally settled on just running a normal shell and calling Python as needed. 10 times easier and faster. When the shell by itself is just too slow, it's nice to be able to call some outside help.

You do not "call" a shell.

You must fork a new process, exec it, let it run, wait for it to finish, reap it, and then return to whatever else you were doing. That's what system() does -- just to create a shell. And then the shell, once running, must do so again, to run whatever external command you asked it to do.

So you are running a thing which runs a thing and must wait for it to quit before you can wait for it to quit.

...which it does, having run the single, individual command you fed into backticks or system().

Imagine running full instances of Perl every time you needed to call uname in BASH even though uname has nothing to do with perl. That's the degree of pointless waste that happens when you write external-command-intensive programs in perl. It can be almost cripplingly wasteful.

Just using the shell in the manner it was meant to be used in the first place avoids quite a lot of round trip. Or you could write code in perl to the same effect, except that perl is pretty bad at that...

That's a nice feature, isn't it?

true && echo "this will print"
false && echo "this won't"
true || echo "neither will this"
false || echo "But this will"

sh had it first -- by decades.

You have backpedaled miles from your argument that shell code is somehow "unsafe" while perl code running plethoras of individual short-lived shells is somehow not.

Anyway. Would you agree that running awk thousands of individual times to process tiny amounts of data is a suboptimal use of it?

Perl is good at some things. I enjoy it for some things. Code like the OP is not an effective use of it.

2 Likes

(The "do something if condition" is a DEC Basic+ idiom that perl adopted)

Not to join the language wars - but the moment someone says such-n-such a language is best is naive at best. It really depends on the task at hand, time to implement, effort to implement, maintainability, efficiency, security, history, policy, etc. etc.

There is no best language.

I've seen scripts and programs that impressed the hell out of me with their cleverness and have taught me something new. I've also seen programming horrors that should never ever have seen the light of day. And everything in between.

The only words of advice I can give is - know your tools.

Mind you, this is something I have been involved with since my first assignment when I was given a choice to use COBOL, FORTRAN, or APL. Ended up with Basic+... :slight_smile:

3 Likes

Looks like the OP is long gone however here is a good starting point for the bash version of that script. Again without the original data files and my limited knowledge of how split() and map() functions operate there may need to be tweaks on the policy reading part. I also leave the command options (-v -d) processing for the reader.

nbadmin="/usr/openv/netbackup/bin/admincmd"
output_dir="$PWD/EI_$(uname -n)"
policy_cmd="$nbadmin/bppllist"

function run_cmd() {
   status="$( $@ 2>&1 )"
   RC=$?
   [ $RC -ne 0 ]
}

$policy_cmd | while read policy
do
  policy_info=$($nbadmin/bppllist $policy -l)
  info=( $(grep "^INFO " <<<"$policy_info") )

  # not standard - ignore
  [ "${info[1]}" != "0" ] && continue

  # inactive - ignore
  [ "${info[11]}" != "0" ] && continue

  grep "^CLIENT " <<<"$policy_info" | while read ignore client ignore
  do
      if ping -c 1 -W 5 $client > /dev/null 2>&1
      then
          echo "$client not pingable" >&2
          continue
      fi
      ( echo "============= Version Check of $client ==================="
      bpgetconfig -t -A -g $client 2>&1
      echo "============= End Version Check of $client ===================" ) > $output_dir/version.$client

      if run_cmd $nbadmin/bpgetconfig -e "/$output_dir/exclude.$client.basic" "$client"
      then
         printf "$client bpgetconfig -exclude no policy failed with $RC\n%s\n" "$status" >&2
      fi

      if run_cmd $nbadmin/bpgetconfig -i "/$output_dir/include.$client.basic" "$client"
      then
         printf "$client bpgetconfig -include no policy failed with $RC\n%s\n" "$status" >&2
      fi

      grep "^SCHED " <<<"$policy_info" | while read ignore schedule ignore
      do
         if run_cmd $nbadmin/bpgetconfig -e "$output_dir/exclude.$policy.$client.$schedule" "$client"
         then
             printf "$client bpgetconfig -exclude with policy and schedule failed with $RC\n%s\n" "$status" >&2
         fi

         if run_cmd $nbadmin/bpgetconfig -i "$output_dir/include.$policy.$client.$schedule" "$client"
         then
             printf "$client bpgetconfig -include with policy and schedule failed with $RC\n%s\n" "$status" >&2
         fi
      done

  done
done
1 Like

@ Corona688

The devils' advocate here. :slight_smile:

You're right. Calling 'uname -r' is not a call to the shell. /bin/uname is a binary that has nothing to do with the shell. I should say 'system calls' instead.

Some say that, because Perl uses system calls, you might as well use Bash. That presumes that Linux binaries are Bash builtins. If you call uname or ping, or any other binary from Bash; is that different than calling one from Perl? Not at all. Bash scripts use system binaries the same way Perl does.

Bash just received a security upgrade. That speaks to that.
You dont have to fork a shell to run Linux binaries. Most returns go to stdout or stderr. Why would you need a shell. Does Awk require a shell fork? The standards guys would know more about that than I would.

I'm not back pedaling at all. Perl and other high level languages have libraries; huge volumes of reusable code to handle most anything. On big projects, they can save you hours, if not days of work. For the most part, they are tried and proven and secure, given there are no absolutes.

Again, I use the shell most of the time because I rarely ever do anything big in a coding sense. But if I did, I wouldn't use the shell. Speed alone is a factor. You mentioned Awk; 10 times faster than the shell.