KSH scripting

Hi Guys,

I am trying to learn to script.
first I have 2 server, A and B.
A with IP 192.168.82.22. B with IP 192.168.82.44.
Both with login user admin and password admin.
server A will generate a file every day with name gg.log under /app/gg/20171002.log. I wish to write a script to copy the file zip it and send to server B under the folder ITlog with one day before.

Can I know how should I write the script?

hello leecopper,

I have a few to questions pose in response first:-

  • Is this homework/assignment? There are specific forums for these.
  • What have you tried so far?
  • What output/errors do you get?
  • What OS and version are you using?
  • What are your preferred tools? (csh, ksh, bash, Perl even?)
  • What logical process have you considered? (to help steer us to follow what you are trying to achieve)
  • Do you have authentication set up to copy the file manually, or some sort of disk share that they both see?
  • Does this need to be scheduled, and if so, what tools do you have available other than cron?

Most importantly, What have you tried so far?

There are probably many ways to achieve most tasks, so giving us an idea of your style and thoughts will help us guide you to an answer most suitable to you so you can adjust it to suit your needs in future.

We're all here to learn and getting the relevant information will help us all. I'm sure we can get this sorted.

Kind regards,
Robin

Hi Robin,
Please see my reply.
Is this homework/assignment? There are specific forums for these.
This is my assignment.
What have you tried so far?
So far i have search on google and try to get some info.
What output/errors do you get?
I have not try any yet.
What OS and version are you using?
Aix 7.1
What are your preferred tools? (csh, ksh, bash, Perl even?)
Ksh
What logical process have you considered? (to help steer us to follow what you are trying to achieve)
i am trying to copy a file generated by the app and ftp it to another workstation to analyse the data.
Do you have authentication set up to copy the file manually, or some sort of disk share that they both see?
not too sure.
Does this need to be scheduled, and if so, what tools do you have available other than cron?
crontab to be use.
Most importantly, What have you tried so far?
just getting some concept to run this on our development server.

Okay, so if you were to do this manually, step by step, what would you do? Would it be something like:-

  • Change to the directory of the source file
  • Start FTP/SFTP to the target server
  • Provide login credentials where requested
  • Within FTP/SFTP, change to the target directory
  • Issue a put to transmit the file
  • Quit FTP/SFTP

It could also be done the other way round, i.e connect from where you want to get the file to back to the source and get the file.
Are either of these a suitable structure? If so:-

  • Do you have a fixed filename to send/receive?
  • Do you have credentials?
  • Will you be using FTP, SFTP or something else?
  • How far can you get with your code before you get stuck?

Kind regards,
Robin

Hi Robin,

so far I have refer some of the existing similar script and do some modify and try to use the script at below,

#!/bin/bash
#assuming the file1 and file2 will be generate everyday and it was already in the directory.
# This script is to transfer the file1 and file2 to admin workstation.
 
 
HOST=192.168.82.22
USER=admin
PASS=admin
 
date
cd /apps/gg/IT
 
gzip data_`date '+%d%m%y_%H'`00.txt
gzip log_`date '+%d%m%y_%H'`00.txt
 
File1=$(date +"data.gz")
File2=$(date +"log.gz")
data="/app/gg/IT/data"
log="/app/gg/IT/log"
data_use=$(date +"data_%Y*")
log_use=$(date +"log_%Y*")
 
ftp -n -v $HOST <<EOF
user "USER" "$PASS"
prompt
ascii on
mkdir "$File1"
cd "$File1"
mkdir "$File2"
cd "$File2"
lcd $data
mput $data_use
lcd $log
mput $log_use
quit
EOF

I have yet try it work or not.

Regards
Leecopper

Okay, that is a decent start. I would suggest moving the credentials out of the script. If you can read the script, you can see the credentials and therefore access data that maybe should be secure. Additionally, if you have several scripts with these details in and you need to change them (e.g. you decide that having the password the same as the account is not actually very secure) then you have to change it in everyone of your scripts else you will get failures.

Without a -n flag, FTP will look for a file called .netrc in the home directory of the person running it. The format is defined in the man netrc page. You don't need to include all the fields.

This does some directory creation before putting all the files that match the string given. I'm not sure if this is what you want. Perhaps this would be neater:-
/home/rbatte1/.netrc

machine 192.168.82.22 login admin password admin
machine serverb login admin password M0r3_53cure!

The second line uses the short DNS name, which is more descriptive so may be better long term.

/home/rbatte1/my_ftp_script_example

#!/bin/ksh

host=serverb         # .... or the IP address if you must <<sigh>>
my_run_date=`date +%Y%m%d`

echo "Changing to source directory /app/gg/${my_run_data}.log"
cd  /app/gg/${my_run_data}.log || (echo "Failed to change directory to source area" ; exit 1)

echo "Checking file gg.log exists"
[ -f gg.log ] || (echo "There is no file to compress/send" ; exit 2 )

echo "Compressing gg.log to gg.${my_run_data}.log.Z"
compress -c gg.log gg.${my_run_data}.log.Z || (echo "Failed to compress the gg.log file" ; exit 3)

echo "Sending file"
ftp "${host}" << EOFTP 
  cd ITlog
  put gg.${my_run_data}.log.Z
EOFTP

echo "File sent.  Removing the temporary compressed file"
rm gg.${my_run_data}.log.Z

echo "Finished"
exit 0

Your description was a little vague, so I've guessed at what you mean for the various directories.

Does this get you going?
Robin

Hi Robin,
There is a changes on the scenario, lets say I have a development server A running AIX 7.1 with ip 192.168.82.22 and a windows workstation with ip 192.168.82.44 and the ftp connection is opened.
for server A under the /app/gg/it there will be file generated every hour and another fie generated daily,what i need to do is to find the files lets say from may 2017 till yesterday and gzip it then ftp it over to the workstation assuming the ftp is ok. how should my scripts looks like?
Here I created a script but don't know its was right or wrong as I do not have a chance to test on the real development environment.

  1. text #!/bin/bash cd /app/gg/it find /app/gg/it -name "Data_to_check_*" -type f -mtime +0 -exec gzip {} \; find /app/gg/it -name "log_to_check_*" -type f -mtime +0 -exec gzip {} \; HOST=192.168.82.44 USER=admin PASS=abc123 DIR1=$(date +"%Y_Data") DIR2=$(date +'%Y_log") DADIR="/app/gg/it/" LODIR="/app/gg/it/" DAFILE=$(date + "Data_%Y%m%d*.gz") LOFILE=$(date + "log_%Y%m%d*.gz") echo "backup file to admin workstation" ftp -n -v $HOST <<EOF user "$USER" "$PASS" prompt ascii on mkdir "$DIR1" cd "$DIR1" mkdir "$DIR2" cd "$DIR2" lcd $DADIR mput $DAFILE lcd $LODIR mput $LOFILE quit EOF

Okay, glad I'm going in the right direction.

  • If you were to use the command line to identify these files, what would you do?
  • With a list of files, could you write a command to gzip them, either individually or to a single archive.

If you can answer these, then you are well on your way to coding it. How far do you get?

Robin