How to concatenate the files based upon the file name?

Hi Experts,

I am trying to merge multiple files into one file based upon the file name.

 
Testreport_Server1.txt         ============
MonitoringReport_Server1.txt============  CentralReport_Server1

Here two files containing server1 should be merged into one file?

How can i do this using shell. I was thinking something like below but not working

 
cat $DR_HOME/OS/Server1*.txt >> Central_Report_Server1
cat $DR_HOME/OS/*Server1.txt >> Central_Report_Server1

Thanks Krish,

I tried that also but not working.
Giving me error. "No such file or directory"

I am using bash.

Are you able to list the files ending with Server1.txt using below command?

ls $DR_HOME/OS/*Server1.txt 

Is the variable DR_HOME set to correct value?
Is OS a subdirectory?

Hello,

Could you please let us know what is the value of

$DR_HOME

here ?

Thanks,
R. Singh

DR_HOME='/home/user/Test/'

Inside this I have OS folder and than files with name containing Server1

Hello,

Could you please use the followinf script to get your desired Output.
Lets say appending_file.ksh is the script name and path where files are present /home/user/bin/singh_testing/awk_programming/testing_appending_file_date.

Our script is in /home/user/bin/singh_testing/awk_programming.

 
$ cat appending_file.ksh
cd /home/archsupp/bin/singh_testing/awk_programming/testing_appending_file_date

if [[ -e test ]]
then
> test
fi

value=`ls -ltr | awk '{print $9}'| grep -v "test"`
set -A array_value $value

for i in ${array_value[@]}
do
cd /home/archsupp/bin/singh_testing/awk_programming/testing_appending_file_date
cat $i >> test

echo "**************************" >> test

done
echo "Script has been completed now."
$

Output wil be stored at /home/archsupp/bin/singh_testing/awk_programming/testing_appending_file_date in test file.

$ cat test
this is first test file.
As per requirement.
**************************
This is the second file.
As per the R. Singh
lets see the appending process here.
**************************
$
 

So you can change the path value in script and try the same it should work properly.

Thanks,
R. Singh

Could you try the ls command posted above to check if the files get listed?

  • Have you checked the syslog facility ?
    /etc/syslog.conf has a facilty to specify the central server to update the log automatically in most of the unix os. Which OS are you using?

Hi Kris,

I have tried the ls command also but giving me the same issue.

No files are getting listed and giving me the no such file found error.

Hello Sharsour,

could you please try my script, you need to change the path and try the same. It should work properly.

Thanks,
R. Singh

Thanks Singh,

Will run your script and get back.
Is there any single line to achieve this requirement?

Clearly it is the issue with variable DR_HOME not being set. It might have been set in a different environment.

Ensure you use export to set the variable. Check if the scope of the variable allows you to access it when you issue the cat or ls command.

SO now I have hardcoded the full path without variable

ls \home\username\Test\*Server1.txt

So do you mean to so this command should list all the files in the Test Directory which contains Server in their name? But it is showing only the "NO Such files found"

When am running

ls \home\username\Test\*.txt this is listing all the files with .txt extension

You have the slashes reversed. Try this.

ls /home/username/Test/*.txt

Sorry for the Typo

This command is working for me

ls /home/username/Test/*.txt

But when am running your command getting an error

ls /home/username/Test/Server1*.txt

Let me reiterate my requirement:- Requirement is it should list down the all file who is having server1 in their name? But its giving the error.

You have the wildcard at the wrong place. You had mentioned that the files end with "Server1.txt"

So, try this

ls /home/username/Test/*Server1.txt

which is what I had suggested earlier in this thread (post #4)

1 Like