How to send data to previous program (pipe)?

Hi,
Suppose I have a command:

$ cmd1 | cmd2

I need to send a message from cmd2 to cmd1 when I receive some a certain message from cmd1. How to do this?
I think that I have to know cmd1's PID and then in cmd2 send a message to this PID. How?

The second process, cmd2, will be able to read the standard output from cmd1 You should not need to know the PID.

It's all a bit theoretical at the moment. Can you explain what you really want to achieve?

Something like ls -l | more is a simple example of this. The output from ls -l (excluding errors) is passed to more which displays the output, pausing with a prompt when it believes it has written a screenful. The shell transfers the input for the more to be reading from the output of the ls rather than from the keyboard or whatever else standard input is defined as.

I'm sure we can explain this with a better example of what you are trying to do overall.

Kind regards,
Robin

Hi Robin,
JackK knows cmd2 can read data from cmd1 . He wants to know how cmd1 can also read data from cmd2 .

Hi JackK,
Send email to yourself, send a message to a message queue, write to a regular file, write to a FIFO file, send a signal, write some text into a shared memory segment... There are hundreds of ways to do this.

What shell are you using?

What operating system are you using?

What is the format of this message you want to send?

What are these two processes doing besides sending data to each other with a pipe being used for sending messages between the two processes in one direction?

Is this a homework assignment? Homework and coursework questions can only be posted in the Homework & Coursework forum under special homework rules.

Please review the rules, which you agreed to when you registered, if you have not already done so.

If you did not post homework, please explain the company you work for and the nature of the problem you are working on.

If you did post homework in the main forums, please review the guidelines for posting homework and repost.

1 Like

Hi Don Cragun,
This is not a homework. I need this because there is no automatic way to have PDB database name visible in SQL*Plus sqlprompt. I saw a solution by using a dedicated SQL function to change current container but this is not what I want to use.

I created a parse-function which I use like this:

$ sqlplus user@DB | parsefunc

This function is looking for some patterns in input data and do some text highlights. I'd like to add an option to send a command to sqlplus:

set sqlprompt "_user'@'_connect_identifier::<PDBname> SQL> "

when user execute a command similar to

This is to make a work in our team easier.

We are using RHEL. Shell is Bash. Company is IBM.

Maybe you should connect the two commands not via a (one-directional) pipe, but use something like "expect" to drive the SQL shell: Manpage of EXPECT

You would have to learn some Tcl, but the language is easy.

Hi rovf,
I think, using Expect for this is not needed and hard. I have no idea at the moment, how could I achieve this this Expect.
I am sure there is an easier way to fulfill my needs. There should be some way to get a PID of sqlplus (cmd1) inside parsefunc (cmd2) and send a string (command) to sqlplus's input using its PID, shouldn't it?

Not within a shell redirection, i.e. a pipe, if you want to retain the stdin from terminal. You would need to rebuild both commands to enable them to read from and / or write to additional file descriptors and create the channels (pipes?) for that.
How about saving above pipe's result into a file and feed that into cmd1 in another (new!) process?

You can achieve this using shell coprocesses

There are examples on this forum and online.
I found one done example by doing sqlplus cooprocess ksh search in google.

Regards
Peasant.

1 Like

There's one big pitfall with this: Which program is reading when? It will be very easy to end up in a deadlock with both programs waiting for the other.

So, this is not nearly as simple as it appears, and why Expect is so valuable when it's really needed.

1 Like

I didn't know about coprocesses before. In my case, I think, writing a parser with coprocesses is not needed and too complicated (for such a small requirement).

Is it possible to do a redirection to a process by it's PID? For example:

$ echo "something" > PID#

May you provide an example, please? It would be good to send a signal from cmd2 to cmd1, but how may I tell cmd1 to interpret the signal I a proper way (changing sqlprompt)?

---------- Post updated at 01:57 PM ---------- Previous update was at 01:46 PM ----------

I found something. When I have sqlplus running and in another session I do:

# echo "TEst;" > /proc/<PIDofSQLPLUS>/fd/0

then in sqlplus I receive:

But the message is not interpreted at all (is only printed) by sqlplus, unfortunately.

---------- Post updated at 04:26 PM ---------- Previous update was at 01:57 PM ----------

According to my last update I found this (linux - sending command to process using /proc - Stack Overflow):

Examples you provided cannot be used for inter process communication.
Only way i can think now are shell coprocesses, with using shell only as a programming tool.

As other mentioned, other languages/tools offer easier way to achieve your goal using various techniques which are not that 'transparent' to use in shell language or are great overhead to code.

You should think about switching to database side completely (stored procedures, pl/sql) to get the desired processing with minimal code investment and (probably) best performance.
But this is just a wild guess, since the whole processes is not known to me.

Rewrite and remodel is sometimes best way to fix your program when features required just cannot keep up.

Hope that helps
Regards
Peasant.

If you use the Korn shell you could use the so-called "Coprocess facility". Notice, though, that only the Korn shell supports these, the solution will not be portable to any other Bourne-shell-descendant (and most probably not even to the many ksh-clones which are only "mostly compatible"). Basically you are starting an asynchronously running background job which communicates with the main job via (bidirectional) pipes.

You start the coprocess with

command |&

and then process the incoming/outgoing messages with read -p and print -p . Per default IOD 4 and 5 are used for this but can be redirected via exec if you have several coprocesses at the same time.

I hope this helps.

bakunin

Once again, what you're asking for isn't exactly trivial.

You can communicate with the previous process with a named pipe, but this will be a bit complicated to set up and prone to deadlocks afterwards (who's waiting for whom?) With all those caveats, here is how you do it:

#!/bin/bash

exec 1>&3 # Save stdout into FD 3 if we need it
mkfifo /tmp/$$

echo "fifo is /tmp/$$"

(       # Process A.  stdout writes to pipe.
        exec < /tmp/$$ # Redirect stdin from named pipe

        for X in 1 2 3
        do
                read ASDF
                echo "string: $ASDF"
        done
) | (
        # Process B.  stdin reads from pipe.
        exec > /tmp/$$  # Redirect stdout into named pipe
        rm -f /tmp/$$

        for X in 1 2 3
        do
                echo $X
                read REPLY
                echo "Reply was '$REPLY'" >&3 # Write to old stdout
        done
)

The sticky part is opening the named pipe. That has to be done in different subshells because exec < namedpipe and exec > namedpipe must happen simultaneously: If two processes aren't trying to open it, the will force your program to wait. This for example will not work:

exec 5<namedpipe # Will wait forever since the line below it will not run
exec 6>namedpipe

$$ is just a convenient random-ish number.

Here is one example of using the presence of a file to act as a semaphore to allow a second process in a pipeline to notify the first process in that same pipeline that it has seen something written by that first process (and it does this twice). If more information needs to be transferred between these two processes, the 2nd process could write any amount of data it wants into the file instead of just creating the file. For this example, we have two shell scripts named cmd1 and cmd2 .
The code to be placed in cmd1 :

#!/bin/bash
FILE=file
FOUND=0

for ((LOOP=1; LOOP < 20; LOOP++))
do	if [ -f "$FILE" ]
	then	echo "$FILE found.  Incrementing FOUND."
		rm "$FILE"
		FOUND=$((FOUND + 1))
	fi
	echo "LOOP is $LOOP"
	echo "FOUND is $FOUND"
	sleep 5
done | tee /dev/tty

Note that the tee in this script allows you to see everything that is being written into the pipeline.

The code to be placed in cmd2 :

#!/bin/bash
FILE=file
sleep 1
echo "Starting $0; looking for 'LOOP is 5'"

while read -r line
do	if [ "$line" = 'LOOP is 5' ]
	then	echo "'LOOP is 5' found; creating $FILE; looking for 'LOOP is 15'"
		> "$FILE"
	fi
	if [ "$line" = 'LOOP is 15' ]
	then	echo "'LOOP is 15' found; creating $FILE again"
		> "$FILE"
	fi
done
echo "$0 hit EOF; exiting"

When you run these two scripts in a pipeline using:

./cmd1 | ./cmd2

you will see output on your screen similar to:

LOOP is 1
FOUND is 0
Starting ./cmd2; looking for 'LOOP is 5'
LOOP is 2
FOUND is 0
LOOP is 3
FOUND is 0
LOOP is 4
FOUND is 0
LOOP is 5
FOUND is 0
'LOOP is 5' found; creating file; looking for 'LOOP is 15'
file found.  Incrementing FOUND.
LOOP is 6
FOUND is 1
LOOP is 7
FOUND is 1
LOOP is 8
FOUND is 1
LOOP is 9
FOUND is 1
LOOP is 10
FOUND is 1
LOOP is 11
FOUND is 1
LOOP is 12
FOUND is 1
LOOP is 13
FOUND is 1
LOOP is 14
FOUND is 1
LOOP is 15
FOUND is 1
'LOOP is 15' found; creating file again
file found.  Incrementing FOUND.
LOOP is 16
FOUND is 2
LOOP is 17
FOUND is 2
LOOP is 18
FOUND is 2
LOOP is 19
FOUND is 2
./cmd2 hit EOF; exiting

In your cmd1 script, you can run whatever commands you want to run when the semaphore file is found to alter the sqlplus prompt.

I have an error:

---------- Post updated at 03:52 PM ---------- Previous update was at 03:12 PM ----------

Hello, Don Cragun.
Your "example of using the presence of a file to act as a semaphore..." is ok. However, I need to interact with sqlplus in a normal way (from its inner prompt) also => interactively.
Do you have any idea how may I start sqlplus to instruct it to read an input from some file when the file is not empty and allow the user to do normal commands from prompt like below?

SQL*Plus: Release 11.2.0.4.0 Production on Wed Feb 14 13:45:42 2018

Copyright (c) 1982, 2013, Oracle.  All rights reserved.

Connected to an idle instance.
SQL>

In your example we can create cmd1 exactly as we wish it to behave. This is not the case when sqlplus is put instead of cmd1.

Oracle's utl_file package allows file i/o. You have to use PL/SQL as part of your script. Note that the DBA has to grant access to the directory by creating a directory object for you - this is part of Oracle security. In older Oracle systems (Oracle 9) there usually is a default object called TMP which usually points to the /tmp directory.

I've been away from this thread for a while.

Would it be possible to have this sort of command still cmd1 | cmd2 but have the sqlplus in cmd2? The cmd1 would need to be reading a file or pipe that cmd2 writes to.

An alternative might be something like cmd1 < <(cmd2) and this time the sqlplus is in cmd1. This is a bash way rather than ksh.

Can you tell us more about the output that the sqlplus will create and how/why it must then take in new input depending what it throws out? It might be better to set it up as a set of separate calls, i.e.

  • script calls sqlplus call to generate some output into work-file1 and then exit
  • script reads work-file1 and calls sqlplus with appropriate input, writing work-file2
  • script reads work-file2 and calls sqlplus with appropriate input, writing work-file3

.... and so on until you have the desired output.

Does anyone care to shoot me down to say that these are all bad approaches? I'd be happy to illustrate how not to do it if that's useful to the thread.

I'm just a bit lost about why the sqlplus needs to have input based on it's output. Have I missed the point?

Robin

I apologize, that should have been:

exec 3<&1

Both examples given by Don Cragun and Corona688 are working for me and I understand the idea behind it.
However, both those examples rely on a possibility of creating cmd1 and cmd2 from scratch, so that stdin/stdout redirection (Corona688's example) and testing file existence (Don Cragun's example) can be made.
This is, however, not possible when I have sqlplus instead of cmd1. Am I right? If yes, then my problem is still unsolved.

You have said that in your case, cmd1 is an interactive sqlplus script that you control. You can make that sqlplus script recognize that something in its environment has changed and use recognition of that change to do whatever it is that you want to do. If, on the other hand, your cmd1 is not an interactive sqlplus script or if you can't make that script recognize that something has changed in the outside world and make adjustments based on that change, then you can't do what you say you want to do and there is nothing we can do to help you make that happen (other than things we have already suggested that you have said you don't want to use).