Multiple co-processor file descriptors

I have a script that creates a KSH co-process for Oracle sqlplus and I am presently interacting with it via print -p and read -p.

I also need to interact with another Oracle database what isn't permitted to have any direct connection to the first. Presently, I simply disconnect from the first database, connect to the second database, and reconnect to the first database when I am finished, again using print -p and read -p.

I would like to enhance this a bit using a second co-process. I know how to create the second sqlplus co-process by first redirecting stdout and stdin for the first co-process using exec fd>&p and exec fd<&p. Then I can create the second co-process and interact with both.

My desire is to write a db_connect function that is capable of determining that a co-process has already been started and a new co-process with new file descriptors needs to be started. Along with this, I would like to dynamically determine which co-process to communicate to using generalized functions.

Any thoughts? Can the open file descriptors be tested? The version of KSH is KSH-88 I believe.

Thomas

The only thing I can think of is to create a global variable and increment it for each co-process. The test on a fd that I know of is
[ -t $fd ]
which checks for being a tty.

Thanks for the suggestion with regards to using a global variable. I am not opposed to doing this. I hope that it doesn't get lost on the poor DBAs here who will have to maintain these scripts when I eventually have to move on.

I started working on a proof of concept and hit an unexpected snag using eval. My thought is that I would pass a handle (just a string really) to my database functions and build variables based on the handle. When I try to redirect stdout and stdin to some open file descriptors, I get an error.

#!/usr/bin/ksh

sqlplus -s /nolog |&

x=TESTVAR

eval export ${x}_stdout=3
eval export ${x}_stdin=4

eval print $TESTVAR_stdout
eval print $TESTVAR_stdin

eval exec ${TESTVAR_stdout}>&p
eval exec ${TESTVAR_stdin}<&p

eval print -u $TESTVAR_stdout "PROMPT Hello"
eval read -u $TESTVAR_stdin LINE

print $LINE

eval exec ${TESTVAR_stdout}>&p <==== results in "t[13]: 3: not found"

The previous lines work fine, I am scratching my head on why TESTVAR_stdout isn't expanding properly.

What is more perplexing is If I manually execute the lines with numeric literals, it works. If I manually execute the lines using variable and no "eval", KSH indicates that I have running jobs for stdout and closes my session for stdin.

If I can get past this hurdle, I can implement the concept easily.

Thomas

I may have figured out my problem so I am posting an update if anyone is interested.

#!/usr/bin/ksh

sqlplus -s /nolog |&

x=TESTVAR

eval export ${x}_stdout=3
eval export ${x}_stdin=4

eval print $TESTVAR_stdout
eval print $TESTVAR_stdin

eval "exec ${TESTVAR_stdout}>&p"
eval "exec ${TESTVAR_stdin}<&p"

eval "print -u${TESTVAR_stdout} \"PROMPT Hello\""
eval "read -u${TESTVAR_stdin} LINE"

print "From sqlplus: $LINE"

My "eval" expressions needed to be inside quotation marks.

Thomas

Sorry, I am not trying to bump up my post but I wanted to retract my previous post as the solution only works when I name my variables. I realized that I wasn't referencing my dynamic variable "x". I am still scratching my head...

Thomas

Sorry, again, I am not trying to bump anything up, I have a core issue, that I am trying to narrow this thread's scope to focus on, using "eval" that is eluding me so if I can get past this, my problem is solved.

Essentially my "x" variable should be able to translate into another variable containing my file descriptor. If I can get "x" to expand into TEXTVAR2_stdout and then expand this so that "4" can be assigned to a local function variable, I am golden.

#!/usr/bin/ksh

sqlplus -s /nolog |&

x=TESTVAR2

eval export "${x}_stdout=3"
eval export "${x}_stdin=4"

# Print using named variables
print $TESTVAR2_stdout
print $TESTVAR2_stdin

# Same thing attempting expansion of my "x" variable;
# results in print TESTVAR2_stdout but no 4
eval "print ${x}_stdout"
# results in print $TESTVAR2_stdout
print $(expr $"${x}_stdout")

I get down to something that looks like $TESTVAR2_stdout but I cannot get the shell to treat it as a variable.

Thomas

#!/bin/ksh

x=TESTVAR2

eval export "${x}_stdout=3"

print $TESTVAR2_stdout

# results in print $TESTVAR2_stdout
eval "print \$${x}_stdout"

vger99 has the syntax. But the direction you are taking seems to be wrong. A fd is specific to a process. Exporting a vaiable containing an fd is senseless. In your first post, when you said "function", I assumed you meant "function". A function is ksh concept and it can have locally scoped variables. A global variable is global to the script, not the environment.

First of all, vger99, thank you. This will solve the problem that I was struggling with. I was probably wrapping myself around my own axils pretty tight because I did double up the "$" symbols, however, I was escaping the second rather than the first. Thanks again.

Secondly, Perderabo, I don't really prefer the direction I am taking either but, considering the fact that I am trying to fit this into my existing library of functions, I think that this may be acceptable. I am also considering an array (three arrays perhaps because of KSH-88) that will help me find the fds.

I'll explain the concept further to clear it up; perhaps you have a better suggestion. Judging from the examples that I have seen with your name attached, I am sure you can come up with something more elegant.

a. I have existing functions to communicate to my co-process, lets call them func_write and func_read.
b. Within these functions, I currently communicate to my co-process with print -p and read -p.
c. I want to eliminate the need to disconnect from my current database so I can communication to a new database for some sporadic lookups.
d. I thought that I could create two co-processes (hence requiring the I/O redirection) and define file descriptors for each co-process.
e. Therefore, to keep my current functions generic, I thought that I would supply a database descriptor (handle) to each function (e.g. func_write "something" database_handle_1)
f. Within the function, I would translate "database_handle_1" into a variable that was previously exported so that I can retrieve the correct fd and then use print -u${database_handle_1_stdout}

Arrays will give me the same thing but referencing dynamic variables seemed logial when I started thinking through how to implement this.

Thanks again,

Thomas

In item "f"; why do you suddenly start talking about exported variables? export makes a variable available to child processes. Those children will have their own fd's. Take at look at this:

#! /usr/bin/ksh

X=2

function one
{
        typeset X
        X=7
        echo in one X = $X
        return
}

function two
{
        X=9
        echo in two X = $X
        return
}
echo before one X = $X
one
echo before two X = $X
two
echo final X = $X
exit 0

Note that there is no "export". However there is a global variable called X. The function called "two" uses this global variable. The function called "one", while it uses a variable called X, it is a local X, not the global X.

When it run it, I get:
before one X = 2
in one X = 7
before two X = 2
in two X = 9
final X = 9

Thanks for pointing that out. I understand the scope of the variables with regards to the functions and exporting the variables is overkill; you are correct. I really want global variables; my thought process was clouded by my "eval" issues.

I have a (now) working proof of concept but I would still like to hear your thoughts on improving the concept.

#!/usr/bin/ksh

function connect
{
    typeset x=$1
    typeset o=$2
    typeset i=$3

    printf "\nFunction connect()\n"

    sqlplus -s /nolog |&

    eval "${x}_stdout=${o}"
    eval "${x}_stdin=${i}"

    STDOUT=$(eval "print \$${x}_stdout")
    STDIN=$(eval "print \$${x}_stdin")

    eval "exec ${STDOUT}>&p"
    eval "exec ${STDIN}<&p"

    if [ ${x} = "TEST1" ]
    then
        eval "print -u${STDOUT} \"connect USER_A/PWD_A@DB1\""
    else
        eval "print -u${STDOUT} \"connect USER_B/PWD_B@DB2\""
    fi
    eval "read -u${STDIN} LINE"
    print "From sqlplus: $LINE"
}

function test_comm
{
    typeset x=$1

    printf "\nFunction test_comm()\n"

    STDOUT=$(eval "print \$${x}_stdout")
    STDIN=$(eval "print \$${x}_stdin")

    eval "print -u${STDOUT} \"SHOW USER\""
    eval "read -u${STDIN} LINE"
    print "From sqlplus: $LINE"
}

ps -ef | grep sqlplus | grep -v grep
connect TEST1 3 4
connect TEST2 5 6

ps -ef | grep sqlplus | grep -v grep
test_comm TEST1
test_comm TEST2
test_comm TEST1
test_comm TEST2

Now here are the results:

Function connect()
From sqlplus: Connected.

Function connect()
From sqlplus: Connected.

me 8118 8106 0 13:51:00 pts/6 0:00 sqlplus -s /nolog
me 8113 8106 0 13:51:00 pts/6 0:00 sqlplus -s /nolog

Function test_comm()
From sqlplus: USER is "USER_A"

Function test_comm()
From sqlplus: USER is "USER_B"

Function test_comm()
From sqlplus: USER is "USER_A"

Function test_comm()
From sqlplus: USER is "USER_B"

The results are what I want and I will use some other kind of global variable to track and define fds so "o" and "i" in connect() won't really exist when implemented.

Thanks again,

Thomas