I have a script that creates a KSH co-process for Oracle sqlplus and I am presently interacting with it via print -p and read -p.
I also need to interact with another Oracle database what isn't permitted to have any direct connection to the first. Presently, I simply disconnect from the first database, connect to the second database, and reconnect to the first database when I am finished, again using print -p and read -p.
I would like to enhance this a bit using a second co-process. I know how to create the second sqlplus co-process by first redirecting stdout and stdin for the first co-process using exec fd>&p and exec fd<&p. Then I can create the second co-process and interact with both.
My desire is to write a db_connect function that is capable of determining that a co-process has already been started and a new co-process with new file descriptors needs to be started. Along with this, I would like to dynamically determine which co-process to communicate to using generalized functions.
Any thoughts? Can the open file descriptors be tested? The version of KSH is KSH-88 I believe.
The only thing I can think of is to create a global variable and increment it for each co-process. The test on a fd that I know of is
[ -t $fd ]
which checks for being a tty.
Thanks for the suggestion with regards to using a global variable. I am not opposed to doing this. I hope that it doesn't get lost on the poor DBAs here who will have to maintain these scripts when I eventually have to move on.
I started working on a proof of concept and hit an unexpected snag using eval. My thought is that I would pass a handle (just a string really) to my database functions and build variables based on the handle. When I try to redirect stdout and stdin to some open file descriptors, I get an error.
eval exec ${TESTVAR_stdout}>&p <==== results in "t[13]: 3: not found"
The previous lines work fine, I am scratching my head on why TESTVAR_stdout isn't expanding properly.
What is more perplexing is If I manually execute the lines with numeric literals, it works. If I manually execute the lines using variable and no "eval", KSH indicates that I have running jobs for stdout and closes my session for stdin.
If I can get past this hurdle, I can implement the concept easily.
Sorry, I am not trying to bump up my post but I wanted to retract my previous post as the solution only works when I name my variables. I realized that I wasn't referencing my dynamic variable "x". I am still scratching my head...
Sorry, again, I am not trying to bump anything up, I have a core issue, that I am trying to narrow this thread's scope to focus on, using "eval" that is eluding me so if I can get past this, my problem is solved.
Essentially my "x" variable should be able to translate into another variable containing my file descriptor. If I can get "x" to expand into TEXTVAR2_stdout and then expand this so that "4" can be assigned to a local function variable, I am golden.
#!/usr/bin/ksh
sqlplus -s /nolog |&
x=TESTVAR2
eval export "${x}_stdout=3"
eval export "${x}_stdin=4"
# Print using named variables
print $TESTVAR2_stdout
print $TESTVAR2_stdin
# Same thing attempting expansion of my "x" variable;
# results in print TESTVAR2_stdout but no 4
eval "print ${x}_stdout"
# results in print $TESTVAR2_stdout
print $(expr $"${x}_stdout")
I get down to something that looks like $TESTVAR2_stdout but I cannot get the shell to treat it as a variable.
vger99 has the syntax. But the direction you are taking seems to be wrong. A fd is specific to a process. Exporting a vaiable containing an fd is senseless. In your first post, when you said "function", I assumed you meant "function". A function is ksh concept and it can have locally scoped variables. A global variable is global to the script, not the environment.
First of all, vger99, thank you. This will solve the problem that I was struggling with. I was probably wrapping myself around my own axils pretty tight because I did double up the "$" symbols, however, I was escaping the second rather than the first. Thanks again.
Secondly, Perderabo, I don't really prefer the direction I am taking either but, considering the fact that I am trying to fit this into my existing library of functions, I think that this may be acceptable. I am also considering an array (three arrays perhaps because of KSH-88) that will help me find the fds.
I'll explain the concept further to clear it up; perhaps you have a better suggestion. Judging from the examples that I have seen with your name attached, I am sure you can come up with something more elegant.
a. I have existing functions to communicate to my co-process, lets call them func_write and func_read.
b. Within these functions, I currently communicate to my co-process with print -p and read -p.
c. I want to eliminate the need to disconnect from my current database so I can communication to a new database for some sporadic lookups.
d. I thought that I could create two co-processes (hence requiring the I/O redirection) and define file descriptors for each co-process.
e. Therefore, to keep my current functions generic, I thought that I would supply a database descriptor (handle) to each function (e.g. func_write "something" database_handle_1)
f. Within the function, I would translate "database_handle_1" into a variable that was previously exported so that I can retrieve the correct fd and then use print -u${database_handle_1_stdout}
Arrays will give me the same thing but referencing dynamic variables seemed logial when I started thinking through how to implement this.
In item "f"; why do you suddenly start talking about exported variables? export makes a variable available to child processes. Those children will have their own fd's. Take at look at this:
#! /usr/bin/ksh
X=2
function one
{
typeset X
X=7
echo in one X = $X
return
}
function two
{
X=9
echo in two X = $X
return
}
echo before one X = $X
one
echo before two X = $X
two
echo final X = $X
exit 0
Note that there is no "export". However there is a global variable called X. The function called "two" uses this global variable. The function called "one", while it uses a variable called X, it is a local X, not the global X.
When it run it, I get:
before one X = 2
in one X = 7
before two X = 2
in two X = 9
final X = 9
Thanks for pointing that out. I understand the scope of the variables with regards to the functions and exporting the variables is overkill; you are correct. I really want global variables; my thought process was clouded by my "eval" issues.
I have a (now) working proof of concept but I would still like to hear your thoughts on improving the concept.
me 8118 8106 0 13:51:00 pts/6 0:00 sqlplus -s /nolog
me 8113 8106 0 13:51:00 pts/6 0:00 sqlplus -s /nolog
Function test_comm()
From sqlplus: USER is "USER_A"
Function test_comm()
From sqlplus: USER is "USER_B"
Function test_comm()
From sqlplus: USER is "USER_A"
Function test_comm()
From sqlplus: USER is "USER_B"
The results are what I want and I will use some other kind of global variable to track and define fds so "o" and "i" in connect() won't really exist when implemented.