solaris redirection

Hi

I am using solaris 10. When running a grep command with multiple files the output is the same as the order of the input. As soon as I pipe the output to another command then it seems that standard error takes precedence, over standard output and gets sent to the pipe first.

ie grep -c 'findme' file1 nofile file2

would give an output in order:
file1:0
grep: cant open nofile
file2:0

However if I do a grep -c 'findme' file1 nofile file2 | cat

I get
grep: cant open nofile
file1:0
file2:0

using grep -c 'findme' file1 nofile file2 2>&1 | cat

I am trying to make it work so that the output order stays the same as the input order, the above works in linux but in solaris stderr still outputs first. Can anyone help me with this.

thanks

You first need to define what you mean by "output order".

Stdout and stderr are two distinct streams of data. In general there is no concept of order between the two.

That's by design. stdout is buffered...a lot of data is collected in a large array and when it's full a single write occurs. This is for efficiency... the write system call is expensive. On the other hand stderr is only line buffered...as soon as a line is available it is displayed. The idea is that stderr should be rarely used, but when it is used it must be displayed immediately.

The only portable way to solve your problem is to write a script that loops over the input files and invokes grep once for each file.

---edit---
Well I guess it's not the only way... you could use expect for instance. There's always another way to skin a cat.

grep -c 'findme' file1 nofile file2  2>&1 | cat

should (and does for me) preserve order on Solaris too. What shell are you using on what Solaris release ?

The shell is irrelevant but I will try ksh, sh, and bash...

$ uname -a
SunOS geo-support1 5.10 Generic_127111-10 sun4u sparc SUNW,Sun-Blade-1000
$ ksh
$ grep -c 'findme' file1 nofile file1 2>&1
file1:0
grep: can't open nofile
file1:0
$ grep -c 'findme' file1 nofile file1 2>&1 | cat
grep: can't open nofile
file1:0
file1:0
$
$
$ ^D
$ sh
$ grep -c 'findme' file1 nofile file1 2>&1 | cat
grep: can't open nofile
file1:0
file1:0
$ $
$ bash
bash-3.00$ grep -c 'findme' file1 nofile file1 2>&1 | cat
grep: can't open nofile
file1:0
file1:0
bash-3.00$

The buffering I talked about is internal to grep. Switching shells has no influence. If it works for you, you must be using a different grep. The behavior of your grep is not guaranteed by any standard and it is not wise to write scripts that depend on it.

Thanks for the responses.

SunOS 5.10 - Bash - It does seem to have obsolete utils though such as a less that does not seem to know what is a real newline as opposed to a newline on the console window. It looks as though i will need to use a loop, Perderabo explanation makes sense, does seem strange though that 2>&1 would work on some systems and not others.

Didnt see the above response, as you say it must be the version of grep which incidentally does correctly recognise a newline.

You are right, the shell is not to blame here. I was fooled by Solaris 11 Express default PATH having GNU tools first.
On Solaris 10, to get the expected behavior, you can use:

/usr/sfw/bin/ggrep -c 'findme' file1 nofile file1 2>&1 | cat

The root cause is stdout is line buffered when the output is a terminal but block buffered when it goes to a pipe or a file. stderr is always line buffered.
Gnu grep is also block buffering stdout in the latter case, but it flushes its standard output stream between each processed file.

---------- Post updated at 14:38 ---------- Previous update was at 11:20 ----------

This is unrelated. You might want to start a new thread and elaborate a little bit about these newlines issues.

thanks jlliagre for your elaboration this has given me some usefull insight into grep. I opened a thread on the less issue a while ago and concluded that it was just due to being an obsolete version of less/more/cat whereas grep/head/tail all properly recognised newline.