Getting 'Killed' msg

Can anyone explain?
I start my unix session on AIX, run tcsh move to a particular directory, let say: cd /test/bin and next i run a command like:
grep "test string" /test/bin/*
to look for the string in any files in the directory.
I am getting a response of Killed.

Why is that happening?

Thanks!

No idea,
but you could run the command with truss, strace or smething similar.
does ist happen only when grepping in a certain folder ?

The 'funny' thing is I am using reflectionX, and if I connect on a coworker's computer I execute fine, yet if I do the same connection and command on mine I get 'killed'. I cannot figure out how could that be the case.
Thanks.

You may be exceeding memory or CPU time limits for your account, which may be different for his account.

Yes it only happens in a particular folder.

try 'which grep' .. you should be using correct grep executable not some junk file with name as grep and having execute permissions..

i assume grep matches in a binary file. data is printed and some special caracters causing the kill.
try: grep .... | strings
it schould work if my assumption is true

I am leaning toward the memory limit, in fact when I start .... at the $ prompt the command works, then if I enter tcsh and enter into the C shell the same command returns Killed.
How can I confirm my hypothesis?
Can I see the memory that s available in each shell?
Thanks!

tcsh may be lowering your allowable memory. Try running tcsh and then type sh to get get a new copy sh. This new sh will be an offspring of tcsh and will inherit any new memory limitations from it. Try the command with this copy of sh.

How do you invoke tcsh? What shell do you start with? If you start with sh and then just do either "tcsh" or "exec tcsh" to switch to tcsh, a file called $HOME/.cshrc will be run and it could contain ulimit commands to lower your memory parameters. If you login as a user with a shell of tcsh, then you also need to review .login for ulimit commands.

Did you try this? It's easy to do and it will give you additional information.

An even simpler test is to use grep -l .... That is, use the dash ell switch. I tells grep to not print anything it finds, but to only list the files that match.

Another thing to do would be to loop through the files and see which file actually causes the problem. Maybe something like this:

for f in /test/bin/*; do echo $f; grep "test string" $f; done

I start from the k shell, at that level the grep cmd
grep -l "test" /TEST/bin/*
works fine, if I then get into tcsh there the grep cmd
grep -l "test" /TEST/bin/*
gives me the Killed response.

There is no ulimit in my .cshrc maybe it is in the /etc/security/limits file (to which I do not have permission).

I can execute this cmd from a spawned ksh shell from inside the C shell with no problem
for j in /TEST/bin/*; do grep -l "test" $j; done
This just walks through the files one at a time and executes grep on each.
So there is no 'funny' characters in any of them, I suppose.
Is there any way I can actually see what mem I have left before executing the cmds in UNIX.
Thanks!

How many files?

So you can run find in sh, ksh, csh but not tcsh?

Try renaming off your .cshrc so it's not used to see what happens.

See if you have a .tcshrc file.

Thanks.

from tcsh, try:
/bin/echo /TEST/bin/*

I thinking that your tcsh may have a bug where it manages to kill itself if it builds too large of a command line.

It would help to know how much data we are looking at I think. Maybe the tcsh can't handle over 2GB??? Just a thought, this is an old problem.

When I execute this
/bin/echo /TEST/bin/*
I still get .... Killed.
if I exit tcsh and back at the $ prompt in K shell I type:
/bin/echo /TEST/bin/* | wc
these are the counts I get :
1 786 22273
Maybe this is it, ~22K is this too big for tcsh command line?

Can you also advice of a way to only return and process files in a particular directory i.e. /TEST/bin, with a command in this format?
find /TEST/bin -name "*" -type f| xargs grep -l "search string"
I am trying to use find to avoid the Arg list too long which would get generated by ls
I tried putting -prune in the command with no success, the complexity might resides in the fact that I want to execute the find command without regard of where I am (my pwd value) at the time of execution.
Let me know if you have suggestions.

Thanks.

Since the shell is your problem, switch to a better shell linke bash or ksh. Mark my words, if you stay with tcsh/csh you will be fighting problem after problem. Maybe:
ls /TEST/bin | xargs grep -l "test"
or
find /TEST/bin ! -name bin -type d -prune -o -type f -print | xargs grep -l "test"
will work with your shell. It's hard to tell you what commands to use when your shell is known to be broken.

Sounds like xargs is your ticket. But if you're stubborn, try this:

Split up the parameter space into n different invocations. For instance:

/bin/echo /TEST/bin/[A-Ma-m]*
/bin/echo /TEST/bin/[N-Zn-z]*
/bin/echo /TEST/bin/[^A-Za-z]*