Stress testing php files at Unix/Linux Command line

Hi,

Your great help is very appreciated. I am looking for any Unix command or tool for doing Stress/Load test of php files at command prompt.
I tried torture.pl but it is not working after20 concurrent threads/users.

as it is very urgent for me..please suggest ur ideas asap.

thanks

If you have GNU xargs, you can use it to run many processes in parallel with the -P argument.

echo "a b c d e" | xargs --max-args=1 -P 10 echo

You could use this to run many PHP tasks in parallel.

Hi, Can u explain me indetail?

say I have two php files(a.php & b.php) in a dir called "test"

My requirement is stress testing these two php files by simulating with 1-100 users.

You could do something like:

for ((N=0; N<10000; N++))
do
        echo a.php b.php
done | xargs -P 100 --max-args=1 php

This should run 100 instances of PHP simultaneously to process 20,000 total requests.

The "-P 100" tells xargs to run up to 100 parallel processes. The --max-args=1 tells it to accept only one argument per process(instead of trying to run several with one single php instance.)

is there anyway of getting the response time or through put measurements?
Basically I want to see Response time of these files as no.of concurrent users increases

thanks for ur prompt responses

---------- Post updated at 01:31 PM ---------- Previous update was at 01:03 PM ----------

I have modified the code to get the response time as below
time( for ((N=0; N<10; N++)); do echo test_performance.php test_performance2.php ; done | xargs -P 10 --max-args=1 php)

the result is as
real 0m42.417s
user 0m29.537s
sys 0m0.399s

which time i need to consider as part of response time?

Real time. The others are just a measure of how much time was spent running userspace code or kernel code or just plain waiting.

You could pipe xargs' output into wc -b to count the amount of data in bytes.