Linux cluster

Hi, I am not sure if the term "cluster" fits my situation or not. I have 5 Linux computers, each has different a host-name, and my users have to remember those the computer names to login and find out which computers have free CPUs, so they can run their jobs on those computers have free CPUs.

Does anyone know any software that connects all those 5 computers and gives it one host-name (so my users need remember only one name to remote login) and possibly assigns the jobs (that my users run) on computers that have free CPUs?

Thanks

Depending on the type of jobs, a load-balancer might be the right thing (eg. Crossroads). For C/C++ compiling, there's distcc.

You may also want to check out Beowolf

SGE , Sun grid engine .
There is free version .

You indeed have a cluster of Linux servers. What you are trying to do is to get your cluster to perform certain tasks in parallel. There are three different kinds of tasks you can do: (1) system administration, (2) compilation and making, and (3) task execution. So far people have commented on a solution for (2) -- pdcc and make -j can be used for this -- and for (3) SGE, but also openmpi. For (1) -- administration -- having only 5 computers is small enough to do it by hand -- ie, use command-line and scripts to synchronize the hosts. But but you can also use parallelizing remote execution tools such as pdsh or mussh, or cluster tools such as flamethrower and rdist to make your life easier.

To get all of the above from scratch, I highly recommend ROCKS, which will install a standardized linux framework on all the hosts, complete with monitoring, and a fully integrated parallel execution environment based on whichever "Rolls" you select (ie, SGE or Torque; OpenMPI or MPICH; Intel CC or GNU CC; etc).