Rocks cluster 6.1 and MPICH2 problem??????

Hey friends,
I am trying to execute a simple hello world in mpi on MPICH2 on Rocks cluster. here is the c source code.

 
#include <mpi.h>
#include <stdio.h>
int main( int argc, char ** argv )
{
 MPI_Init( NULL, NULL );
 int world_size;
 MPI_Comm_size( MPI_COMM_WORLD, &world_size );
 int world_rank;
 MPI_Comm_rank( MPI_COMM_WORLD, &world_rank );
 char processor_name[MPI_MAX_PROCESSOR_NAME];
 int name_len;
 MPI_Get_processor_name( processor_name, &name_len );
 printf( "Hello world from processor %s, rank %d" " out of %d processors\n", processor_name, world_rank, world_size );
 MPI_Finalize();
}

And I compile it like this.

/opt/mpich2/gnu/bin/mpicc ./hello.c -o hello

I have the following entry on the machine file.

compute-0-0
compute-0-1

Now here is how I run the hello program

/opt/mpich/gnu/bin/mpirun -np 2 -machinefile machines ./hello

which gives me the follwing error.

 
[user1@cluster ~]$ /opt/mpich2/gnu/bin/mpirun -np 2 -machinefile machines ./mpi_hello_world
Could not chdir to home directory /export/home/user1: No such file or directory
Could not chdir to home directory /export/home/user1: No such file or directory
[proxy:0:0@compute-0-0.local] launch_procs (./pm/pmiserv/pmip_cb.c:687): unable to change wdir to /export/home/user1 (No such file or directory)
[proxy:0:0@compute-0-0.local] HYD_pmcd_pmip_control_cmd_cb (./pm/pmiserv/pmip_cb.c:935): launch_procs returned error
[proxy:0:0@compute-0-0.local] HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:77): callback returned error status
[proxy:0:0@compute-0-0.local] [mpiexec@cluster.hpc.org] control_cb (./pm/pmiserv/pmiserv_cb.c:215): assert (!closed) failed
[mpiexec@cluster.hpc.org] HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:77): callback returned error status
[mpiexec@cluster.hpc.org] HYD_pmci_wait_for_completion (./pm/pmiserv/pmiserv_pmci.c:181): error waiting for event
[mpiexec@cluster.hpc.org] main (./ui/mpich/mpiexec.c:405): process manager error waiting for completion
[user1@cluster ~]$ 

Please help me.

hi,
any one wants answer this question?
i have this question too! :frowning:

not having access to MPI and just looking at your posted errors, you need to address the directory issue first before you start rummaging somewhere else ...

Could not chdir to home directory /export/home/user1: No such file or directory
1 Like

tanx but,
we use rocks cluster which has MPICH2 pre-installed in /opt/mpich2/
Do I need to do any other modification to libraries?
and any other recommendation for us...?

afaik the error has nothing to do with where the app is installed but rather where it expects to dump/find some files ...

i suggest you read the MPICH2 documentation further to see what you need to do to fix your issue ... the fix might be as simple as creating/fixing permissions on a directory or as complicated as a full recompile and reinstall but you need to confirm that within the documentation ...

1 Like