Whenever I try to call mpi_reduce
with mpi_in_place
as the send buffer it crashes. A trawl of google reveals this to be have been a problem on Mac OS for OMPI 1.3.3 - but I'm on CentOS with OMPI 1.6.3 (with gfortran 4.4.6).
The following program crashes:
PROGRAM reduce
USE mpi
IMPLICIT NONE
REAL, DIMENSION(2, 3) :: buffer, gbuffer
INTEGER :: ierr, me_world
INTEGER :: buf_shape(2), counts
CALL mpi_init(ierr)
CALL mpi_comm_rank(mpi_comm_world, me_world, ierr)
buffer = 1.
IF (me_world .EQ. 0) PRINT*, "buffer: ", buffer
buf_shape = SHAPE(buffer)
counts = buf_shape(1)*buf_shape(2)
CALL mpi_reduce(MPI_IN_PLACE, buffer, counts, mpi_real, mpi_sum, 0, mpi_comm_world, ierr)
IF (me_world .EQ. 0) PRINT*, "buffer: ", buffer
CALL mpi_finalize(ierr)
END PROGRAM reduce
The MPI error is:
MPI_ERR_ARG: invalid argument of some other kind
which is not very helpful.
Am I missing something as to how mpi_reduce
should be called? Does this work with other compilers/MPI implementations?
MPI_IN_PLACE
documentation as I thought that collective communications had to be called by all processes with exactly the same arguments. – Plebeian