MPI support for ABINIT 9.4.1 on a Cray XC40 cluster

Hi,

We are installing ABINIT 9.4.1 on a Cray XC40 cluster (beskow.pdc.kth.se) and have encountered issues with the MPI support.

Configuration, compilation, and linking work out fine for a non-MPI build. The problem arise when we activate MPI. The settings that I have tried, building on a compute node, are


#!/bin/bash

#SBATCH -J abinit
#SBATCH -A
#SBATCH -t 02:00:00
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=32

#wget https://www.abinit.org/sites/default/files/packages/abinit-9.4.1.tar.gz
#tar xf abinit-9.4.1.tar.gz
#cd abinit-9.4.1

Load the build environment

PrgEnv-intel

----------------------

module load cdt/19.06
module load intel/18.0.0.128
module swap PrgEnv-cray PrgEnv-intel
module load cray-fftw/3.3.8.3
module load cray-netcdf/4.6.3.0
module load cray-hdf5/1.10.6.1
module load libxc/4.0.3
module load anaconda/2019.03/py37

export CRAYPE_LINK_TYPE=dynamic
export CC=“cc -D_Float128=__float128”
export CXX=CC
export FC=ftn

mkdir build
cd build
srun -n 1 …/configure
–with-libxc=/pdc/vol/libxc/4.0.3/
–with-netcdf=/opt/cray/pe/netcdf/4.6.3.0/INTEL/19.0/
–with-netcdf-fortran=/opt/cray/pe/netcdf/4.6.3.0/INTEL/19.0/
srun -n 1 make -j 32 > build_log.txt 2> build_error.txt

for which the configure script renders the error message


checking whether the MPI C compiler is set… no
checking whether the MPI C++ compiler is set… no
checking whether the MPI Fortran compiler is set… no

Adding as argument to configure ‘–with_mpi_prefix=/opt/cray/pe/craype/2.6.1/bin/’

gave the error message.

configure: error: unrecognized option: `–with_mpi_prefix=/opt/cray/pe/craype/2.6.1/bin/

What I can see the ABINIT configure script uses the variables CC, CXX, and FC. These I have set to the standard compiler wrappers for the Cray system.

Any suggestions on what additional environment variable(s) or flag(s) could be used for the configure script are most welcome.

Best regards,
Johan Hellsvik

Hi,

configure: error: unrecognized option: `–with_mpi_prefix=/opt/cray/pe/craype/2.6.1/bin/

can you try with

--with-mpi=/opt/cray/pe/craype/2.6.1/

or

--with-mpi="yes"
export FC="mpifort"
export CC="mpicc"
export CXX="mpicxx"

jmb

Hi,

Thank you for the advice. Unfortunately the MPI compilers are still not detected

--with-mpi=/opt/cray/pe/craype/2.6.1/

gave the error message

configure: error: invalid MPI settings
Please adjust --with-mpi and/or CC and re-run configure

--with-mpi="yes"
export FC="mpifort"
export CC="mpiicc"
export CXX="mpicxx"

gave the error message

configure: error: could not run Fortran compiler “mpifort”

--with-mpi="yes"
export FC="mpiifort" #(added an "i")
export CC="mpiicc"  #(added an "i")
export CXX="mpicxx"

gave the error message

checking whether the MPI C compiler is set… no
checking whether the MPI C++ compiler is set… no
checking whether the MPI Fortran compiler is set… no

Best regards,
Johan Hellsvik

Hi,

Sorry for the slow reaction…

It is necessary to start from the beginning, i.e. know the environment

From your batch script, can you run the following commands to fix the env :

module load cdt/19.06
module load intel/18.0.0.128
module swap PrgEnv-cray PrgEnv-intel
module load cray-fftw/3.3.8.3
module load cray-netcdf/4.6.3.0
module load cray-hdf5/1.10.6.1
module load libxc/4.0.3
module load anaconda/2019.03/py37

now, can you send the outputs of :

which mpifort
mpifort -show
which mpiifort
mpiifort -show
which mpif90
mpif90 -show
which ifort
echo $FC

best,