I've always been a Debian man, but for various reasons I need to be able to compile various scientific packages on a HPC running
ROCKS. ROCKS 5.4.3 is based on
CentOS 5,6and it turns out that debian is wonderfully easy, accommodating and robust in comparison. Well, since it's not my HPC, CentOS is what I'm stuck with.
Here's how to build nwchem on a rocks 5.4.3 (viper) cluster based on CentOS 5.6 and its ancient kernel.
(Linux 2.6.18-238.19.1.el5 #1 SMP Fri Jul 15 07:31:24 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux )
There are three different approaches:
CASE 1. Using LD_LIBRARY_PATH
This method requires no root access.
Check to see whether you've installed the rocks-openmpi package from the bio roll - it should be in /opt/openmpi. Otherwise use yum to install the base-roll openmpi package, which will end up in /usr/lib64/openmpi/1.4-gcc/lib -- you'll need root or sudo to do anything with yum.
For compilation, do
export LIBRARY_PATH=$LIBRARY_PATH:/opt/openmpi/lib
or
export LIBRARY_PATH=$LIBRARY_PATH:/usr/lib64/openmpi/1.4-gcc/lib/
depending on whether there is an openmpi directory in /opt or not.
You can also put the export line in your buildconf.sh below
For execution:
in either you ~/.bashrc (user basis) or /etc/profile (global) put
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib
CASE 2. /opt/openmpi is present; using symlinked libs
mpicc and mpif77 are probably already symlinked, but if not:
sudo ln -s /opt/openmpi/bin/mpicc /usr/bin/mpicc
sudo ln -s /opt/openmpi/bin/mpif77 /usr/bin/mpif77
The following allows for building and running:
sudo ln -s /opt/openmpi/lib/libmpi.so /usr/lib/libmpi.so
sudo ln -s /opt/openmpi/lib/libopen-rte.so /usr/lib/libopen-rte.so
sudo ln -s /opt/openmpi/lib/libopen-pal.so /usr/lib/libopen-pal.so
sudo ln -s /opt/openmpi/lib/libmpi_f77.so /usr/lib/libmpi_f77.so
sudo ln -s /opt/openmpi/lib/libmpi.so /usr/lib64/libmpi.so.0
sudo ln -s /opt/openmpi/lib/libopen-rte.so /usr/lib64/libopen-rte.so.0
sudo ln -s /opt/openmpi/lib/libopen-pal.so /usr/lib64/libopen-pal.so.0
sudo ln -s /opt/openmpi/lib/libmpi_f77.so /usr/lib64/libmpi_f77.so.0
the /usr/lib64 symlinks are necessary for execution, or you'll get
./nwchem: error while loading shared libraries: libmpi.so.0: cannot open shared object file: No such file or directory
CASE 3. /opt/openmpi is NOT present; using symlinked libs
yum install openmpi openmpi-devel
And then put in all the symlinks...dunno why this isn't done on install, but there you go.
sudo ln -s /usr/local/lib64/openmpi/1.4-gcc/bin/mpicc /usr/bin/mpicc
sudo ln -s /usr/local/lib64/openmpi/1.4-gcc/bin/mpif77 /usr/bin/mpif77
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi.so /usr/lib/libmpi.so
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-rte.so /usr/lib/libopen-rte.so
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-pal.so /usr/lib/libopen-pal.so
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi_f77.so /usr/lib/libmpi_f77.so
Using the above symlinks compilation will work just fine.
However, in order to actually run nwchem you need
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi.so /usr/lib64/libmpi.so.0
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-rte.so /usr/lib64/libopen-rte.so.0
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-pal.so /usr/lib64/libopen-pal.so.0
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi_f77.so /usr/lib64/libmpi_f77.so.0
or you'll get
./nwchem: error while loading shared libraries: libmpi.so.0: cannot open shared object file: No such file or directory
Finally, make sure we can find our mpirun:
sudo ln -s /usr/lib64/openmpi/1.4-gcc/bin/mpirun /usr/bin/mpirun
ALL CASES
Continue here:
We'll be working in /export/home/me/tmp
wget http://www.nwchem-sw.org/images/Nwchem-6.0.tar.gz
tar -xvf Nwchem
cd nwchem-6.0
create a file called
buildconf.sh and stuff it with the following:
export LARGE_FILES=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=/export/home/me/tmp/nwchem-6.0
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES=all
export USE_MPI=y
export USE_MPIF=y
export MPI_LOC=/usr/lib64/openmpi/1.4-gcc/lib
export MPI_INCLUDE=/usr/lib64/openmpi/1.4-gcc/include
export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread"
cd $NWCHEM_TOP/src
make clean
make nwchem_config
make FC=gfortran
NOTE: the above buildconf.sh works for the case when you installed openmpi yourself (
CASE 2 or 3). If it got installed with ROCKS on setup and is present in
/opt/openmpi (
CASE 1 or 3) change the following:
export MPI_LOC=/opt/openmpi/lib
export MPI_INCLUDE=/opt/openmpi/include
Launch the build
sh buildconf.sh
You'll end up with a binary called nwchem in nwchem--6.0/bin/LINUX64 -- you can put a PATH to it in your
~/.bashrc
CASE 3
For execution you will need to make sure nwchem can find the openmpi libs --
echo $LD_LIBRARY_PATH
will tell you whether the path is included by default.
Otherwise, in either you
~/.bashrc (user basis) or
/etc/profile (global) put
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib
Running
If you move nwchem out of the compilation directory (to say /usr/local/nwchem) you may also want to define e.g.
export NWCHEM_TOP=/usr/local/nwchem-6.0
export NWCHEM_TARGET=LINUX64
export NWCHEM_BASIS_LIBRARY=${NWCHEM_TOP}/libraries/
Again, this goes into your .bashrc or /etc/profile, depending on scope.
To use multiple cores, do
mpirun -n
4 nwchem jobname.nw
where the number of cores is
4.
Errors and troubleshooting:
If you get errors about libraries missing or mpicc-related errors you should
make sure that you've symlinked everything you need into the /usr/lib folder or set the LIBRARY_PATH (see above). You could probably edit /etc/ld.conf too, but it will get messy with time.
I also tried building
using mpich2-1.2 as well as 1.4, but got error messages about undefined references left and right.