Showing posts with label openmpi. Show all posts
Showing posts with label openmpi. Show all posts

19 May 2013

424. NWChem 6.3 on Debian Wheezy

Update 23 May 2013: The execution times are pretty much the same as for 6.1.1 with a new patch. I've updated the instructions below to incorporate this new patch (http://www.nwchem-sw.org/images/Iswtch.patch.gz)

Update 21 May 2013: The execution times can be improved considerably by setting
ARMCI_NETWORK=SOCKETS

They are still ca 30% longer than 6.1.1 though due to slower SCF convergence. See http://www.nwchem-sw.org/index.php/Special:AWCforum/st/id834/Nwchem_6.3_running_2-5_times_slo....html

Update 20 May 2013: I did a bit of basic benchmarking. NWChem 6.3 is incredibly slow (ca 190s vs 40s for the 8 core, 3.6 GHz benchmark in http://verahill.blogspot.com.au/2013/05/414-frequency-vs-cores-crude.html). It's parallellising properly from what I can see (i.e. it is not running 8 serial jobs). I've repeated the calc with an unpatched version of nwchem 6.3, and it is just as slow.
 I'll post updates here if I figure this one out.

Original post:
NWChem 6.3 is just out. Here's how to build it for CPU computations.

To build on CentOS 5.6, see http://verahill.blogspot.com.au/2013/05/421-nwchem-63-on-rocks-543centos-56.html


Math library:
Use either openblas (for intel or AMD) or ACML (for AMD).

My GabEdit/Python NWChem patch
This is NOT the patch alluded to in the 23 May update and is optional. It enables python support, and makes the output more verbose so that gabedit can be used as an alternative to ECCE. Hence, it is required if, but only if, you want to enable python and to be able to use GabEdit to open output files.

First create a patch file, e.g. diff.patch.

diff -rupN src.original/config/makefile.h src/config/makefile.h
--- src.original/config/makefile.h 2013-04-15 12:41:45.016853322 +1000
+++ src/config/makefile.h 2013-04-15 12:38:44.933319544 +1000
@@ -2039,7 +2039,7 @@ endif
 
      ifeq ($(BUILDING_PYTHON),python)
 #   EXTRA_LIBS += -ltk -ltcl -L/usr/X11R6/lib -lX11 -ldl
-     EXTRA_LIBS +=    -lnwcutil  -lpthread -lutil -ldl
+     EXTRA_LIBS +=    -lnwcutil  -lpthread -lutil -ldl -lssl -lz
   LDOPTIONS = -Wl,--export-dynamic 
      endif
 ifeq ($(NWCHEM_TARGET),CATAMOUNT)
diff -rupN src.original/ddscf/movecs_pr_anal.F src/ddscf/movecs_pr_anal.F
--- src.original/ddscf/movecs_pr_anal.F 2013-04-15 12:41:45.036852381 +1000
+++ src/ddscf/movecs_pr_anal.F 2013-04-15 12:23:28.100409225 +1000
@@ -195,7 +195,7 @@ c
  22         format(1x,2('  Bfn.  Coefficient  Atom+Function  ',5x))
             write(LuOut,23)
  23         format(1x,2(' ----- ------------  ---------------',5x))
-            do klo = 0, min(n-1,9), 2
+            do klo = 0, min(n-1,199), 2
                khi = min(klo+1,n-1)
                write(LuOut,2) (
      $              int_mb(k_list+k)+1, 
diff -rupN src.original/ddscf/rohf.F src/ddscf/rohf.F
--- src.original/ddscf/rohf.F 2013-04-15 12:41:45.036852381 +1000
+++ src/ddscf/rohf.F 2013-04-15 12:23:28.100409225 +1000
@@ -153,7 +153,7 @@ c
             ilo = 1
             ihi = nmo
          endif
-         call movecs_print_anal(basis, ilo, ihi, 0.15d0, g_movecs, 
+         call movecs_print_anal(basis, ilo, ihi, 0.01d0, g_movecs, 
      $        'ROHF Final Molecular Orbital Analysis', 
      $        .true., dbl_mb(k_eval), oadapt, int_mb(k_irs),
      $        .true., dbl_mb(k_occ))
diff -rupN src.original/ddscf/scf_vec_guess.F src/ddscf/scf_vec_guess.F
--- src.original/ddscf/scf_vec_guess.F 2013-04-15 12:41:45.036852381 +1000
+++ src/ddscf/scf_vec_guess.F 2013-04-15 12:23:28.100409225 +1000
@@ -511,19 +511,19 @@ c
          nprint = min(nclosed+nopen+30,nmo)
          if (scftype.eq.'RHF' .or. scftype.eq.'ROHF') then
             call movecs_print_anal(basis, 1,
-     &           nprint, 0.15d0, g_movecs, 
+     &           nprint, 0.01d0, g_movecs, 
      &           'ROHF Initial Molecular Orbital Analysis', 
      &           .true., dbl_mb(k_eval), oadapt, int_mb(k_irs),
      &           .true., dbl_mb(k_occ))
          else
             nprint = min(nalpha+20,nmo)
             call movecs_print_anal(basis, max(1,nbeta-20),
-     &           nprint, 0.15d0, g_movecs, 
+     &           nprint, 0.01d0, g_movecs, 
      &           'UHF Initial Alpha Molecular Orbital Analysis', 
      &           .true., dbl_mb(k_eval), oadapt, int_mb(k_irs),
      &           .true., dbl_mb(k_occ))
             call movecs_print_anal(basis, max(1,nbeta-20),
-     &           nprint, 0.15d0, g_movecs(2), 
+     &           nprint, 0.01d0, g_movecs(2), 
      &           'UHF Initial Beta Molecular Orbital Analysis', 
      &           .true., dbl_mb(k_eval+nbf), oadapt, int_mb(k_irs+nmo),
      &           .true., dbl_mb(k_occ+nbf))
diff -rupN src.original/ddscf/uhf.F src/ddscf/uhf.F
--- src.original/ddscf/uhf.F 2013-04-15 12:41:45.036852381 +1000
+++ src/ddscf/uhf.F 2013-04-15 12:23:28.096409414 +1000
@@ -144,11 +144,11 @@ C
          enddo
          ihi = max(ihi-1,1)
  9611    continue
-         call movecs_print_anal(basis, ilo, ihi, 0.15d0, g_movecs, 
+         call movecs_print_anal(basis, ilo, ihi, 0.01d0, g_movecs, 
      $        'UHF Final Alpha Molecular Orbital Analysis', 
      $        .true., dbl_mb(k_eval), oadapt, int_mb(k_irs),
      $        .true., dbl_mb(k_occ))
-         call movecs_print_anal(basis, ilo, ihi, 0.15d0, g_movecs(2), 
+         call movecs_print_anal(basis, ilo, ihi, 0.01d0, g_movecs(2), 
      $        'UHF Final Beta Molecular Orbital Analysis', 
      $        .true., dbl_mb(k_eval+nbf), oadapt, int_mb(k_irs+nmo),
      $        .true., dbl_mb(k_occ+nbf))
diff -rupN src.original/mcscf/mcscf.F src/mcscf/mcscf.F
--- src.original/mcscf/mcscf.F 2013-04-15 12:41:45.000854073 +1000
+++ src/mcscf/mcscf.F 2013-04-15 12:23:23.748613695 +1000
@@ -719,7 +719,7 @@ c
       if (util_print('final vectors analysis', print_default))
      $     call movecs_print_anal(basis, 
      $     max(1,nclosed-10), min(nbf,nclosed+nact+10),
-     $     0.15d0, g_movecs, 'Analysis of MCSCF natural orbitals',
+     $     0.01d0, g_movecs, 'Analysis of MCSCF natural orbitals',
      $     .true., dbl_mb(k_evals), .true., int_mb(k_sym), 
      $     .true., dbl_mb(k_occ))
 c     
diff -rupN src.original/nwdft/scf_dft/dft_mxspin_ovlp.F src/nwdft/scf_dft/dft_mxspin_ovlp.F
--- src.original/nwdft/scf_dft/dft_mxspin_ovlp.F 2013-04-15 12:41:45.604825677 +1000
+++ src/nwdft/scf_dft/dft_mxspin_ovlp.F 2013-04-15 12:23:28.228403211 +1000
@@ -184,14 +184,14 @@ c
       call ga_sync()
 c
       call movecs_print_anal(basis,int_mb(k_non),int_mb(k_non)
-     & ,0.15d0,g_alpha,'Alpha Orbitals without Beta Partners',
+     & ,0.01d0,g_alpha,'Alpha Orbitals without Beta Partners',
      &   .false., 0.0 ,.false., 0 , .false., 0 )
 c
       if (nct.GE.2) then
       do i = 2,nct
       ind = int_mb(k_non+i-1)
       call movecs_print_anal(basis,ind,ind
-     & ,0.15d0,g_alpha,' ',
+     & ,0.01d0,g_alpha,' ',
      &   .false., 0.0 ,.false., 0 , .false., 0 )
       enddo
       endif
@@ -350,7 +350,7 @@ c      endif
 c      endif
 c 9990 format(/,18x,'THERE ARE',i3,1x,'UN-PARTNERED ALPHA ORBITALS')
 c
-       call movecs_print_anal(basis, 1, nalp, 0.15d0, g_ualpha,
+       call movecs_print_anal(basis, 1, nalp, 0.01d0, g_ualpha,
      & 'Alpha Orb. w/o Beta Partners (after maxim. alpha/beta overlap)',
      &   .false., 0.0 ,.false., 0 , .false., 0 )
 c
diff -rupN src.original/nwdft/scf_dft/dft_scf.F src/nwdft/scf_dft/dft_scf.F
--- src.original/nwdft/scf_dft/dft_scf.F 2013-04-15 12:41:45.608825490 +1000
+++ src/nwdft/scf_dft/dft_scf.F 2013-04-15 12:23:28.228403211 +1000
@@ -1774,7 +1774,7 @@ c
             else
                blob='DFT Final Beta Molecular Orbital Analysis' 
             endif
-            call movecs_print_anal(ao_bas_han, ilo, ihi, 0.15d0, 
+            call movecs_print_anal(ao_bas_han, ilo, ihi, 0.01d0, 
      &           g_movecs(ispin), 
      &           blob, 
      &           .true., dbl_mb(k_eval(ispin)), oadapt, 
diff -rupN src.original/nwdft/scf_dft_cg/dft_cg_solve.F src/nwdft/scf_dft_cg/dft_cg_solve.F
--- src.original/nwdft/scf_dft_cg/dft_cg_solve.F 2013-04-15 12:41:45.612825303 +1000
+++ src/nwdft/scf_dft_cg/dft_cg_solve.F 2013-04-15 12:23:28.220403588 +1000
@@ -183,7 +183,7 @@ c
             blob = 'DFT Final Beta Molecular Orbital Analysis'
           endif
           call movecs_fix_phase(g_movecs(ispin))
-          call movecs_print_anal(basis, ilo, ihi, 0.15d0,
+          call movecs_print_anal(basis, ilo, ihi, 0.01d0,
      &         g_movecs(ispin),blob,
      &         .true., dbl_mb(k_eval+(ispin-1)*nbf),
      &         oadapt, int_mb(k_irs+(ispin-1)*nbf),


Compile NWChem
This examples uses the ACML libs. See e.g. this post for openblas settings.

sudo apt-get install build-essential gfortran python2.7-dev libopenmpi-dev openmpi-bin
sudo mkdir /opt/nwchem
sudo chown $USER:$USER /opt/nwchem
cd /opt/nwchem/
wget http://www.nwchem-sw.org/download.php?f=Nwchem-6.3-src.2013-05-17.tar.gz
mv download.php\?f\=Nwchem-6.3-src.2013-05-17.tar.gz Nwchem-6.3-src.2013-05-17.tar.gz
tar xvf Nwchem-6.3-src.2013-05-17.tar.gz
cd nwchem-6.3-src.2013-05-17/
patch -p0 < diff.patch
patching file src/config/makefile.h patching file src/ddscf/movecs_pr_anal.F patching file src/ddscf/rohf.F patching file src/ddscf/scf_vec_guess.F patching file src/ddscf/uhf.F patching file src/mcscf/mcscf.F patching file src/nwdft/scf_dft/dft_mxspin_ovlp.F patching file src/nwdft/scf_dft/dft_scf.F patching file src/nwdft/scf_dft_cg/dft_cg_solve.F
cd src/ wget http://www.nwchem-sw.org/images/Iswtch.patch.gz gzip -d Iswtch.patch patch -p0 < Iswtch.patch cd ../ export LARGE_FILES=TRUE export TCGRSH=/usr/bin/ssh export NWCHEM_TOP=`pwd` export NWCHEM_TARGET=LINUX64 export NWCHEM_MODULES="all python" export PYTHONVERSION=2.7 export PYTHONHOME=/usr export BLASOPT="-L/opt/acml/acml5.3.1/gfortran64_int64/lib -lacml" export USE_MPI=y export USE_MPIF=y export USE_MPIF4=y export MPI_LOC=/usr/lib/openmpi/lib export MPI_INCLUDE=/usr/lib/openmpi/include export LIBRARY_PATH="$LIBRARY_PATH:/usr/lib/openmpi/lib:/opt/acml/acml5.3.1/gfortran64_int64/lib" export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread" export ARMCI_NETWORK=SOCKETS cd $NWCHEM_TOP/src make clean make nwchem_config make FC=gfortran 1> make.log 2>make.err cd $NWCHEM_TOP/contrib export FC=gfortran ./getmem.nwchem


Settings
Create /opt/nwchem/default.nwchemrc
nwchem_basis_library /opt/nwchem/nwchem-6.3-src.2013-05-17/src/basis/libraries/ ffield amber amber_1 /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/amber_s/ amber_2 /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/amber_x/ amber_3 /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/amber_q/ amber_4 /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/amber_u/ amber_5 /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/custom/ spce /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/solvents/spce.rst charmm_s /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/charmm_s/ charmm_x /opt/nwchem/nwchem-6.3-src.2013-05-17/src/data/charmm_x/

Symmlink to this file in each user's home:
ln -s /opt/nwchem/default.nwchemrc ~/.nwchemrc

16 May 2012

151. Building nwchem 6.1 on debian wheezy/testing with openblas

Build without external libs in a later post.

Openblas:

sudo apt-get install build-essential gfortran gpp
sudo mkdir /opt/openblas
sudo chown ${USER} /opt/openblas
cd ~/tmp
wget http://nodeload.github.com/xianyi/OpenBLAS/tarball/v0.1.1
tar xvf v0.1.1
cd xianyi-OpenBLAS-e6e87a2/
wget http://www.netlib.org/lapack/lapack-3.4.1.tgz
make all BINARY=64 CC=/usr/bin/gcc FC=/usr/bin/gfortran USE_THREAD=0 INTERFACE64=1 1> make.log 2>make.err

make PREFIX=/opt/openblas install
cp lib*.*  /opt/openblas/lib

add
export LD_LIBRARY_PATH:$LD_LIBRARY_PATH:/opt/openblas/lib
to your ~/.bashrc

[for later use with nwchem and ecce, add /opt/openblas/lib to /etc/ld.so.conf and do sudo ldconfig]



Nwchem:
sudo apt-get install libopenmpi-dev python-dev

sudo mkdir /opt/nwchem
sudo chown ${USER} /opt/nwchem
cd /opt/nwchem
wget http://www.nwchem-sw.org/images/Nwchem-6.1-2012-Feb-10.tar.gz
tar xvf Nwchem-6.1-2012-Feb-10.tar.gz
cd nwchem-6.1/

Edit line 1914 in src/config/makefile.h and add -lz -lssl for python support
Then continue

export LARGE_FILES=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=`pwd`
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES="all python"
export PYTHONVERSION=2.7
export PYTHONHOME=/usr
export BLASOPT="-L/opt/openblas/lib -lopenblas"
export USE_MPI=y
export USE_MPIF=y
export USE_MPIF4=y
export MPI_LOC=/usr/lib/openmpi/lib
export MPI_INCLUDE=/usr/lib/openmpi/include
export LIBRARY_PATH=$LIBRARY_PATH:/usr/lib/openmpi/lib
export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread"
cd $NWCHEM_TOP/src
make clean
make nwchem_config
make FC=gfortran
export FC=gfortran
cd ../contrib
./getmem.nwchem

Building takes ages

Edit your ~/.bashrc and add
export NWCHEM_EXECUTABLE=/opt/nwchem/nwchem-6.1/bin/LINUX64/nwchem
export NWCHEM_BASIS_LIBRARY=/opt/nwchem/nwchem-6.1/src/basis/libraries/
export PATH=$PATH:/opt/nwchem/nwchem-6.1/bin/LINUX64

To make ecce play nice you also need to edit /etc/ld.so.conf and add
/opt/openblas/lib
then do 
sudo ldconfig

Links to this post:
http://chemport.ru/forum/viewtopic.php?f=71&t=98589

22 April 2012

123. Adding python support to nwchem under debian

I've posted the general build instructions for nwchem 6.0 with mpi support here: http://verahill.blogspot.com.au/2012/03/nwchem-60-with-openmpi-support-on.html

However, those instructions don't include python support.

0. Download,  extract nwchem and install blas etc. as shown in http://verahill.blogspot.com.au/2012/03/nwchem-60-with-openmpi-support-on.html

1. Edit nwchem-6.0/src/config/makefile.h
Change line 1962 from
EXTRA_LIBS +=    -lnwcutil  -lpthread -lutil -ldl
to
EXTRA_LIBS +=    -lnwcutil  -lpthread -lutil -ldl -lz -lssl
2. Install python headers
sudo apt-get install python2.7-dev

3. Execute the following commands (one by one or by putting them in a shell script)
export LARGE_FILES=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=`pwd`
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES="all python"
export PYTHONHOME=/usr
export PYTHONVERSION=2.7
export USE_MPI=y
export USE_MPIF=y
export MPI_LOC=/usr/lib/openmpi/lib
export MPI_INCLUDE=/usr/lib/openmpi/include
export LIBRARY_PATH=$LIBRARY_PATH:/usr/lib/openmpi/lib
export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread"
cd $NWCHEM_TOP/src
make clean
make nwchem_config
make FC=gfortran

It should work fine and after a long build you'll have python enabled binaries.

4. Testing
You can test whether there's python support by creating test.nw with

python
for n in range(1,6):
        print n, n*2,n**2
end
task python
and running it with 
mpirun -n 1 nwchem test.nw

which gives



  NWChem Input Module
                                -------------------


                               NWChem Python program
                               ---------------------
for n in range(1,6):
        print n, n*2,n**2
1 2 1
2 4 4
3 6 9
4 8 16
5 10 25
1 2 1
2 4 4
3 6 9
4 8 16
5 10 25
 Task  times  cpu:        0.0s     wall:        0.0s

Done.


Note:
for ROCKS/CENTOS it was not necessary to edit src/config/makefile.h


The relevant parts in the the build configuration are
export NWCHEM_MODULES="all python"
export PYTHONHOME=/opt/rocks
export PYTHONVERSION=2.4

Other than that, just follow http://verahill.blogspot.com.au/2012/03/building-nwchem-60-on-rocks-543centos.html



Error:
gfortran: error: /usr/include/python2.7/lib/python2.7/config/libpython2.7.a: No such file or directory
make: *** [all] Error 1
locate libpython2.7.a
/usr/lib/libpython2.7.a
/usr/lib/python2.7/config/libpython2.7.a

Reason:
export PYTHONHOME=/usr/include/python2.7

Solution:
export PYTHONHOME=/usr


Error:
In function `PyZlib_compress':
(.text+0x1540): undefined reference to `deflateEnd'

Solution:
http://www.emsl.pnl.gov/docs/nwchem/nwchem-support/2012/02/0065.Re:_NWCHEM_undocumented_compilation_flag

Edit nwchem-6.0/src/config/makefile.h
For LINUX64 look at lines 1960-1964

1960      ifeq ($(BUILDING_PYTHON),python)
1961 #   EXTRA_LIBS += -ltk -ltcl -L/usr/X11R6/lib -lX11 -ldl
1962      EXTRA_LIBS +=    -lnwcutil  -lpthread -lutil -ldl
1963   LDOPTIONS = -Wl,--export-dynamic
1964      endif

Change line 1962 to
EXTRA_LIBS +=    -lnwcutil  -lpthread -lutil -ldl -lz -lssl


20 March 2012

114. Nwchem 6.0 with openmpi support on debian testing

I still haven't managed to compile a working versin of Nwchem 6.1 on Debian 64 bit regardless of whether I'm using mpich or openmpi. The number of posts relating to compiling nwchem is steadily growing, but I'd rather have post which are almost, but not quite, identical if it makes it's unambiguous for the average user how to build and use nwchem.

Anyway, since I'm using openmpi on my rocks cluster(s), I figure I might as well start using openmpi on debian too. In addition, the only way you can get nwchem 6.0 to work with mpich2 on debian seems to be by using the old v1.2 package which causes problems of its own (see apt-pinning).

Note: See here for information about python support: http://verahill.blogspot.com.au/2012/04/adding-python-support-to-nwchem-under.html

Long story short -- nwchem with openmpi:
mkdir ~/tmp
sudo apt-get install openmpi-bin libopenmpi-dev
wget http://www.nwchem-sw.org/images/Nwchem-6.0.tar.gz
tar -xvf Nwchem-6.0.tar.gz
cd nwchem-6.0/

export LARGE_FILES=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=/home/me/tmp/nwchem-6.0
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES=all
export USE_MPI=y
export USE_MPIF=y
export MPI_LOC=/usr/lib/openmpi/lib
export MPI_INCLUDE=/usr/lib/openmpi/include
export LIBRARY_PATH=$LIBRARY_PATH:/usr/lib/openmpi/lib
export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread"
cd $NWCHEM_TOP/src
make clean
make nwchem_config
make FC=gfortran

This will take a good 20-30 minutes.


Your binary will be in nwchem-6.0/bin/LINUX64/

Finally, see whether openmpi is already in your LD_LIBRARY_PATH

echo $LD_LIBRARY_PATH
/lib/openmm:/usr/lib/nvidia-cuda-toolkit:/usr/lib/nvidia
If not, edit ~/.bashrc and add
export LD_LIBRARY_PATH=/usr/lib/openmpi/lib:$LD_LIBRARY_PATH
export PATH=$PATH:/home/me/tmp/nwchem-6.0/bin/LINUX64


13 March 2012

105. Nwchem 6.1 with openmpi on ROCKS 5.4.3/CentOS 5.6


EDIT 18 May 2012: 
Compiling nwchem 6.1 with internal libs on debian: http://verahill.blogspot.com.au/2012/05/compiling-nwchem-61-with-internal-libs.html
Compiling nwchem 6.1 with openblas on debian: http://verahill.blogspot.com.au/2012/05/building-nwchem-61-on-debian.html


I can build and use nwchem on ROCKS 5.4.3 -- see instructions below.

EDIT: the gfortran version is GNU Fortran (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
On debian, which yields a segfaulting binary, the version is GNU Fortran (Debian 4.6.3-1) 4.6.3

I'm still having no luck building binaries which don't segfault on execution on debian though. The openmpi versions are the same for both ROCKS and debain: 1.4.3.

--START HERE --

ROCKS 5.4.3/CentOS
The build is essentially the same as for nwchem-6.0 (http://verahill.blogspot.com.au/2012/03/building-nwchem-60-on-rocks-543centos.html) - the single difference is that you need to define USE_MPIF4 or you get errors

To build:

wget http://www.nwchem-sw.org/images/Nwchem-6.1-2012-Feb-10.tar.gz
tar -xvf Nwchem-6.1-2012-Feb-10.tar.gz
cd nwchem-6.1/
export LARGE_FILES=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=/export/home/me/tmp/nwchem-6.1
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES=all
export USE_MPI=y
export USE_MPIF=y
export USE_MPIF4=y
export MPI_LOC=/opt/openmpi
export MPI_INCLUDE=/opt/openmpi/include
export LIBRARY_PATH=$LIBRARY_PATH:/opt/openmpi/lib
export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread"
cd $NWCHEM_TOP/src
make clean
make  nwchem_config
make  FC=gfortran

Building takes a little while.

Running:
Make sure that you make the reference to your openmpi libs permanent and make life easier by putting the following in your ~/.bashrc or /etc/profile:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib

export NWCHEM_EXECUTABLE=/export/home/me/tmp/nwchem-6.1/bin/LINUX64/nwchem
export NWCHEM_BASIS_LIBRARY=/export/home/me/tmp/nwchem-6.1/src/basis/libraries/
PATH=$PATH:/export/home/me/nwchem-6.1/bin/LINUX64



To run on multiple procs do
mpirun -n 3 nwchem input.nw
where 3 is the number of cores

104. Building gromacs with fftw3 and openmpi on ROCKS 5.4.3/CentOS

This guide was heavily modified on 13/03/2012 to remove the need for sudo/root privileges.

Not all flavours of linux are equal. I've always been a Debian man, but have recently become a user of a ROCKS based HPC cluster on a different continent. To make sure that I don't screw things up I'm currently trying to work out how to reliably compile common computational packages under ROCKS 5.4.3, which is CentOS based.

If you installed the bio roll from the beginning you'll have openmpi in /opt/openmpi (rocks_openmpi.x86_64 package), and fftw in /opt/rocks/lib and /opts/rocks/include (fftw.x86_64 package)

If you only installed the basic rolls, you won't have either. Now, you can either download the bio roll and install from there, or you can install the regular openmpi package and compile fftw yourself. In fact, you'll need to do the latter if you want double-precision gromacs anyway.

My goal is to avoid having to use sudo or root at all. I've rewritten this guide a couple of times, so there may be weird annoying errors that I've missed.

Installing openmpi:
If you don't have openmpi in /opt, then you can install it from the base roll
sudo yum install openmpi


fftw3:
You can skip this step IF
1. you have fftw files in /opt/rocks/lib and /opt/rocks/include
AND
2. you only want single precision

Otherwise:

wget http://www.fftw.org/fftw-3.3.1.tar.gz
tar -xvf fftw-3.3.1.tar.gz
cd fftw-3.3.1


Then use --prefix to tell make where to install the files:

Single precision fftw3 libraries:
make distclean
./configure --enable-float --enable-mpi --enable-threads --with-pic --prefix=/export/home/me/.fftwsingle
make
make install

Double-precision fftw3 libraries:
make distclean
./configure --disable-float --enable-mpi --enable-threads --with-pic --prefix=/export/home/me/.fftwdouble
make 
make install

gromacs:

First download and extract:

cd ~/tmp
wget ftp://ftp.gromacs.org/pub/gromacs/gromacs-4.5.5.tar.gz

tar -xvf gromacs-4.5.5.tar.gz
cd gromacs-4.5.5/

Before building you need to define where the openmpi libs are i.e.
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib
OR
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib64/openmpi/1.4-gcc/lib

We now have three permutations of possible builds:
1. We use the single precision fftw libs in /opt/rocks/lib and /opt/rocks/include
export LDFLAGS=-L/opt/rocks/lib
export CPPFLAGS=-I/opt/rocks/include
./configure --enable-mpi --enable-float --with-fft=fftw3 --program-suffix=_spmpi --prefix=/export/home/me/gromacs
make
make install

2. We use the single precision fftw libs in /export/home/me/.fftwsingle

export LDFLAGS=-L/export/home/me/.fftwsingle/lib
export CPPFLAGS=-I/export/home/me/.fftwsingle/include
./configure --enable-mpi --enable-float --with-fft=fftw3 --program-suffix=_spmpi --prefix=/export/home/me/gromacs
make
make install

3. We use the double precision fftw libs in /export/home/me/.fftwdouble

export LDFLAGS=-L/export/home/me/.fftwdouble/lib
export CPPFLAGS=-I/export/home/me/.fftwdouble/include
./configure --enable-mpi --disable-float --with-fft=fftw3 --program-suffix=_ddmpi --prefix=/export/home/me/gromacs
make
make install


Running

You will now have single and double-precision binaries, e.g.
grompp_spmpi and grompp_ddmpi

Make sure that you define/have defined LD_LIBRARY_PATH in /etc/profile or ~/.bashrc and included the paths to your mpi libs and your fftw libs, e.g.:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib:/export/home/me/.fftwsingle:/export/home/me/.fftwdouble

Actually, it doesn't seem necessary to include the fftw path.

You may also want to include your gromacs bins in your path:
export PATH=$PATH:/export/home/me/gromacs/bin

Dynamic load-balancing seems to be disabled by default, so to use multiple cores run using e.g.
mpirun -n 4 mdrun_spmpi -s inp.tpr -o out.trr etc.

DONE


Troubleshooting

Error:
checking size of off_t... configure: error: in `/export/home/me/tmp/gromacs-4.5.5':
configure: error: cannot compute sizeof (off_t)
See `config.log' for more details
config.log:
./conftest: error while loading shared libraries: libmpi.so.0: cannot open shared object file: No such file or directory
Solution:
Set LD_LIBRARY_PATH to your openmpi libs e.g.
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib

Error:
/usr/local/lib/libfftw3f.a: could not read symbols: Bad value
collect2: ld returned 1 exit status
make[3]: *** [libmd.la] Error 1
Solution:
Compile fftw3 using the --with-pic switch:
./configure --enable-float --enable-mpi --enable-threads --with-pic 




103. Building nwchem 6.0 on Rocks 5.4.3/CentOS

I've always been a Debian man, but for various reasons I need to be able to compile various scientific packages on a HPC running ROCKS. ROCKS 5.4.3 is based on CentOS 5,6and it turns out that debian is wonderfully easy, accommodating and robust in comparison. Well, since it's not my HPC, CentOS is what I'm stuck with.

Here's how to build nwchem on a rocks 5.4.3 (viper) cluster based on CentOS 5.6 and its ancient kernel.
(Linux  2.6.18-238.19.1.el5 #1 SMP Fri Jul 15 07:31:24 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux )

There are three different approaches:




CASE 1.
 Using LD_LIBRARY_PATH
This method requires no root access.
Check to see whether you've installed the rocks-openmpi package from the bio roll - it should be in /opt/openmpi. Otherwise use yum to install the base-roll openmpi package, which will end up in /usr/lib64/openmpi/1.4-gcc/lib -- you'll need root or sudo to do anything with yum.

For compilation, do
export LIBRARY_PATH=$LIBRARY_PATH:/opt/openmpi/lib
or
export LIBRARY_PATH=$LIBRARY_PATH:/usr/lib64/openmpi/1.4-gcc/lib/
depending on whether there is an openmpi directory in /opt or not.

You can also put the export line in your buildconf.sh below
For execution:
in either you ~/.bashrc (user basis) or /etc/profile (global) put
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib






CASE 2. /opt/openmpi is present; using symlinked libs

mpicc and mpif77 are probably already symlinked, but if not:

sudo ln -s /opt/openmpi/bin/mpicc /usr/bin/mpicc
sudo ln -s /opt/openmpi/bin/mpif77 /usr/bin/mpif77


The following allows for building and running:
sudo ln -s /opt/openmpi/lib/libmpi.so /usr/lib/libmpi.so
sudo ln -s /opt/openmpi/lib/libopen-rte.so /usr/lib/libopen-rte.so
sudo ln -s /opt/openmpi/lib/libopen-pal.so /usr/lib/libopen-pal.so
sudo ln -s /opt/openmpi/lib/libmpi_f77.so /usr/lib/libmpi_f77.so
sudo ln -s /opt/openmpi/lib/libmpi.so /usr/lib64/libmpi.so.0
sudo ln -s /opt/openmpi/lib/libopen-rte.so /usr/lib64/libopen-rte.so.0
sudo ln -s /opt/openmpi/lib/libopen-pal.so /usr/lib64/libopen-pal.so.0
sudo ln -s /opt/openmpi/lib/libmpi_f77.so /usr/lib64/libmpi_f77.so.0


the /usr/lib64 symlinks are necessary for execution, or you'll get
./nwchem: error while loading shared libraries: libmpi.so.0: cannot open shared object file: No such file or directory



CASE 3. /opt/openmpi is NOT present; using symlinked libs

yum install openmpi openmpi-devel
And then put in all the symlinks...dunno why this isn't done on install, but there you go.

sudo ln -s /usr/local/lib64/openmpi/1.4-gcc/bin/mpicc  /usr/bin/mpicc
sudo ln -s /usr/local/lib64/openmpi/1.4-gcc/bin/mpif77 /usr/bin/mpif77
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi.so /usr/lib/libmpi.so
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-rte.so /usr/lib/libopen-rte.so
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-pal.so /usr/lib/libopen-pal.so
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi_f77.so /usr/lib/libmpi_f77.so

Using the above symlinks compilation will work just fine.
However, in order to actually run nwchem you need
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi.so /usr/lib64/libmpi.so.0
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-rte.so /usr/lib64/libopen-rte.so.0
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libopen-pal.so /usr/lib64/libopen-pal.so.0
sudo ln -s /usr/lib64/openmpi/1.4-gcc/lib/libmpi_f77.so /usr/lib64/libmpi_f77.so.0

or you'll get
./nwchem: error while loading shared libraries: libmpi.so.0: cannot open shared object file: No such file or directory
Finally, make sure we can find our mpirun:
sudo ln -s /usr/lib64/openmpi/1.4-gcc/bin/mpirun /usr/bin/mpirun


ALL CASES
Continue here:
We'll be working in /export/home/me/tmp
wget http://www.nwchem-sw.org/images/Nwchem-6.0.tar.gz
tar -xvf Nwchem
cd nwchem-6.0

create a file called buildconf.sh and stuff it with the following:
export LARGE_FILES=TRUE
export TCGRSH=/usr/bin/ssh
export NWCHEM_TOP=/export/home/me/tmp/nwchem-6.0
export NWCHEM_TARGET=LINUX64
export NWCHEM_MODULES=all
export USE_MPI=y
export USE_MPIF=y
export MPI_LOC=/usr/lib64/openmpi/1.4-gcc/lib
export MPI_INCLUDE=/usr/lib64/openmpi/1.4-gcc/include
export LIBMPI="-lmpi -lopen-rte -lopen-pal -ldl -lmpi_f77 -lpthread"
cd $NWCHEM_TOP/src
make clean
make nwchem_config
make FC=gfortran
NOTE: the above buildconf.sh works for the case when you installed openmpi yourself (CASE 2 or 3). If it got installed with ROCKS on setup and is present in /opt/openmpi (CASE 1 or 3) change the following:

export MPI_LOC=/opt/openmpi/lib
export MPI_INCLUDE=/opt/openmpi/include
Launch the build

sh buildconf.sh

You'll end up with a binary called nwchem in nwchem--6.0/bin/LINUX64 -- you can put a PATH to it in your ~/.bashrc


CASE 3
For execution you will need to make sure nwchem can find the openmpi libs --
echo $LD_LIBRARY_PATH
will tell you whether the path is included by default.
Otherwise, in either you ~/.bashrc (user basis) or /etc/profile (global) put
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/openmpi/lib


Running
If you move nwchem out of the compilation directory (to say /usr/local/nwchem) you may also want to define e.g.

export NWCHEM_TOP=/usr/local/nwchem-6.0
export NWCHEM_TARGET=LINUX64
export NWCHEM_BASIS_LIBRARY=${NWCHEM_TOP}/libraries/

Again, this goes into your .bashrc or /etc/profile, depending on scope.

To use multiple cores, do
mpirun -n 4 nwchem jobname.nw
where the number of cores is 4.


Errors and troubleshooting:
If you get errors about libraries missing or mpicc-related errors you should make sure that you've symlinked everything you need into the /usr/lib folder or set the LIBRARY_PATH (see above). You could probably edit /etc/ld.conf too, but it will get messy with time.

I also tried building using mpich2-1.2 as well as 1.4, but got error messages about undefined references left and right.