The configuration seems fine and the hdf5 message are not related to the MPI execution.

Can you clarify a bit what you mean by “see a mess”? How do you run the mpi application?


Michele

--
ETH Zurich
Michele Dolfi
Institute for Theoretical Physics
HIT G 32.4
Wolfgang-Pauli-Str. 27
8093 Zurich
Switzerland


+41 44 633 78 56 phone
+41 44 633 11 15 fax 





On Oct 29, 2016, at 9:42 PM, Tadeusz Wasiutyński <tadeusz.wasiutynski@gmail.com> wrote:

Hello,
I met the problem on my CENTOS 7 installed on station with 24 threads. After:

~/opt/bin/cmake  -D Boost_ROOT_DIR:PATH=~/alps-2.2.b4-src-with-boost/boost/ -D MPI_C_LIBRARIES=/usr/lib64/openmpi/libmpi.so -D MPI_C_INCLUDE_PATH=/usr/lib64/openmpi/include/ -D MPI_CXX_INCLUDE_PATH=/usr/lib64/openmpi/include -D LPSolve_LIBRARY=/usr/lib64/liblpsolve55.so -D LPSolve_INCLUDE_DIR=/usr/include/lpsolve/ -D HDF5_LIBRARIES=~/opt/lib/libhdf5.so -D HDF5_INCLUDE_DIR=~/opt/include/ -D SZIP_LIBRARIES=/usr/lib64/lib/libsz.so -D SZIP_INCLUDE_DIRS=/usr/lib64/include/  ~/alps-2.2.b4-src-with-boost/alps/

I receive:

---- Compiler version: c++ (GCC) 4.8.5 20150623 (Red Hat 4.8.5-4)
-- Build type: Release
-- Python interpreter /usr/bin/python
-- Python interpreter ok : version 2. PYTHON_INCLUDE_DIRS =  /usr/include/python2.7
-- PYTHON_NUMPY_INCLUDE_DIR = /usr/lib64/python2.7/site-packages/numpy/core/include
-- PYTHON_SITE_PKG = /usr/lib/python2.7/site-packages
-- PYTHON_LIBRARY = /usr/lib64/python2.7/config/libpython2.7.so
-- PYTHON_EXTRA_LIBS =-lpthread -ldl  -lutil
-- PYTHON_LINK_FOR_SHARED =  -Xlinker -export-dynamic
-- ALPS version: 2.2.b4
-- Looking for Boost Source
-- Found Boost Source: /home/twasiutynsk/alps-2.2.b4-src-with-boost/boost
-- Boost Version: 1_58_0
-- Adding Boost dir: /home/twasiutynsk/alps-2.2.b4-src-with-boost/boost 
-- MPI compiler was /usr/lib64/openmpi/bin/mpicxx
-- Falling back to CMake provied LAPACK/BLAS detection.
-- A library with BLAS API found.
-- A library with BLAS API found.
-- A library with LAPACK API found.
-- SQLite Library: not found
-- Could NOT find SZIP (missing:  SZIP_LIBRARIES SZIP_INCLUDE_DIRS) 
-- HDF5 without THREADSAFE mode. ALPS will ensure thread safety by HDF5 running sequentially.
-- Python interpreter /usr/bin/python
-- Python interpreter ok : version 2.7.5
-- PYTHON_INCLUDE_DIRS =  /usr/include/python2.7
-- PYTHON_NUMPY_INCLUDE_DIR = /usr/lib64/python2.7/site-packages/numpy/core/include
-- PYTHON_SITE_PKG = /usr/lib/python2.7/site-packages
-- PYTHON_LIBRARY = /usr/lib64/python2.7/config/libpython2.7.so
-- PYTHON_EXTRA_LIBS =-lpthread -ldl  -lutil
-- PYTHON_LINK_FOR_SHARED =  -Xlinker -export-dynamic
-- Numpy include in /usr/lib64/python2.7/site-packages/numpy/core/include
-- ALPS XML dir is /opt/alps/lib/xml
-- HDF5 without THREADSAFE mode. ALPS will ensure thread safety by HDF5 running sequentially.
-- MPS: enabling NU1 symmetry.
-- HDF5 without THREADSAFE mode. ALPS will ensure thread safety by HDF5 running sequentially.
-- tebd will not be built
-- HDF5 without THREADSAFE mode. ALPS will ensure thread safety by HDF5 running sequentially.
-- Configuring done
-- Generating done
-- Build files have been written to: /home/twasiutynsk/build

in ccmake I see MPI ON.  make and make tests (all passed) go smoothly. In tutorial runs I see however mess with MPI=24 while everything goes OK with MPI=1. 
Is something wrong in my HDF5? 

In docker hub I found dolfim/alps but could not run probably because of some path problems. Anyone did it successfully?
Regards

-- 
Tadeusz Wasiutyński


--
Tadeusz Wasiutyński


----
Comp-phys-alps-users Mailing List for the ALPS Project
http://alps.comp-phys.org/

List info: https://lists.phys.ethz.ch//listinfo/comp-phys-alps-users
Archive: https://lists.phys.ethz.ch//pipermail/comp-phys-alps-users

Unsubscribe by writing a mail to comp-phys-alps-users-leave@lists.phys.ethz.ch.