Hi,
I have been using ALPS on my home computer using cygwin, but I recently installed it on a system that is hooked to some computer clusters at my university. I got ALPS to install except for HDF. I tried the HDF library on the server and installed one myself from the HDF website, but neither one worked. When I did the ./configure it would say that it couldn't find H5File() or something like that in one of the libraries. How important is it to have HDF? Any suggestions on what to try to get ALPS to install with HDF?
My second question is with regard to parallel processing and ALPS. I used the --with-pthread flag when I installed and it is installed with MPI support, but is ALPS really set up for parallel processing very well? I ask because I saw that the --with-pthread flag was experimental.
Also, I have several different clusters available to me for use. Any suggestions as to which one would work best? Here are their characteristics: # Tunnel Arch (48 nodes, 96 procs)
* "data mining cluster" for jobs requiring large memory. * 1.4 and 1.8 GHz Opteron processors * 4 Gbytes memory per node * Gigabit Ethernet interconnect
# Marching Men (164 nodes, 328 procs)
* "cycle farm" for serial and smaller parallel * jobs which do not require a high speed interconnect. * 1.4 and 1.8 GHz Opteron processors * 2 Gbytes memory per node * Gigabit Ethernet interconnect
# Delicate Arch (256 nodes, 512 procs)
* "parallel cluster" for highly parallel parallel jobs requiring high speed interconnect. * 1.4 GHz Opteron processors * 2 Gbytes memory per node * Both Myrinet and Gigabit Ethernet interconnects
# Sanddune Arch (156 nodes, 312 procs, 624 cores)
* "parallel cluster" for highly parallel parallel jobs requiring high speed interconnect. * 2.4 GHz dual-core Opteron processors * 8 Gbytes memory per node (2 Gbytes per processor core) * Both Infiniband and Gigabit Ethernet interconnects
Thank you for the help, Justin Peel