Dear ALPS developers,
I am running worm simulations for many different parameter sets on cluster. The problem I am having is filesystem bandwidth limitation, probably mostly because other users also use the system. What I have found out when running simulation is the generated output data is quite big. I generated 72MB of data for one simulation with aprox. 20 tasks (worm simulation, 1000000 sweeps, I change t in Hubbard model from [0.03 - 0.25]). I have many different simulations, quickly generating 10GB+ of data. I am not using --write-xml option. My question is, why is the output so big? From what I understand, the only thing needed is to store all observables with their errors (+ checkpoint data). Is there any way to: minimize the output generated (so I don't contribute to bandwidth limitations of cluster) or copy only data that I need (namely averages) to my laptop? I would like to work on data not connected to cluster and copying many GBs of files seems a waste (+ is really slow on the system).
Regards,
Žiga Osolin