Dear Volodymyr,
The Hilbert space size for 8 up and 8 down particles in 32 sites is about 10^14 elements. To store the matrix you will thus need 10^31 bytes of memory, or about 10^16 Petabyte, 16 orders of magnitude beyond the memory of the biggest supercomputer in the world. The scaling is exponential.
Matthias
On Nov 25, 2011, at 2:14 PM, Volodymyr Derzhko wrote:
Dear Mr. Troyer,
Thank you for your answer. I'm going to put N_total near half of the number of the lattice cites. In this case its near 16. But I also want to look a bit closer on one-particle problem, and compare ED results with analytical. Do you think, there is a solution to that? More powerful computer? Wait longer?
Sill, it is a bit strange, because for 8 and 18 sites diagonalization took 6 and 30 seconds, respectively. I suppose, it is less than the program need to build and diagonalize matrix in 2^16- or 2^32-dimensional space.
Thank you for help!
Volodymyr Derzhko
It's a limitation because the codes are not optimized for very few particles - it essentially builds up the full Hilbert space of fermions on 32 sites (2^64 states) and filters out your few states. What are the target numbers of particles that you aim for?
Matthias