Hello,
No, currently it is not possible. Could you tell me a bit more why you need such functionality?
when I run my simulations, I like to have a rough idea of where it is going. So before switching to the ALPS libraries for scheduling, I defined a threshold for how many sweeps where done on a given task before switching to a different one. That way, I could run a job for may be 24h and see where the results were headed. This is especially useful when the cluster queue is loaded and it might take some time after a resubmitted job starts. I could set the number of sweeps to a very large number and see whether the result had converged or not. If it did, I would deactivate the task via an entry in its file. That way, jobs that require less work don't take up computing time from those that need more.
I am guessing the same can be done in ALPS by defining a rather low number of sweeps and then changing the xml-files if needed, right?
Does the parapack scheduler have this functionality?
The easiest is just to split the simulation into several input files, each containing four instances and then run them alternatingly for some fixed time.
Matthias
That's of course true. When running alpspython on an input file, can I define the number of input files the job should be split into?
Best, Peter