import z2pack error

Wei Li verali at udel.edu
Mon Feb 13 23:40:19 CET 2017


Hi Dominik,

Thanks, it runs well now.

For the given example (Bi) calculation, I need to set up the search_shells number. What’s the value you recommend? 
I have increased to 30 from default value 12, it still can not satisfy the B1 condition:

 Unable to satisfy B1 with any of the first  30 shells
 Your cell might be very long, or you may have an irregular MP grid
 Try increasing the parameter search_shells in the win file (default=12)

Or do I need to check other parameters?

Here is my input:

num_wann = 10
num_bands = 10
spinors : true
search_shells=30
num_iter 0
exclude_bands 11-15

Regards
Wei Li


> On Feb 11, 2017, at 4:00 PM, Dominik Gresch <greschd at phys.ethz.ch> wrote:
> 
> Dear Wei Li,
> 
> If everything work correctly, Z2Pack should automatically call VASP (via the command given) and read the .mmn file after VASP has generated it. This is because Z2Pack needs to iteratively change the k-point path with which VASP is called and the overlap file is calculated.
> 
> In your case, it seems that you are submitting the VASP run to some scheduler -- this means the command returns immediately, before the .mmn file has been generated. As a result, Z2Pack tries       to read the file, but it does not exist yet. You can solve this problem by submitting instead the Python process running Z2Pack, and then running VASP within that same job.
> 
> Best regards,
> 
> Dominik
> 
> On 11.02.2017 20:40, Wei Li wrote:
>> Dear Dominik,
>> 
>> From what I understand, after calling the first principle code (in my case, it is VASP), the next step is to 
>> read the .mmn file in the top level of the build file. So I give the mmn path like below:
>> 
>> system = z2pack.fp.System(
>>     input_files=["input/CHGCAR", "input/INCAR", "input/POSCAR", "input/POTCAR", "input/wannier90.win" ],
>>     kpt_fct=z2pack.fp.kpoint.vasp,
>>     kpt_path="KPOINTS",
>>     command="qsub /home/work/mtg/users/verali/Bi_Z2/Z2/paral.qs"
>>     mmn_path="wannier90.mmn"
>> )
>> 
>> But it does not work. 
>> 
>> Could you advise how to read the mmn file?
>> 
>> Regards
>> Wei Li
>> 
>> 
>>> On Feb 10, 2017, at 2:59 PM, Wei Li <verali at udel.edu <mailto:verali at udel.edu>> wrote:
>>> 
>>> Dear Dominik,
>>> 
>>> Thanks for your reply. I solved this problem.
>>> 
>>> I generated the build file as following:
>>> 
>>> CHG     CONTCAR  EIGENVAL  INCAR    OSZICAR  paral.qs.o889660  POSCAR  REPORT       wannier90.eig  wannier90.win   WAVECAR
>>> CHGCAR  DOSCAR   IBZKPT    KPOINTS  OUTCAR   PCDAT             POTCAR  vasprun.xml  wannier90.mmn  wannier90.wout  XDATCAR
>>> 
>>> There is no error.
>>> 
>>> which script do I need to read the .mmn file to plot the result? So far, the result file is empty.
>>> 
>>> Regards
>>> Vera 
>>> 
>>> 
>>>> On Feb 8, 2017, at 11:16 AM, Dominik Gresch <greschd at phys.ethz.ch <mailto:greschd at phys.ethz.ch>> wrote:
>>>> 
>>>> Dear Wei Li,
>>>> 
>>>> It seems you are running multiple instances of the Python interpreter, due to the 'mpirun' command. There seems to be a bit of a race condition when they all try to create the 'results' folder.
>>>> 
>>>> In general, you shouldn't run Python itself with mpirun, only the first-principles command (command="mpirun YOUR_VASP_COMMAND" in the z2pack.fp.System options).
>>>> Best regards,
>>>> 
>>>> Dominik Gresch
>>>> 
>>>> On 08.02.2017 17:08, Wei Li wrote:
>>>>> Dear Dominik Gresch,
>>>>> 
>>>>> Thanks a lot.
>>>>> 
>>>>> I activated py35 and it works.
>>>>> 
>>>>> Do I need to change any parameters if I want to run the example given here:
>>>>> https://github.com/Z2PackDev/Z2Pack/tree/master/2.0.x/examples/fp/Bi_vasp <https://github.com/Z2PackDev/Z2Pack/tree/master/2.0.x/examples/fp/Bi_vasp>
>>>>> 
>>>>> I run it directly and I got the attached output, I did not find any error in output file or log file, but the process is terminated.
>>>>> 
>>>>> 
>>>>> Regards
>>>>> Wei Li
>>>>> 
>>>>> 
>>>>>> On Feb 8, 2017, at 4:53 AM, Dominik Gresch <greschd at phys.ethz.ch <mailto:greschd at phys.ethz.ch>> wrote:
>>>>>> 
>>>>>> Dear Wei Li,
>>>>>> 
>>>>>> The python version on which you installed Z2Pack (3.5.2) is probably not the same as you are using to run the script. Since it seems you are using a virtualenv (py35) when running Python from the command line, you should make sure the same virtualenv is also active in your job (i.e., "source py35/bin/activate" in your script).
>>>>>> 
>>>>>> Best regards,
>>>>>> 
>>>>>> Dominik Gresch
>>>>>> On 08.02.2017 10:31, Wei Li wrote:
>>>>>>> Dear Z2pack team,
>>>>>>> 
>>>>>>> I have complied Wannier90 1.2 on top of VASP, and then I downloaded and installed z2pack like below:
>>>>>>> 
>>>>>>> (py35) [verali at farber input]$ python
>>>>>>> Python 3.5.2 |Anaconda 4.2.0 (64-bit)| (default, Jul  2 2016, 17:53:06) 
>>>>>>> [GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux
>>>>>>> Type "help", "copyright", "credits" or "license" for more information.
>>>>>>> >>> import z2pack
>>>>>>> >>> z2pack.__version__
>>>>>>> '2.0.3'
>>>>>>> >>> 
>>>>>>> 
>>>>>>> I think this means I have successfully installed z2pack, right?
>>>>>>> 
>>>>>>> But when I run run.py script, it had the error:
>>>>>>> 
>>>>>>>   File "run.py", line 14, in <module>
>>>>>>> Traceback (most recent call last):
>>>>>>>   File "run.py", line 14, in <module>
>>>>>>>     import z2pack
>>>>>>> ImportError: No module named z2pack
>>>>>>> Traceback (most recent call last):
>>>>>>>   File "run.py", line 14, in <module>
>>>>>>> 
>>>>>>> 
>>>>>>> Below is my job file:
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> vpkg_require intel/2015.3.187
>>>>>>> vpkg_require openmpi/1.8.5
>>>>>>> 
>>>>>>> #$ -pe mpi 40 
>>>>>>> #$ -l m_mem_free=3G
>>>>>>> #$ -l h_cpu=09:00:00
>>>>>>> ##$ -l exclusive=1
>>>>>>> 
>>>>>>> MY_EXE="/home/work/mtg/users/bin/vasp_ncl"
>>>>>>> 
>>>>>>> MY_EXE_ARGS=()
>>>>>>> 
>>>>>>> #
>>>>>>> # Uncomment to enable lots of additional information as OpenMPI executes
>>>>>>> # your job:
>>>>>>> #
>>>>>>> #SHOW_MPI_DEBUGGING=YES
>>>>>>> 
>>>>>>> ##
>>>>>>> ## You should NOT need to change anything after this comment.
>>>>>>> ##
>>>>>>> OPENMPI_FLAGS="--display-map --mca btl ^openib --mca oob_tcp_if_exclude lo,ib0 --mca btl_tcp_if_exclude lo,ib0"
>>>>>>> if [ "${WANT_CPU_AFFINITY:-NO}" = "YES" ]; then
>>>>>>>   OPENMPI_FLAGS="${OPENMPI_FLAGS} --bind-to core"
>>>>>>> fi
>>>>>>> if [ "${WANT_NPROC:-0}" -gt 0 ]; then
>>>>>>>   OPENMPI_FLAGS="${OPENMPI_FLAGS} --np ${WANT_NPROC} --map-by node"
>>>>>>> fi
>>>>>>> if [ "${SHOW_MPI_DEBUGGING:-NO}" = "YES" ]; then
>>>>>>>   OPENMPI_FLAGS="${OPENMPI_FLAGS} --debug-devel --debug-daemons --display-devel-map --display-devel-allocation --mca mca_verbose 1 --mca coll_base_verbose 1 --mca ras_base_verbose 1 --mca ras_gridengine_debug 1 --mca ras_gridengine_verbose 1 --mca btl_base_verbose 1 --mca mtl_base_verbose 1 --mca plm_base_verbose 1 --mca pls_rsh_debug 1"
>>>>>>>   if [ "${WANT_CPU_AFFINITY:-NO}" = "YES" ]; then
>>>>>>>     OPENMPI_FLAGS="${OPENMPI_FLAGS} --report-bindings"
>>>>>>>   fi
>>>>>>> fi
>>>>>>> 
>>>>>>> echo "GridEngine parameters:"
>>>>>>> echo "  mpirun        = "`which mpirun`
>>>>>>> echo "  nhosts        = $NHOSTS”
>>>>>>> echo "  nproc         = $NSLOTS"
>>>>>>> echo "  executable    = $MY_EXE"
>>>>>>> echo "  MPI flags     = $OPENMPI_FLAGS"
>>>>>>> echo "-- begin OPENMPI run -- on "
>>>>>>> mpirun ${OPENMPI_FLAGS} python run.py
>>>>>>> echo "-- end OPENMPI run -- on “
>>>>>>> 
>>>>>>> Pls help to advise how can I solve this problem.
>>>>>>> 
>>>>>>> Thanks in advance.
>>>>>>> 
>>>>>>> Wei Li
>>>>>>> Ph.D student 
>>>>>>> University of Delaware
>>>>>>> 
>>>>>>>                                           
>>>>>> 
>>>>> 
>>>> 
>>> 
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.phys.ethz.ch/private/z2pack/attachments/20170213/47f3a0bf/attachment.html>


More information about the Z2Pack mailing list