-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test script using mpi #49
Comments
Can you repeat what your exact error is, and which MultiNest version you use? |
I have MultiNest v3.6 I got many memory management errors: *** glibc detected *** /usr/bin/python2.7: free(): invalid next size (normal): 0x00000000030fd660 *** ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0 [vsyscall] |
What do you think about this problem? I got also this message: APPLICATION TERMINATED WITH THE EXIT STRING: Aborted (signal 6) |
Please make sure you have installed mpi4py. |
I am using mpiexec. With 1 core, the script is working fine. But with core>1, it terminates. Using mpiexec, I am running other programs, which are not related to multinest and they are working fine. So, I think, it's not a problem of mpi installation. Just for clarification, here is my ".oar" file to submit the job. #!/bin/bash |
It is still not clear to me whether the correct multinest library is loaded (mpi version vs non-mpi version). see first few lines of https://github.com/JohannesBuchner/PyMultiNest/blob/master/pymultinest/run.py . The meaning of the error depends on
You can find out which library is loaded with lsof -p . I suspect you are not using the MPI version of the multinest library. |
The problem is that the program terminates very quickly and I can't get any On Fri, Dec 12, 2014 at 2:01 PM, Johannes Buchner [email protected]
|
I reinstalled multinest and pymultinest in mac 16 GB macbook pro. Without mpi it was perfect as as before but with "mpirun -np 2 python demo.py" commend I got the following errors, which is exactly the same as written #45 but no clear solution to this is written! *** set a breakpoint in malloc_error_break to debug mpirun noticed that process rank 1 with PID 88238 on node dhcp2-139 exited on signal 6 (Abort trap: 6). |
It is still not clear to me whether the correct multinest library is loaded (mpi version vs non-mpi version). see first few lines of https://github.com/JohannesBuchner/PyMultiNest/blob/master/pymultinest/run.py . The meaning of the error depends on
You can find out which library is loaded with lsof -p . I suspect you are not using the MPI version of the multinest library. You can also use ldd or strace. |
@suvenduat Are you using mpi4py? You probably need to. My skeleton pymultinest runs go something like this:
Then supply init_MPI=False to pymultinest.run() Invoke as: mpiexec -np NUMPROCS python code.py At one point I had different behaviour with the supposedly synonymous mpiexec and mpirun. As @JohannesBuchner says, make sure you're using the MPI version of libmultinest.so - can you confirm? Let me know if you need any more. It was painful to figure out but I now routinely have pymultinest running across two or more 48-core nodes.... |
@jtlz2 Great.. pymultinest working perfectly using mpiexec in my mac after taking your suggestion (it's clearly explained) and @JohannesBuchner suggestion.
I should check which mpiexec is this and the reason for it!! Any idea will be helpful. |
The line
should not be needed, because it is already inside pymultinest. |
I thinks you should automatically set init_MPI=False always. |
Running successfully now in cluster. Thank you. |
If you can leave any advice for the next person trying to get pymultinest to run with MPI it would be highly appreciated. |
Must install mpi4py to run using MPI. Supply init_MPI=False to pymultinest.run() |
Great, thank you. In the next version I will add a MPI section to the manual, and change the code to set init_MPI to false by default. |
Dear Suvendu, |
Dear Johannes, Cheers, On Fri, Jan 9, 2015 at 1:58 PM, Johannes Buchner [email protected]
|
Hi Johannes,
I am having problem to run pymltinest using MPI. Without MPI, everything is perfect.
I got the same error like issues 6 using the same demo script:
#6
I think MPI is correctly install in server.
I could not get what modification I should make to run it properly.
Could you please help me to solve this problem? Thank you.
Regards,
Suvendu Rakshit
The text was updated successfully, but these errors were encountered: