Cyclone case with MPI parallelization crashing with caught signal 11 segmentation faults

I am using MFIX to make a simple cyclone (Files attached).

Whenever I run it in parallel using MPI (slurm script attached) I get a sgementation fault after a few time-steps.

This is weird because, the error does not come up at the start of the simulation but comes sometime later. (OUT file attached!)

out_file.txt (247.0 KB)
geometry_0001.stl (2.1 MB)
geometry_0001.stl (2.1 MB)
vtk_out.f (158.2 KB)
fassani_cyclone.mfx (21.0 KB)
job_submit_file.txt (1.1 KB)

I am not able to reproduce the segfault.

  1. What is your parallel decomposition (NODESI, NODESJ, NODESK)
  2. What mpi version are you using and did you build the solver with the same version as when submitting the job?
  3. Can you please try to build the solver without smp support (only dmp) and with debug flags (choose “Debug” as the build type, this may give better details if it crashes).

Your stl file is not watertight and it would be better to clean it up , but I cannot tell if this is the issue because it doesn’t fail on my system.

  1. The nodes are 2 x 2 x2
  2. The MPI version is 4.1.2 both for building the solver and submission
  3. I am trying the pure DMP solver now and will let you know by afternoon

Dear Jeff,

I tried with the pure DMP solver and got the same error. IN this case as well the [IJK] is[2 2 2].

I have attached the out file for your checking.
crashed_out_file.txt (290.6 KB)

Also, I ran it pure MPI and not SMP cores, and this was the top error:

4 /home/ub07846/cyclone_HPC/Pressure_fixed/case_2a/mfixsolver.cpython-310-x86_64-linux-gnu.so(__pic_wall_bcs_MOD_apply_wall_bc_pic+0x5fb) [0x7f9e3bde293b]
5 /home/ub07846/cyclone_HPC/Pressure_fixed/case_2a/mfixsolver.cpython-310-x86_64-linux-gnu.so(__pic_time_march_mod_MOD_pic_time_step+0x93) [0x7f9e3bd9b1e3]

Seems something is wrong with the pic_wall_bcs and the _pic_time_march?