Errors when running the tutorial case "FB_SQP" with a DMP solver in MFIX-23.1

Dear mfix team,

The command window reported “Error: solver crash! & Error STOP” when I tried to run the tutorial case “FB_SQP” with a built DMP solver in MFIX-23.1. I cannot find a solution.

The simulation parameters in this tutorial case were not adjusted, and this case ran well using the default solver.

To build a DMP solver successfully, I changed a line in CMakeLists.txt as suggested by @cgw (Errors when building a DMP solver in MFIX-22.4 on CentOS 7.2), I am wondering if this would cause such a error in mfix-23.1.
2

I don’t think changing that compiler flag should be a problem - it just turns off a certain optimization - turning it off should be safe.

Please attach your project files (Main menu → Submit bug report) and I’ll have a look.

Thanks, @cgw, the project files are now attached.
fb_sqp_2023-04-25T211047.061194.zip (28.4 KB)

gfortran:            GNU Fortran (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44)
openmpi:             mpirun (Open MPI) 1.10.7

Both your Fortran compiler and your OpenMPI installation are antiques.
Can you upgrade? The current MPI version is 4.1.5.

Note that gfortran and OpenMPI are in conda-forge, so you can upgrade very easily.

$ conda activate mfix-23.1
$ conda install -c conda-forge gfortran openmpi

or if you use Mamba:

$ mamba activate mfix-23.1
$ mamba install gfortran openmpi

These will install in $CONDA_PREFIX so they won’t affect the ‘system’ utilities and the installation will not require root privileges.

Thank you for your response! @cgw I followed your instructions and was able to upgrade gfortran and OpenMPI easily. However, a similar error as previous still occurred, it seemed that the old-version Openmpi was not be replaced by the updated version according to the files of bug report (attached below). I am not a expert on this issue at all, would you please give me some additional advice?
fb_sqp_2023-04-25T221245.083289.zip (28.1 KB)

Did you install them into the mfix-23.1 environment? Did you restart MFiX after doing so?

What do you get if you type (with mfix-23.1 activated):

$ which mpirun
$ mpirun --version
$ conda list mpi

please see below:

How about mfixversioninfo ? What does that say?

MFiX: 23.1
Python: 3.10.10 | packaged by conda-forge | (main, Mar 24 2023, 20:08:06) [GCC 11.3.0]
gfortran: GNU Fortran (conda-forge gcc 12.2.0-19) 12.2.0
openmpi: mpirun (Open MPI) 1.10.7
Qt wrapper: PyQt5
Qt: 5.15.6
qtpy: 2.3.1
numpy: 1.24.3
nodeworks: Unavailable
flask: 2.2.3
psutil: 5.9.5
VTK: 9.2.6
OpenGL backend: 2
System info: Linux-3.10.0-327.el7.x86_64-x86_64-with-glibc2.17
Install location: /home/LGQ/anaconda3/envs/mfix-23.1/share/mfix
Default solver: /home/LGQ/anaconda3/envs/mfix-23.1/bin/mfixsolver
Solver source: /home/LGQ/anaconda3/envs/mfix-23.1/share/mfix/src
The new project files were attached before.

This doesn’t make sense to me.

When you run mpirun --version from the command line you get

mpirun (OpenMPI) 4.1.5

but the output of mfixversioninfo says:

mpirun (Open MPI) 1.10.7

The mfixversioninfo script is just running:

        status, output = subprocess.getstatusoutput(
            'mpirun --version')
        if status==0:
            OPENMPI_VERSION = output.splitlines()[0]

It’s running the same mpirun --version command that you typed in, but getting a different result. I do not understand this. Are you sure this is running in the same environment?

Sorry for the misleading, the command window reads like:
(base) [LGQ@node15 Desktop]$ conda activate mfix-23.1
(mfix-23.1) [LGQ@node15 Desktop]$ which mpirun
~/anaconda3/envs/mfix-23.1/bin/mpirun
(mfix-23.1) [LGQ@node15 Desktop]$ mpirun --version
mpirun (Open MPI) 4.1.5

Report bugs to Getting Help
(mfix-23.1) [LGQ@node15 Desktop]$ conda list mpi

packages in environment at /home/LGQ/anaconda3/envs/mfix-23.1:

Name Version Build Channel

mpi 1.0 openmpi conda-forge
openmpi 4.1.5 h414af15_101 conda-forge
(mfix-23.1) [LGQ@node15 Desktop]$ mfixversioninfo
MFiX: 23.1
Python: 3.10.10 | packaged by conda-forge | (main, Mar 24 2023, 20:08:06) [GCC 11.3.0]
gfortran: GNU Fortran (conda-forge gcc 12.2.0-19) 12.2.0
openmpi: mpirun (Open MPI) 4.1.5
Qt wrapper: PyQt5
Qt: 5.15.6
qtpy: 2.3.1
numpy: 1.24.3
nodeworks: Unavailable
flask: 2.2.3
psutil: 5.9.5
VTK: 9.2.6
OpenGL backend: 2
System info: Linux-3.10.0-327.el7.x86_64-x86_64-with-glibc2.17
Install location: /home/LGQ/anaconda3/envs/mfix-23.1/share/mfix
Default solver: /home/LGQ/anaconda3/envs/mfix-23.1/bin/mfixsolver
Solver source: /home/LGQ/anaconda3/envs/mfix-23.1/share/mfix/src

I misunderstood your meaning previously.

That all looks correct. The case still won’t run?

No, the case cannot run.


Is it possible this command “MESA_GL_VERSION_OVERRIDE=3.2 mfix” cause the error?

No, the MESA_GL_VERSION_OVERRIDE is unrelated.

But don’t do the module load mpi, that’s overriding the MPI we installed with some version on your cluster. Or else get the cluster admins to update the MPI module to something more current!

Thanks for your suggestion, but “module load mpi” is required for building a DMP solver, is that right?

Without “module load mpi”, the build output reads:

The idea is to use the mpi we installed from conda-forge instead of the (old) mpi implementation which is in that module.

Alternately, your systems folks may be able to update the mpi module on your cluster.

Can you please:

  1. Go to the main menu, Settings, and enable “Developer mode”
  2. Go back to the build popup, check “Verbose”
  3. Try the build again, and copy/paste ALL the build output (not screenshot) so I can see the whole thing.

Thanks!

Thanks for your time, @cgw. I follow your instructions and the attached files contain all the build output information (without and with the command “module load mpi” ).
With_module-load-mpi.txt (548.6 KB)
Without_module-load-mpi.txt (2.2 KB)

With the command “module load mpi” , the DMP solver can be built without any mistakes occurring. But when I try to run the simulation with the built solver, the mfix-gui reports like:

AND I didn’t get any errors if I used previous versions (like mfix-22.4.1 & mfix-22.4.3), no matter it’s a case created by myself or the tutorial case.

In the “with load module mpi” file we find:

-- Found MPI_C: /usr/lib64/openmpi/lib/libmpi.so (found version "3.0")
-- Found MPI_Fortran: /usr/lib64/openmpi/lib/libmpi_usempi.so (found version "3.0")
-- Found MPI: TRUE (found version "3.0")

This is using the MPI Fortran that is on your cluster, which looks to be version 3.0.

If you do not load the module, then we should use the MPI and Fortran that you installed in the Conda env:

In the “without load module mpi” file we see:

-- The Fortran compiler identification is GNU 4.8.5
-- Found MPI_C: /home/LGQ/anaconda3/envs/mfix-23.1/lib/libmpi.so (found version "3.1")
-- Could NOT find MPI_Fortran (missing: MPI_Fortran_WORKS)
CMake Error at /home/LGQ/anaconda3/envs/mfix-23.1/share/cmake-3.26/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
  Could NOT find MPI (missing: MPI_Fortran_FOUND) (found version "3.1")

This is strange for several reasons. It should be finding the gfortran 12.2.0 that you installed into the conda env, but it’s reporting 4.8.5.
And then it’s saying that it found MPI Fortran 3.1 but it “doesn’t work” (I’m not sure what tests CMake is doing to decide whether the compiler works or not - CMake is big and complicated and tries too hard).

Can you activate the mfix-23.1 environment and type conda list and show me the output (text please, not screenshot). Thanks!

– Charles