The case can build SMP solver, but encounter error when building DMP solver

It is encountered an error when compiling silane_pyrolysis_pic_3d case to build DMP solver, but it works fine in building SMP solver, for the latest version 23.3.2 in the CentOS 7.9 system. And I want to know why the computing speed is not any faster than serial computation in SMP using 10 threads.
error_in_build_DMP.txt (3.1 KB)

SMP is not guaranteed to be faster than serial. I suggest you install perf and run perf top during both a serial and SMP run to see what’s going on. This may help point out areas where further optimization/parallelization is possible.

Regarding the compilation failure - I do not have CentOS 7 available for testing, but I can try to set it up as a virtual machine when I get a chance. Can you send me the output of conda list and mfixversioninfo, please?

Hi, cgw.
Thanks for your reply. The relevant output are in the attached file.
conda list and mfixversioninfo.txt (31.2 KB)

Can you please repeat the conda list with the mfix-23.3.2 environment activated? Thanks.

Hi @MFiX_LYP

I installed CentOS 7.9 and mfix-23.3.2 in a VirtualBox instance and building DMP solver works fine.

Here’s the start of my build log:

Building generic solver
Running cmake command:
cmake -DENABLE_PYMFIX=ON -DENABLE_MPI=ON -G "Unix Makefiles" -DCMAKE_INSTALL_PREFIX=/home/cgw/foo -DUDF_DIR=/home/cgw/foo /home/cgw/mambaforge/envs/mfix-23.3.2/share/mfix/src 

-- Found Python3: /home/cgw/mambaforge/envs/mfix-23.3.2/bin/python3.10 (found version "3.10.13") found components: Interpreter 
-- Setting build type to 'RelWithDebInfo' as none was specified.
-- MFIX build settings summary: 
--    Build type        = RelWithDebInfo
--    CMake version     = 3.27.9
--    Fortran compiler  = 
--    Fortran flags     = 
--    ENABLE_MPI        = ON
--    ENABLE_OpenMP     = OFF
--    ENABLE_CTEST      = OFF
--    ENABLE_COVERAGE   = OFF
-- The Fortran compiler identification is GNU 13.2.0
-- The C compiler identification is GNU 13.2.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /home/cgw/mambaforge/envs/mfix-23.3.2/bin/gfortran - skipped
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /home/cgw/mambaforge/envs/mfix-23.3.2/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Performing Test ffpe_trap
-- Performing Test ffpe_trap - Success
-- Performing Test ffpe_summary
-- Performing Test ffpe_summary - Success
-- Found MPI_C: /home/cgw/mambaforge/envs/mfix-23.3.2/lib/libmpi.so (found version "3.1") 
-- Found MPI_Fortran: /home/cgw/mambaforge/envs/mfix-23.3.2/lib/libmpi_usempif08.so (found version "3.1") 
-- Found MPI: TRUE (found version "3.1")  
-- Reading version info from /home/cgw/mambaforge/envs/mfix-23.3.2/share/mfix/src/model/../VERSION
-- Python3_EXECUTABLE /home/cgw/mambaforge/envs/mfix-23.3.2/bin/python3.10
-- Configuring done (3.7s)
-- Generating done (0.2s)
-- Build files have been written to: /home/cgw/foo/build
Build command:

cmake --build . --target install -j 4

Note that in my system, the cmake command is:

cmake -DENABLE_PYMFIX=ON
      -DENABLE_MPI=ON
      -G "Unix Makefiles" 
      -DCMAKE_INSTALL_PREFIX=...
      -DUDF_DIR=...

while your command is:

cmake -DMPI_Fortran_COMPILER=mpifort 
      -DENABLE_PYMFIX=ON 
      -DENABLE_MPI=ON -G "Unix Makefiles"
      -DCMAKE_INSTALL_PREFIX=...
      -DUDF_DIR=...

Note the extra -DMPI_Fortran_COMPILER=mpifort in your build. I suspect that’s the issue. Where is this coming from? What command are you using? build_mfixsolver or something else??

– Charles

OK, I can add -DMPI_Fortran_COMPILER=mpifort to my build command without triggering an error, so that’s not it.
But I found that if I have mpifort installed from the yum repos, the build fails. Note that CentOS7 is ancient, and most of the utiliites are very old versions. We supply a complete build environment (courtesy of conda-forge) with the MFiX package and these utilities are more up-to-date and more likely to work than whatever is on your system in /usr/bin (We do our testing with the compiler from conda-forge)

Although the compiler in the mfix conda environment should take precedence over the broken utilities in /usr/bin, this doesn’t seem to be happening… I will try to find out why, although supporting CentOS7 is not a high priority for us.

For now, you should be able to:

$ sudo yum erase gcc-gfortran openmpi openmpi-devel

to remove the packages which are causing trouble. Please let me know if this workaround works for you.

Oh, cgw. This is a public machine, so I’m not allowed to do this. Can I specify mpi by command? Just like specify gcc and Fortran compiler. The conda list with the mfix-23.3.2 environment activated is in the attached file.
conda list with mfix-23.3.2.txt (19.3 KB)

I’m working on a solution. Stay tuned.

If you are building from a command-line, please try this:

(mfix-23.3.2)$ build_mfixsolver -j --dmp -DCMAKE_Fortran_COMPILER=mpifort

note that CMake is very fussy and the capitalization of CMAKE_Fortran_COMPILER has to be exactly like that or it won’t work.

If you want to make this work from the GUI:

(mfix-23.3.2) $ cd $CONDA_PREFIX/lib/python3.10/site-packages/mfixgui/widgets/

and edit the file build_popup.py (back it up first!)

Change this section, starting at line 569:

   569	        if self.ui.checkbox_dmp.isChecked():
   570	            args += ["--dmp"]
   571	            if compiler:
   572	                args += ["-DMPI_Fortran_COMPILER=" + compiler]
   573	        elif compiler:
   574	            args += ["-DCMAKE_Fortran_COMPILER=" + compiler]

to this:

   569	        if self.ui.checkbox_dmp.isChecked():
   570	            args += ["--dmp"]
   571	
   572	        if compiler:
   573	            args += ["-DCMAKE_Fortran_COMPILER=" + compiler]

that is, delete lines 571 and 572, and change the elif on 573 to an if.

Restart MFiX and the DMP build should work from the GUI.

We will make this the default behavior in the next MFiX release. Thanks for the bug report.

– Charles

I tried this way, but it make no difference. I see the Build command in the GUI is this:
image
Is this command wrong?
And it has same error when building from a command-line.

I think it should be -DCMAKE_Fortran_COMPILER not -DMPI_Fortran_COMPILER

Try running
build_mfixsolver -j --dmp -DCMAKE_Fortran_COMPILER=mpifort
from the command line (with activated MFiX environment)

I tried. The result is this:

(mfix-23.3.2) [test@admin test_silane_parallel]$ build_mfixsolver -j --dmp -DCMAKE_Fortran_COMPILER=mpifort
Building custom solver for test_silane_parallel.mfx
Running cmake command:
cmake -DCMAKE_Fortran_COMPILER=mpifort -DENABLE_PYMFIX=ON -DENABLE_MPI=ON -G "Unix Makefiles" -DCMAKE_INSTALL_PREFIX=/data/home/test/Public/LYP/test/test_silane_parallel -DUDF_DIR=/data/home/test/Public/LYP/test/test_silane_parallel /data/software/anaconda3/envs/mfix-23.3.2/share/mfix/src 

-- Found Python3: /data/software/anaconda3/envs/mfix-23.3.2/bin/python3.10 (found version "3.10.13") found components: Interpreter 
-- Setting build type to 'RelWithDebInfo' as none was specified.
-- MFIX build settings summary: 
--    Build type        = RelWithDebInfo
--    CMake version     = 3.27.8
--    Fortran compiler  = mpifort
--    Fortran flags     = 
--    ENABLE_MPI        = ON
--    ENABLE_OpenMP     = OFF
--    ENABLE_CTEST      = OFF
--    ENABLE_COVERAGE   = OFF
-- The Fortran compiler identification is GNU 13.2.0
-- The C compiler identification is GNU 13.2.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /data/software/anaconda3/envs/mfix-23.3.2/bin/mpifort - skipped
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /data/software/anaconda3/envs/mfix-23.3.2/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Performing Test ffpe_trap
-- Performing Test ffpe_trap - Success
-- Performing Test ffpe_summary
-- Performing Test ffpe_summary - Success
-- Could NOT find MPI_C (missing: MPI_C_WORKS) 
-- Found MPI_Fortran: /data/software/anaconda3/envs/mfix-23.3.2/bin/mpifort (found version "3.1") 
CMake Error at /data/software/anaconda3/envs/mfix-23.3.2/share/cmake-3.27/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
  Could NOT find MPI (missing: MPI_C_FOUND) (found version "3.1")
Call Stack (most recent call first):
  /data/software/anaconda3/envs/mfix-23.3.2/share/cmake-3.27/Modules/FindPackageHandleStandardArgs.cmake:600 (_FPHSA_FAILURE_MESSAGE)
  /data/software/anaconda3/envs/mfix-23.3.2/share/cmake-3.27/Modules/FindMPI.cmake:1837 (find_package_handle_standard_args)
  model/CMakeLists.txt:664 (find_package)


-- Configuring incomplete, errors occurred!
==========================================================================
                     BUILD FAILED
==========================================================================
(mfix-23.3.2) [test@admin test_silane_parallel]$