Multi processor simulation using HPC

Hi, I am trying to simulate the Gasification process using MFiX. I am using High-Performance Computer (HPC) for the simulation. I changed the number of nodes NODESI, NODESJ, and NODESK, in *.mfx file. But, when I try to run the simulation in HPC, it says the number of nodes is not consistent. How can I specify the number of processor in command line? Also, I am using the solver that was build using GUI version, because the build process failed when I tried to use command line in HPC.

Please attach your project files (Main menu → Submit bug report)

biomass_2023-04-25T103341.950284.zip (25.0 MB)
Here is the project files.

I’m confused by a few things here.

First of all, when I bring up the build popup:

It looks like you have set the build type to Custom but not specified any compiler flags. Why did you do that?

It also looks like you did not enable DMP, which is needed if you are going to run on a cluster.

Enable the DMP checkbox.
And set the build type to Release with debug info (unless you need to do a debug build)

After doing this, and rebuilding, you can specify the number of processors (I/J/KMAX) in the run popup:

I was able to run on 8 cores (2x2x2) without any issue.

1 Like

Actually, I am newbie to this MFiX and trying to learn through experience.
I tried to do the set up as you suggested. It worked in my Linux OS. However, when I tried to run it in my university’s HPC, it gave me error after 99% of the solver building.

In HPC I did following steps:

  1. Copied the Biomass.mfx and its other files in hpc home directory.
  2. module load python/anaconda/3.8.6
  3. source activate mfix-22.4.2
  4. module load gcc
  5. module load openmpi
  6. module load cmake

solver building:
build_mfixsolver --dmp FC=“mpifort”

#The buidling process reach 99% and following error occurs and says “BUILD FAIL”

/tools/anacondapython-3.8.6/envs/mfix-22.4.2/share/mfix/src/model/xpow.c:1:0: error: bad value (haswell) for -march= switch
/* xpow.c - replacement pow() function that detects integer and dyadic exponents
^
gmake[2]: *** [model/CMakeFiles/mfixcore.dir/xpow.c.o] Error 1
gmake[1]: *** [model/CMakeFiles/mfixcore.dir/all] Error 2
gmake: *** [all] Error 2

                 BUILD FAILED

==========================================================

I am not sure what the issue is here. Any help would be appreciated.

Your GCC is too old. Either get a newer GCC, or use the GCC/GFortran that comes with conda-forge:

$ conda activate mfix-22.4.2
$ conda install -c conda-forge gcc (this will get you a 12.2.0 version)

Alternately, if you are not able to update the compiler:

$ conda activate mfix-22.4.2
$ cd $CONDA_PREFIX/share/mfix/src/model
$ nano CMakeLists.txt  # or use editor of your choice

find the section around line 591:

if (APPLE)
  set (MARCH "")
else()
  set (MARCH "-march=haswell")
endif()

change this to

find the section around line 591:

if (APPLE)
  set (MARCH "")
else()
  set (MARCH "")
endif()

Note that this will remove certain compiler optimizations and result in a slower solver. Upgrade the compiler if possible.

Thank you so much. The HPC that I am using is from University. So, I do not have authority to install new updates. But, I tried the alternative way by updating CMakelists.txt file. The compiler build was successful.

Now, When I try to run the solver using SMP:
env OMP_NUM_THREADS=8 ./mfixsolver -f Biomass.mfx

I got the following error saying “/usr/bin/python: No module named mfixgui”

Do I need GUI to be installed to solve in the cluster?
Also, The HPC admin from my university was not able to install the GUI in hpc.

This is the complete error message that I got.

(mfix-22.4.2) azb0224@easley01:~ > mpirun -np 16 ./mfixsolver -f Biomass.mfx NODESI=2 NODESJ=4 NODESK=2
/usr/bin/python: No module named mfixgui

Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.


mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[14684,1],0]
Exit code: 1

Then you haven’t activated the MFiX environment on the worker nodes.

Two options: modify the submit script so that mfix is actiavted, OR,
use the ‘batch solver’.

To build batch solver: in Main menu / Settings enable "Developer mode’

Then in the build popup you will have an option to turn off interactive support. This will build a batch-mode solver which will run on the worker nodes without the MFiX environment. HOWEVER if you do this you will be unable to pause/resume or monitor the run.

If you see line 1 of the above error, the mfix environment is already activated

(mfix-22.4.2) azb0224@easley01:~ > mpirun -np 1 ./mfixsolver -f Biomass.mfx NODESI=1 NODESJ=1 NODESK=1
/usr/bin/python: No module named mfixgui

Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.

I tried to run in the terminal without job submission. It shows the same error.

I do not understand this. If mfix is active, then there should be a Python interpreter as part of the environment. Please set up the mfix environment and then send me the output of env.

I am not sure, if this is what you are asking. I have obtained this output by printenv.
Please refer to the following output

LDFLAGS=-L/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/lib -L/tools/hpclib/mpfr-4.1.0/lib -L/tools/hpclib/gmp-6.2.0/lib
MANPATH=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/share/man:/cm/shared/apps/slurm/current/share/man:/usr/share/lmod/lmod/share/man:/usr/local/man:/usr/local/share/man:/usr/share/man/overrides:/usr/share/man:/cm/local/apps/environment-modules/current/share/man
XDG_SESSION_ID=65474
ModuleTable003=cm9wVD17fSxbInN0YWNrRGVwdGgiXT0xLFsic3RhdHVzIl09ImFjdGl2ZSIsWyJ1c2VyTmFtZSJdPSJnbXAvNi4yLjAiLH0sbXBjPXtbImZuIl09Ii9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvY29yZS9tcGMvMS4yLjAiLFsiZnVsbE5hbWUiXT0ibXBjLzEuMi4wIixbImxvYWRPcmRlciJdPTUscHJvcFQ9e30sWyJzdGFja0RlcHRoIl09MSxbInN0YXR1cyJdPSJhY3RpdmUiLFsidXNlck5hbWUiXT0ibXBjLzEuMi4wIix9LG1wZnI9e1siZm4iXT0iL2NtL3NoYXJlZC9tb2R1bGVmaWxlcy9jb3JlL21wZnIvNC4xLjAiLFsiZnVsbE5hbWUiXT0ibXBmci80LjEuMCIsWyJsb2FkT3JkZXIiXT00LHByb3BUPXt9LFsic3RhY2tEZXB0aCJdPTEsWyJzdGF0dXMiXT0iYWN0aXZlIixbInVz
HOSTNAME=c20-login01
OMPI_MCA_btl_openib_allow_ib=1
PROJ_DATA=/tools/anacondapython-3.8.6/envs/mfix-22.4.2/share/proj
__LMOD_REF_COUNT_MODULEPATH=/cm/shared/modulefiles/system:1;/cm/shared/modulefiles/compilers:1;/cm/shared/modulefiles/core:1;/cm/shared/modulefiles/languages:1;/cm/shared/modulefiles/software:1;/cm/shared/modulefiles/libraries/gcc/9.3.0:1;/cm/shared/modulefiles/parallel/gcc/9.3.0:1;/cm/shared/modulefiles/software/.gcc/9.3.0:1;/cm/shared/modulefiles/compilers/.toolchain/gcc/9.3.0:1
TERM=xterm
SHELL=/bin/bash
LMOD_ROOT=/usr/share/lmod
HISTSIZE=1000
CPPFLAGS=-I/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include -I/tools/hpclib/mpc-1.2.0/include -I/tools/hpclib/mpfr-4.1.0/include -I/tools/hpclib/gmp-6.2.0/include
LMOD_SYSTEM_DEFAULT_MODULES=DefaultModules
MODULEPATH_ROOT=/cm/shared/modulefiles
SSH_CLIENT=10.99.16.58 55179 22
CONDA_SHLVL=1
LIBRARY_PATH=/tools/python-3.9.2/lib:/tools/hpclib/mpfr-4.1.0/lib:/tools/hpclib/gmp-6.2.0/lib:/cm/shared/apps/slurm/current/lib64/slurm:/cm/shared/apps/slurm/current/lib64
LMOD_PACKAGE_PATH=/cm/shared/apps/lmod
CONDA_PROMPT_MODIFIER=(mfix-22.4.2)
__LMOD_REF_COUNT_C_INCLUDE_PATH=/tools/hpclib/mpfr-4.1.0/include:1;/tools/hpclib/gmp-6.2.0/include:1
GSETTINGS_SCHEMA_DIR_CONDA_BACKUP=
LMOD_PKG=/usr/share/lmod/lmod
LMOD_VERSION=8.2.7
SSH_TTY=/dev/pts/29
__LMOD_REF_COUNT_LOADEDMODULES=DefaultModules:1;slurm/current:1;gmp/6.2.0:1;mpfr/4.1.0:1;mpc/1.2.0:1;gcc/9.3.0:1;openmpi/4.0.3:1;cmake/3.18.6:1;python/3.9.2:1
__LMOD_REF_COUNT_LDFLAGS=-L/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/lib:1;-L/tools/hpclib/mpfr-4.1.0/lib:1;-L/tools/hpclib/gmp-6.2.0/lib:1
QT_GRAPHICSSYSTEM_CHECKED=1
USER=azb0224
LD_LIBRARY_PATH=/tools/python-3.9.2/lib:/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/lib:/tools/hpclib/mpc-1.2.0/lib:/tools/hpclib/mpfr-4.1.0/lib:/tools/hpclib/gmp-6.2.0/lib:/tools/gcc-9.3.0/lib64:/tools/gcc-9.3.0/lib:/cm/shared/apps/slurm/current/lib64/slurm:/cm/shared/apps/slurm/current/lib64
LMOD_sys=Linux
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:.tar=01;31:.tgz=01;31:.arc=01;31:.arj=01;31:.taz=01;31:.lha=01;31:.lz4=01;31:.lzh=01;31:.lzma=01;31:.tlz=01;31:.txz=01;31:.tzo=01;31:.t7z=01;31:.zip=01;31:.z=01;31:.Z=01;31:.dz=01;31:.gz=01;31:.lrz=01;31:.lz=01;31:.lzo=01;31:.xz=01;31:.bz2=01;31:.bz=01;31:.tbz=01;31:.tbz2=01;31:.tz=01;31:.deb=01;31:.rpm=01;31:.jar=01;31:.war=01;31:.ear=01;31:.sar=01;31:.rar=01;31:.alz=01;31:.ace=01;31:.zoo=01;31:.cpio=01;31:.7z=01;31:.rz=01;31:.cab=01;31:.jpg=01;35:.jpeg=01;35:.gif=01;35:.bmp=01;35:.pbm=01;35:.pgm=01;35:.ppm=01;35:.tga=01;35:.xbm=01;35:.xpm=01;35:.tif=01;35:.tiff=01;35:.png=01;35:.svg=01;35:.svgz=01;35:.mng=01;35:.pcx=01;35:.mov=01;35:.mpg=01;35:.mpeg=01;35:.m2v=01;35:.mkv=01;35:.webm=01;35:.ogm=01;35:.mp4=01;35:.m4v=01;35:.mp4v=01;35:.vob=01;35:.qt=01;35:.nuv=01;35:.wmv=01;35:.asf=01;35:.rm=01;35:.rmvb=01;35:.flc=01;35:.avi=01;35:.fli=01;35:.flv=01;35:.gl=01;35:.dl=01;35:.xcf=01;35:.xwd=01;35:.yuv=01;35:.cgm=01;35:.emf=01;35:.axv=01;35:.anx=01;35:.ogv=01;35:.ogx=01;35:.aac=01;36:.au=01;36:.flac=01;36:.mid=01;36:.midi=01;36:.mka=01;36:.mp3=01;36:.mpc=01;36:.ogg=01;36:.ra=01;36:.wav=01;36:.axa=01;36:.oga=01;36:.spx=01;36:*.xspf=01;36:
CONDA_EXE=/tools/anacondapython-3.8.6/bin/conda
ModuleTable004=ZXJOYW1lIl09Im1wZnIvNC4xLjAiLH0sb3Blbm1waT17WyJmbiJdPSIvY20vc2hhcmVkL21vZHVsZWZpbGVzL3BhcmFsbGVsL2djYy85LjMuMC9vcGVubXBpLzQuMC4zLmx1YSIsWyJmdWxsTmFtZSJdPSJvcGVubXBpLzQuMC4zIixbImxvYWRPcmRlciJdPTcscHJvcFQ9e30sWyJzdGFja0RlcHRoIl09MCxbInN0YXR1cyJdPSJhY3RpdmUiLFsidXNlck5hbWUiXT0ib3Blbm1waSIsfSxweXRob249e1siZm4iXT0iL2NtL3NoYXJlZC9tb2R1bGVmaWxlcy9sYW5ndWFnZXMvcHl0aG9uLzMuOS4yIixbImZ1bGxOYW1lIl09InB5dGhvbi8zLjkuMiIsWyJsb2FkT3JkZXIiXT05LHByb3BUPXt9LFsic3RhY2tEZXB0aCJdPTAsWyJzdGF0dXMiXT0iYWN0aXZlIixbInVzZXJOYW1lIl09
CPATH=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include:/cm/shared/apps/slurm/current/include
_LMOD_REF_COUNT__LMFILES=/cm/shared/modulefiles/system/DefaultModules.lua:1;/cm/shared/modulefiles/system/slurm/current:1;/cm/shared/modulefiles/core/gmp/6.2.0:1;/cm/shared/modulefiles/core/mpfr/4.1.0:1;/cm/shared/modulefiles/core/mpc/1.2.0:1;/cm/shared/modulefiles/compilers/gcc/9.3.0.lua:1;/cm/shared/modulefiles/parallel/gcc/9.3.0/openmpi/4.0.3.lua:1;/cm/shared/modulefiles/compilers/.toolchain/gcc/9.3.0/cmake/3.18.6.lua:1;/cm/shared/modulefiles/languages/python/3.9.2:1
CXXFLAGS=-I/tools/hpclib/mpfr-4.1.0/include -I/tools/hpclib/gmp-6.2.0/include
_CE_CONDA=
LMOD_PREPEND_BLOCK=normal
ModuleTable001=X01vZHVsZVRhYmxlXz17WyJNVHZlcnNpb24iXT0zLFsiY19yZWJ1aWxkVGltZSJdPTg2NDAwLFsiY19zaG9ydFRpbWUiXT1mYWxzZSxkZXB0aFQ9e30sZmFtaWx5PXt9LG1UPXtEZWZhdWx0TW9kdWxlcz17WyJmbiJdPSIvY20vc2hhcmVkL21vZHVsZWZpbGVzL3N5c3RlbS9EZWZhdWx0TW9kdWxlcy5sdWEiLFsiZnVsbE5hbWUiXT0iRGVmYXVsdE1vZHVsZXMiLFsibG9hZE9yZGVyIl09MSxwcm9wVD17fSxbInN0YWNrRGVwdGgiXT0wLFsic3RhdHVzIl09ImFjdGl2ZSIsWyJ1c2VyTmFtZSJdPSJEZWZhdWx0TW9kdWxlcyIsfSxjbWFrZT17WyJmbiJdPSIvY20vc2hhcmVkL21vZHVsZWZpbGVzL2NvbXBpbGVycy8udG9vbGNoYWluL2djYy85LjMuMC9jbWFrZS8zLjE4LjYubHVh
MAIL=/var/spool/mail/azb0224
PATH=/tools/python-3.9.2/bin:/tools/cmake-3.18.6/gcc/9.3.0/bin:/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/bin:/tools/gcc-9.3.0/bin:/usr/bin:/tools/anacondapython-3.8.6/envs/mfix-22.4.2/bin:/tools/anacondapython-3.8.6/condabin:/cm/shared/apps/slurm/current/sbin:/cm/shared/apps/slurm/current/bin:/usr/local/bin:/usr/local/sbin:/usr/sbin:/opt/dell/srvadmin/bin:/home/azb0224/.local/bin:/home/azb0224/bin
PROJ_NETWORK=ON
GSETTINGS_SCHEMA_DIR=/tools/anacondapython-3.8.6/envs/mfix-22.4.2/share/glib-2.0/schemas
C_INCLUDE_PATH=/tools/hpclib/mpfr-4.1.0/include:/tools/hpclib/gmp-6.2.0/include
LD_RUN_PATH=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/lib
CONDA_PREFIX=/tools/anacondapython-3.8.6/envs/mfix-22.4.2
LMOD_SETTARG_CMD=:
__LMOD_REF_COUNT_LD_RUN_PATH=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/lib:1
PWD=/home/azb0224
LMFILES=/cm/shared/modulefiles/system/DefaultModules.lua:/cm/shared/modulefiles/system/slurm/current:/cm/shared/modulefiles/core/gmp/6.2.0:/cm/shared/modulefiles/core/mpfr/4.1.0:/cm/shared/modulefiles/core/mpc/1.2.0:/cm/shared/modulefiles/compilers/gcc/9.3.0.lua:/cm/shared/modulefiles/parallel/gcc/9.3.0/openmpi/4.0.3.lua:/cm/shared/modulefiles/compilers/.toolchain/gcc/9.3.0/cmake/3.18.6.lua:/cm/shared/modulefiles/languages/python/3.9.2
__LMOD_REF_COUNT_PYTHONPATH=/tools/python-3.9.2:1
LANG=en_US.UTF-8
MODULEPATH=/cm/shared/modulefiles/system:/cm/shared/modulefiles/compilers:/cm/shared/modulefiles/core:/cm/shared/modulefiles/languages:/cm/shared/modulefiles/software:/cm/shared/modulefiles/libraries/gcc/9.3.0:/cm/shared/modulefiles/parallel/gcc/9.3.0:/cm/shared/modulefiles/software/.gcc/9.3.0:/cm/shared/modulefiles/compilers/.toolchain/gcc/9.3.0
ModuleTable_Sz=6
LOADEDMODULES=DefaultModules:slurm/current:gmp/6.2.0:mpfr/4.1.0:mpc/1.2.0:gcc/9.3.0:openmpi/4.0.3:cmake/3.18.6:python/3.9.2
ENABLE_LMOD=1
ModuleTable005=InB5dGhvbiIsfSxzbHVybT17WyJmbiJdPSIvY20vc2hhcmVkL21vZHVsZWZpbGVzL3N5c3RlbS9zbHVybS9jdXJyZW50IixbImZ1bGxOYW1lIl09InNsdXJtL2N1cnJlbnQiLFsibG9hZE9yZGVyIl09Mixwcm9wVD17fSxbInN0YWNrRGVwdGgiXT0wLFsic3RhdHVzIl09ImFjdGl2ZSIsWyJ1c2VyTmFtZSJdPSJzbHVybSIsfSx9LG1wYXRoQT17Ii9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvc3lzdGVtIiwiL2NtL3NoYXJlZC9tb2R1bGVmaWxlcy9jb21waWxlcnMiLCIvY20vc2hhcmVkL21vZHVsZWZpbGVzL2NvcmUiLCIvY20vc2hhcmVkL21vZHVsZWZpbGVzL2xhbmd1YWdlcyIsIi9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvc29mdHdhcmUiLCIvY20vc2hhcmVkL21vZHVsZWZpbGVz
LMOD_CMD=/usr/share/lmod/lmod/libexec/lmod
LMOD_AVAIL_STYLE=en_grouped
__LMOD_REF_COUNT_CXXFLAGS=-I/tools/hpclib/mpfr-4.1.0/include:1;-I/tools/hpclib/gmp-6.2.0/include:1
__LMOD_REF_COUNT_CFLAGS=-I/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include:1;-I/tools/hpclib/mpc-1.2.0/include:1;-I/tools/hpclib/mpfr-4.1.0/include:1;-I/tools/hpclib/gmp-6.2.0/include:1
_CE_M=
CXX=mpic++
HISTCONTROL=ignoredups
SHLVL=1
HOME=/home/azb0224
__LMOD_REF_COUNT_PATH=/tools/python-3.9.2/bin:1;/tools/cmake-3.18.6/gcc/9.3.0/bin:1;/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/bin:1;/tools/gcc-9.3.0/bin:1;/usr/bin:1;/tools/anacondapython-3.8.6/envs/mfix-22.4.2/bin:1;/tools/anacondapython-3.8.6/condabin:1;/cm/shared/apps/slurm/current/sbin:1;/cm/shared/apps/slurm/current/bin:1;/usr/local/bin:1;/usr/local/sbin:1;/usr/sbin:1;/opt/dell/srvadmin/bin:1;/home/azb0224/.local/bin:1;/home/azb0224/bin:1
__LMOD_REF_COUNT_CPATH=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include:1;/cm/shared/apps/slurm/current/include:1
CFLAGS=-I/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include -I/tools/hpclib/mpc-1.2.0/include -I/tools/hpclib/mpfr-4.1.0/include -I/tools/hpclib/gmp-6.2.0/include
ModuleTable002=IixbImZ1bGxOYW1lIl09ImNtYWtlLzMuMTguNiIsWyJsb2FkT3JkZXIiXT04LHByb3BUPXt9LFsic3RhY2tEZXB0aCJdPTAsWyJzdGF0dXMiXT0iYWN0aXZlIixbInVzZXJOYW1lIl09ImNtYWtlIix9LGdjYz17WyJmbiJdPSIvY20vc2hhcmVkL21vZHVsZWZpbGVzL2NvbXBpbGVycy9nY2MvOS4zLjAubHVhIixbImZ1bGxOYW1lIl09ImdjYy85LjMuMCIsWyJsb2FkT3JkZXIiXT02LHByb3BUPXt9LFsic3RhY2tEZXB0aCJdPTAsWyJzdGF0dXMiXT0iYWN0aXZlIixbInVzZXJOYW1lIl09ImdjYy85LjMuMCIsfSxnbXA9e1siZm4iXT0iL2NtL3NoYXJlZC9tb2R1bGVmaWxlcy9jb3JlL2dtcC82LjIuMCIsWyJmdWxsTmFtZSJdPSJnbXAvNi4yLjAiLFsibG9hZE9yZGVyIl09Myxw
__LMOD_REF_COUNT_INCLUDE=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include:1;/tools/hpclib/mpc-1.2.0/include:1;/tools/hpclib/mpfr-4.1.0/include:1;/tools/hpclib/gmp-6.2.0/include:1
FC=mpif90
BASH_ENV=/usr/share/lmod/lmod/init/bash
CONDA_PYTHON_EXE=/tools/anacondapython-3.8.6/bin/python
PYTHONPATH=/tools/python-3.9.2
LMOD_arch=x86_64
LOGNAME=azb0224
SSH_CONNECTION=10.99.16.58 55179 10.143.14.10 22
MPI_HOME=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current
__LMOD_REF_COUNT_LIBRARY_PATH=/tools/python-3.9.2/lib:1;/tools/hpclib/mpfr-4.1.0/lib:1;/tools/hpclib/gmp-6.2.0/lib:1;/cm/shared/apps/slurm/current/lib64/slurm:1;/cm/shared/apps/slurm/current/lib64:1
MODULESHOME=/usr/share/lmod/lmod
PKG_CONFIG_PATH=/tools/hpclib/mpfr-4.1.0/lib/pkgconfig:/tools/hpclib/gmp-6.2.0/lib/pkgconfig
CONDA_DEFAULT_ENV=mfix-22.4.2
__LMOD_REF_COUNT_LD_LIBRARY_PATH=/tools/python-3.9.2/lib:1;/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/lib:1;/tools/hpclib/mpc-1.2.0/lib:1;/tools/hpclib/mpfr-4.1.0/lib:1;/tools/hpclib/gmp-6.2.0/lib:1;/tools/gcc-9.3.0/lib64:1;/tools/gcc-9.3.0/lib:1;/cm/shared/apps/slurm/current/lib64/slurm:1;/cm/shared/apps/slurm/current/lib64:1
LMOD_SETTARG_FULL_SUPPORT=no
LESSOPEN=||/usr/bin/lesspipe.sh %s
__Init_Default_Modules=1
LMOD_FULL_SETTARG_SUPPORT=no
__LMOD_REF_COUNT_PKG_CONFIG_PATH=/tools/hpclib/mpfr-4.1.0/lib/pkgconfig:1;/tools/hpclib/gmp-6.2.0/lib/pkgconfig:1
MPI_RUN=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/bin/mpirun
CC=mpicc
XDG_RUNTIME_DIR=/run/user/535193
__LMOD_REF_COUNT_CPPFLAGS=-I/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include:1;-I/tools/hpclib/mpc-1.2.0/include:1;-I/tools/hpclib/mpfr-4.1.0/include:1;-I/tools/hpclib/gmp-6.2.0/include:1
ModuleTable006=L2xpYnJhcmllcy9nY2MvOS4zLjAiLCIvY20vc2hhcmVkL21vZHVsZWZpbGVzL3BhcmFsbGVsL2djYy85LjMuMCIsIi9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvc29mdHdhcmUvLmdjYy85LjMuMCIsIi9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvY29tcGlsZXJzLy50b29sY2hhaW4vZ2NjLzkuMy4wIix9LFsic3lzdGVtQmFzZU1QQVRIIl09Ii9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvc3lzdGVtOi9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvY29tcGlsZXJzOi9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvY29yZTovY20vc2hhcmVkL21vZHVsZWZpbGVzL2xhbmd1YWdlczovY20vc2hhcmVkL21vZHVsZWZpbGVzL3NvZnR3YXJlIix9
INCLUDE=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/include:/tools/hpclib/mpc-1.2.0/include:/tools/hpclib/mpfr-4.1.0/include:/tools/hpclib/gmp-6.2.0/include
__LMOD_REF_COUNT_MANPATH=/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/share/man:1;/cm/shared/apps/slurm/current/share/man:1;/usr/share/lmod/lmod/share/man:1;/usr/local/man:1;/usr/local/share/man:1;/usr/share/man/overrides:1;/usr/share/man:1;/cm/local/apps/environment-modules/current/share/man:1
LMOD_DIR=/usr/share/lmod/lmod/libexec
LMOD_COLORIZE=yes
BASH_FUNC_module()=() { eval $($LMOD_CMD bash “$@”) && eval $(${LMOD_SETTARG_CMD:-:} -s sh)
}
BASH_FUNC_ml()=() { eval $($LMOD_DIR/ml_cmd “$@”)
}
_=/usr/bin/printenv

Thanks, that is helpful.

PATH=/tools/python-3.9.2/bin:/tools/cmake-3.18.6/gcc/9.3.0/bin:/tools/openmpi-4.0.3/gcc/9.3.0/slurm/current/bin:/tools/gcc-9.3.0/bin:/usr/bin:/tools/anacondapython-3.8.6/envs/mfix-22.4.2/bin:/tools/anacondapython-3.8.6/condabin:/cm/shared/apps/slurm/current/sbin:/cm/shared/apps/slurm/current/bin:/usr/local/bin:/usr/local/sbin:/usr/sbin:/opt/dell/srvadmin/bin:/home/azb0224/.local/bin:/home/azb0224/bin

Whatever is putting /tools/python-3.9.2/bin at the front of your PATH is the problem. You need to be using the Python environment that Anaconda installed (along with all of its libraries). This is installed in $CONDA_PREFIX which in your case is:

 CONDA_PREFIX=/tools/anacondapython-3.8.6/envs/mfix-22.4.2

The command conda activate mfix-22.4.2
will insert $CONDA_PREFIX/bin at the head of $PATH, and I can see that it is in $PATH but not at the beginning, so after the Conda enviroment was set up, some other local script put /tools/python.. at the head of $PATH so that is keeping the correct Python environment from being found. I don’t know where that is coming from, but if you change the order of initializations so that the conda activate is done LAST, you might have better results.

Hello sir,

I am trying to set the solver following your procedure. However, I cannot find the DMP checkbox on my side. I am using the latest version of Mfix, could you give me some more advice on how to set the solver to allow parallel computation?

DMP is only supported on Linux. Are you using Windows?

Hello sir,

Yes i am indeed using windows. So is there any way to do the similar setting on windows?

SMP is supported on Windows but not DMP. You can try SMP on Windows, but if you really need DMP, then you need to run on Linux.

May I ask how to use SMP on windows? Sorry I did not figure it out on my own, is it possible that you could provide some more instructions? Thanks

Check the “SMP” box in the build popup.