.. _running_mfix_on_joule:


=====================
Running MFiX on Joule
=====================

This section is for MFiX users on NETL's HPC system Joule. MFiX is installed as
an environment module, so installing it yourself is not necessary. The following
sections document which compiler modules to use on Joule to run MFiX and build
the solver.

To build the solver from the GUI with a particular compiler on Joule, you need
the environment module for that compiler loaded when before starting MFiX. If
that compiler is not loaded, exit MFiX to go back to your shell, load the
appropriate module, and start MFiX again.

To avoid confusion, use ``module list`` and ``module unload`` to check that you
only have one version (e.g. 8.2, 6.5, 6.4) of one particular compiler (e.g.
``gnu``, ``intel``) loaded at any given time!

.. _launch_gui_on_joule:

---------------
Running the GUI
---------------

1.  Load the MFiX |version| module:

    .. code-block:: bash
       :substitutions:

       > module load mfix/|version|

2)  Run the GUI:

    .. code-block:: bash

        > vglrun mfix

3)  To build from the GUI, see: :ref:`build-custom`.

4)  To build from the command line, run:

    .. code-block:: bash

        > build_mfixsolver

With just |mfix_module_inline| environment module loaded, you can build the MFiX solver
in serial with GCC 4.8 (the CentOS system compiler). For building the solver
with other configurations, see below.


------------------------
Building Solver with GCC
------------------------

At the time of this writing, Joule has modules for GCC 6.4, 6.5, 8.2, 8.4, and 9.3. GCC 8.2 or newer is recommended.

    .. code-block:: bash

        > module load gnu/8.2.0                                            # if only using serial
        > module load gnu/8.2.0 openmpi/4.0.1_gnu8.2                       # if using DMP

        > module load gnu/6.5.0                                           # if only using serial
        > module load gnu/6.5.0 openmpi/3.1.3_gnu6.5                      # if using DMP


When building from the command line:

    .. code-block:: bash

        > build_mfixsolver -s none           # for serial, no interactive support
        > build_mfixsolver -s none --dmp     # for DMP, no interactive support

When building from the GUI:

  - For DMP support, check the DMP checkbox (on the Build dialog)
  - Default interactive support is python (recommended)


-------------------------------------------
Building Solver with Intel Fortran Compiler
-------------------------------------------

Load module:

    .. code-block:: bash

        > module load intel/2019.2.053                      # if only using serial
        > module load intel/2019.2.053 intelmpi/2019.2.144  # if using DMP


When building from the command line:

    .. code-block:: bash

        > build_mfixsolver -s none        # for serial
        > build_mfixsolver -s none --dmp  # for DMP

Building an Interactive solver with Intel compiler is not supported.

When building from the GUI:

  - Check "Enable developer mode" under Settings, then select Interactive support: "None" (on the Build dialog)
  - For DMP support, check the DMP checkbox (on the Build dialog)


-----------------------------------
Building Solver from Source Tarball
-----------------------------------

To build the solver from the MFiX source tarball or from a Git repo, you do not
need to load the |mfix_module_inline| module. Instead, you only need to load the CMake
module:

    .. code-block:: bash

        > module load cmake

Also load the GCC or Intel module for the compiler you want to use. For
complete instructions on building the solver from source, see :ref:`developers`.