Regression Testing

The MFIX-Exa regression test suite is built on top of the AMReX regression testing framework, which is developed and maintained by the AMReX team. The framework provides tools for building applications, running benchmark problems, and comparing results against reference datasets. The AMReX regression testing system is open source and available at:

This page describes how to configure and run the regression tests, how to customize the test configuration files, and provides a summary of the available benchmark problems.

Overview

Regression tests are used to ensure that changes to MFIX-Exa do not introduce unintended differences in numerical results. These tests:

  • build MFIX-Exa and AMReX in a controlled configuration,

  • run a suite of benchmark problems,

  • compare the results against reference data,

  • report any differences.

The regression suite is not part of the CI pipeline. CI runs the CTest-based MFIX-Exa Test Suite, while regression tests must be run manually by developers.

Regression Test Configuration Files

MFIX-Exa provides two separate regression test configuration files, located in the tests/ directory:

tests/MFIX-tests-CPU.ini
tests/MFIX-tests-GPU.ini

Use the CPU configuration file for CPU-only regression testing and the GPU configuration file for CUDA-enabled regression testing. At this time, MFIX-Exa does not test configurations for HIP (AMD) or SYCL (Intel) backends.

For GPU (CUDA) regression tests, the benchmark datasets are generated using a CPU-only build by setting -DAMReX_GPU_BACKEND=NONE and -DMFIX_GPU_BACKEND=NONE. This ensures that CPU and GPU runs are compared against a common reference dataset. Because GPU runs may exhibit small numerical differences relative to CPU-generated benchmarks, the regression tolerances for GPU tests are typically less strict.

Structure of the Configuration Files

Each .ini file contains several sections:

  • [main] – global settings for the regression framework

  • [AMReX] – AMReX source location and CMake options

  • [source] – MFIX-Exa source location and CMake options

  • [BENCHXX-*] – individual benchmark problem definitions

Only a small number of entries typically need to be modified by users.

User-Editable Fields

Required Edits

Users must update the following fields in the ini file. MFIX-tests-GPU.ini:

  • testTopDir – directory where regression outputs will be written.

  • webTopDir – directory for HTML reports (optional).

  • [AMReX]/dir – path to the AMReX source tree, typically the version located in the subprojects/amrex directory.

  • [source]/dir – path to the MFIX-Exa source tree.

Example:

testTopDir = /path/to/regression/output
webTopDir  = /path/to/regression/web
[AMReX]
dir = /path/to/mfix-exa/subprojects/amrex
[source]
dir = /path/to/mfix-exa

When maintaining both CPU and GPU regression suites on the same system, the testTopDir and webTopDir entries in each ini file should point to different directories. This ensures that the benchmark datasets and test results for CPU and GPU runs remain separate. Mixing these directories can lead to incorrect comparisons, overwritten benchmark files, or misleading regression reports.

Optional Edits

These entries may be modified depending on the system:

  • MPIcommand – command used to launch MPI jobs

  • MPIhost – hostname for MPI execution (optional)

  • numMakeJobs – number of parallel build jobs

  • branch – AMReX or MFIX-Exa branch to check out

Advanced Edits

These entries are rarely modified:

  • cmakeSetupOpts in [AMReX] or [source]

  • runtime_params inside individual benchmark sections

  • numprocs or numthreads for scaling studies

Running the Regression Tests

The MFIX-Exa regression suite is driven by the AMReX regression testing framework. Before creating or running any regression tests, ensure that the directories specified in the ini configuration file exist:

  • testTopDir – directory where benchmark and test output will be written

  • webTopDir – directory where HTML reports will be generated

Both directories must be created manually before running the regression framework.

In the examples below, regression_testing refers to the AMReX regression testing repository available at:

Creating Benchmark Datasets

Benchmark datasets serve as the reference results against which future runs are compared. They must be created before running the regression suite for the first time, or whenever intentional changes to MFIX-Exa require updated reference results.

Create benchmark datasets for all tests listed in the ini file:

./regression_testing/regtest.py \
    --make_benchmarks "creating initial benchmark datasets" \
    MFIX-tests.ini

Create benchmark datasets for a subset of tests:

./regression_testing/regtest.py \
    --make_benchmarks "creating initial benchmark datasets" \
    --tests 'LIST OF BENCHMARK CASES' \
    MFIX-tests.ini

Running the Regression Tests

To run the regression suite without updating AMReX or MFIX-Exa, use the --no_update all flag. This prevents the framework from checking out the branches listed in the ini file and is recommended when testing local modifications.

Run specific tests with a descriptive note:

./regression_testing/regtest.py \
    --no_update all \
    --note "test after making change XYZ" \
    --tests 'BENCH01-Size0001 BENCH02-Size0001' \
    MFIX-tests.ini

Updating Benchmark Results

If a code modification is known to change numerical results, the benchmark datasets must be updated. There are two ways to do this:

  1. Recreate benchmarks using ``–make_benchmarks`` (recommended when changes are intentional and validated)

  2. Copy benchmark results from a failed test run (useful when the regression suite has already produced new output)

Update benchmarks using the copy_benchmarks option:

./regression_testing/regtest.py \
    --no_update all \
    --copy_benchmarks "Updating after submodules" \
    MFIX-tests.ini

This replaces the existing benchmark plotfiles with those generated during the most recent test run.

Interpreting Results

After the tests complete, the framework generates:

  • a summary of passed/failed tests,

  • detailed logs for each benchmark,

  • optional HTML reports (if webTopDir is set),

  • particle comparison results (if compareParticles = 1).

A test is considered failed if:

  • MFIX-Exa output differs from the benchmark data,

  • the simulation crashes or produces invalid output,

  • particle trajectories differ beyond tolerance.

Benchmark Summary

The MFIX-Exa regression suite includes a broad set of benchmark problems covering fluid flow, particle dynamics, and coupled multiphase physics. A high-level summary is provided below.

Benchmark

Category

Description

BENCH01

HCS

Homogeneous Cooling System (HCS) with multiple advection and redistribution options; includes restart and replicate tests.

BENCH02

Settling

Single-particle settling in fluid; includes no-fluid and wall-bounded variants.

BENCH03

Fluidized bed

Small fluidized bed with gas–solid interaction.

BENCH04

Square riser

Gas–solid riser flow in a square duct.

BENCH05

Cylindrical fluidized bed

Multiple grid sizes and physics options; includes chemistry and covered-grid tests.

BENCH06

Cylindrical riser

Gas–solid riser with multiple MPI configurations.

BENCH08

PIC

Particle-in-cell tests including restart variants.

BENCH09

Refinement

AMR refinement test for particle and fluid fields.

DEM01–DEM06

DEM

Particle-only verification tests (free fall, bouncing, compression, rolling, oblique collisions, terminal velocity).

TracerAdv

Tracer transport

Tracer advection with optional Godunov/MOL advection schemes.

Relationship to CI Tests

The regression suite is separate from the MFIX-Exa CI pipeline:

  • CI runs the CTest-based MFIX-Exa Test Suite.

  • Regression tests use the AMReX regression framework.

  • CI tests focus on build correctness and basic functionality.

  • Regression tests focus on numerical correctness and reproducibility.

Both systems complement each other and should be used regularly during development.

Developer Expectations

  • CPU and GPU regression tests should be run when modifying multiphase physics, AMReX interfaces, or GPU kernels.

  • Any regression failure must be investigated and resolved before merging.

  • Benchmark data should only be updated when numerical changes are intentional and documented.