Converting particle velocity to field data for paraview visualization

Particle Visualization | Post-processing | Paraview

So I am running a MP-PIC simulation for the NETL CFB Riser problem.
When the simulation is done, there is particle velocity data (lagrangian) in the form of .VTP files and there is Eulerian data (i.e., gas velocity, pressure) in the .VTU files.

In paraview I can open the .RES file to visualize the gas velocity data, and I can export it to excel to plot it. For example, gas velocity vs the height of the pipe.

For the particle data- I open the .PVD file to visualize in paraview. However, since this is particle data that is visualized using paraview - spherical glyphs.

How do you export or extract particle velocity value at each [X,Y,Z] location?
Basically, how to convert that particle velocity data into a field- that can be visualized and do post-processing?

Here is my .MFX file and .STL files

geometry_0001.stl (5.6 MB)
geometry_0001.stl (5.6 MB)
own_STL_file.mfx (18.3 KB)

You have particle data available in the vtp files (x,y,z, diameter and velocity). You can visualize it in Paraview by attaching a vector glyph to each particle. Note that this is Lagrangian data, i.e. you have the velocity at the (x,y,z) location but that location changes in time since it follows the particles.

Yes that is correct its Lagrangian data.
However, can we somehow map the particle velocity to the Eulerian grid and export it to excel at each time-step?

Basically, I want to know the particle velocity in the radial direction [across the diameter] at say middle of the Riser at each time-step.

Can I export the above to CSV or excel? If so, how would I do that?

I don’t know how to do that in Paraview. MFiX computes an averaged solids velocity in each computational cell. This is not exposed to export but you can do this to save the data in the vtu files:

  1. Say your vtk region for cell data is region number 1. Add
vtk_vel_s(1,1) = .True.

in the .mfx file.

  1. Copy the file model/cartesian_grid/vtk_out.f into the prokect directory. Locate the line:
CALL WRITE_VECTOR_IN_VTU_BIN('Solids_Velocity_'//ADJUSTL(SUBM),U_S(:,M),V_S(:,M),W_S(:,M),PASS)

Go up a few lines an locate the loop:

            DO M = 1,MMAX

This should be line 193 with MFiX 22.1. Change this line to

            DO M = 1,MMAX + 1
  1. Save the file and build the custom solver. Run your simulation with this custom solver. You should now see the solids velocity in the vtu files (cell data). You can now extract a profile and export it from Paraview.

In my case, the only input files I give to the HPC unit is the .MFX file and the .STL file and the simulation output also does not have a file named ‘model/cartesian_grid/vtk_out.f’

Any ideas where I could find this file?

From the GUI, go to the “Editor tab” at the bottom. You can then navigate the source code. There is a search box where you can type “vtk_out” and it will make it easier to locate the file. Select the file and copy it to the project directory (top left green highlighted “copy” button). Once the file is copied you can edit it as shown in previous post. Save the file and build the solver (wrench icon).

Got it! Thanks for the step by step instructions!

I can build the solver on my GUI and it works on my local computer.

However, when I queue the job on HPC do I need to include the vtk_out.f file to give me the solid_velocity as a field?
If yes, then I can include it in the HPC folder.

However, when I queue the job on HPC and also include the vtk_out.f file in the HPC folder, do I need to rebuild the solver on the HPC? If so how?

I ran the above file on the HPC and included the vtk_out.f file in the directory however the resulting .vtu cell data does not have the solid_velocity as a field when visualized in paraview or MFIX.

vtk_out.f file works perfectly when I simulate no my local computer with the custom build as instructed by Dr. Deitiker.

Thus, I was wondering if you could kindly provide me help with the solid_velocity field data using the vtk_out.f method on HPC runs.

Did you also build the solver on the HPC from the project directory and ran with this custom solver?

Thanks for the comment Dr. Dietiker.

So that is something I was trying to figure out, is that our university HPC already has MFIX installed with the solver, and I was not understanding how to rebuild the solver on HPC itself.

Here is the slurm-file along with the solver details for the HPC I am using. It would be an immense help if you can guide me in rebuilding the custom solver on the HPC:

job_submission_file.txt (1023 Bytes)

If your HPC support team built /share/apps/mfix/bin/mfixsolver_dmp_smp, ask them how they did it (from the GUI, from command line, with what options and source code?).

My suggestion is to build from the same directory as $SLURM_SUBMIT_DIR. This should contain vtk_out.f.

I would also build the dmp solver with the GUI. Click the wrench icon, select “Distributed Memory Parallel (DMP)” option and click “Build solver”. At the end of the build process, you will have a mfixsolver file in the project directory.

You can also build from command line with

build_mfixsolver --dmp

See more option at:

https://mfix.netl.doe.gov/doc/mfix/22.1/html/solver/build-custom.html

Then you should have the executable in $SLURM_SUBMIT_DIR. If you really want to copy the files to a sub directory, use

#Copy input files and solver to simulation directory
cp ../own_STL_file.mfx ../geometry_0001.stl ../geometry.stl ../vtk_out.f ../mfixsolver .

Then when you run the simulation, you point to the custom solver in the current directory:

srun ./mfixsolver -f own_STL_file.mfx