Exploring Lagrangian Multiphase Flow: A Guide to MPPICFoam and MPPICDyMFoam
In the world of OpenFOAM, modeling how particles interact with fluids is a classic challenge. While the standard DPMFoam (Discrete Particle Method) is great for dilute systems, it often struggles when particles get crowded. That’s where MPPICFoam and its dynamic-mesh sibling, MPPICDyMFoam, step in.
These solvers use the Multiphase Particle-In-Cell (MP-PIC) method, a powerful hybrid approach designed to handle dense particle flows without the massive computational cost of tracking every single physical collision.
How the MP-PIC Method Works
The core philosophy of these solvers is to treat the fluid as a continuum (Eulerian) and the particles as discrete “parcels” (Lagrangian). However, unlike standard DEM (Discrete Element Method) which calculates collisions between every individual sphere, MP-PIC uses a continuum model for particle stress.
- Parcels, Not Particles: To save memory, the solver groups thousands of physical particles into a single “parcel.” Every parcel in your simulation represents a cloud of particles with identical properties (velocity, size, density).
- The Particle Stress Gradient: Instead of detecting a “hit” between two parcels, the solver calculates a Particle Pressure \(\tau\) based on the local volume fraction of solids \(\theta\).
- The Governing Equation: The velocity of a parcel \(u_p\) is influenced by gravity, drag, and—crucially—the gradient of this particle stress:$$\frac{du_p}{dt} = \text{drag}(u_f – u_p) + g – \frac{1}{\theta \rho_p} \nabla \tau$$This “pressure” term effectively pushes parcels away from regions where they are too tightly packed, simulating the effect of collisions.
Key Properties and Features
The MP-PIC approach offers several distinct advantages for industrial-scale simulations:
- Packing Limits: You can define a
maxAlphaPacked(usually around 0.6 to 0.65). When the solid volume fraction nears this limit, the particle stress increases exponentially, preventing parcels from overlapping unrealistically. - Statistical Scaling: Because you are moving parcels rather than trillions of particles, you can simulate massive systems like fluidized beds or cyclones on standard workstations.
- Two-Way Coupling: The fluid feels the particles, and the particles feel the fluid. The momentum exchange is handled via drag models (like Gidaspow, Wen-Yu, or Ergun) that automatically adjust based on how crowded the area is.
MPPICFoam vs. MPPICDyMFoam
The distinction between these two is straightforward but vital for your setup:
| Solver | Best Used For… | Key Feature |
| MPPICFoam | Stationary geometries. | Fixed mesh; ideal for fluidized beds, pneumatic conveying, and silos. |
| MPPICDyMFoam | Moving parts or adaptive meshes. | Supports Dynamic Mesh (DyM). Use this for valves opening/closing, rotating stirrers, or moving pistons interacting with particles. |
Interaction with CFD Analysis
When you run an MP-PIC analysis, the interaction between the parcels and the mesh is a delicate dance. Here is what happens during every time step:
- Fluid Solve: The Eulerian equations calculate the pressure and velocity fields of the gas/liquid.
- Interpolation: The fluid velocity at the exact location of each Lagrangian parcel is interpolated from the surrounding cell nodes.
- Parcel Transport: The solver calculates the new position and velocity of the parcels using the stress gradient and drag forces.
- Source Term Mapping: The momentum lost or gained by the parcels is mapped back onto the Eulerian mesh as a “source term,” which will affect the fluid flow in the next increment.
Pro-Tip for Convergence
Because the particle stress gradient \(\nabla \tau\) is calculated on the mesh, your results can be sensitive to grid size. If your cells are too small (smaller than your particles), the continuum assumption breaks down. Ideally, your mesh cells should be 3–5 times larger than your particle diameter for the MP-PIC math to remain stable and physically accurate.
Whether you’re modeling a chemical reactor or a dust suppression system, these solvers provide that “sweet spot” between accuracy and speed, letting you simulate dense flows that would otherwise crash a standard Lagrangian solver.
CloudHPC is a HPC provider to run engineering simulations on the cloud. CloudHPC provides from 1 to 224 vCPUs for each process in several configuration of HPC infrastructure - both multi-thread and multi-core. Current software ranges includes several CAE, CFD, FEA, FEM software among which OpenFOAM, FDS, Blender and several others.
New users benefit of a FREE trial of 300 vCPU/Hours to be used on the platform in order to test the platform, all each features and verify if it is suitable for their needs