skip to main content
10.1145/3545008.3545016acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicppConference Proceedingsconference-collections
research-article

ParallelDualSPHysics: supporting efficient parallel fluid simulations through MPI-enabled SPH method

Published: 13 January 2023 Publication History

Abstract

Smoothed Particle Hydrodynamics (SPH) is a classical mesh-free particle method which has been successfully applied in the field of Computational Fluid Dynamics (CFD). Its advantages over traditional mesh-based methods have made it very popular in simulating problems involving large deformation and free-surface flow. The high computational cost of the SPH method has obstructed its vast application. A lot of research effort has been devoted to accelerating the SPH method using GPU and multi threading. However, developing efficient parallel SPH algorithms on modern high-performance computers (HPCs) remains significantly challenging, especially for simulating real-world engineering problems involving hundreds of millions of particles. In this paper, we proposed an MPI-enabled parallel SPH algorithm and developed the ParallelDualSPHysics1, an open-source software supporting efficient parallel fluid simulations. Based on an efficient domain decomposition scheme, the essential data structure and algorithms of DualSPHysics were refactored to build the parallel version. For collaborating with evenly distributed particles on a distributed-memory HPC system, the parallel particle interaction and particle update modules were introduced, which enabled the SPH solver to synchronize computations among multiple processors using MPI. In addition, the redesigned pre-processing and post-processing capabilities of the ParallelDualSPHysics supported the applications of this software in a wide range of areas. Real-life test cases with up to 120 million particles were simulated and analyzed on a modern HPC system. The results showed that the parallel efficiency of ParallelDualSPHysics exceeds 90 with up to 1024 CPU cores. It indicated that ParallelDualSPHysics has the potential for large-scale engineering applications.

References

[1]
Takashi Amada, Masataka Imura, Yoshihiro Yasumuro, Yoshitsugu Manabe, and Kunihiro Chihara. 2004. Particle-based fluid simulation on GPU. In ACM workshop on general-purpose computing on graphics processors, Vol. 41. Citeseer, 42.
[2]
Angela, Ferrari, Michael, Dumbser, and Eleuterio. 2009. A new 3D parallel SPH scheme for free surface flows. Computers and Fluids(2009).
[3]
Alejandro JC Crespo, José M Domínguez, Benedict D Rogers, Moncho Gómez-Gesteira, S Longshaw, RJFB Canelas, Renato Vacondio, Anxo Barreiro, and Orlando García-Feal. 2015. DualSPHysics: Open-source parallel CFD solver based on Smoothed Particle Hydrodynamics (SPH). Computer Physics Communications 187 (2015), 204–216.
[4]
K. Dan, J. Bender, B. Solenthaler, and M. Teschner. 2020. Smoothed Particle Hydrodynamics Techniques for the Physics Based Simulation of Fluids and Solids. (2020).
[5]
Karen Devine, Bruce Hendrickson, Erik Boman, Matthew St. John, and Courtenay Vaughan. 2000. Design of dynamic load-balancing tools for parallel applications. In Proceedings of the 14th international conference on Supercomputing. 110–118.
[6]
José M Domínguez, Alejandro JC Crespo, Daniel Valdez-Balderas, Benedict D Rogers, and Moncho Gómez-Gesteira. 2013. New multi-GPU implementation for smoothed particle hydrodynamics on heterogeneous clusters. Computer Physics Communications 184, 8 (2013), 1848–1860.
[7]
Jose M Domínguez, Georgios Fourtakas, Corrado Altomare, Ricardo B Canelas, Angelo Tafuni, Orlando García-Feal, Ivan Martínez-Estévez, Athanasios Mokos, Renato Vacondio, Alejandro JC Crespo, 2021. DualSPHysics: from fluid dynamics to multiphysics problems. Computational Particle Mechanics(2021), 1–29.
[8]
Robert A Gingold and Joseph J Monaghan. 1977. Smoothed particle hydrodynamics: theory and application to non-spherical stars. Monthly notices of the royal astronomical society 181, 3 (1977), 375–389.
[9]
Christoph Gissler, Stefan Band, Andreas Peer, Markus Ihmsen, and Matthias Teschner. 2017. Approximate air-fluid interactions for SPH. In Proceedings of the 13th Workshop on Virtual Reality Interactions and Physical Simulations. 29–38.
[10]
Moncho Gómez-Gesteira, Alejandro JC Crespo, Benedict D Rogers, Robert A Dalrymple, José M Dominguez, and Anxo Barreiro. 2012. SPHysics–development of a free-surface fluid solver–Part 2: Efficiency and test cases. Computers & Geosciences 48 (2012), 300–307.
[11]
Xiaohu Guo, Benedict D Rogers, Steven Lind, and Peter K Stansby. 2018. New massively parallel scheme for Incompressible Smoothed Particle Hydrodynamics (ISPH) for highly nonlinear and distorted flow. Computer Physics Communications 233 (2018), 16–28.
[12]
T. Harada, S. Koshizuka, and Y. Kawaguchi. 2007. Smoothed particle hydrodynamics on GPUs. Proc. Computer Graphics Int., Rio de Janeiro, Brazil, May 30-Jun. 2, 2007 4, 4(2007), 671–691.
[13]
Anis Hasanpour, Denis Istrati, and Ian Buckle. 2021. Coupled SPH–FEM Modeling of Tsunami-Borne Large Debris Flow and Impact on Coastal Structures. Journal of Marine Science and Engineering 9, 10 (2021), 1068.
[14]
Alexis Hérault, Giuseppe Bilotta, and Robert A Dalrymple. 2010. Sph on gpu with cuda. Journal of Hydraulic Research 48, sup1 (2010), 74–79.
[15]
MB Liu and GR Liu. 2010. Smoothed particle hydrodynamics (SPH): an overview and recent developments. Archives of computational methods in engineering 17, 1(2010), 25–76.
[16]
Julien Loiseau, Hyun Lim, Mark Alexander Kaltenborn, Oleg Korobkin, Christopher M Mauney, Irina Sagert, Wesley P Even, and Benjamin K Bergen. 2020. FleCSPH: The next generation FleCSIble parallel computational infrastructure for smoothed particle hydrodynamics. SoftwareX 12(2020), 100602.
[17]
D. Nishiura, M. Furuichi, and H. Sakaguchi. 2015. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing. Computer Physics Communications 194 (2015), 18–32.
[18]
Matteo Nori and Marco Baldi. 2018. AX-GADGET: a new code for cosmological simulations of Fuzzy Dark Matter and Axion models. Monthly Notices of the Royal Astronomical Society 478, 3 (2018), 3935–3951.
[19]
G. Oger, D. Le, Touze, and D. Guibert. 2016. On distributed memory MPI-based parallelization of SPH codes in massive HPC context - ScienceDirect. Computer Physics Communications 200 (2016), 1–14.
[20]
Prabhu Ramachandran, Aditya Bhosale, Kunal Puri, Pawan Negi, Abhinav Muta, A Dinesh, Dileep Menon, Rahul Govind, Suraj Sanka, Amal S Sebastian, 2021. PySPH: a Python-based framework for smoothed particle hydrodynamics. ACM Transactions on Mathematical Software (TOMS) 47, 4 (2021), 1–38.
[21]
J. Ren, T. Jiang, W. Lu, and G. Li. 2016. An improved parallel SPH approach to solve 3D transient generalized Newtonian free surface flows. Computer Physics Communications 205 (2016), 87–105.
[22]
Eugenio Rustico, Giuseppe Bilotta, Alexis Herault, Ciro Del Negro, and Giovanni Gallo. 2012. Advances in multi-GPU smoothed particle hydrodynamics simulations. IEEE Transactions on Parallel and Distributed Systems 25, 1 (2012), 43–52.
[23]
Matthieu Schaller, Pedro Gonnet, Peter W Draper, Aidan BG Chalk, Richard G Bower, James Willis, and Loïc Hausammann. 2018. SWIFT: SPH with inter-dependent fine-grained tasking. Astrophysics Source Code Library(2018), ascl–1805.
[24]
Andrew V. Terekhov. 2010. Parallel Dichotomy Algorithm for solving tridiagonal system of linear equations with multiple right-hand sides. Parallel Comput. 36, 8 (2010), 423–438.
[25]
Satori Tsuzuki and Takayuki Aoki. 2016. Effective dynamic load balance using space-filling curves for large-scale SPH simulations on GPU-rich supercomputers. In 2016 7th Workshop on Latest Advances in Scalable Algorithms for Large-Scale Systems (ScalA). IEEE, 1–8.
[26]
J. S. Willis, M. Schaller, P. Gonnet, R. G. Bower, and P. W. Draper. 2018. An Efficient SIMD Implementation of Pseudo-Verlet Lists for Neighbour Interactions in Particle-Based Codes. (2018).
[27]
Yang Yang, Xin Liu, Yufa Xia, Wanqing Wu, Huahua Xiong, Heye Zhang, Lin Xu, Kelvin KL Wong, Hanbin Ouyang, and Wenhua Huang. 2017. Impact of spatial characteristics in the left stenotic coronary artery on the hemodynamics and visualization of 3D replica models. Scientific Reports 7, 1 (2017), 1–13.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICPP '22: Proceedings of the 51st International Conference on Parallel Processing
August 2022
976 pages
ISBN:9781450397339
DOI:10.1145/3545008
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 January 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Computational Fluid Dynamics
  2. Parallel Computing
  3. ParallelDualSPHysics
  4. Smoothed Particle Hydrodynamics

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • State Key Laboratory of High Performance Computing
  • National Natural Science Foundation of China

Conference

ICPP '22
ICPP '22: 51st International Conference on Parallel Processing
August 29 - September 1, 2022
Bordeaux, France

Acceptance Rates

Overall Acceptance Rate 91 of 313 submissions, 29%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 140
    Total Downloads
  • Downloads (Last 12 months)51
  • Downloads (Last 6 weeks)2
Reflects downloads up to 09 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media