The Use of HPC for Undertaking Probabilistic Explosion Assessments


HPC – The General Perception
When we hear the term HPC, high performance computing, we perhaps instinctively think about big simulations – in the case of CFD, undertaking individual simulations using billion-cell plus meshes running massively in parallel across several thousand computational nodes on a super-computer. In many areas of engineering simulation, however, the requirement to routinely simulate flow cases that use such massive computational meshes does not yet exist. So what can HPC offer in that case?

This article describes how HPC can be exploited to successfully deliver probabilistic analyses where a large number (hundreds or thousands) of simpler cases are simulated to allow the sensitivity of the CFD/FEA predictions to be understood with respect to a wide range of input parameters. Its successful implementation relies on three aspects:

• retaining a fit-for-purpose LPC (low performance computing) approach when creating individual simulation cases, so that they will each run quickly on the HPC host 

• automating the pre-processing workflow to systematically create a large number of underlying simulation cases

• automating the post-processing workflow to compile the simulation predictions into a useful form of information for further interpretation.

 

This article discusses the CFD modelling of hydrocarbon explosions within congested spaces using a probabilistic framework, which is typically required for the determination of structural design loads in the oil and gas sector.

Document Details

Reference

BM_Apr_16_5

Authors

Howell. S

Language

English

Type

Magazine Article

Date

2016-04-01

Organisations

Abercus

Region

Global

 NAFEMS Member Download



This site uses cookies that enable us to make improvements, provide relevant content, and for analytics purposes. For more details, see our Cookie Policy. By clicking Accept, you consent to our use of cookies.