3. Wide area message passing
Connect applications running on different
platforms by establishing communication paths.
Each path can be hand-tuned for better
performance.
More light-weight than MPI, useful for coupling
and parallelizing codes over long distances.
6. ● Couples computational models.
● Connects these over wide area networks
using MPWide.
● Handles models’ time and space scales as
per the Multiscale Modeling and Simulation
framework*.
● Supports Java, C, C++ and Fortran.
● Used by 10+ production applications.
9. ● Aim: Accurately model cerebrovascular
bloodflow with acceptable performance.
● Approach: integrate a
person-specific circulation model
with a high-res local vasculature
model.
10. ● Future applications:
○ Comparison of rheology models.
○ Validation against medical data (ongoing).
○ Look for predictive indicators of aneurysm rupture.
● And eventually predict the outcome of
cerebrovascular surgery.
14. 1D
We couple the 1D
Python Navier-Stokes
(PyNS) solver to
HemeLB to construct
a multiscale model..
3D
We use MPWide to
efficiently exchange
data between a
desktop in London
and a supercomputer
in Edinburgh.
supercomputer
15.
16. More?
Groen et al., Interface Focus 3(2), 2013.
Groen et al., Journal of Computational Science
4(5), 2013.
Bernabeu et al., Interface Focus 3(2), 2013.
http://www.slideshare.
net/DerekGroen/multiscale-modelling-of-brainbloodflow
18. Aim: To develop quantitative coarse-grained
models of clay-polymer nanocomposites.
We will use these models to:
● Predict the thermodynamically favourable state of the
composites.
● Predict their elasticity.
19. We require:
● Accurate potentials.
● Realistic structures.
● Task farming many MD simulations.
30. Scientific Challenges
Just scratching the surface here:
● Which couplings can deliver useful
information?
● What information should we exchange?
● How do we validate and error-check coupled
models?
○ ...what if they are multi-physics as well?
32. More?
“Survey of Multiscale and Multiphysics
Applications and Communities”
Derek Groen, Stefan Zasada and Peter
Coveney
IEEE Computing in Science & Engineering (in
press), 2013.
preprint at: http://arxiv.org/1208.6444
33. Acknowledgements
Slides made by Derek Groen
Thanks go out to:
James Suter, Rupert Nash, James
Hetherington, Peter Coveney, Hywel Carver,
Stefan Zasada, Steven Rieder, Simon
Portegies Zwart, Chris Kurowski, Alfons
Hoekstra, Werner Dubitzky...and many others!