Engineering Simulations of
@ PyConAPAC 2014 May 17
❖ I am: an engineer using computers to solve problems in mechanics.!
❖ I want to solve conservation laws. I want to solve for various physical
processes. I should build a framework.!
❖ Examples of the simulations:
Stress Waves in!
Supersonic Jet in Cross Flow
❖ We want multiple solvers for physical processes governed by the equations:!
❖ We use the space-time Conservation Element and Solution Element (CESE)
❖ We want an extensible system, modularized for various physical processes.!
❖ Then plan for interoperation.!
❖ We want parallel code (scale from 1,000 cores to more). We want hybrid
parallelism for heterogeneous architecture.!
❖ We want the code to be organized. We don’t want spaghetti code patched
from paper to paper. We want to code systematically.
It’s an Engineering Problem
❖ “Develop a software framework for the CESE method.”!
❖ Balance the following points:!
❖ Extensibility and modularity.!
❖ Parallel computing.!
❖ All goals are achieved, with rough ends.
❖ Python is slow, but provides a good infrastructure for working with
low-level number-crunching code.!
❖ ndarray (N-dimensional array) of NumPy (http://
❖ Cython (http://cython.org/) for interfacing between Python and C.!
❖ The versatile scientiﬁc Python ecosystem simpliﬁes coding supportive
❖ Practices emerged from other ﬁelds ﬂow into sciences:!
❖ Version control, unit tests, documenting tools, etc.
Scripting Is a Must
❖ There are too many parameters and analyses required.!
❖ No interface is sufﬁcient for the simulations: SOLVCON
needs to be a library.!
❖ Controlling a library by using low-level languages like
C and C++ is too painful to be realistic.
Design for Productivity
❖ The outstanding capability to interface to C makes Python an
ideal manager for low-level, high-performance C code.!
❖ C++ works as well but SOLVCON uses only C to avoid the
❖ The hybrid approach allows us to architect the system with the
high-level Python and still have the high performance of C.!
❖ Ideas can be quickly prototyped with Python. After results
get validated, the fast C version can then developed
❖ The conservation laws we are
solving assume continuum
❖ In computer memory nothing
❖ There must be a way to
discretize the space for the
❖ We chose to employ the unstructured meshes of mixed
elements to model complex geometry.!
❖ In computer memory they consist of several look-up
hexahedron tetrahedron prism pyramid
Look-Up Tables for Meshes
❖ SOLVCON categorize the look-up tables into three: (i)
coordinate, (ii) meta-data, and (iii) connectivity.!
❖ They are either 1-D or 2-D arrays.!
❖ These arrays contain “ghost cells” for boundary-
❖ The ghost cells are outside the computing space.!
❖ The CESE method needs only one layer of ghost cells.
❖ Millions or billions of
unstructured mesh elements
can be automatically
decomposed into multiple
blocks for parallel computing.!
❖ A graph is built for
representing the connectivity
among all the mesh elements.!
❖ The partitioned graphs are
used to decompose the
original mesh into thousands
Two-Level Nested Loops
❖ The time-marching nature of
the CESE method dictates a
structure of two-level nested
❖ The outer loop is for time-
❖ Multiple inner loops
“sweep” over the discretized
❖ All facilities can built around
this execution ﬂow.
❖ solvcon.case.MeshCase: the outer temporal loop.!
❖ solvcon.solver.MeshSolver: house of the inner spatial loops.
❖ What’s in-situ visualization/analysis (ISV)?!
❖ Perform result visualization/analysis while time-marching.!
❖ Why in-situ?!
❖ [50 million-element mesh] * [10,000 time steps] * [6 single-precision solution
ﬁelds (CFD)] mean 12 TB of data. It is one simulation case.!
❖ Segregating result analysis from simulation code is difﬁcult.!
❖ Architecture of most research codes isn’t good enough for the segregation.!
❖ Most codes simply produce the data and use another software to post-
❖ A common practice is to reduce the resolution.!
❖ Outputting every 100 time steps reduces the output dat to 120 GB.!
❖ But it defeats the purpose of high-resolution simulations.
❖ SOLVCON provides
companions of MeshCase and
MeshSolver: MeshHook and
❖ MeshCase and MeshSolver
are for equation solutions.!
❖ MeshHook and MeshAnchor
are for solution analysis.!
❖ And there’re still a lot to do.
❖ SOLVCON documents should contain what’s in
conventional research papers.!
❖ What we want is more than conventional papers.!
❖ Bi-directional linking between the code and the
❖ Some part of the documents should be runnable. It’s
related to validation.
❖ Sphinx (http://sphinx-doc.org/) is the default tool for
❖ ReadTheDocs (https://readthedocs.org/) is a service to
host Sphinx documents.!
❖ SOLVCON authors and organizes its documents with
❖ The documents are published at http://
www.solvcon.net/ (ReadTheDocs is behind the scenes).
Authoring, Editing, and Reviewing
❖ We use Mercurial for version control and BitBucket
(https://bitbucket.org/solvcon/solvcon/) to host the
project and its source code.!
❖ Reviewing the documents goes through the pull-request
ﬂow of the code review.!
❖ We treat the reStructuredText source ﬁles of the
documents like the program source code.
❖ Code without veriﬁcation can’t be taken as working code,
no matter how carefully we wrote it.!
❖ With the testing system, refactoring and quick prototyping
❖ Many research codes don’t have a regression testing
system, so while coding the developers can’t be sure
how the changes impact the system.!
❖ Two kinds of tests: unit tests and answer tests. Both need
❖ Unit tests are short and quick, and check for simple
output of elementary API behaviors.!
❖ SOLVCON uses standard Python unittest module
❖ nose (https://nose.readthedocs.org/) is used as the
automatic test runner.
❖ Answer tests check for the correctness of simulations.
Usually an answer test is a meaningful simulation, and
can serve as an example to how to use SOLVCON to
simulate other problems. !
❖ Jenkins (http://jenkins-ci.org/) is installed at http://
ci.solvcon.net/ to automatically run the answer tests for
every commit of SOLVCON in a testing farm.
❖ The simulations were done with older version of
SOLVCON; it takes a lot of efforts to upgrade (ongoing).!
❖ Unfamiliarity of the collaborators to the authoring
system (Sphinx): http://www.solvcon.net/en/latest/bulk/theory.html!
❖ Uniﬁcation of code and document needs extra efforts.!
❖ Feeding back to the main framework.!
❖ Server farm for continuous integration.
❖ New physical processes:!
❖ solvcon.parcel.bulk: acoustic wave solver based on
❖ solvcon.parcel.vewave: visco-elastic wave solver.!
❖ Document the CESE method.!
❖ Documenting with iPython in addition to Sphinx?!
❖ Get more people to play together!