Understanding Distributed Calculi
in Haskell
Pawel Szulc
@rabbitonweb
jobs@pyrofex.net
Understanding Distributed Calculi
in Haskell
Pawel Szulc
@rabbitonweb
Haskell
distributed-process
also known as Cloud Haskell
distributed-process
data Process a
distributed-process
data Process a
ProcessId
send :: Serializable a => ProcessId -> a -> Process ()
distributed-process
data Process a
ProcessId
expect :: forall a. Serializable a => Process a
send :: Serializable a => ProcessId -> a -> Process ()
master
master
ping
pong
master
ping
pong
master
ping
pong
Init(masterPid, pongPid)
master
ping
pong
master
ping
pong
Ping(pingPid)
master
ping
pong
master
ping
pong
Pong
master
ping
master
ping
Done
master
Calculus!
λ-calculus
Syntax
L, M, N :: = x variable
λx.M abstraction
M N application
Few definitions
λx.M
Few definitions
λx.M
Bound variable
Few definitions
λx.λy.(x y x) ≡ λz.λw.(z w z)
alpha equivalence
Few definitions
λx.λy.(x y x) ≡ λz.λw.(z w z)
alpha equivalence
Few definitions
λx.λy.(x y x) ≡ λz.λw.(z w z)
alpha equivalence
Few definitions
λx.M ≡ λy.M {y/x} alpha conversion
Few definitions
λx.(x y)
Few definitions
λx.(x y)
free variable
Beta reduction
● Applying argument to a function.
Beta reduction
● Applying argument to a function.
● Consider it as a single computation step.
Beta reduction
(λx.N) M ➝ N{M/x}
Beta reduction - example
Beta reduction - example
(λx.λy.x) z w
Beta reduction - example
(λx.λy.x) z w ➝
(λy.z) w
Beta reduction - example
(λx.λy.x) z w ➝
(λy.z) w ➝
z
Is that it?
L, M, N :: = x
λx.M
M N
(λx.N) M ➝ N{M/x}
Is that it?
L, M, N :: = x
λx.M
M N
(λx.N) M ➝ N{M/x}
Encodings
Encodings
● support for multi argument functions
λ(x,y).M
Encodings
● support for multi argument functions
λ(x,y).M ≡ λx.λy.M
Encodings
● support for multi argument functions
○ Moses Schönfinkel
Encodings
● support for multi argument functions
○ Haskell Curry
Encodings
● support for multi argument functions
● boolean values
Encodings
● support for multi argument functions
● boolean values
True = λt.λf.t
False = λt.λf.f
Encodings
● support for multi argument functions
● boolean values
True = λt.λf.t
False = λt.λf.f
If = λl.λm.λn l m n
And = λb.λc. b c False
Encodings
● support for multi argument functions
● boolean values
● Church numerals
Is there a
calculus for
distributed
computing?
Is there a calculus for distributed computing?
“The inevitability of the lambda-calculus arises
from the fact that the only way to observe a
functional computation is to watch which output
values it yields when presented with different
input values” [1]
Is there a calculus for distributed computing?
“Unfortunately, the world of concurrent
computation is not so orderly. Different notions of
what can be observed may be appropriate in
different circumstances, giving rise to different
definitions of when two concurrent systems have
‘the same behavior’ ” [1]
CCS & CSP
CCS & CSP
CCS - Calculus of Communicating Systems [3]
CSP - Communicating Sequential Processes [4]
π-calculus
What is π-calculus
“π -calculus is a model of computation for
concurrent systems.” [2]
What is π-calculus
“In lambda-calculus everything is a function (...)
In the pi-calculus every expression denotes a
process - a free-standing computational activity,
running in parallel with other process.” [1]
What is π-calculus
“Two processes can interact by exchanging a
message on a channel” [1]
What is π-calculus
“It lets you represent processes, parallel
composition of processes, synchronous
communication between processes through
channels, creation of fresh channels, replication of
processes, and nondeterminism.” [2]
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Input and output example
Model a program that runs single process P, which
sends message hello on channel x and then
receives a message msg on the same channel x.
Input and output example
P :: =x(hello).x(msg).0
Input and output example
P :: = x(hello).x(msg).0
P
Input and output example
P :: = x(hello).x(msg).0
P
xhello
Input and output example
P :: = x(hello).x(msg).0
P x
hello
Input and output example
P :: = x(hello).x(msg).0
P x
hello
Input and output example
P :: = x(hello).x(msg).0
P x
hello
Input and output example
P :: = x(hello).x(msg).0
P x
hello
Input and output example
P :: = x(hello).x(msg).0
P x
msg
Input and output example
P :: = x(hello).x(msg).0
Input and output example
P :: = x(hello).x(msg).0
Input and output example
P :: = x(hello).x(msg).0
P
xhello
Input and output example
P :: = x(hello).x(msg).0
P
xhello
Waits for message ‘msg’ on channel
‘x’, before continuing.
Input and output example
P :: = x(hello).x(msg).0
P
xhello
This is a synchronous call.
It waits until somebody receives it.
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
How to pronounce ν?
lower-case: ν
sound: Nee
greek name: Νι
How to pronounce ν?
lower-case: ν
sound: Nee
greek name: Νι
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
(νx)P restriction bounds a variable
P ::= yx.0
(νx)P restriction bounds a variable
P ::= yx.0
Q ::= (νx)P
(νx)P restriction bounds a variable
P ::= yx.0
Q ::= (νx)P →
(νx)(yx.0)
(νx)P restriction bounds a variable
(νx)yx.0 == (νz)yz.0
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Input and output example revisited
P :: = x(hello).0
Q :: =x(msg).0
R :: = νx(P | Q).0
Input and output example revisited
P :: = x(hello)
Q :: =x(msg)
R :: = νx(P | Q)
R
Input and output example revisited
P :: = x(hello)
Q :: =x(msg)
R :: = νx(P | Q)
R
Input and output example revisited
P :: = x(hello)
Q :: =x(msg)
R :: = νx(P | Q)
P Q
R
Input and output example revisited
P :: = x(hello)
Q :: =x(msg)
R :: = νx(P | Q)
P Qx
R
Input and output example revisited
P :: = x(hello)
Q :: =x(msg)
R :: = νx(P | Q)
P Qx
hello
R
Input and output example revisited
P :: = x(hello)
Q :: =x(msg)
R :: = νx(P | Q)
P Qx
hello
R
Input and output example revisited
P :: = x(hello)
Q :: = x(msg)
R :: = νx(P | Q)
Q
msg
R
Input and output example revisited
P :: = x(hello)
Q :: = x(msg)
R :: = νx(P | Q)
Input and output example revisited
P :: = x(hello)
Q :: = x(msg)
R :: = νx(P | Q)
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Structural Congruence
“Two processes are structurally congruent, if they
are identical up to structure.” [5]
Structural Congruence
P | Q ≡ Q | P commutativity of parallel composition
Structural Congruence
P | Q ≡ Q | P commutativity of parallel composition
(P | Q) | R ≡ P | (Q | R) associativity of parallel composition
Structural Congruence
P | Q ≡ Q | P commutativity of parallel composition
(P | Q) | R ≡ P | (Q | R) associativity of parallel composition
((νx)P) | Q ≡ (νx)(P | Q) “scope extrusion”
Structural Congruence
P | Q ≡ Q | P commutativity of parallel composition
(P | Q) | R ≡ P | (Q | R) associativity of parallel composition
((νx)P) | Q ≡ (νx)(P | Q) “scope extrusion”
!P ≡ P | !P replication
Structural Congruence
P | Q ≡ Q | P commutativity of parallel composition
(P | Q) | R ≡ P | (Q | R) associativity of parallel composition
((νx)P) | Q ≡ (νx)(P | Q) “scope extrusion”
!P ≡ P | !P replication
(νx)(νy)P ≡ (νy)(νx)P restriction
Reduction rules ⟶
Think of it as a operational semantics.
P ⟶ P’ represents a single computation step
Reduction rules ⟶
xy.P | x(z).Q communication
Reduction rules ⟶
xy.P | x(z).Q communication
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → if P → Q reduction under |
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → Q | R if P → Q reduction under |
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → Q | R if P → Q reduction under |
(νx)P → if P → Q reduction under ν
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → Q | R if P → Q reduction under |
(νx)P → (νx)Q if P → Q reduction under ν
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → Q | R if P → Q reduction under |
(νx)P → (νx)Q if P → Q reduction under ν
P → if P ≡ P’ Q’ ≡ Q structural congruence
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → Q | R if P → Q reduction under |
(νx)P → (νx)Q if P → Q reduction under ν
P → if P ≡ P’ → Q’ ≡ Q structural congruence
Reduction rules ⟶
xy.P | x(z).Q → P | [y/z]Q communication
P | R → Q | R if P → Q reduction under |
(νx)P → (νx)Q if P → Q reduction under ν
P → Q if P ≡ P’ → Q’ ≡ Q structural congruence
Some examples(1) - PingPong
PING ::= x(ping).x(pong)
PONG ::= x(ping).x(pong)
P ::= PING | PONG
Some examples(1) - PingPong
PING ::= x(ping).x(pong)
PONG ::= x(ping).x(pong)
P ::= x(ping).x(pong) | x(ping).x(pong)
Some examples(1) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong)
Some examples(1) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong)
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong)
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong)
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= x(pong) | x(pong)
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= x(pong) | x(pong)
Some examples(1) - PingPong
P ::= x(pong) | x(pong)
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= x(pong) | x(pong)
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= 0 | 0
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(1) - PingPong
P ::= 0 | 0
Some examples(1) - PingPong
P ::= 0 | 0
P | 0 ≡ P missing equivalence?
Some examples(1) - PingPong
P ::= 0 | 0
P | 0 ≡ P wikipedia equivalence
Some examples(1) - PingPong
P ::= 0
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)(P | Q)
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy)xy.yw.yz | x(y).y(h).y(h))
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy)xy.yw.yz | x(y).y(h).y(h))
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy)xy.yw.yz | x(y).y(h).y(h))
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy)yw.yz | y(h).y(h))
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy)yw.yz | y(h).y(h))
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy)yz | y(h))
xy.P | x(z).Q → P | [y/z]Q communication
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= (νx)((νy) 0 | 0)
Some examples(0) - Sending channels
P ::= (νy)xy.yw.yz
Q ::= x(y).y(h).y(h)
R ::= 0
Some examples(2) - PingPong
PING ::= x(ping).x(pong)
PONG ::= x(ping).x(pong)
P ::= PING | PONG | PONG
Some examples(2) - PingPong
P ::= PING | PONG | PONG
Some examples(2) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong) | x(ping).x(pong)
Some examples(2) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong) | x(ping).x(pong)
Some examples(2) - PingPong
P ::= x(ping).x(pong) | x(ping).x(pong) | x(ping).x(pong)
Some examples(2) - PingPong
P ::=x(pong) | x(ping).x(pong) |x(pong)
Some examples(2) - PingPong
P ::=x(pong) | x(ping).x(pong) |x(pong)
Some examples(2) - PingPong
P ::= 0 | x(ping).x(pong) | 0
Some examples(3) - PingPong
PING ::= x(ping).x(pong)
PONG ::= x(ping).x(pong)
P ::= PING | PONG
Some examples(3) - PingPong
PING ::= x(ping).x(pong)
PONG ::= x(ping).x(pong)
P ::= !PING | !PONG
Some examples(3) - PingPong
P ::= !PING | !PONG
Some examples(3) - PingPong
P ::= !PING | !PONG
!P ≡ P | !P replication
Some examples(3) - PingPong
P ::= PING | !PING | PONG | !PONG
!P ≡ P | !P replication
Some examples(3) - PingPong
P ::= PING | !PING | PONG | !PONG
Some examples(3) - PingPong
P ::= 0 | !PING | 0 | !PONG
Some examples(3) - PingPong
P ::= 0 | PING | !PING | 0 | PONG | !PONG
Some examples(3) - PingPong
P ::= 0 | 0 | !PING | 0 | 0 | !PONG
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
P ::= 0 | xz | y(v)
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
P ::= 0 | xz | y(v) P ::= xy | 0 | z(v)
Some examples(4) - Race Conditions
P ::= xy | xz | x(w).w(v)
P ::= xz | y(v) P ::= xy | z(v)
Two processes P and Q are bisimilar if ...
“Two processes P and Q are bisimilar if every
action of one can be matched by a corresponding
action of the other to reach bisimilar state” [1]
Benefits
● reason about computation
● detect dead-locks
● detect non-determinism
● reason about structure
● good to describe protocols
● a formal framework for providing semantics for
a high-level language
distributed-process
Revisited
Typed channels
data SendPort a
data ReceivePort a
Typed channels
data SendPort a
data ReceivePort a
newChan :: Serializable a => Process (SendPort a, ReceivePort a)
Typed channels
data SendPort a
data ReceivePort a
newChan :: Serializable a => Process (SendPort a, ReceivePort a)
sendChan :: Serializable a => SendPort a -> a -> Process ()
receiveChan :: Serializable a => ReceivePort a -> Process a
Example
MASTER ::= (νx)(cx.cx.(PING | PONG).c(done)
PING ::= c(x).x(ping).x(pong).x(done)
PONG::= c(x).x(ping).x(pong)
Implementation issue...
“This seems quite natural.(...).But there’s a big problem here.
ReceivePorts are not Serializable, which prevents us passing
the ReceivePort r1 to the spawned process. GHC will reject the
program with a type error.” [8]
Implementation issue...
“Why are ReceivePorts not Serializable? If you think about it a
bit, this makes a lot of sense. If a process were allowed to send
a ReceivePort somewhere else, the implementation would have
to deal with two things: routing messages to the correct desti‐
nation when a ReceivePort has been forwarded (possibly
multiple times), and routing messages to multiple destinations,
because sending a ReceivePort would create a new copy.” [8]
Implementation issue...
“This would introduce a vast amount of complexity to the
implementation, and it is not at all clear that it is a good feature
to allow. So the remote framework explicitly disallows it,
which fortunately can be done using Haskell’s type system.”
[8]
Issues
● synchronous by nature
● receiving on send channel - implementation
dilemmas
● notion of creating a named channel
async π-calculus
Chemical State Machine
“The chemical abstract machine” G. Berry, G. Boudl
Chemical State Machine
“The chemical abstract machine” G. Berry, G. Boudl
Chemical State Machine
“The chemical abstract machine” G. Berry, G. Boudl
Chemical State Machine
“The chemical abstract machine” G. Berry, G. Boudl
Chemical State Machine
“The chemical abstract machine” G. Berry, G. Boudl
Sync π syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Async π syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy.P output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Async π syntax
P, Q, R :: = 0 inert process
x(y).P input prefix
xy output prefix
P | Q parallel composition
(νx)P restriction
!P replication
Benefits
● almost all expressive power of classical
pi-calculus
● models asynchronous computation
Issues
● Is not a closed theory
● Heating / cooling rules seem like an overkill
● Not as powerful as synchronous version
○ “Comparing the expressive power of the synchronous
and asynchronous pi-calculus” Catuscia Palamidessi
ρ-calculus
ρ-calculus
“The π-calculus is not a closed theory, but rather
a theory dependent upon some theory of names.
(...) names may be tcp/ip ports or urls or object
references, etc. But, foundationally, one might ask
if there is a closed theory of processes, i.e. one in
which the theory of names arises from and is
wholly determined by the theory of processes.” [7]
Quoting
“Here we present a theory of an asynchronous
message-passing calculus built on a notion of
quoting. Names are quoted processes, and as such
represent the code of a process.(...) Name-passing,
then becomes a way of passing the code of a
process as a message.” [7]
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
x⦉P⦊
“Process P will be packaged up as its code, ⌜P⌝,
and ultimately made available as an output at the
port x” [7]
x⦉P⦊
“Process P will be packaged up as its code, ⌜P⌝,
and ultimately made available as an output at the
port x” [7]
“The lift operator turns out to play a role
analogous to (νx)”
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
⌝x⌜
“The ⌝x⌜ operator (...) eventually extracts the
process from a name. We say ‘eventually’ because
this extraction only happens when quoted process
is substituted into this expression.” [7]
⌝x⌜
“A consequence of this behaviour is that the ⌝x⌜ is
inert, except under an input prefix” [7]
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
x(y) - syntactic sugar
x(y) ≜ x⦉⌝y⌜⦊
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Syntax
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
How to create a single name even?
P, Q ::= 0 inert process
x(y).P input
⌝x⌜ drop
x⦉P⦊ lift
P | Q parallel
x, y ::= ⌜P⌝ quote
Name game!
x ::= ⌜?⌝
Name game!
x ::= ⌜0⌝
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝
x(y) output
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝ // ⌜x(x)⌝
x(y) output
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝
z ::= ⌜⌜0⌝(⌜0⌝).0⌝
x(y).P input
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝
z ::= ⌜⌜0⌝(⌜0⌝).0⌝ // ⌜x(x).0⌝
x(y).P input
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝
z ::= ⌜⌜0⌝(⌜0⌝).0⌝
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝
z ::= ⌜⌜0⌝(⌜0⌝).0⌝
q ::= ⌜0 | 0 ⌝
Name game!
x ::= ⌜0⌝
y ::= ⌜⌜0⌝(⌜0⌝)⌝
z ::= ⌜⌜0⌝(⌜0⌝).0⌝
q ::= ⌜0 | 0 ⌝
p ::= ⌜0 | 0 | 0 ⌝
Are those different names?
x ::= ⌜0⌝
q ::= ⌜0 | 0 ⌝
p ::= ⌜0 | 0 | 0 ⌝
Are those different names?
“This question leads to several intriguing and
apparently fundamental questions. Firstly, if
names have structure, what is a reasonable notion
of equality on names? How much computation, and
of what kind, should go into ascertaining equality
on names?” [7]
Structural congruence
P | 0 ≡ P ≡ 0 | P
P | Q ≡ Q | P
(P | Q) | R ≡ P | (Q | R)
alpha-equivalence
x(z).w⦉y(z)⦊
x(v).w⦉y(v)⦊
alpha-equivalence
x(z).w⦉y(z)⦊
x(v).w⦉y(v)⦊
x(v).w⦉y(v)⦊ {z/v}
Name equivalence
⌜⌝x⌜⌝ ≡ x quote-drop
P ≡ Q ➝ ⌝P⌜ ≡ ⌝Q⌜ struct-equiv
Wait, what?
⌜⌝x⌜⌝ ≡ x
P ≡ Q ➝ ⌝P⌜ ≡ ⌝Q⌜
P | 0 ≡ P ≡ 0 | P
P | Q ≡ Q | P
(P | Q) | R ≡ P | (Q | R)
Wait, what?
⌜⌝x⌜⌝ ≡ x
P ≡ Q ➝ ⌝P⌜ ≡ ⌝Q⌜
P | 0 ≡ P ≡ 0 | P
P | Q ≡ Q | P
(P | Q) | R ≡ P | (Q | R)
“If you made them and they made you...”
⌜⌝x⌜⌝ ≡ x
P ≡ Q ➝ ⌝P⌜ ≡ ⌝Q⌜
P | 0 ≡ P ≡ 0 | P
P | Q ≡ Q | P
(P | Q) | R ≡ P | (Q | R)
It all works out...
⌜⌝x⌜⌝ ≡ x
P ≡ Q ➝ ⌝P⌜ ≡ ⌝Q⌜
P | 0 ≡ P ≡ 0 | P
P | Q ≡ Q | P
(P | Q) | R ≡ P | (Q | R)
Operational Semantics
Operational Semantics
x0
≡ x1
then x0
⦉(Q)⦊ | x1
(y).P ➝ P {⌝Q⌜/y}
Operational Semantics
x0
≡ x1
then x0
⦉(Q)⦊ | x1
(y).P ➝ P {⌝Q⌜/y}
P ➝ P’ then P | Q ➝ P’ | Q
Operational Semantics
x0
≡ x1
then x0
⦉(Q)⦊ | x1
(y).P ➝ P {⌝Q⌜/y}
P ➝ P’ then P | Q ➝ P’ | Q
P ≡ P’ and Q ≡ Q’ and Q’ ➝ P’ then P ➝ Q
x(y) - syntactic sugar
x(y) ≜ x⦉⌝y⌜⦊
x(y) - syntactic sugar proof!
x(z) | x(y).P
x(y) - syntactic sugar proof!
x(z) | x(y).P
P {z/y}
x(y) - syntactic sugar proof!
x(z) | x(y).P
x(y) - syntactic sugar proof!
x(z) | x(y).P ➝
x⦉⌝z⌜⦊ | x(y).P
x(y) - syntactic sugar proof!
x(z) | x(y).P ➝
x⦉⌝z⌜⦊ | x(y).P ➝
P {⌜⌝z⌜⌝/y}
x(y) - syntactic sugar proof!
x(z) | x(y).P ➝
x⦉⌝z⌜⦊ | x(y).P ➝
P {⌜⌝z⌜⌝/y} ≡
P {z/y}
RHO-lang
● Bisimulation
What we have not covered?
References
1 “Foundational Calculi for Programming Languages” Benjamin C. Pierce
2 "FAQ on π-Calculus" Jeannette M. Wing
3 “A Calculus of Communicating Systems” Robin Milner
4 “Communicating Sequential Processes” C.A.R Hoare
5 https://en.wikipedia.org/wiki/%CE%A0-calculus
6 “The Polyadic Pi-Calculus: a Tutorial” Robin Milner
7 “A Reflective Higher-Ordered Calculus” L.G. Meredith, Matthias Radestock
8 “Parallel and Concurrent Programming in Haskell: Techniques for Multicore”
Pawel Szulc@rabbitonweb
THE END
jobs@pyrofex.net

Understanding distributed calculi in Haskell