• Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
158
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
6
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. On the Verification of CryptographicProtocolsFederico CeruttiDipartimento di Ingegneria dell’Informazione, Universit`a di BresciaVia Branze 38, I-25123 Brescia, ItalyJanuary 11, 2011Talk at Prof. Giuzzi’s Course: “Algebra for coding theory and cryptography”c 2010 Federico Cerutti <federico.cerutti@ing.unibs.it>
  • 2. From Cryptography to ProtocolsCryptography hides information: send or receive messages inencrypted formCryptographic functions can be put to use in more creative waysto design security protocols[Needham and Schroeder, 1978]’s security goals:Establishment of authenticated interactive communication(greeting messages over the phone)One-way authentication (both parties could not be available at thesame time)Signed communication (nonrepudiation feature, the recepient canprove to a third party that the message originated from theclaimed sender)c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 2
  • 3. Needham Schroeder Protocol (NS)[Needham and Schroeder, 1978]Notation:Principals (e.g. A, B)Encryption keys, as shared keys (e.g. Kab), public keys (e.g. Ka), and the corresponding privatekeys (e.g. K−1a )Statements or formulae (e.g. Na)SA B1: A, B2: {Kb, B}K−1s3: {Na, A}Kb6: {Na, Nb}Ka7: {Nb}Kb4: B, A5: {Ka, A}K−1sAssumption: both parties know S’ public keyMessage 1: A → S: A, B Message 5: S → B: {Ka, A}K−1sMessage 2: S → A: {Kb, B}K−1sMessage 6: B → A: {Na, Nb}KaMessage 3: A → B: {Na, A}KbMessage 7: A → B: {Nb}KbMessage 4: B → S: B, Ac 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 3
  • 4. Important issues in designing of authenticationprotocolsThe adversary attacking a network protocol can actively interfere(interposition, eavesdropping, alteration, dropping, duplication,rerouting, messages rearranging).Encryption is a basic tool for the construction of protocols, but itneeds to be used properly in order to achieve the security goal.Security goal can be (and should be) defined independently fromthe underlying cryptographic techniques (symmetric/asymmetriccryptography. . . ) used in the solution.Recognize the need for techniques for the verification andanalysis of security protocols.Protocols such as those developed here are prone to extremely subtle errors that areunlikely to be detected in normal operation. The need for techniques to verify thecorrectness of such protocols is great, and we encourage those interested in such problemsto consider this area. [Needham and Schroeder, 1978]c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 4
  • 5. Dolev, YaoBurrows, Abadi, NeedhamDelagrande, Grote, Hunterc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 5
  • 6. The first attempt: the Dolev-Yao (DY) model[Dolev and Yao, 1981]Very constrainedDoes not allow to describe many interesting protocolsBut:It is the first paper providing a formal modelThe adversarial model is still quite generalThey show that by restricting the class of target protocols one canobtain interesting algorithmic resultsImportant requirements:A precise language for the description of protocolsA formal execution model that describes how the protocol is run,possibly in the presence of an adversary. This includes adescription of the adversary’s capabilitiesA formal language for specifying desired security protocolsc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 6
  • 7. DY AssumptionsA perfect public key system, where:the one-way functions used are unbreakablethe public directory is secure and cannot be tampered witheveryone has access to all public keysthe private keys are known by the legitimate userIn a two-party protocol, only the two users who wish tocommunicate are involved in the transmission processOnly “active” eavesdroppers, or someone who first taps thecommunication line to obtain messages and then tries everythinghe can in order to discover the plaintext. A saboteur:can obtain any message passing through the networkis a legitimate user of the network (he can initiate a conversationwith any other user)will have the opportunity to be a receiver to any userc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 7
  • 8. DY sketchedLet Σ be a finite alphabet of operator symbols and Σ∗ be the set ofall finite strings over Σ including the empty word λ.Σ = {d} ∪ {EX, DX, iX, dX|X is a user name}Reduction rulesEXDX → λ DXEX → λ dXiX → λ diX → λA Ping-pong protocol P(S, R) between the users S and R is asequence of strings α1, α2, . . . , αl such that αi ∈ Σ∗S if i is odd andαi ∈ Σ∗R otherwise. [Dolev et al., 1982]∆ = (ΣZ ∪ {αi(X, Y )|1 ≤ i ≤ l, X and Y are different users})∗ is anoperator-words of the adversary.Let α1(S, R) be the first word of P(S, R) and Z an adversary. P isinsecure, if there exists a string γ ∈ ∆ such that γα1 = λc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 8
  • 9. DY ExecutedExample ([Stamer, 2005])1 A → B : {{m}KB, A}KB, α1(A, B) = EBiAEB2 B → A : {{m}KA, B}KA, α2(A, B) = EAiBEADBdADBγ = DZdADZZEZiAEZDAdZDAA(3)EAiZdDZdADZZ(3)EZiAEZDAdZDAA(2)EAiZZ(2)γ α2α1 = λ γ = γ α2 γα1 = λTheorem ([Dolev et al., 1982])For “Ping-Pong” protocols there exists a security checking algorithmwhose input are the generic cancellation rules and the protocol. Thetime-complexity is O(n3), where n is the length of the input.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 9
  • 10. DY explained [Stamer, 2005]1.1 A → B: {{m}Kb, A}Kb1.2 B → Z/A: {{m}Ka , B}Ka2.1 Z → A: {{{m}Ka , B}Ka , Z}Ka2.2 A → Z: {{{m}Ka , B}Kz , A}Kz3.1 Z → A: {{m}Ka , Z}Ka3.2 A → Z: {{m}Kz , Z}KzAt the end, Z knows m.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 10
  • 11. DY ConclusionsFirst paper that provides a formal modelProtocol restrictions are mostly on honest participants and theprotection goalThe adversarial model is still quite generalTheoretical resultsSimple security characterization for a subclass of protocolsSecurity problem decidable in polynomial timeNot general enoughDifficult to extendIt does not consider the evolution of the message exchangingc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 11
  • 12. Dolev, YaoBurrows, Abadi, NeedhamDelagrande, Grote, Hunterc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 12
  • 13. A Logic of Authentication (BAN)[Burrows et al., 1990]A more general approachGoal of authentication (informally and imprecisely): afterauthentication, two principals (people, computers, services)should be entitled to believe that they are communicating witheach other and not with intrudersBAN helps in answering the following questions:Does this protocol work? Can it be made to work?Exactly what does this protocol achieve?Does this protocol need more assumptions than another protocol?Does this protocol do anything unnecessary?BAN assumes totally corrected implementations of protocols andcryptosystemsBAN is focused on the beliefs of trustworthy parties involved inthe protocols and on the logic evolution of these beliefs as aconsequence of communicationc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 13
  • 14. What is LogicA way for formalizing knowledge through sentences. . .The sentences have to be expressed according to the syntax of therepresentation languageWe need the semantics of the language, which defines the truth ofeach sentence with respect to each possible world (or model). . . and for reasoning over this knowledgeEntailment relation: α |= β means that the sentence α entails thesentence β, or that in every model in which α is true, β is also truec 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 14
  • 15. Logic is ancientAll men are mortal, Socrates is a man, therefore Socrates is mortal.MPman(X) man(X) → mortal(X)mortal(X)man(Socrates) |= mortal(Socrates)In every model (or in every possible world) where Socrates actually isa man, then actually Socrates is mortal.man(X) mortal(X) man(X) → mortal(X)T T TT F FF T TF F Fc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 15
  • 16. Formal theory [Mendelson, 2010]Definition (Conditions for L being a formal theory)1 A countable set of symbols is given as the symbols of L; every itsfinite sequence of symbolsis called an expression of L.2 There is a subset of the set of expressions of L called the set ofwell-formed formulas (wfs) of L. There is usually an effectiveprocedure to determine whether a given expression is a wf.3 There is a set of wfs called the set of axioms of L. If one caneffectively decide whether a given wf is an axiom, then L is calledan axiomatic theory.4 There is a finite set R1, . . . , Rn of relations among wfs, calledrules of inference. For each Ri, there is a unique positive integerj such that, for every set of j wfs and each wf B, one caneffectively decide whether the given j wfs are in the relation Rito B, and, if so, B is said to follow from or to be a directconsequence of the given wfs by virtue of Ric 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 16
  • 17. Formal theory [Mendelson, 2010]DefinitionA proof in a formal theory L is a sequence B1, . . . , Bj of wfs such that,for each i either Bi is an axiom of L or Bi i- a direct consequence ofsome of the preceding wfs in the sequence by virtue of one of the rulesof inference of L.DefinitionA theorem of L is a wf B of L such that B is the last wf of some proofin L. Such a proof is called a proof of B in L.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 16
  • 18. Formal theory [Mendelson, 2010]DefinitionA wf B is said to be a consequence in L of a set Γ of wfs if and only ifthere is a sequence B1, . . . , Bk of wfs such that C is Bk and, for each i,either Bi is an axiom or Bi is Γ, or Bi is a direct consequence by somerule of inference of some of the preceding wfs in the sequence. Such asequence is called a proof (or a deduction) of C from Γ. The membersof Γ are called the hypotheses or premises of the proof. We use Γ Cas an abbreviation for “C is a consequence of Γ”.The following are simple properties of the notion of consequence:1 If Γ ⊆ ∆ and Γ C, then ∆ C;2 Γ C if and only if there is a finite subset ∆ of Γ such that ∆ C;3 If ∆ C, and for each B in ∆, Γ B, then Γ C.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 16
  • 19. Formal axiomatic theory L1[Mendelson, 2010, Kleene, 1952]1 The symbols of L1 are ¬, ⇒, (, ), ∧, ∨, and the letters Ai withpositive integers i as subscripts: A1, A2, A3, . . .. The symbols ¬and ⇒ are called primitive connectives, and the letters Ai arecalled statement letters.2 a All statement letters are well-formed formulas (wfs).b If B and C are wfs, then so are (¬B), (B ⇒ C), (B ∧ C), and (B ∨ C).c C is a wfs if and only if there is a finite sequence B1, . . . , Bn, (n ≥ 1)such that Bn = C and, if 1 ≤ i ≤ n, Bi is either a wfs or a negation(Bi = (¬Bk), k < i), or conditional (Bi = (Bj ⇒ Bl), j < i, l < i), or aconjunction, or a disjunction constructed from previous expression inthe sequence.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 17
  • 20. Formal axiomatic theory L1[Mendelson, 2010, Kleene, 1952]1 If B, C and D are wfs of L1, then the following are axioms of L1:A1 B ⇒ (C ⇒ B);A2 (B ⇒ (C ⇒ D)) ⇒ ((B ⇒ C) ⇒ (B ⇒ D));A3 (B ∧ C) ⇒ B;A4 (B ∧ C) ⇒ C;A5 B ⇒ (C ⇒ (B ∧ C));A6 B ⇒ (B ∨ C);A7 C ⇒ (B ∨ C);A8 (B ⇒ D) ⇒ ((C ⇒ D) ⇒ (B ∨ C ⇒ D));A9 (B ⇒ C) ⇒ ((B ⇒ ¬C) ⇒ ¬B));A10 ¬¬B ⇒ B.2 The only rule of inference of L1 is modus ponens: C is a directconsequence of B and (B ⇒ C).c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 17
  • 21. The BAN formalismLogic with principals (P, Q, R . . .), encryption keys (K...) and formulas orstatements (X, Y, . . .). Conjunction is the only propositional connectiveP believes X, the principal P may act as though X is trueP sees X, someone has sent a message containing X to P, who can read and repeatXP said X, P at some time sent a message including the statement X; it is knownthat P believed X thenP controls X, P is an authority on X and should be trusted on this matterfresh X, X has not been sent in a message at any time before the current run ofthe protocolPK↔ Q, P and Q may use the shared key K to communicateK→ P, P has public key K (and the relative private key K−1 which is goodPXQ, the formula X is a secret known only to P and Q, and possibly toprincipals trusted by them{X}K , the formula X is encrypted under the key KX Y , the formula X is combined with the formula Y , it is intended that Y be asecret and that its presence prove the identity of whoever utters X Yc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 18
  • 22. Logical Postulates: preliminariesTwo epochs:present that begins at the start of the particular run of the protocol underconsiderationpast indeed all messages sent before the present are considered to be inthe past and should not be accepted as recentAll beliefs held in the present are stable for the entirety of theprotocol runBeliefs held in the past are not necessarily carried forward intothe presentA message cannot be understood by a principals who does notknow the keyThe key cannot be deduced from the encrypted messageEach encrypted message contains sufficient redundancy to allow aprincipal who decrypts it to verify that he has used the right keyc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 19
  • 23. The postulates1 Message-meaningP believesK→ Q P sees {X}K−1P believes Q said X. . .2 Nonce-verificationP believes fresh X P believes Q said XP believes Q believes X3 JurisdictionP believes Q controls X P believes Q believes XP believes X4 Composition. . .5 Freshness of composed messages with nonces. . .c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 20
  • 24. Needham Schroeder Protocol (NS)[Needham and Schroeder, 1978]Notation:Principals (e.g. A, B)Encryption keys, as shared keys (e.g. Kab), public keys (e.g. Ka), and the corresponding privatekeys (e.g. K−1a )Statements or formulae (e.g. Na)SA B1: A, B2: {Kb, B}K−1s3: {Na, A}Kb6: {Na, Nb}Ka7: {Nb}Kb4: B, A5: {Ka, A}K−1sAssumption: both parties know S’ public keyMessage 1: A → S: A, B Message 5: S → B: {Ka, A}K−1sMessage 2: S → A: {Kb, B}K−1sMessage 6: B → A: {Na, Nb}KaMessage 3: A → B: {Na, A}KbMessage 7: A → B: {Nb}KbMessage 4: B → S: B, Ac 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 21
  • 25. NS idealizedOriginal IdealizedMessage 1 A → S: A, BMessage 2 S → A: {Kb, B}K−1sS → A: {Kb→ B}K−1sMessage 3 A → B: {Na, A}KbA → B: {Na}KbMessage 4 B → S: B, AMessage 5 S → B: {Ka, A}K−1sS → B: {Ka→ A}K−1sMessage 6 B → A: {Na, Nb}Ka B → A: { ANbB Na }KaMessage 7 A → B: {Nb}KbA → B: { ANaB Nb}KbMessages 1 and 4 do not contribute to the logical propertiesIn message 3, Na is not known by B and thus is not being usedto prove the identity of AIn messages 6 and 7, Na and Nb are used as secretsc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 22
  • 26. NS analyzedA believesKa→ A B believesKb→ B A believesKs→ SB believesKs→ B S believesKa→ A S believesKb→ BS believesKs→ SA believes ∀K S controlsK→ B B believes ∀K S controlsK→ AA believes fresh Na B believes fresh NbA believes ANaB B believes ANbBA believes freshKb→ B B believes freshKa→ AA believesKb→ B B believesKa→ AA believes B believes ANbB B believes A believes ANaBc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 23
  • 27. BAN Explained [Burrows et al., 1989]A believes S controlsKb→ BA believes freshKb→ BA believesKs→ S A sees {Kb→ B}K−1sA believes S saidKb→ BA believes S believesKb→ BA believesKb→ Bwhere(1) Message Meaning[assumption]A believesKs→ S [message 2]A sees {Kb→ B}K−1sA believes S saidKb→ B(2) Nonce verification[assumption]A believes freshKb→ B [(1)]A believes S saidKb→ BA believes S believesKb→ BJurisdiction rule[assumption]A believes S controlsKb→ B [(2)]A believes S believesKb→ BA believesKb→ Bc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 24
  • 28. BAN ConclusionsFirst logic for protocol verificationDescription of proof systemsBAN enables us to exhibit step by step how beliefs are built upto the point of mutual authenticationIt can guide in identifying mistakes and in suggesting correctionsIt lacks on semantics [Abadi and Tuttle, 1991,Mao and Boyd, 1993, Delgrande et al., 2009]It is not in a pure declarative form rather it has to idealize aprotocolIt cannot reveal some kinds of attack (like reflection attack)c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 25
  • 29. Dolev, YaoBurrows, Abadi, NeedhamDelagrande, Grote, Hunterc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 26
  • 30. New Idea in Logic Verification of CryptographicProtocols [Delgrande et al., 2009]Standard specification of protocols is both incomplete andimpreciseThe Goal of the protocol in not explicitly given (incomplete),therefore it is difficult to determine if a given trace constitutes anattackIt is not clear what A → B means (imprecise): it does not meansthat A successfully sends the message to B; instead:A intends to send the message to BA actually sends the message to someoneNot only beliefs but also intentionsAim: define a logic program in which the answer sets correspondto sequences of exchanged messages between agentsc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 27
  • 31. A logic programDefinitions:It is a picture of why and how you believe a program/policy willworkThe foundation of program planning and the key tool of programevaluationA series of “if-then” relationships that, if implemented asintended, lead to the desired outcomesExample ([Lamperti, 2011])speak(al, russian).speak(robert, english).speak(mary, russian).speak(mary, english).communicate(X, Y ) ← speak(X, L), speak(Y, L), X = Y.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 28
  • 32. The Answer Set {Programming|Prolog} (ASP)in a Nutshell [Gelfond, 2008]Declarative programming based on the stable model semantics oflogic programming [Gelfond and Lifschitz, 1988], autoepistemic logic[Moore, 1985], and default logic [Reiter, 1980]Literals: p(ˆt), ¬p(ˆt) (ˆt is a term)Rules: l0 or . . . or lk ← lk+1, . . . , lm, not lm+1, . . . , not ln.or is the epistemic disjunctionnot is default negation (aka negation as failure): literals possiblypreceded by default negation are called extended literalsConstraint: ← l0, . . . , ln.Fact: l0 or . . . or lk.Choice rules: i {p, q} j (at least i of the enclosed atoms are true,but not more that j)Conditions: p(X) : q(X) (e.g. if we have q(a), q(b), then we caninstantiate p(a), p(b))c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 29
  • 33. The Answer Set {Programming|Prolog} (ASP)in a Nutshell [Gelfond, 2008]DefinitionA program of Answer Set Prolog is a pair {σ, Π} where σ is asignature and Π is a collection of logic programming rules over σ.Given a logic program Π, the corresponding signature is denoted byσ(Π). If σ(Π) is not explicitly given, it is assumed it consists ofsymbols occurring in the program.Terms, literals, and rules of Π are called ground if they contain novariables and no symbols for arithmetic functions. A rule r is called aground instance of a rule r of Π if it is obtained from r by:1 replacing r’s non-integer variables by properly typed groundterms of σ(Π)2 replacing r’s variables for non-negative integers by numbers3 replacing the remaining occurrences of numerical terms by theirvaluesc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 29
  • 34. ASP partial interpretation [Gelfond, 2008]Definition (Partial interpretation of σ)A set of ground literals G = {l0, . . . , lk} is consistent iff i, j s.t.li = ¬lj. Consistent sets of ground literals over σ are called partialinterpretations of σ.Given a partial interpretation S:l ground literal, ¯l = ¬l, l is true if l ∈ S, false if ¯l ∈ S, unknownotherwisenot l extended literal is true if l /∈ S, false otherwiseconjunction of extended literals U is true if all elements of U istrue in S, false if at least one element of U is false in S, unknownotherwisedisjunction of literals D is true if at least one of its members istrue in S, false if all members of D is false in S, unknownotherwisec 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 30
  • 35. ASP: negation as failure [Gelfond, 2008]Non-monotonic inference rule in logic programmingnot p is derived from failure of deriving pExamplecross railroad ← ¬train approaches.“If you know that the train not approaches, then cross the railroad”.cross railroad ← not train approaches.“If you don’t known if the train approaches or not, then cross therailroad”. Incomplete information!Examplefly(X) ← bird(X), not abnormal(X).abnormal(X) ← penguin(X).“Each bird flies, except abnormal ones”. Defaults!c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 31
  • 36. ASP: epistemic disjunction [Gelfond, 2008]In classical logic, A ∨ B forbids that both A and B are falseIn epistemic logic, A or B means that every possible set ofreasoner’s belief must satisfy A or satisfy B.Examplenode(X) ← arc(X, Y ).node(Y ) ← arc(X, Y ).color(X, red) or color(X, green) or color(X, blue) ← node(X).I know that an arc is between two nodes of a graph, and I know thateach node has a colour that is one of the following, red, green, or blue.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 32
  • 37. ASP: the answers set semantics [Gelfond, 2008]DefinitionA partial interpretation S of σ(Π) is an answer set for Π if S isminimal (w.r.t. set inclusion) partial interpretation satisfying therules of Π.An answer set corresponds to a possible set of beliefs which can bebuilt by a rational reasoner on the basis of rules of Π. In theconstruction of such a set, S, the reasoner is assumed to be guided bythe following informal principles:S must satisfy the rules of Πthe reasoner should adhere to the rationality principle which saysthat one shall not believe anything one is not forced to believe(minimality)c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 33
  • 38. An example in ASPExample ([Eiter, 2008])person(joey).male(X) or female(X) ← person(X).bachelor(X) ← male(X), not married(X).Two answer sets:1 {person(joey), male(joey), bachelor(joey)}2 {person(joey), female(joey)}c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 34
  • 39. The DGH approachThree independent modules:1 Protocol module: it includes generic information about messagepassing and protocols2 Intruder module: it includes a specification of the intruderscapabilities3 Instance module: it includes the structure of a specific protocolc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 35
  • 40. The DGH approach: Protocol moduleGeneral message passing frameworkPrincipals’ holdings, capabilities, . . .Keys: asymmetric and symmetric key encryptionNonces or timestampActions (e.g. send and receive)Time (discrete): actions occur at some point in time (to keep thesearch space manageable, a single time step is used for one agentto receive a message, decrypt its contents, compose a reply,encrypt and send it)Notion of appropriateness: matching of messagesTwo auxiliary predicates:1 talked: to indicate that two agents have successfullycommunicated2 authenticated: is part of goal specificationc 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 36
  • 41. The DGH approach: Protocol modulesend (A, B, M, T) : - wants(A, send (A, B, M) , T) , has (A, M, T) .receive (A, B, M, T+1) : - send (B, A, M, T) , not intercept (M, T+1){wants(A, send (A, B, M) , T) } 1 : - receive (A, B, M2, T) ,f i t (msg( J , B, A, M2) ,msg( J+1, A, B, M) )talked (A, B, T+1) : - send (A, B, M, T) , receive (B, A, M, T+1).authenticated (A, B, T) : - send (A, B, enc (M1, K1) , T1 ) ,fresh (A, nonce (A, Na) , T1 ) ,part m ( nonce (A, Na) , M1) ,key pair (K1 , Kinv1 ) , has (A, K1 , T1 ) ,has (B, Kinv1 , T1 ) ,not has (C, Kinv1 , T1) : agent (C) : C != B,send (B, A, enc (M2, K2) , T2 ) ,receive (A, B, enc (M2, K2) , T) ,part m ( nonce (A, Na) , M2) ,key pair (K2 , Kinv2 ) , has (B, K2 , T1 ) ,has (A, Kinv2 , T1 ) ,not has (C, Kinv2 , T1 ) : agent (C) : C != A,T1 < T2 , T2 < T.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 37
  • 42. The DGH approach: Intruder moduleThis module specifies all aspects of the intruderThe intruder model is flexible and easy to modifyHoldings:the public key of every agenta public-private key pair (in order to pretend to be an honestagent)Capabilities:intercept messages and it receives the messages that it interceptssend messages whenever it wants tofake the sender name of messages it sends0 { intercept (M, T+1) } 1 : - send (A, B, M, T) .receive ( I , A, M, T+1) : - send (A, B, M, T) ,intercept (M, T+1).1 { receive (A, B, M, T+1) : principal (B) } 1 : -send ( I , A, M, T) .c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 38
  • 43. The DGH approach: Instance moduleMessages (NS simplified case)A B1: {Na, A}Kb2: {Na, Nb}Ka3: {Nb}Kbmsg( 1 , A, B, enc (m( nonce (C, Na) , principal (A) ) , pub key (B ) ) ) .msg( 2 , B, A, enc (m( nonce (C, Na) , nonce (D, Nb) ) , pub key (A) ) ) .msg( 3 , A, B, enc (m( nonce (D, Nb) ) , pub key (B ) ) ) .c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 39
  • 44. The DGH approach: Instance moduleGoals and AttacksGoals: both principals should believe that the other one isauthenticated and that they actually aregoal (A, B, T) : - authenticated (A, B, T) ,believes (A, authenticated (A, B) , T) ,authenticated (B, A, T) ,believes (B, authenticated (B, A) , T) .Attacks: run of the protocol that causes an agent to believe thegoal is true, when in fat the goal is not trueattack : - believes (A, authenticated (A, B) , T) ,not authenticated (A, B, T) .c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 40
  • 45. DGH Executedsend ( a , i , enc (m( nonce ( a , 1 ) , principal ( a ) ) , pub key ( i ) ) , 0 )receive ( i , a , enc (m( nonce ( a , 1 ) , principal ( a ) ) , pub key ( i ) ) , 1 )send ( i , b , enc (m( nonce ( a , 1 ) , principal ( a ) ) , pub key ( b ) ) , 1 )receive (b , a , enc (m( nonce ( a , 1 ) , principal ( a ) ) , pub key ( b ) ) , 2 )send (b , a , enc (m( nonce ( a , 1 ) , nonce (b , 1 ) ) , pub key ( a ) ) , 2 )receive ( i , b , enc (m( nonce ( a , 1 ) , nonce (b , 1 ) ) , pub key ( a ) ) , 3 )send ( i , a , enc (m( nonce ( a , 1 ) , nonce (b , 1 ) ) , pub key ( a ) ) , 3 )authenticated ( a , i , 4 )believes ( a , authenticated ( a , i ) , 4 )believes ( a , completed ( a , i ) , 4 )receive ( a , i , enc (m( nonce ( a , 1 ) , nonce (b , 1 ) ) , pub key ( a ) ) , 4 )send ( a , i , enc (m( nonce (b , 1 ) ) , pub key ( i ) ) , 4 )authenticated ( a , i , 5 )believes ( a , authenticated ( a , i ) , 5 )believes ( a , completed ( a , i ) , 5 )receive ( i , a , enc (m( nonce (b , 1 ) ) , pub key ( i ) ) , 5 )send ( i , b , enc (m( nonce (b , 1 ) ) , pub key ( b ) ) , 5 )authenticated ( a , i , 6 )believes ( a , authenticated ( a , i ) , 6 )believes ( a , completed ( a , i ) , 6 )believes (b , authenticated (b , a ) , 6 )believes (b , completed (b , a ) , 6 )receive (b , a , enc (m( nonce (b , 1 ) ) , pub key ( b ) ) , 6 )c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 41
  • 46. DGH Explained (reflection attack)Two parallel protocol runs.1.1 A → I: {Na, A}Ki2.1 IA → B: {Na, A}Kb2.2 B → IA: {Na, Nb}Ka1.2 I → A: {Na, Nb}Ka1.3 A → I: {Nb}ki2.3 IA → B: {Nb}kbI knows Nb.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 42
  • 47. ConclusionsCryptography is not enoughProtocols use cryptography to reach goals like authenticationThe verification of protocols is still an open challengeDY’80First attemptComputable algorithmBAN Logic’90Specific logic for authenticationFocused on beliefsDGH2009“General purpose” logic: reuse of results in ASP theoryDeal with both beliefs and intentionsMore on [Chen et al., 2008]c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 43
  • 48. References I[Abadi and Tuttle, 1991] Abadi, M. and Tuttle, M. R. (1991).A semantics for a logic of authentication (extended abstract).In Proceedings of the tenth annual ACM symposium on Principles of distributed computing, PODC ’91,pages 201–216, New York, NY, USA. ACM.[Bornat and Sufrin, 2008] Bornat, R. and Sufrin, B. (2008).Jape.http://jape.comlab.ox.ac.uk:8080/jape/.[Burrows et al., 1990] Burrows, M., Abadi, M., and Needham, R. (1990).A logic of authentication.ACM Trans. Comput. Syst., 8:18–36.[Burrows et al., 1989] Burrows, M., Abadi, M., and Needham, R. M. (1989).A logic of authentication, rep. 39.Technical report, Digital Equipment Corporation Systems Research Center, Palo Alto, Calif.[Chen et al., 2008] Chen, Q., Zhang, C., and Zhang, S. (2008).Secure transaction protocol analysis: models and applications.Springer-Verlag, Berlin, Heidelberg.[Delgrande et al., 2009] Delgrande, J., Grote, T., and Hunter, A. (2009).A general approach to the verification of cryptographic protocols using answer set programming.In Erdem, E., Lin, F., and Schaub, T., editors, Logic Programming and Nonmonotonic Reasoning,volume 5753 of Lecture Notes in Computer Science, pages 355–367. Springer Berlin / Heidelberg.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 44
  • 49. References II[Dolev et al., 1982] Dolev, D., Even, S., and Karp, R. (1982).On the security of ping-pong protocols.Information and Control, 55(1-3):57 – 68.[Dolev and Yao, 1981] Dolev, D. and Yao, A. C. (1981).On the security of public key protocols.Technical report, Stanford, CA, USA.[Dolev and Yao, 1983] Dolev, D. and Yao, A. C. (1983).On the security of public key protocols.Information Theory, IEEE Transactions on, 29(2):198–208.[Eiter, 2008] Eiter, T. (2008).Answer set programming in a nutshell.http://gradlog.informatik.uni-freiburg.de/gradlog/slides_ak/eiter_asp.pdf.[Gelfond, 2008] Gelfond, M. (2008).Chapter 7 answer sets.In Frank van Harmelen, V. L. and Porter, B., editors, Handbook of Knowledge Representation,volume 3 of Foundations of Artificial Intelligence, pages 285 – 316. Elsevier.[Gelfond and Lifschitz, 1988] Gelfond, M. and Lifschitz, V. (1988).The stable model semantics for logic programming.In ICLP/SLP, pages 1070–1080.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 45
  • 50. References III[Kleene, 1952] Kleene, S. C. (1952).Introduction to Metamatematics.[Lamperti, 2011] Lamperti, G. (2011).Programmazione logica.http://www.ing.unibs.it/ lamperti/lp/lezioni/lp-logico.pdf.[Leone et al., ] Leone, N., Pfeifer, G., and Faber, W.The dlv project.http://www.dbai.tuwien.ac.at/proj/dlv/.[Lifschitz, 2008] Lifschitz, V. (2008).What is answer set programming?In Proceedings of the AAAI Conference on Artificial Intelligence, pages 1594–1597. MIT Press.[Mao and Boyd, 1993] Mao, W. and Boyd, C. (1993).Towards formal analysis of security protocols.In Computer Security Foundations Workshop VI, 1993. Proceedings.[Mendelson, 2010] Mendelson, E. (2010).Introduction to Mathematical Logic: Fifth Edition.[Micciancio, 2005] Micciancio, D. (2005).Cse208: Spring 2005 - advanced cryptography: Symbolic methods for security analysis.http://cseweb.ucsd.edu/classes/sp05/cse208/index.html.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 46
  • 51. References IV[Moore, 1985] Moore, R. C. (1985).Semantical considerations on nonmonotonic logic.Artif. Intell., 25:75–94.[Needham and Schroeder, 1978] Needham, R. M. and Schroeder, M. D. (1978).Using encryption for authentication in large networks of computers.Commun. ACM, 21:993–999.[Reiter, 1980] Reiter, R. (1980).A logic for default reasoning.Artificial Intelligence, 13(1-2):81–132.[Stamer, 2005] Stamer, H. (2005).Verification of cryptographic protocols part 1: The dolev-yao model and “ping-pong” protocols.http://www.theory.informatik.uni-kassel.de/~stamer/RewritingOverview_slides.pdf.c 2011 Federico Cerutti <federico.cerutti@ing.unibs.it> January 2011 47