This document provides an overview of logical approaches to analyzing the security of distributed systems. It discusses cryptographic protocols, web services, and modeling tools. The document is divided into three sections. The first section describes cryptographic protocols and web services. The second section discusses tools for modeling these systems using first-order logic. The third section presents symbolic models for cryptographic protocols and a proposed model for analyzing web services security.
Fundamentals of computational_fluid_dynamics_-_h._lomax__t._pulliam__d._zinggRohit Bapat
This document provides an overview of computational fluid dynamics (CFD) and summarizes its key steps and concepts. It discusses the fundamentals of CFD, including conservation laws, governing equations, finite difference approximations, semi-discrete and finite volume methods, and time-marching algorithms. The document is intended to introduce readers to the basic theory and methods in CFD for modeling fluid flow and transport phenomena.
This dissertation presents research on specialized decision algorithms for string constraints to support program analysis. It identifies a set of string constraints that captures common programming language constructs and permits efficient solving algorithms. It presents algorithms for solving regular matching assignments, concatenation-intersection problems, and general systems of subset constraints over regular languages. It also evaluates various automata data structures and algorithms to inform the design of efficient solving approaches. The goal is to provide a constraint solving interface that allows client analyses to reason about strings similarly to using a SAT solver for binary states. Experimental results show the prototype solver to be several orders of magnitude faster than competing approaches on published benchmarks.
This document provides an introduction to integral calculus and demonstrates how to perform integral calculations using the computer algebra system Sage. It covers key integral calculus concepts such as the definition of the integral, Riemann sums, the Fundamental Theorem of Calculus, and techniques for evaluating integrals such as substitution, integration by parts, and trigonometric substitutions. It also discusses applications of integrals to computing areas, volumes, arc lengths, averages, and centers of mass. The document is intended as a preliminary version of an instructional text on integral calculus using Sage.
Reconstruction of Surfaces from Three-Dimensional Unorganized Point Sets / Ro...Robert Mencl
This document is a dissertation written by Robert Mencl to earn a Doctor of Natural Sciences degree from the University of Dortmund. The dissertation proposes a new algorithm for reconstructing surfaces from unorganized 3D point clouds. The algorithm uses the Euclidean minimum spanning tree to create an environment graph, then incrementally constructs the surface by adding triangles while ensuring the resulting triangles satisfy necessary conditions to approximate the underlying surface. The dissertation provides detailed descriptions of the algorithm's components and theoretical analysis to prove properties like the triangles constructed will have bounded edge lengths and converge to the natural neighbor embedding of the surface.
Masters Thesis: A reuse repository with automated synonym support and cluster...Laust Rud Jacobsen
Having a code reuse repository available can be a great asset for a programmer. But locating components can be difficult if only static documentation is available, due to vocabulary mismatch. Identifying informal synonyms used in documentation can help alleviate this mismatch. The cost of creating a reuse support system is usually fairly high, as much manual effort goes into its construction.
This project has resulted in a fully functional reuse support sys- tem with clustering of search results. By automating the construc- tion of a reuse support system from an existing code reuse repository, and giving the end user a familiar interface, the reuse support system constructed in this project makes the desired functionality available. The constructed system has an easy to use interface, due to a fa- miliar browser-based front-end. An automated method called LSI is used to handle synonyms, and to some degree polysemous words in indexed components.
In the course of this project, the reuse support system has been tested using components from two sources, the retrieval performance measured, and found acceptable. Clustering usability is evaluated and clusters are found to be generally helpful, even though some fine-tuning still has to be done.
This document is a draft of a textbook titled "Applied Calculus" written by Karl Heinz Dovermann, a professor of mathematics at the University of Hawaii. It is dedicated to his wife and sons. The textbook covers topics in calculus including definitions of derivatives, integrals, and applications of calculus through 12 chapters with sections on background concepts, derivatives, applications of derivatives, integration, and prerequisites from precalculus.
This document provides an introduction to queueing theory. It discusses key concepts such as random variables, probability distributions, performance measures, Little's law and the PASTA property. It then examines several common queueing models including the M/M/1, M/M/c, M/Er/1, M/G/1 and G/M/1 queues. For each model it derives the equilibrium distribution and discusses measures like mean queue length and waiting time. The goal is to give an overview of basic queueing theory concepts and common single-server and multi-server queues.
A buffer overflow study attacks and defenses (2002)Aiim Charinthip
This document provides an overview of buffer overflow attacks and defenses. It discusses stack and heap overflows, and how programs can be exploited by overwriting memory buffers. It then summarizes various protection solutions, including Libsafe and the Grsecurity kernel patch, which make the stack and heap non-executable to prevent execution of injected code. The document serves as an introduction to buffer overflows and techniques for mitigating these vulnerabilities.
Fundamentals of computational_fluid_dynamics_-_h._lomax__t._pulliam__d._zinggRohit Bapat
This document provides an overview of computational fluid dynamics (CFD) and summarizes its key steps and concepts. It discusses the fundamentals of CFD, including conservation laws, governing equations, finite difference approximations, semi-discrete and finite volume methods, and time-marching algorithms. The document is intended to introduce readers to the basic theory and methods in CFD for modeling fluid flow and transport phenomena.
This dissertation presents research on specialized decision algorithms for string constraints to support program analysis. It identifies a set of string constraints that captures common programming language constructs and permits efficient solving algorithms. It presents algorithms for solving regular matching assignments, concatenation-intersection problems, and general systems of subset constraints over regular languages. It also evaluates various automata data structures and algorithms to inform the design of efficient solving approaches. The goal is to provide a constraint solving interface that allows client analyses to reason about strings similarly to using a SAT solver for binary states. Experimental results show the prototype solver to be several orders of magnitude faster than competing approaches on published benchmarks.
This document provides an introduction to integral calculus and demonstrates how to perform integral calculations using the computer algebra system Sage. It covers key integral calculus concepts such as the definition of the integral, Riemann sums, the Fundamental Theorem of Calculus, and techniques for evaluating integrals such as substitution, integration by parts, and trigonometric substitutions. It also discusses applications of integrals to computing areas, volumes, arc lengths, averages, and centers of mass. The document is intended as a preliminary version of an instructional text on integral calculus using Sage.
Reconstruction of Surfaces from Three-Dimensional Unorganized Point Sets / Ro...Robert Mencl
This document is a dissertation written by Robert Mencl to earn a Doctor of Natural Sciences degree from the University of Dortmund. The dissertation proposes a new algorithm for reconstructing surfaces from unorganized 3D point clouds. The algorithm uses the Euclidean minimum spanning tree to create an environment graph, then incrementally constructs the surface by adding triangles while ensuring the resulting triangles satisfy necessary conditions to approximate the underlying surface. The dissertation provides detailed descriptions of the algorithm's components and theoretical analysis to prove properties like the triangles constructed will have bounded edge lengths and converge to the natural neighbor embedding of the surface.
Masters Thesis: A reuse repository with automated synonym support and cluster...Laust Rud Jacobsen
Having a code reuse repository available can be a great asset for a programmer. But locating components can be difficult if only static documentation is available, due to vocabulary mismatch. Identifying informal synonyms used in documentation can help alleviate this mismatch. The cost of creating a reuse support system is usually fairly high, as much manual effort goes into its construction.
This project has resulted in a fully functional reuse support sys- tem with clustering of search results. By automating the construc- tion of a reuse support system from an existing code reuse repository, and giving the end user a familiar interface, the reuse support system constructed in this project makes the desired functionality available. The constructed system has an easy to use interface, due to a fa- miliar browser-based front-end. An automated method called LSI is used to handle synonyms, and to some degree polysemous words in indexed components.
In the course of this project, the reuse support system has been tested using components from two sources, the retrieval performance measured, and found acceptable. Clustering usability is evaluated and clusters are found to be generally helpful, even though some fine-tuning still has to be done.
This document is a draft of a textbook titled "Applied Calculus" written by Karl Heinz Dovermann, a professor of mathematics at the University of Hawaii. It is dedicated to his wife and sons. The textbook covers topics in calculus including definitions of derivatives, integrals, and applications of calculus through 12 chapters with sections on background concepts, derivatives, applications of derivatives, integration, and prerequisites from precalculus.
This document provides an introduction to queueing theory. It discusses key concepts such as random variables, probability distributions, performance measures, Little's law and the PASTA property. It then examines several common queueing models including the M/M/1, M/M/c, M/Er/1, M/G/1 and G/M/1 queues. For each model it derives the equilibrium distribution and discusses measures like mean queue length and waiting time. The goal is to give an overview of basic queueing theory concepts and common single-server and multi-server queues.
A buffer overflow study attacks and defenses (2002)Aiim Charinthip
This document provides an overview of buffer overflow attacks and defenses. It discusses stack and heap overflows, and how programs can be exploited by overwriting memory buffers. It then summarizes various protection solutions, including Libsafe and the Grsecurity kernel patch, which make the stack and heap non-executable to prevent execution of injected code. The document serves as an introduction to buffer overflows and techniques for mitigating these vulnerabilities.
Hub location models in public transport planningsanazshn
This document is a dissertation written in German on the topic of hub location models in public transport planning. It begins with an introduction that describes hub location problems, aspects of multi-period planning, and solution procedures. The dissertation then reviews literature on hub location problems and formulations. It presents new mathematical formulations for public transport applications and extensions. Finally, it discusses solution methods like Lagrangian relaxation, Benders decomposition, and heuristic algorithms.
This document provides an introduction to object-oriented programming concepts like classes, objects, inheritance and polymorphism. It also introduces the C++ programming language, starting from the C language basics and expanding on object-oriented features in C++ like classes, objects, constructors and destructors. The document uses a case study of implementing a generic singly linked list in C++ to demonstrate templates, iterators and other OOP concepts.
This document is a table of contents for the book "Stochastic Programming" by Peter Kall and Stein W. Wallace. It provides an overview of the book's contents, which include chapters on basic concepts in stochastic programming, dynamic systems, recourse problems, probabilistic constraints, preprocessing, and network problems. The book aims to introduce the fundamental concepts and solution techniques in stochastic programming.
This document proposes a system to allow a robot to automatically find a path to a predefined goal in uncontrolled environments. The system has three main modules: 1) An artificial vision module that obtains a quantified representation of the robot's vision using local feature detection and visual words. 2) A reinforcement learning module that receives the vision input and sensor data to compute the state and reward. The state is a normalized vector and sensor data, and reward is based on distance to the goal. 3) A behavior control module. The robot is tested using Sony Aibo to seek the goal and change behavior based on experience, but does not find the optimal route.
This document is a free online calculus textbook. It was created by David Guichard and others and submitted to an open textbook initiative in California. The textbook is updated occasionally by the authors to correct errors and add new material. It covers topics in analytic geometry, limits, derivatives, integrals, and infinite series and is made freely available under a Creative Commons license.
This document contains notes from a trigonometry class taught by Steven Butler at Brigham Young University in Fall 2002. It is divided into 9 chapters that cover topics such as geometric foundations, the Pythagorean theorem, angle measurement, trigonometry with right triangles, trigonometry with circles, graphing trigonometric functions, inverse trigonometric functions, and working with trigonometric identities. Each chapter contains sections that explain key concepts and include supplemental practice problems.
This document contains notes from a trigonometry course. It includes 10 chapters that cover topics like geometric foundations, the Pythagorean theorem, angle measurement, trigonometric functions, graphing trigonometric functions, inverse trigonometric functions, and working with trigonometric identities. Each chapter also includes supplemental problems for additional practice.
This document is a technical report submitted as part of a Master's degree in Information Security. It examines applying machine learning algorithms to the task of intrusion detection in computer security. Specifically, it analyzes the NBTree and VFI machine learning algorithms on a dataset of network connections and compares their performance at detecting intrusions. The NBTree algorithm achieved high accuracy and recall, indicating it is well-suited for intrusion detection using machine learning. The report also discusses future work and the usefulness of machine learning for computer security problems.
This document provides an introduction and overview of data structures and algorithms. It discusses linked lists, binary search trees, heaps, sets, queues, and the AVL tree data structure. It also covers sorting algorithms like merge sort, quicksort, and insertion sort as well as numeric algorithms for primality testing, base conversions, finding greatest common denominators, and more. The goal is to provide annotated references and examples of how to implement and use various common data structures and algorithms.
This document contains lecture notes for a logic course focused on classical and non-classical logics. It covers topics like propositional and predicate calculus, modal logics, temporal logics, logics of programs, and fixpoint calculus. The main emphasis is on automated deduction techniques for classical logics, including resolution method, sequent calculus, and analytic tableaux. Application areas discussed include formal specification and verification of software, data structures, and databases.
Vector spaces, vector algebras, and vector geometriesRichard Smith
Vector spaces over an arbitrary field are treated. Exterior algebra and linear geometries based on vector spaces are introduced. Scalar product spaces and the Hodge star are included.
This document summarizes a master's thesis that implements a reliable overlay multicast protocol on wireless sensor nodes. The thesis first discusses related work on wireless sensor networks, communication schemes, hardware, and the Contiki operating system. It then presents the design of the Sensor Nodes Overlay Multicast Communication (SNOMC) protocol, including node roles, message types, design models, data structures, and the SNOMC algorithm. The implementation of SNOMC in Contiki is described, along with implementations of UDP and TCP for comparison. An evaluation analyzes the performance of transmitting small and large messages using SNOMC.
This document is the thesis of Arnaud Jean-Baptiste presented at the Universite des Sciences et Technologies de Lille for the degree of Doctor of Philosophy in computer science. The thesis proposes a model of handles to control references in dynamically typed languages by enforcing behavioral properties like read-only at the reference level. It presents three experiments with handles - enforcing read-only, supporting various behavioral properties, and adding state to handles. The thesis also discusses implementation details and evaluates the performance overhead of the handle approach.
Business Mathematics Code 1429
BA Code 1429
AIOU Islamabad BA General Book
BA General Allama Iqbal Open University Course Code 1429 Business Mathematics
This document is an introduction to plasma physics that covers several key topics:
1. It defines plasma as a gas of charged particles and discusses the conditions needed for a plasma state, including debye shielding and plasma parameters.
2. It describes different models for plasma description including fluid, MHD, and two-fluid models. It also covers continuity, Euler, and state equations.
3. It discusses MHD equilibria and waves, including Alfven and magnetosonic modes.
4. It examines MHD discontinuities and shocks.
5. It presents the two-fluid description and generalized Ohm's law.
6. It explores waves in dispers
This document outlines lecture notes on machine learning. It introduces machine learning and discusses different paradigms of learning including assigning parameters, rote learning, knowledge acquisition, concept learning from examples, and neural networks. It covers topics such as concept learning, languages for learning, version space learning, induction of decision trees, covering strategies, searching generalization graphs, inductive logic programming, Bayesian approaches, minimum description length principle, unsupervised learning, and explanation-based learning.
Java data structures for principled programmerspnr15z
This document provides an overview of the 7th edition of a textbook on data structures in Java. It covers object-oriented programming concepts, common data structures like vectors and generics, design fundamentals including complexity analysis and recursion, sorting algorithms, an interface-based design method, and iterators. Each chapter also includes examples and exercises to demonstrate the concepts and techniques.
This document is the master's thesis of Tamás Martinec titled "Real-Time Non-Photorealistic Shadow Rendering". The thesis discusses non-photorealistic rendering (NPR) techniques, real-time shadow rendering algorithms, and presents an example of combining hatching-based NPR with shadow mapping to generate stylized shadows in real-time. The thesis is divided into chapters covering NPR techniques and styles, real-time shadow rendering methods, graphics hardware and shaders, and a demonstration implementing hatching and shadowed hatching shaders.
This document discusses security analysis of distributed systems. It outlines that security analysis involves specifying the participating entities, a security property, and checking if that property is satisfied across all possible executions despite interference from attackers. Some example security properties mentioned are secrecy, authentication, and strong secrecy. Analyzing distributed systems security is challenging due to their non-deterministic and infinitely branching nature.
Leads Clinical Research is a Site Management Organization headquartered in Bangalore, India. They obtain and manage clinical trials for sponsors and CROs. Leads has experienced staff and a network of investigators and volunteers across India to assist with patient recruitment and trials. Their site management services aim to provide consistent, high-quality data and reduce trial timelines through quick startups and reduced site management needs.
The document discusses the benefits of using a miswak, which is a natural toothbrush made from the Salvadora persica tree. It lists 15 occasions when using a miswak is recommended, such as before prayer, eating, or sleep. It also notes that miswak use increases intelligence and is a means of purification mentioned in the Hadith. Finally, it provides instructions for preparing a miswak by soaking and shaving it to create a brush-like tip for cleaning the teeth.
Hub location models in public transport planningsanazshn
This document is a dissertation written in German on the topic of hub location models in public transport planning. It begins with an introduction that describes hub location problems, aspects of multi-period planning, and solution procedures. The dissertation then reviews literature on hub location problems and formulations. It presents new mathematical formulations for public transport applications and extensions. Finally, it discusses solution methods like Lagrangian relaxation, Benders decomposition, and heuristic algorithms.
This document provides an introduction to object-oriented programming concepts like classes, objects, inheritance and polymorphism. It also introduces the C++ programming language, starting from the C language basics and expanding on object-oriented features in C++ like classes, objects, constructors and destructors. The document uses a case study of implementing a generic singly linked list in C++ to demonstrate templates, iterators and other OOP concepts.
This document is a table of contents for the book "Stochastic Programming" by Peter Kall and Stein W. Wallace. It provides an overview of the book's contents, which include chapters on basic concepts in stochastic programming, dynamic systems, recourse problems, probabilistic constraints, preprocessing, and network problems. The book aims to introduce the fundamental concepts and solution techniques in stochastic programming.
This document proposes a system to allow a robot to automatically find a path to a predefined goal in uncontrolled environments. The system has three main modules: 1) An artificial vision module that obtains a quantified representation of the robot's vision using local feature detection and visual words. 2) A reinforcement learning module that receives the vision input and sensor data to compute the state and reward. The state is a normalized vector and sensor data, and reward is based on distance to the goal. 3) A behavior control module. The robot is tested using Sony Aibo to seek the goal and change behavior based on experience, but does not find the optimal route.
This document is a free online calculus textbook. It was created by David Guichard and others and submitted to an open textbook initiative in California. The textbook is updated occasionally by the authors to correct errors and add new material. It covers topics in analytic geometry, limits, derivatives, integrals, and infinite series and is made freely available under a Creative Commons license.
This document contains notes from a trigonometry class taught by Steven Butler at Brigham Young University in Fall 2002. It is divided into 9 chapters that cover topics such as geometric foundations, the Pythagorean theorem, angle measurement, trigonometry with right triangles, trigonometry with circles, graphing trigonometric functions, inverse trigonometric functions, and working with trigonometric identities. Each chapter contains sections that explain key concepts and include supplemental practice problems.
This document contains notes from a trigonometry course. It includes 10 chapters that cover topics like geometric foundations, the Pythagorean theorem, angle measurement, trigonometric functions, graphing trigonometric functions, inverse trigonometric functions, and working with trigonometric identities. Each chapter also includes supplemental problems for additional practice.
This document is a technical report submitted as part of a Master's degree in Information Security. It examines applying machine learning algorithms to the task of intrusion detection in computer security. Specifically, it analyzes the NBTree and VFI machine learning algorithms on a dataset of network connections and compares their performance at detecting intrusions. The NBTree algorithm achieved high accuracy and recall, indicating it is well-suited for intrusion detection using machine learning. The report also discusses future work and the usefulness of machine learning for computer security problems.
This document provides an introduction and overview of data structures and algorithms. It discusses linked lists, binary search trees, heaps, sets, queues, and the AVL tree data structure. It also covers sorting algorithms like merge sort, quicksort, and insertion sort as well as numeric algorithms for primality testing, base conversions, finding greatest common denominators, and more. The goal is to provide annotated references and examples of how to implement and use various common data structures and algorithms.
This document contains lecture notes for a logic course focused on classical and non-classical logics. It covers topics like propositional and predicate calculus, modal logics, temporal logics, logics of programs, and fixpoint calculus. The main emphasis is on automated deduction techniques for classical logics, including resolution method, sequent calculus, and analytic tableaux. Application areas discussed include formal specification and verification of software, data structures, and databases.
Vector spaces, vector algebras, and vector geometriesRichard Smith
Vector spaces over an arbitrary field are treated. Exterior algebra and linear geometries based on vector spaces are introduced. Scalar product spaces and the Hodge star are included.
This document summarizes a master's thesis that implements a reliable overlay multicast protocol on wireless sensor nodes. The thesis first discusses related work on wireless sensor networks, communication schemes, hardware, and the Contiki operating system. It then presents the design of the Sensor Nodes Overlay Multicast Communication (SNOMC) protocol, including node roles, message types, design models, data structures, and the SNOMC algorithm. The implementation of SNOMC in Contiki is described, along with implementations of UDP and TCP for comparison. An evaluation analyzes the performance of transmitting small and large messages using SNOMC.
This document is the thesis of Arnaud Jean-Baptiste presented at the Universite des Sciences et Technologies de Lille for the degree of Doctor of Philosophy in computer science. The thesis proposes a model of handles to control references in dynamically typed languages by enforcing behavioral properties like read-only at the reference level. It presents three experiments with handles - enforcing read-only, supporting various behavioral properties, and adding state to handles. The thesis also discusses implementation details and evaluates the performance overhead of the handle approach.
Business Mathematics Code 1429
BA Code 1429
AIOU Islamabad BA General Book
BA General Allama Iqbal Open University Course Code 1429 Business Mathematics
This document is an introduction to plasma physics that covers several key topics:
1. It defines plasma as a gas of charged particles and discusses the conditions needed for a plasma state, including debye shielding and plasma parameters.
2. It describes different models for plasma description including fluid, MHD, and two-fluid models. It also covers continuity, Euler, and state equations.
3. It discusses MHD equilibria and waves, including Alfven and magnetosonic modes.
4. It examines MHD discontinuities and shocks.
5. It presents the two-fluid description and generalized Ohm's law.
6. It explores waves in dispers
This document outlines lecture notes on machine learning. It introduces machine learning and discusses different paradigms of learning including assigning parameters, rote learning, knowledge acquisition, concept learning from examples, and neural networks. It covers topics such as concept learning, languages for learning, version space learning, induction of decision trees, covering strategies, searching generalization graphs, inductive logic programming, Bayesian approaches, minimum description length principle, unsupervised learning, and explanation-based learning.
Java data structures for principled programmerspnr15z
This document provides an overview of the 7th edition of a textbook on data structures in Java. It covers object-oriented programming concepts, common data structures like vectors and generics, design fundamentals including complexity analysis and recursion, sorting algorithms, an interface-based design method, and iterators. Each chapter also includes examples and exercises to demonstrate the concepts and techniques.
This document is the master's thesis of Tamás Martinec titled "Real-Time Non-Photorealistic Shadow Rendering". The thesis discusses non-photorealistic rendering (NPR) techniques, real-time shadow rendering algorithms, and presents an example of combining hatching-based NPR with shadow mapping to generate stylized shadows in real-time. The thesis is divided into chapters covering NPR techniques and styles, real-time shadow rendering methods, graphics hardware and shaders, and a demonstration implementing hatching and shadowed hatching shaders.
This document discusses security analysis of distributed systems. It outlines that security analysis involves specifying the participating entities, a security property, and checking if that property is satisfied across all possible executions despite interference from attackers. Some example security properties mentioned are secrecy, authentication, and strong secrecy. Analyzing distributed systems security is challenging due to their non-deterministic and infinitely branching nature.
Leads Clinical Research is a Site Management Organization headquartered in Bangalore, India. They obtain and manage clinical trials for sponsors and CROs. Leads has experienced staff and a network of investigators and volunteers across India to assist with patient recruitment and trials. Their site management services aim to provide consistent, high-quality data and reduce trial timelines through quick startups and reduced site management needs.
The document discusses the benefits of using a miswak, which is a natural toothbrush made from the Salvadora persica tree. It lists 15 occasions when using a miswak is recommended, such as before prayer, eating, or sleep. It also notes that miswak use increases intelligence and is a means of purification mentioned in the Hadith. Finally, it provides instructions for preparing a miswak by soaking and shaving it to create a brush-like tip for cleaning the teeth.
The document discusses the power of words and stories to captivate readers and transport them to other worlds. It describes how readers become engrossed in stories and feel a range of emotions as they turn the pages, eager to find out what will happen next but also not wanting the story to end. Readers are pulled into stories and find it hard to tear themselves away, their minds remaining captivated even when distracted by everyday events. The document invites readers to lose themselves in the sights and sounds of words on the page.
Tales of Ocean Fantasy is a 2.5D cute-style MMORPG with unique gameplay involving fleet development, battles, and contests for islands. Players can explore land and sea maps completing quests as multiple classes using specialized spells, costumes, armor, and weapons. The story revolves around rebuilding Doris Wonderland through quests, dungeons, and battling the dark power over 500 years. Key features include land and sea maps, guild systems, fleet customization, treasure hunting, boss battles, and arena PvP. The game is set to commercially launch in March 2012 following testing.
This document discusses the benefits and proper use of miswak. It lists 15 occasions when miswak should be used, such as before meals, after waking, and before undertaking a journey. Miswak is described as the purification for the mouth and a means of pleasing God. The document also notes four things that increase intelligence, including using miswak and sitting with pious or learned people. It provides instructions for soaking and preparing miswak for use.
This document summarizes a presentation on copyrights and trademarks given by David M. Adler. It provides an overview of intellectual property rights including what is covered by copyright and trademark laws. It discusses how to register copyrights, the duration of copyright protection, fair use of copyrighted materials, and how to send an effective DMCA notice to have infringing content removed from websites.
Facebook Marketing Legal & Regulatory ComplianceAdler Law Group
The past few years have witnessed an explosion of legal and regulatory activity involving social and other new media. The growth of Facebook as a platform for customer engagement, brand extension and other advertising and marketing efforts creates both new opportunities and new legal and business risks. The legislative and regulatory environment is also in flux as Congress and federal agencies rush to provide new rules, regulations and laws aimed at consumer protection, privacy and IP piracy. This session will examine several key areas, including copyright, trademark and related intellectual property concerns; false advertising and marketing restrictions; gaming; data privacy issues presented by social media; and impacts of social media on employees and the workplace. Attendees will learn how to identify legal risks and issues before they become full-scale emergencies and how to develop appropriate policies and guidelines covering social media activity.
Social Media Legal, Regulatory & Compliance: Risks & IssuesAdler Law Group
Social media sites like Facebook, Twitter, YouTube and LinkedIn provide the opportunity for authentic interaction and engagement with customers. Companies are rapidly adopting these services as strategic marketing tools. New technological developments often create new legal and business risks. Learn how to identify the legal issues, develop policies and procedures to avoid legal risks and how to maintain regulatory compliance.
TechWeek Chicago 2012 was great success. Packed room with entrepreneurs eager to understand legal issues facing their business. If you were unable to attend, here is my portion of the presentation.
Entertainment Law & Technology: Trends in Media & AdvertisingAdler Law Group
This presentation, brought to you by Chicago's leading Entertainment Law firm, is geared for in-house counsel tasked with managing a broad range of IP, marketing, branding and promotions efforts.
Identifying Intellectual Property Issues in Startups 2014Adler Law Group
Do you work with start-up companies and need a basic understanding of the various intellectual property issues that can arise? This Presentaion will help you:
*Understand the trademark and copyright problems one may encounter with branding;
*Learn how to protect branding once established;
*Understand trade secrets and the importance of non-disclosure and confidentiality agreements;
*Establish a proactive approach toward intellectual property ownership between co-founders, employees, and vendors;
*Understand business names, domain names, promotional issues, and website content concerns.
Managing Risk: Legal Issues for Affiliate Marketers & Affiliate Marketing Man...Adler Law Group
Affiliate marketing is one of the most cost-effective techniques for monetizing web site traffic and driving sales. Unfortunately, it has a reputation for high risk. While the industry is unlikely to ever be risk-free, it is possible to manage risk by: (1) understanding how techniques like behavioral and contextual targeting affect consumers, affiliates and merchants, (2) understanding the legal and regulatory environment, (3) understating risks involved with prospective marketing partners, (4) using and maintaining proper contracts that allocate risk and provide appropriate indemnifications, and (5) keeping informed about the changes in technology, marketing practices and the regulatory environment. Attendees will learn how to identify these issues and develop policies and procedures to keep informed about the current technology, marketing strategies and regulatory compliance.
Topics covered include:
Behavioral/Contextual Advertising
Regulatory/Industry Compliance : FTC Guides & Enforcement Actions
CAN-SPAM compliance
IP Law: Rules governing use of others™ Trademarks/Keywords, Right of Publicity/Endorsement Issues.
Identifying, protecting against, and disputing accusations of Click-Fraud
Online Behavioral Advertising (OBA) Legal & Regulatory ComplianceAdler Law Group
Online behavioral advertising involves the collection of data about individuals' online activities in order to deliver targeted advertisements. While this allows for personalized ads, many users are concerned about privacy and a lack of anonymity online. Both regulators and legislators have responded by introducing laws and guidelines to increase transparency, consent, and security around the collection and use of personal data for behavioral advertising. Industry groups have also developed self-regulatory principles, but enforcement of these is ongoing.
FFEA 2016 -10 Website Mistakes Even Great Marketers Can MakeSaffire
This document provides 11 common website mistakes that marketers can make and how to avoid them. It recommends using current programming languages and plug-ins, optimizing for search engines and mobile users, including clear calls to action, prioritizing photos and video over just text, and collecting analytics to improve content and outreach over time. The overall message is that websites need frequent updates, multichannel content, and data-driven optimization to effectively engage audiences.
The document outlines 5 steps to developing a smart compensation plan: 1) gain executive support by emphasizing compensation's impact on retention and the bottom line, 2) define your compensation strategy by determining goals and market, 3) develop a market-based pay structure using appropriate job evaluation and market data, 4) build pay ranges by identifying differentials, pay grades, and guidelines for movement, and 5) implement a total rewards plan by finalizing all compensation elements, budgets, outliers, and empowering managers. Following these steps can help attract and retain top talent through a compensation plan aligned with business needs.
This document provides 10 tips for brands using WeChat official accounts to build audiences. The tips include making headlines count, segmenting audiences, increasing relevance of content, being more compelling, providing incentives and rewards, using more visual storytelling, linking to other social media, inviting guest editors, turning questions into content, and creating content on location. It emphasizes the importance of high-quality, relevant, visual content that engages audiences and drives action. It also recommends tools like CMS/CRM systems to better segment and target audiences with customized content.
It’s not enough that you drink water every day. You have to make sure it’s the adequate amount and it’s absolutely safe and clean. To be guaranteed about your everyday drinking water, it would be a good idea buy water filter here in Singapore or anywhere you might be in the world.
The document provides an overview of the C preprocessor, which is a macro processor that transforms C code before compilation. It covers preprocessing tasks like handling headers, macros, conditionals, and other directives. It also describes the traditional mode for backward compatibility with older code and implementations.
This document is a master's thesis that examines localization techniques in wireless sensor networks. It provides background on wireless sensor networks and how they emerged from military applications but are now used in various civil applications. The thesis focuses on developing and analyzing new localization algorithms. It presents the results of experiments measuring received signal strength indication (RSSI) from wireless sensor nodes, which indicate significant fluctuations that could limit the reliability of localization schemes. Overall, the thesis evaluates localization methods and develops new algorithms to improve positioning accuracy in wireless sensor networks.
This document provides an overview of developing and deploying a secure portal solution using WebSphere Portal V5 and Tivoli Access Manager V5.1. It discusses the key concepts, high-level architecture, and software components involved. The target audience includes portal administrators, developers, and security administrators. The document covers topics such as security fundamentals, architecture and topology selection for runtime and development environments, design guidelines, and integration considerations. It also includes a working example solution to demonstrate an implementation based on the guidance provided.
This document is a table of contents for a textbook on mathematics for computer science. It lists 10 chapters that cover topics such as proofs, induction, number theory, graph theory, relations, and sums/approximations. Each chapter is divided into multiple sections that delve deeper into the chapter topic. For example, Chapter 1 discusses propositions, axioms, logical deductions and provides examples of proofs; Chapter 2 covers induction and uses it to prove theorems.
Discrete Mathematics - Mathematics For Computer ScienceRam Sagar Mourya
This document is a table of contents for a textbook on mathematics for computer science. It lists 10 chapters that cover topics like proofs, induction, number theory, graph theory, relations, and sums/approximations. Each chapter is divided into multiple sections that delve deeper into the chapter topic, with descriptive section titles providing a sense of what each chapter covers at a high level.
This document provides an introduction to security on mainframe systems. It discusses fundamental security concepts like confidentiality, integrity and availability. It also covers security elements such as identification, authentication, authorization, encryption and auditing. Additionally, it examines the System z architecture and how the hardware and operating system provide security features. The document uses a case study about securing an online bookstore to illustrate how these concepts apply in a business context. It is intended to help readers understand mainframe security.
Pattern classification via unsupervised learnersNick Palmer
I study classification problems in a standard learning framework, in which an unsupervised learner creates a discriminant function over each class and observations are labeled by the learner returning the highest value associated with that observation. I examine whether this approach gains significant advantage over traditional discriminant techniques.
Probably Approximately Correct learning distributions over class labels under L1 distance or KL-divergence is shown to imply PAC classification in this framework. I determine bounds on the regret associated with the resulting classifier, taking into account the possibility of variable misclassification penalties, and demonstrate the advantage of estimating the a posteriori probability distributions over class labels in the setting of Optical Character Recognition.
Unsupervised learners can be used to learn a class of probabilistic
concepts (stochastic rules denoting the probability that an observation has a positive
label in a 2-class setting). I demonstrate a situation where unsupervised learners
can be used even when it is hard to learn distributions over class labels – in this case
the discriminant functions do not estimate the class probability densities.
If that isn't exciting enough, I then use a standard state-merging technique to PAC-learn a class of probabilistic automata. The results show that by learning the distribution over outputs under the weaker L1 distance rather than KL-divergence we are able to learn without knowledge of the expected length of an output. It is also shown that for a restricted class of these automata learning under L1 distance is equivalent to learning under KL-divergence.
This document provides an introduction to queueing theory, covering basic concepts from probability theory used in queueing models like random variables, generating functions, and common probability distributions. It then discusses fundamental queueing models and relations, including Kendall's notation for describing queueing systems and Little's Law relating average queue length and waiting time. Specific queueing models are analyzed like the M/M/1, M/M/c, M/Er/1, M/G/1, and G/M/1 queues.
This document provides an introduction to queueing theory. It discusses key concepts such as random variables, probability distributions, performance measures, Little's law and the PASTA property. It then examines several common queueing models including the M/M/1, M/M/c, M/Er/1, M/G/1 and G/M/1 queues. For each model it derives the equilibrium distribution and discusses measures like mean queue length and waiting time. The goal is to provide the fundamental mathematical techniques for analyzing queueing systems.
This document provides lecture notes on applied cryptography and data security. It covers topics such as symmetric and asymmetric cryptosystems, cryptanalysis techniques, stream and block ciphers like DES and AES, public key cryptography including RSA, and the discrete logarithm problem. The notes are intended for education on the fundamentals of cryptography and cybersecurity.
This document is a textbook about the science of computing. It is divided into chapters that cover various topics in computing including: logic circuits, data representation, computational circuits, computer architecture, operating systems, artificial intelligence, and language and computation. The textbook is copyrighted to Carl Burch and is intended to provide an introduction to fundamental concepts in computer science.
This document is a thesis submitted by David Liebman to the State University of New York at New Paltz for the degree of Master of Science in Computer Science. The goal of the thesis is to create a chatbot using natural language processing and deep learning models. The thesis provides background on recurrent neural networks, transformers, and pre-trained language models like GPT-2. It then describes the experimental design and setup for installing chatbot models on devices like the Raspberry Pi. Several chatbot experiments are conducted using GRU, transformer, and GPT-2 models with discussion of the results.
Stochastic Processes and Simulations – A Machine Learning Perspectivee2wi67sy4816pahn
Written for machine learning practitioners, software engineers and other analytic professionals interested in expanding their toolset and mastering the art. Discover state-of-the-art techniques explained in simple English, applicable to many modern problems, especially related to spatial processes and pattern recognition. This textbook includes numerous visualization techniques (for instance, data animations using video libraries in R), a true test of independence, simple illustration of dual confidence regions (more intuitive than the classic version), minimum contrast estimation (a simple generic estimation technique encompassing maximum likelihood), model fitting techniques, and much more. The scope of the material extends far beyond stochastic processes.
The document outlines the author's preparation for coding interviews at Google, including a review of important data structures, algorithms, and problem domains. The author plans to thoroughly review arrays, trees, graphs, dynamic programming, recursion, sorting, strings, caching, game theory, computability, bitwise operators, math, concurrency, and system design. They will also practice solving problems involving arrays, strings, trees, graphs, divide-and-conquer, dynamic programming, and more. The single document the author intends to bring summarizes their background, projects, most difficult bugs, and other experiences that may be relevant questions during the interview.
Efficient algorithms for sorting and synchronizationrmvvr143
This document is the thesis of Andrew Tridgell submitted for the degree of Doctor of Philosophy at The Australian National University in February 1999. The thesis presents efficient algorithms for internal and external parallel sorting and for remote data update (rsync). The internal sorting algorithm approaches the problem by first using an incorrect but fast algorithm to almost sort the data before performing a cleanup phase. The external sorting algorithm partitions data across disks before performing sorting within each partition. The rsync algorithm operates by exchanging block signatures followed by a simple hash search algorithm to efficiently synchronize remote files. Performance results are presented for each algorithm along with comparisons to other related work.
This document describes the implementation of an Android kernel rootkit for unrooted stock Android images. It discusses the structure of the Linux kernel, existing kernel rootkits and related work. The thesis then presents a concept for an Android kernel rootkit and infection method. It details the implementation process, including building a customized kernel, developing the rootkit module and creating an exploit tool and infected app. Problems encountered during implementation are also discussed. The work evaluates the practical usage of the rootkit, potential defenses and areas for future improvement.
(Springer optimization and its applications 37) eligius m.t. hendrix, boglárk...ssuserfa7e73
This document provides an introduction to the book "Introduction to Nonlinear and Global Optimization" by Eligius M.T. Hendrix and Boglárka G.-Tóth. It discusses the aims and scope of the book, which is to provide undergraduate and graduate textbooks focusing on algorithms for solving nonlinear optimization problems and their applications. The introduction also notes that optimization has expanded rapidly in recent decades with new techniques developed and its use diffusing into other disciplines. The book will cover topics like nonlinear optimization, network flow problems, stochastic optimization, and more.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
7. Chapter 1
Introduction
Anu granted him the totality of knowledge of all.
He saw the Secret, discovered the Hidden,
he brought information of (the time) before the Flood.
(Epic of Gilgamesh)
The best things in life aren’t things.
(3:26 PM Jul 21st via UberTwitter, P. Hilton)
1.1 Information Management
In what is often considered as the oldest written story, the main character is
first described as a man of knowledge. The mysteries in ancient Greece also
considered the possession of secret knowledge as a source of enlightenment.
More prosaically, priests, astrologers, physicists and so on formed congregations
based on their possession of unique knowledge, and the preservation of these
congregations depended upon their monopoly on these pieces of useful knowl-
edge, e.g. the computation of the areas allocated to peasants after each flood of
the Nile. In ancient societies being able to retain and control secrets was thus
a self-preservation issue for organizations.
These ancient origins of information retention are in contrast with nowa-
days society which emphasizes the instantaneous diffusion of information via
platforms such as twitter.com or facebook.com. CEOs have their own blog
on their company’s strategy1 and facing a crisis situation corporations try to be
as open as possible to gain or recover citizens, consumers and peers confidence.
In nowadays societies, being able to disseminate as much as possible information
is now a survival issue for corporations and individuals.
Of course the delineation between the necessity of preserving secrecy of some
information and dissemination of information is not as coarse, and both aspects
get along at the same time in almost every society, think e.g. of advertising and
1 See http://www.wired.com/wired/archive/15.04/wired40_ceo.html for more context,
the blog itself being at http://blog.redfin.com.
7
8. 8 CHAPTER 1. INTRODUCTION
patents. This is particularly visible in nowadays complex industrial projects
such as the development of a new plane, as demonstrated by Boeing with the
787 dreamliner, which relies on contractors disseminated all over the world,
some of whom being also contractors for its competitor Airbus.
Thus the contrast between ancient and nowadays societies also routinely oc-
curs as everyone, from the manager of a complex program involving contractors
to the facebook website member, has to manage, i.e. share information with
partners or withhold it. One particular difficulty in the management of infor-
mation is the lack of reliability of electronic systems. Facebook members have
difficulties in adapting to the latest changes in Facebook access control policies,
while information system specialists fear the possible computer attacks on their
information systems.
1.2 Information Management in Computer Sys-
tems
Choosing to share or disclose information in a face-to-face meeting is relatively
easy, as it suffices to express it or not. When in a discussion one wants some
information to be passed to some partners but not to others, it is still possible
to skillfully resort to some common knowledge, ambiguities, or any type of non-
verbal communication to precisely disclose the information to the intend person.
The variety of possibilities offered to human for direct communications is
beyond the capacity of modern days computers. Computer systems conversa-
tions are message exchanges, and the lack of ambiguity in these is crucial to
their proper functioning. When accounting for the fact that anyone who is will-
ing to may participate, even passively and without the other participants being
aware of it, in any conversation occurring over a medium such as the Internet,
it would seem that computer users only have the choice of disclosing a piece of
information to everyone or to no one, as were groups thousands of years ago.
The role of cryptography is to provide to computer systems the ability hu-
mans naturally have to alter how information is expressed to guarantee the
identity of the participants who can extract meaningful information from the
messages, or of the possible source of the message. Cryptographic protocols are
predefined conversations in which the messages exchanged by the participants
are protected by cryptographic operations. Most of my research work has con-
sisted in determining whether a cryptographic protocol satisfies the guarantees
it claims to achieve, and more precisely in trying to determine in a fixed setting
whether the protocol fails to provide its users with its claimed guarantees.
But as presented above, an intelligent information management requires not
only the control over some pieces of information but also the proper dissemina-
tion of other pieces of information. For example the Web Services framework
aims at maximizing the availability of information by making it accessible via
on-line services. Here the notion of information is taken in the broad sense and
denotes data as well as processes. A continuation of my research on crypto-
9. 1.3. DOCUMENT OUTLINE 9
graphic protocols has been the extension of some results into the Web Service
framework and consists in deciding, given the messages the putative Web Ser-
vices are willing to exchange one with another, whether there exists an elec-
tronic conversation that satisfies everyone’s information management policy. I
have considered this problem under two different angles, depending on whether
one is interested in the how, i.e. considers the structure of the exchangeable
messages, or in the what, i.e. considers the conditions under which a participant
agrees to disclose a piece of information to someone else.
1.3 Document Outline
In the rest of this section I describe more precisely the four parts that compose
this document, namely: a) the domain of application of my researchs that con-
tains a short description of crpytographic protocols and Web Services, b) the
first-order logic tools that I rely upon to solve problems in the aforementioned
domain, c) a description of the formal modelling in first-order logic based frame-
works of cryptographic protocols and Web Services, and d) a summary of the
results achieved.
Domain. The first part contains the description of the two application do-
mains of my work. The first one is the analysis of cryptographic protocols, on
which I have begun to work under the supervision of Laurent Vigneron and
Micha¨l Rusinowitch during my PhD. I present in Chapter 2 cryptographic pro-
e
tocols, and surveys the existing analysis methods. Chapter 3 is an introduction
to Web Services biased towards our purpose, which is the analysis of their com-
munications under security constraints.
Tools. Both out of didactical purpose and to serve as a reference for the latter
parts of this document, I begin Chapter 4 with an introduction to the basics
of first-order logic byb surveying the classical skolemization, compacity prop-
erty, and resolution. The latter is of special importance to us as it permits
one to prove automatically that a first-order theory is unsatisfiable—one says
that resolution is refutationally complete—, and thus by contradiction that a
property is a logical consequence of other properties. This chapter ends with
more advanced materials on reasoning modulo an equational theory that ends
with the replacement properties that underlies a large part of my work on the
analysis of cryptographic protocols. The refutational completeness of resolu-
tion is insufficient for the practical purpose of automated deduction as it relies
on non-determinism, and the amount of computation required even for simple
theories is too large even for modern days computer. Refinements of resolution
aim at reducing the non-determinism to turn this procedure into one suited to
automated deduction, and in some cases permits one to obtain a decision proce-
dure. We first present in Chapter 5 the classical result of Basin and Ganzinger
that proves that for first-order theories in which all permitted resolution steps
10. 10 CHAPTER 1. INTRODUCTION
have been performed, the logical consequence problem is decidable. This re-
sult is based on a refinement of resolution based on an ordering in which every
atom without variables is greater than only a bounded number of other atoms.
This presentation is followed by its (unpublished) extension to well-founded
orderings I have obtained with Mounira Kourjieh when solving cryptographic
protocol analysis problems.
Modelling. Now that the reader is equipped with a “survival toolkit” in first-
order logic I present the formal models on which the analysis is performed.
Chapter 6 includes an article written in collaboration with M. Rusinowitch on
the compilation of standard cryptographic protocol specifications into active
frames. These are a simplified formal model of protocol participants in which
only the global effects, not the individual operations, of the participant are taken
into account. Also in this chapter I introduce symbolic derivations in which all
operations must be atomic. In contrast with active frames, which have an in-
tuitive semantics, and with process calculi, that rely on standard programming
constructions, symbolic derivations are designed to ease the reasoning on pro-
tocol participants and on the intruder, at the cost of a difficulty to relate this
model of computation to standard constructions.
In contrast with cryptographic protocols in which entities usually terminate
their participation to the protocol after a few execution steps, Web Services
may exhibit a rich behavior. Trust negotiation in particular usually ends once a
fixpoint is reached. Thus in order to take into account the access control part of
the Web Service specifications we need to consider a framework in which loops
are allowed. In collaboration with Philippe Balbiani and Marwa ElHouri I have
proposed one such framework in [21, 22], from which Chapter 7 is extracted.
Results obtained. The last part of this document presents the decidability
or combination results I have obtained since I obtained my Ph.D. In a first
chapter I present a synthesis of several results obtained around the decidability
of the insecurity problem of cryptographic protocols when only a finite number of
message exchanges by honest agents are allowed. Instead of focusing on each of
the settings considered, I have tried to how these different results are connected
one with another. In doing so I have assumed that the reader is already familiar
with the proofs and techniques employed in the articles [61, 67, 62].
Then in Chapter 9 I present the results obtained while I was invited in the
Cassis project at INRIA Nancy Grand Est. I have worked there in collaboration
with M. Rusinowitch, M. Turuani, and with two Ph.D. students, Mohammed
Anis Mekki and Tigran Avanesov. We have worked on the application of the
techniques developped primarily for cryptographic protocol analysis to solve ba-
sic orchestration problems, which are both special reachability problems. With
M.A. Mekki the study was focused on building a complete tool that takes in its
input a description of the available services in an Alice&Bob-like notation and
a description of the goal of the orchestration, and produces a deployment-ready
validated orchestrator service. At the time of writing, that service is deployed
11. 1.3. DOCUMENT OUTLINE 11
as a tomcat servlet, but all the cryptography is implemented within the body
of the SOAP messages. With T. Avanesov we have considered a multi-intruder
extension of the standard cryptographic protocol analysis setting. When per-
forming security analysis, this setting permits us to model situations in which
several intruders are willing to collaborate one with another, but cannot com-
municate directly, and thus have to pass the information they want to exchange
through honest agents. When composing Web Services, we look at a distributed
orchestration problem: several partners are willing to collaborate, but they do
not wish to share all the information they have. The problem then is to decide
whether the participants’ security policies are flexible enough to allow them
to collectively implement the goal service. Generally speaking, this problem
is strictly more difficult than standard orchestration (or cryptographic protocol
analysis) given that in addition to a decision procedure for the case of Dolev-Yao
like message manipulations, we have obtained an undecidability result when the
equational theory that defines the operations is subterm and convergent.
Finally in Chapter 10 I present some work on the equivalence of symbolic
derivations. The problem is to determine whether an intruder can observe dif-
ferences in the execution of two different protocols. A preliminary result ob-
tained in collaboration with M. Rusinowitch was published in [75]. In that
paper we have provided a more succinct proof of the decidability of this prob-
lem for subterm convergent equational theories, a result originally obtained by
M. Baudet [27]. In this chapter I present a criterion that actually permits one
to reduce this equivalence problem to the reachability analysis performed when
considered the usual trace properties. I believe that the reduction can easily be
implemented in reachability analysis tools such as CL-AtSe or OFMC, and thus
may be of practical interest.
Epilogue. This document ends with a last chapter on the future research di-
rections stemming from the results obtained so far. A one-sentence summary
would be more of the same, but differently. While I plan to continue the work
around reachability analysis problems, I also plan to explore further the side-
ways, namely:
• to work on the potential applications to safety analysis;
• to explore further the relation between reachability analysis and first-order
automated reasoning techniques;
• to obtain a comprehensive framework for service composition that also
takes into account trust negotiation, and as a consequence to relate more
formally the models for protocols and Web Services presented in this doc-
ument;
• to extend the modularity results obtained to address the modular verifi-
cation of aspect-based programs.
15. Chapter 2
Cryptographic Protocols
The starting point of the work presented in this document is
the security analysis of cryptographic protocols. We describe
in this chapter what these communicating programs are, which
properties they guarantee, and how they are specified. We also
present a short survey on the analyzes they may be subject to
with an emphasis on our domain of research.
2.1 Cryptographic Protocols
We present in this section the cryptographic protocols. In Subsection 2.1.1 we
present the setting in which they are specified: the participants, the electronic
communications, and the cryptographic operations. Then in Subsection 2.1.2
we briefly present a short specification of a cryptographic protocol in a Re-
quest for Comments document issued by the Internet Engineering Task Force
(IETF), a standardization body. Though we do not consider exclusively cryp-
tographic protocols specified in such documents, this serves as the basis for our
first formal model of cryptographic protocols, in which the participants and the
discussion they are intended to have is specified by a narration, presented in
Subsection 2.1.3. Then we present some of the standard properties they can
guarantee in Subsection 2.1.4. Finally we explain in Subsection 2.1.5 how the
correspondence between the narrations and their properties can be established.
2.1.1 Secured Communications
A cryptographic protocol defines which messages can be exchanged between
participants. The advantage gained by reducing one’s possible actions to those
described in the protocol is the implicit guarantee that each participant behaving
as prescribed is provided with security guarantees on the data he has exchanged.
This guarantee is obtained via the clever use of cryptographic primitives.
These are algorithms that rely on the asymmetry of information between
individuals, and are classified according to the assumptions on this asymmetry.
15
16. 16 CHAPTER 2. CRYPTOGRAPHIC PROTOCOLS
The most common types are:
Secret key cryptosystems: this type of cryptography has been the only type
of cryptography until the 1970s. It relies on a secret piece of information,
called a secret key, known only within a small group. Every member of
this group can both cipher and decipher messages with the key, while
agents outside of it can neither cipher nor decipher the encoded message.
Instances of secret key cryptosystems are the Enigma [214], DES [165],
3DES [169], and the current AES [170]. Given a message M , and a secret
key sk(k) we denote:
encs (M, sk(k)):the encryption of M with the key sk(k)
decs (M, sk(k)):the decryption of M with the key sk(k)
Public key cryptosystems: the first (tentative) publication [158] on public
key cryptography was met with skepticism, as in the words of a reviewer:
“Experience shows that it is extremely dangerous to transmit key
information in the clear.” 1
The first accepted paper on the topic was the presentation by Diffie and
Hellman [104] of a clever usage of exponentiation in modular arithmetic.
The result of their analysis was the possibility to compute a couple of
keys (pk(k), sk(k)) such that the messages encrypted with the key pk(k)
can be decrypted only with the key sk(k), and such that sk(k) cannot
feasibly be computed from pk(k). Thus the key pk(k) can be published
as a phone number would be, and any participant can send information
only to the agent knowing the key sk(k), given that only that agent can
decrypt, i.e. understand. Examples of public-key cryptosystems include
RSA [186, 31, 179, 180], ElGamal [116]. Given a message M , a public key
pk(k) and a secret key sk(k) we denote:
encp (M, pk(k)) the encryption of M with the key pk(k)
decp (M, sk(k)) the decryption of M with the key sk(k)
Signature cryptosystems: the asymmetry of public key cryptosystems can
also be employed to authenticate the creator of a message. The sender
signs the message he wants to send with a secret key sk(k). Anybody
knowing the public key pk(k) can then verify that the signature was com-
posed with the key sk(k), and thus originates from the possessor of that
key. Given a message M , a public key pk(k) and a secret key sk(k) we
denote:
sign(M, sk(k)) the signature of M with the key sk(k)
verif (M , M, pk(k)) the check that M is the signature of M with
the inverse of the key pk(k)
1 http://www.merkle.com/1974/
17. 2.1. CRYPTOGRAPHIC PROTOCOLS 17
Other functions are employed to construct messages such as the concatena-
tion M1 , M2 of two messages. We also consider the modeling of mathematics
functions such that the bitwise exclusive-or or the modular exponentiation, and
will add the corresponding symbols as necessary.
2.1.2 RFCs
Cryptographic protocols are published and endorsed by various governmental
or private organizations. These organizations can be formed to support one spe-
cific (set of) protocols, such as the “Liberty Alliance”, or have a more general
interest in one domain, such as the “Oasis Open consortium” or the “World
Wide Web Consortium”, for respectively the transmission and representation
of information in the XML format or the Web. The Internet Engineering Task
Force (IETF) is particularly important as an organization focusing on the basic
protocols employed in the computer-to-computer communications, and on the
interoperability of their implementations. Transport Layer Security [102, 103]
(TLS) is specified by a Request for Comments (RFC) document, as are some
protocol proposals in early stages, such as RFC 2945 that describes the SRP
Authentication and Key Exchange System. In the latter case implementation
issues are not discussed, but the principle of the protocol is presented. Often
such documents contain a finite state automaton describing the different states
in which a program implementing the protocol can be as well as the possible
actions in each state, and/or the intended sequence of messages between par-
ticipants in the protocol, as in Figure 2.1.
Client Host
U =<username> →
← s =<salt from passwd file>
Upon identifying himself to the host, the client will receive the
salt stored on the host under his username.
a =random()
A = g a %N →
v =<stored password verifier>
b =random()
← B = (v + g b )%N
p =<raw password>
x = SHA(s|SHA(U |” : ”|p))
S = (B − g x )(a+u∗x) %N S = (A ∗ v u )b %N
K =SHA Interleave(S) K =SHA Interleave(S)
Figure 2.1: Annotated message sequence chart extracted from the RFC 2945
(SRP Authentication and Key Exchange System)
18. 18 CHAPTER 2. CRYPTOGRAPHIC PROTOCOLS
2.1.3 Narrations
Though in the Avispa and Avantssar we have worked on the definition of more
complex protocol specification languages, the specification of a protocol by a
single sequence of messages as in [98, 148, 126, 162] is sufficient for most cryp-
tographic protocols even though the internal computations of the agents is not
specified. In its simplest form, a narration is a sequence of message exchanges
followed by the initial knowledge each participant must have to engage in the
protocol (Needham-Schroeder Public Key protocol, [166]):
A→B:encp ( A, Na , KB )
B→A:encp ( Na , Nb , KA )
A→B:encp (Nb , KB )
where
−1
A knows A, B, KA , KB , KA
−1
B knows A, B, KA , KB , KB
The names A and B in this sequence do not refer to any particular individual
but to roles in the narration: common names instead of A and B are Client,
Server, Initiator,. . . Actual participants in an instance (also called session) of
the protocol play each one of the roles defined by the message exchange.
We note that the messages Na and Nb are not in the knowledge of A nor
of B. These are nonces, i.e. random values created at the beginning of each
instance of the protocol.
Personal work:
We present in Chapter 6 how these narrations can be given an operational
semantics. The languages we have developed in the course of the Avispa
and Avantssar projects did not need such developments given that the
modeler of a protocol in HSPSL [64] or ASLan V.2 has to specify also
the internal actions of the roles. Though it is often tedious to write such
specifications, the language aims at a greater accuracy of the protocol
model. We note that latest works such as [163] step back on this choice
and return to simpler models.
2.1.4 Security Properties
Generally speaking [83] one can distinguish two kinds of properties for programs
such as protocols:
• Properties that are defined by a set of possible executions of the protocol;
• Hyper-properties that are defined by the set of the sets of possible execu-
tions of the protocol.
Our work principally focuses on the properties of protocols such as:
• Secrecy, i.e. determining whether one of the messages exchanged can be
constructed by an attacker;
19. 2.1. CRYPTOGRAPHIC PROTOCOLS 19
• Authentication, i.e. determining whether the principals accept only the
messages originating from the participants listed in the narration.
Example 1. The simplified [147] version of the Needham-Schroeder Public Key
protocol (NSPK) [166] exhibits vulnerabilities to both secrecy and authentica-
tion. Whereas at the end of their respective execution A and B shall be assured
to have engaged in a conversation one with another and that the nonces Na and
Nb are kept secret, Lowe [147] found the following attack:
A → I :encp ( A, Na , KI )
I(A)→ B :encp ( A, Na , KB )
B →I(A):encp ( Na , Nb , KA )
I → A :encp ( Na , Nb , KA )
A → I :encp (Nb , KI )
I(A)→ B :encp (Nb , KB )
In this attack A starts a legitimate instance of the protocol with an intruder, i.e.
a dishonest agent I. This intruder then masquerades as A—the corresponding
events are denoted I(A)—and initiates a session with B. B responds as if he
were talking to A, and ends successfully his part of the protocol. However, in
the course of his protocol instance B has accepted messages issued by I instead
of A, hence an authenticity failure. Furthermore, the nonces Na and Nb , which
are believed by B to be a common secret shared with A, are actually known by
I, hence a secrecy breach.
Personal work:
Until recently I have worked only on the security analysis of properties
such as secrecy and authentication. However in a debuting series of work
I also consider the problem of the security analysis w.r.t. the equivalence
of protocols. This notion is employed to reason about anonymity, e-voting
protocols, abstraction of a perfect primitive by a concrete one, and so on.
Chapter 10 includes these results, which are related to the refutation of
cryptographic protocols.
2.1.5 Formal methods
We have worked on the formal analysis of cryptographic protocols. This means
that given a specification such as a narration we built a logical model of the
protocol and its environment consisting in three parts describing respectively:
• the possible actions of agents behaving as prescribed by the roles in the
protocol;
• the possible actions of an attacker in the setting considered;
• the property we want to verify.
The parallel execution of roles and of the intruder is interpreted by a conjunc-
tion. Two types of logical analysis can then be performed:
20. 20 CHAPTER 2. CRYPTOGRAPHIC PROTOCOLS
Validation: one proves that the property is logically implied by the specifica-
tions of the protocol and of the intruder;
Refutation: one constrains the logical specifications e.g. by imposing an ini-
tial state, bounds the number of possible instances of the protocol,. . . and
proves that under these restrictions the property is not logically implied
by the specifications of the protocol and of the intruder.
When failing in refuting a protocol, we can only conclude that under the con-
straints imposed there is no attack. Of course this does not mean that there is
no attack when weaker constraints, or none, are imposed. Let us review some
of the constraints routinely imposed:
Isolation: no protocol is executed concurrently with the one under scrutiny.
While unrealistic, this assumption, or some weaker version of it, is needed
given that for any protocol P one can construct a protocol P’ [132] such
that, when P’ is executed concurrently with P the attacker can discover
a secret message exchanged in P. While this result is theoretical as the
second protocol has to be constructed from the first one, such attacks also
often occur in practice [91].
In [50, 19] the isolation assumption is weakened into assuming, in some
form or another, that no other protocol executed concurrently uses the
same cryptographic data. Concerning symbolic analysis of protocols, one
can find in [163] similar assumptions employed to obtain the soundness
of the composition of transport protocols. Other similar conditions for
the sequential or parallel composability can also be found in [10, 88] and
others that can be traced back to the non-unifiability condition initially
introduced for the decidability of secrecy in [185].
Soundness: the properties of cryptographic primitives are usually [119, 115,
184] expressed by games in which an intruder, modeled by a probabilistic
Turing machine, cannot in a reasonable amount of time have a significant
gain over a toss of coin. For instance in IND-CPA games the intruder is
given a public key. He then chooses two messages m0 and m1 , and is then
presented with the encryption of either m0 or m1 . He wins the game if he
can choose m0 and m1 such that he has strictly2 more than 50% chances
of guessing the right answer.
While there are some attempts [23, 24] to directly interpret the construc-
tions on messages in terms of probability distributions, the usual lifting
of these properties into a symbolic world is problematic given that they
express what the intruder cannot do, whereas the symbolic analysis rests
on the description of what the intruder can do. We present how the trans-
lation from the concrete cryptographic setting to the symbolic world can
be justified in Subsection 2.2.2.
2 The actual condition is actually even more restrictive, and depends on the length of the
key
21. 2.2. VALIDATION OF CRYPTOGRAPHIC PROTOCOLS 21
Bounds on the instances of the protocol: though in practice the number
of distinct agents that can engage in an unbounded number of sessions of a
cryptographic protocol is a priori unbounded, it has been proved [85] that
if there is a secrecy (resp. authentication) failure in an arbitrary (w.r.t. the
number of sessions and the agents participating in each session) instance
of the protocol then there is a secrecy (resp. authentication) failure with
the same number of sessions but only 1 (resp. 2) distinct honest agents,
in addition to the intruder, instantiating the roles of the protocol.
Furthermore Stoller [200, 201] remarked that essentially all “standard”
protocols either had a flaw found when examining a couple of sessions
or were safe. While this cannot be argued for cryptographic protocols in
general [160] this remark lead to the refutation-based methods in which
one only tries to find an attack involving a couple of distinct instances
of the protocol. We present more in details in Section 2.3 the history of
refutation with a bounded number of instances of the protocol.
2.2 Validation of Cryptographic Protocols
2.2.1 Validation in a symbolic model
Validation of cryptographic protocols is usually performed under the assumption
that the protocol is executed in isolation, this assumption being justified by the
work on the soundness w.r.t. the concrete cryptographic setting described in
Section 2.2.2. Under this isolation hypothesis, validation of a protocol amounts
to proving that for any number of parallel instances of the protocol, each instance
provides the guarantees claimed by the protocol. This problem is usually treated
by translating the descriptions of the intruder and of the honest agents into sets
of (usually Horn) clauses, and by reducing the problem of the existence of an
attack to a satisfiability problem.
This approach is successful in practice, see for example the ProVerif tool
by B. Blanchet [38], and some decision procedures were also obtained. The
satisfiability of sets of clauses in which each clause either has at most one variable
or one function symbol is decidable [84], a NEXPTIME bound is given in [194,
195]. This problem is DEXPTIME-complete if all the clauses are furthermore
Horn clauses. The class of sets of clauses was later extended to take into account
blind copy [90] while preserving decidability.
It was also extended to take into account the properties of an exclusive
or [196]. While in this article it is also proven that adding an abelian group ad-
dition operation leads to undecidability, it was implemented in ProVerif in [137],
and the decidability of some particular case, including some group protocols,
was proven.
2.2.2 Soundness w.r.t. a concrete model
Validation of a cryptographic protocol is done w.r.t. a given attacker model.
However there is no assurance that the modeled attacker is as strong as an at-
tacker who can take advantage of the precise arithmetic relations between the
22. 22 CHAPTER 2. CRYPTOGRAPHIC PROTOCOLS
messages, the keys, and so on. For example the Pollard ρ method [182] is based
on the computation of collisions (different products having the same result) in
a finite group and speeds-up significantly the factorization of some integers. We
thus have a discrepancy between the symbolic analysis of cryptographic primi-
tives, which is conducted independently from the actual values of the messages
exchanged and the keys, and the analysis in the concrete setting in which the
attacker has access to the actual values of the messages and the keys, with
this additional information opening the possibility of additional attacks on a
protocol.
There has been a lot of work trying to relate concrete settings to symbolic
ones, starting with [177]. As demonstrated by e.g. [50] finding a good setting is a
difficult and error-prone task. However more recent works such as [19, 138, 139]
have provided sound and usable definitions and cryptographic settings. If one
agrees on the restriction on the usage of cryptographic protocols and of keys
imposed by these settings there exists a cryptographic library that hides the
concrete values of the keys by imposing the use of pointers instead of real data
and such that every useful manipulation on message can be performed by calls
to this library.
2.3 Refutation of Cryptographic Protocols
2.3.1 Advantages over validation
Validation of cryptographic protocols is undecidable even in the simplest settings
in which perfect cryptography is employed, the protocol is executed in isolation
from other protocols, and either only a finite number of distinct values are
exchanged or some typing systems ensures that the complexity of the messages
is bounded. Furthermore the soundness of a validation procedure is hard to
establish: though one can prove that in a given symbolic model there is no
attack on a protocol, this result does not necessarily translate into the validation
of a concrete version of the protocol as was described in 2.2.2.
However, when trying to refute a protocol, the translation to the concrete
level is simpler as it suffices to prove that any action performed by the attacker
in the symbolic model can be translated into an action of an attacker in the
concrete model. Also the restrictions imposed on the protocols to ensure the
decidability of their validation are usually too strong for real-life case studies.
These reasons motivated the refutation of cryptographic protocols under
constraints: instead of trying to prove that a protocol is valid one tries to dis-
cover an attack when additional constraints on the protocol are imposed. In
accordance with the observations by Stoller [200, 201] the most common con-
straint consists in: a) bounding the number of messages the honest participants
can receive; and b) forcing the participant either to accept a message or aborts
his execution of the protocol. These assumptions can be translated in terms
of processes by imposing that the honest participants are modeled by processes
without loop and in which the “else” branch of the conditional is always an
23. 2.3. REFUTATION OF CRYPTOGRAPHIC PROTOCOLS 23
abort. Usually one further imposes that the tests in the conditional must be
(conjunctions of) positive equality tests. Another common restriction consists
in bounding the complexity of the terms representing the messages.
Under these assumptions it is possible to devise decision procedures for the
refutation of cryptographic protocols w.r.t. a model of the attacker. When
conducting such an analysis one first has to provide the reader with a message
and deduction model, and then only can one present a decision procedure w.r.t.
these models. In more details we have:
Message model: Messages are modeled by first-order terms, i.e. finite recur-
sive structures defined by the applications of some functions on terms and
by constants. The first task in protocol refutation consists in defining the
properties of these functions. For instance one should model that a bitwise
exclusive-or operation ⊕ is commutative, i.e. for every messages x and y
the equality x ⊕ y = y ⊕ x holds;
Deduction model: Then one has to model how the attacker can use messages
at his disposal to create new ones. This is usually done by assuming
that the intruder can apply (a subset of) the symbols employed to define
the messages to construct new messages. For example an asymmetric
encryption algorithm can be employed by the intruder to construct new
messages, but the sk( ), pk( ) symbols, employed to denote the public and
private keys, cannot be employed by the intruder to construct new keys;
Decision procedure: Finally one searches a decision procedure applicable to
all finite message exchanges where the messages are as defined in the first
point when attacked by an intruder having the deduction power as defined
in the second point.
Since we attempt to refute protocols the soundness of the message and de-
duction models is more important than their completeness. Forgetting some
possible equalities or deductions may lead to inconclusive analysis (stating that
no attack is found under the current hypotheses), but having unsound equal-
ities or deductions could lead to false positives, i.e. a valid protocol could be
declared as flawed.
2.3.2 Personal Work on the Refutation of Cryptographic
Protocols
During my PhD I have worked on the refutation of cryptographic protocols
when the number of messages exchanged among the honest agents is bounded.
In collaboration with Laurent Vigneron, I first extended Amadio and Lugiez’s
decision procedure [8] to take into account the case of non-atomic secret keys
and implemented it in daTac [78]. Then we have presented an abstraction of
the parallel sessions of a cryptographic protocol [77, 79] in which it is possible
to validate strong authentication, in contrast with other existing abstractions
(e.g. [41]) in which replay attacks cannot be detected. This abstraction is based
24. 24 CHAPTER 2. CRYPTOGRAPHIC PROTOCOLS
on a saturation of the protocol rules modeled as clauses, and on the extension of
the intruder’s deduction capacities with these so-called “oracle” rules, instead
of simply checking the property in the saturated set of rules. Then, and before
I finished my PhD, I have worked with R. K¨sters, M. Rusinowitch, and M. Tu-
u
ruani on the extension of the complexity result obtained in the case of perfect
cryptography [190, 144] to the cases in which an exclusive-or [68, 61], an expo-
nential for Diffie-Hellman [69, 62], commutative asymmetric encryption [60, 62],
or oracle rules [63] were added to the standard set of intruder deduction rules.
I finally presented a lazy constraint solving procedure [56] that extends the one
in [78] to protocols in which an exclusive-or symbol appears. This procedure
was implemented in CL-AtSe [208] by M. Turuani and M. T¨ngerthal with some
u
further optimization on the exclusive-or unification algorithm [207].
This serie of results was however non-satisfactory given that there was no
result on the decidability of refutation when e.g. both an exponential and an
exclusive-or appear in the protocol. In collaboration with M. Rusinowitch we
have considered the problem of the combination of decision procedures for refu-
tation, and presented a solution [70, 76] that reduces the refutation of protocols
expressed over the union of two disjoint sets of operators and with ordering re-
strictions to problems of refutation in individual signatures with the same kind
of ordering constraints. We later extended this result to well-moded but non-
disjoint union of signatures in [71, 72]. In [11] the authors build upon the first
combination result to obtain a similar one on the combination of static equiv-
alence decision procedures, while [157, 136] obtain similar conditions for the
combination on non-disjoint signatures, and [47] extends it to take into account
some specific properties of homomorphisms. Finally let me mention that the
well-moded constraint is rather general and intuitive, given that it was defined
to model the properties of exponential w.r.t. the abelian group of its exponents,
but was also employed in [97] to model the relationship between access control
and deductions on messages in PKCS#11.
When Mounira Kourjieh began her PhD under my supervision, we started
to work on a novel research direction. As explained above, the traditional
research on the relation between concrete and symbolic models of cryptographic
primitives is based on the establishment of a set of assumptions on the use of
these primitives and on the management of the keys, and in proving that under
these assumptions one can build a complete symbolic model such that, if there
is no flaw on the symbolic level then there is no flaw on the concrete level. We
remark that:
• the approach may be too restrictive for real-life protocols, as it requires
e.g. that the keys are created and managed by a trusted entity—the
cryptographic library;
• the soundness of validation in the symbolic model is hard to establish
given that one has to account for all the possible actions of the attackers.
This is in contrast with the soundness of refutation for which one only has
to prove that the actions described in the symbolic setting are feasible in
the concrete setting.
25. 2.3. REFUTATION OF CRYPTOGRAPHIC PROTOCOLS 25
For these two reasons we have tried to model the weaknesses of the cryptographic
primitives when no assumption is made on the keys creation and management:
instead of restricting the concrete level to make it fit a symbolic model we
have instead augmented the symbolic model to take into account the known
attacks on the concrete primitives. We have achieved decidability results for
signatures in the multi-user setting [58] and the decidability3 of the refutation
for hash functions for which it is feasible to compute collisions [57]. This work
is presented in more details in Chapter 8.
3 Under the assumption that the combination result of [71] on deduction systems also holds
on extended deduction systems.
27. Chapter 3
Web Services
As a continuation of my work on cryptographic protocols I have
begun research on Web Services when I arrived in Toulouse
in 2004. While at first they were simply viewed as crypto-
graphic protocols exchanging XML messages, this very active
area turned out to be the source of a variety of research prob-
lems related to the modeling of the access control policy and
of the workflow of Business Processes. Also of interest is the
emerging development of modular methods for the validation of
Web Services. We introduce in this chapter Web Services with
a short historical introduction, followed by a description of the
aspects of concern to my research. I conclude it with a summary
of my research on this topic.
3.1 Web Services
3.1.1 Basic services
1
The usual characterization of Web Service defines a Web Service as an appli-
cation that communicates with remote clients using the HTTP [114] transport
protocol. The principle of having applications executed on a server computer
and used by remote clients is not an original one, as was already present in Sun’s
mid-90’s motto “Network is the computer”. However the first implementations
were impractical, for several reasons:
• Sun’s proposal was to code all the applications in Java to ensure inter-
operability.
• The Corba2 framework aimed at the independence from Java, but suffered
from the choice of a binary encoding of data (which implies the difficulty
1 This historical discussion is based, among other sources, on http://www.ibm.com/
developerworks/webservices/library/ws-arc3/.
2 Common Object Request Broker Architecture.
27
28. 28 CHAPTER 3. WEB SERVICES
for different vendors to provide interoperable solutions) and of a dedicated
transport protocol called IIOP [159] that imposes constraints on the pro-
grammer and limits interoperability to platforms understanding it;
These limitations have not prevented both Java and Corba to be successful
in a closed environment, but were too strong for the overall adoption of these
solutions for client/server communications.
Given the workforce needed to specify, standardize, and implement inter-
operatively a protocol on a variety of platform, a natural choice for the transport
protocol was to rely on an off-the-shelf widely implemented protocol. HTTP
stood out among other possibilities because a) it is an open protocol, and
b) client interfaces are already provided by existing Web browsers, and c) these
Web browsers also already support scripting languages, and d) its traffic is in
most cases not blocked by firewalls. Furthermore, when employed in combina-
tion with the TLS [102, 103] protocol it provides the basic security guarantees
of server authentication and confidentiality. One usually differentiate between
SOAP and REST Web Services. The former are based on SOAP, an application-
level transport protocol that relies on post/get HTTP verbs. In addition to
these verbs the REST Web Services also use the update/delete ones, but do
not need the extra abstraction provided by the SOAP protocol.
Another characterization of Web Services (starting from WSDL 2.0 [187]) is
the description of an available service in the Web Service Description Language.
This is a language in which the individual functionalities, called operations, are
advertised together with a description of their in- and output messages, as well
as a description of how one can connect to the service. An important point
is that for Web Services described in WSDL, HTTP is not the only possible
transport protocol. Originally WSDL [81] was designed to describe Web Services
communicating using the SOAP [120] protocol, an application-level protocol
originally running on top of HTTP. Bindings of SOAP to other protocols such
as JMS or smtp have since been defined, and with WSDL 2.0 the application-
level transport protocol is not necessarily SOAP anymore.
Example 2. The Amazon S33 (Simple Storage Service) provides users with a
storage space as well as with operations enabling the user to set an access control
policy to her files and add, view, remove files from the store. It is available both
in the REST style and in the SOAP style.
Model. In the rest of this document we consider an abstraction of Web Ser-
vices in which the exact transport protocol employed is irrelevant, assuming
that one could describe more precisely the messages whenever one wants to
consider the exact binding employed. As a result, a Web Service is akin to a
role specification in which request/response pairs of messages are defined, but
without necessarily constraints on the order in which the requests are received.
3 API description available at url http://docs.amazonwebservices.com/AmazonS3/latest/
API/.
29. 3.1. WEB SERVICES 29
3.1.2 Software as a Service
WSDL defines which functionalities a service offers as well as how one com-
municates with the service. However, since their inception, Web services have
gradually turned from remotely accessible libraries to full-fledged applications.
The general idea is to transform existing applications, or create new ones, by
writing independent software components and by establishing communication
sequences between these components. The goal is to:
• ease the deployment of new applications and the development of new com-
ponents;
• ease the changes in an application by containing each one in a single
component;
• rely on the fact that each component is remotely accessible to gain flexi-
bility on the hardware infrastructure, i.e. the actual computers running
the components, for example by relying on a Web server to dispatch a
request to the computer on which the application is deployed.
The separation into atomic components necessitates a way to glue these com-
ponents into applications. This glue is called a business process, and is written
in a language in which, besides the usual assignments, conditionals, and loops
constructs, there exists basic constructs to invoke a remote service. Some of
these languages are scripting languages such as python or Ruby, but we have
chosen to focus on BPEL [128] Business Process Execution Language because
of its natural integration in the WSDL description of a service: services in-
voked are referenced using their WSDL description, and the process itself can
be advertised by publishing a WSDL description of it.
A current trend is also to employ Web Services to outsource the computers in
which a corporation’s applications are executed. I.e. the services are not hosted
on a computer belonging to the corporation but on computers provided by a
third party, who in returns perceives some payment according to the resources
used by the applications. A merit of this cloud computing approach is the
low initial cost of deployment of services as well as the reduced uncertainty
on the running cost/customer ratio, a crucial benefit in nowadays economic
environment.
Model. When analyzing the security of a Web Service, we simply model Busi-
ness Processes with an ordering on the possible input and output messages. But
when considering the access control policy of services we introduce a process de-
scription language which is a simplified version of BPEL, see Chapter 7.
3.1.3 Security Policies
In general terms, a policy controls the possible invocation of the operations of
a service, such as its Quality of Service, or its business logic. In a framework
such as JBOSS, even the business process can be encoded as a policy over the
30. 30 CHAPTER 3. WEB SERVICES
acceptable requests. Instead of analyzing policies in general, we focus on two
types of security-related policies:
• the message-level security policy, which expresses how the data transmit-
ted to and from the service has to be cryptographically secured;
• the access control policy, which is expressed at the level of the application
and expresses when an invocation is legitimate.
Message Protection
There are two main ways to secure the communications of a service with its
partners: a) to impose that the transport protocol must be secured, and b) to
impose the usage of cryptographic primitives to protect the sensitive parts of
the transmitted messages.
Given that there exists secure transport protocols such as TLS, one could
wonder why one would need to further protect the messages. The main moti-
vation for this extra protection is the fact that the protection provided by TLS
is a point-to-point one, whereas complex service interactions depend upon end-
to-end security. A simple example would be the payment of an item purchased
on Internet. One does not necessarily trust the e-commerce web site enough to
send it one’s credit card information, even though they have to be transmitted
to the bank to complete the transaction. Thus the client has to send to the
e-commerce web site her credit card information cryptographically protected in
such a way that: a) this web site will be able to employ the protected data to
complete the transaction with the bank, but also b) this web site will not be
able to derive the credit information from the data. Other applications include
digital contract signing, electronics bidding, etc.
Model. Cryptographically protected messages are simply cryptographic pro-
tocol messages. When analyzing access control policies, which rely on the pay-
load of messages rather than on the cryptography employed to secure the mes-
sages, we partially abstract the message layer by simply assuming that the
payload is either signed, encrypted, or both, or none, by a user and that the
transport protocol is either secured or not. See Chapter 7.
Authentication–Assertion–Authorization
Access control consists in determining whether a given entity has the right,
under the actual known circumstances, to perform a given action on a protected
object. Access control rules emit opinions on whether the access should be
granted or denied, and an access control policy gathers these opinions and uses
a policy combination algorithm to grant or deny the access to the resource. A
rule is said to be applicable on a request if it emits a grant or deny opinion.
In the most simple form rules are totally ordered, and the opinion of the first
applicable rule is the resulting opinion of the set of rules, but other combinations
algorithms can be found e.g. in [173].
31. 3.1. WEB SERVICES 31
Expressibility. Just as Object Oriented programming simplifies the manage-
ment of objects by organizing them in a hierarchy, a lot of research on access
control is focused on the simplest ways to write rules that are both sound w.r.t.
desired policies and easily writable and understandable. In this line we note
the RBAC (Role Based Access Control ) framework proposed by Ferraiolo and
Kuhn [113] that organizes individuals according to the administrative role they
have (doctor, visitor, etc.) together with a role hierarchy that defines the inher-
itance of permissions of junior role r to a senior role r . Access control decisions
are based uniquely on the role played by the requester, on the action, and on
the object in the request. OrBAC [129] refines this model by introducing a hi-
erarchy of contexts in which a request has to be analyzed as well as a hierarchy
on objects. These models often yield very simple policies but at the expense of
expressibility. For example in pure RBAC it is not possible to express that the
same individual, regardless of her role, shall not perform two different actions in
the same execution context (this is called dynamic separation of duty). On the
other side of the spectrum, ABAC (Attribute-Based Access Control ) provides
no hierarchy, and the decision is based solely on the values of a set of attributes
extracted from the request and from the environment. This implies that every
aspect that can influence an access control decision has to be modeled by a
valued attribute, and thus that this type of access control system, while being
able to express any kind of policy, is hard to deploy and manage. Its versa-
tility nonetheless made it the system of choice for Web Service access control
systems such as XACML [173], especially in the currently developed XACML
3.0 version, with its WS profile [9].
Layered model of Access Control. A layered model has emerged over the
years from the industry best practices as well as from the availability of dedicated
systems. Access control in distributed systems is now viewed as consisting in
three interacting components:
Authentication: the first phase is implemented in applications such as Shib-
boleth and consists in the authentication of users. I.e., a user has to
authenticate to one such server using e.g. his login and password or a
more complex authentication protocol, and once the authentication con-
straints imposed on the server are satisfied (e.g. the user has provided a
valid certificate authenticating his signature verification key and has re-
sponded successfully to a challenge-response protocol) the server issues
a token that can be employed by the user to prove his identity to other
services. Alternatively, in the case of SAML Single Sign-On, the server
will authenticate the user to other services.
Assertions: once the user is identified he can negotiate with security services to
obtain assertions that qualify him. For example a user can use his identity
to activate a role and thereby obtain a role membership credential. This
credential can then be employed to gain new ones expressing permissions
associated with this role.
32. 32 CHAPTER 3. WEB SERVICES
Authorization: Finally, when trying to execute an action on a resource, the
user decorates his request with the necessary credentials, and an autho-
rization decision is taken based on the value and origin of the provided
attributes.
Model. Given that we are less interested in a user-friendly access control
system than in the analysis of the access control policy of a set of Web Services
we have adopted a formal model of attribute-based access control. We have
abstracted away the authentication phase by using secure channels providing
authentication, and are left with the modeling of the assertion collection part
and of the authorization part of access control. We present in Chapter 7 a
comprehensive model of a distributed access control system for Web Services
where the rules are furthermore modeled as Horn clauses.
3.2 Results achieved in the domain of Web Ser-
vices
I have collaborated with Marwa El Houri, a PhD student I supervised, and
Philippe Balbiani on the definition of a formal model for the analysis of Web
Services [110]. Our final proposal consists in modeling each component in a
Web Service infrastructure by a communicating entity, i.e. an agent that has:
• a store that permits to model a memory, a database, the history of the
service, etc.;
• a trust negotiation policy that indicates which credentials the entity is
ready to share with which other entities on which kind of channel;
• A workflow which consists in a set of tasks. Tasks are recursively defined,
and an authorization rule controls each invocation of a task.
Given the part of an infrastructure (a database system, a human agent, a trust
negotiation engine or a Business Process Engine) modeled by an entity some of
the above parts may be empty.
This model permits us to seamlessly encode Role Based Access Control with
(dynamic) separation or binding of duties constraints as well as advanced fea-
tures such as all surveyed kinds of delegation [110]. We have also enriched it
with cryptographic primitives and secure channels to enable the validation of a
given set of entities w.r.t. untrusted users [110].
In collaboration with Mohammed Anis Mekki—a PhD student I co-supervise
with M. Rusinowitch—and M. Rusinowitch we have considered the choreogra-
phy problem for a set of services. This problem consists in building, given a
finite set of available services, an orchestrator that communicates with these
services to achieve a given goal. I detail this work in Chapter 9. Also presented
in that chapter is the work in collaboration with Tigran Avanesov, M. Rusi-
nowitch and Mathieu Turuani on the choreography problem for services which
33. 3.2. RESULTS ACHIEVED IN THE DOMAIN OF WEB SERVICES 33
consists in, again given a set of available services and a goal, to compute se-
quences of communication for each of the available services such that the goal
is satisfied at the end once every participating service has ended its sequence of
communication.
37. Chapter 4
Fundamentals of
First-Order Logic
We introduce in this chapter the formalism and notions that will
be employed in the rest of this document. This chapter is aimed
at presenting first-order logic with an emphasis on resolution,
and should be read as a basis for a course on first-order logic ori-
ented towards resolution and its applications. This focus means
that significant though unrelated notions are lacking. The in-
terested reader can find in particular complements on sequent
calculus and semantic tableaux in [94].
This chapter ends with the definition of equational theories, a
more advanced concept that we need to analyze cryptographic
protocols. In particular we extend the unification notions intro-
duced together with resolution to unification modulo an equa-
tional theory. We also prove a few important facts on equational
unification.
4.1 Facts, sentences, and truth
4.1.1 Reasoning on facts
Consider the following sentences:
• It is summer or the temperature is cold;
• It is not summer or the weather is rainy.
We rely on the excluded-middle law 1 which states that a fact can only be true or
false. As a consequence we can reason on the possible truth value of the fact “It
1 In Scottish courts the result of a criminal prosecution can be either proven (meaning
guilty), not proven, or not guilty. In this case we can have at the same time that the result
of the prosecution is not “proven” and is not “not proven”. Beyond the anecdote logic with
no excluded-middle law (intuitionistic logic, linear logic, . . . ) have been employed fruitfully
37
38. 38 CHAPTER 4. FUNDAMENTALS OF FIRST-ORDER LOGIC
is summer”. If it is true then the fact “It is not summer” must be false. Since
the second sentence is true one can deduce that the weather is rainy. But it may
also be the case that the fact “It is summer” is false. Since the first sentence is
true we must then have that the temperature is cold. As a conclusion of these
two sentences, either the temperature is cold or the weather is rainy.
Generally speaking, if A, B1 , . . . , Bn , C1 , . . . , Ck are facts, and the sentences:
• A or B1 or . . . or Bn ;
• not(A) or C1 or . . . or Ck .
are true, then if A is true, not(A) must be false, and thus C1 or . . . or Ck is
true since the second sentence is. Symmetrically if A is false we must have B1
or . . . or Bn because the first sentence is true. This reasoning is sound since if
the assumptions are true then the conclusion must be true.
This reasoning can also be conducted if there is no alternative in one of the
sentences. Assume the following two sentences are true:
• It is day or it is night;
• It is not day.
One ought to conclude that it is night. Another special case is when there is no
alternative in both sentences. For instance assume the following two sentences
are true:
• It is day;
• It is not day.
By following the general scheme given above we deduce that a sentence with
no facts must be true. But the common sense also tells us that the assumption
that both sentences are true does not hold: a fact and its negation cannot be
both true. We reconcile these two conclusions by imposing that a sentence
with no facts must always be false, and rely on the soundness of our deduction
mechanism to deduce (by contrapositive reasoning) that if the conclusion is
false then one of the premises must be false. In this case, i.e. when in a set of
sentences at least one must be false whatever truth value is chosen on the facts,
we say that this set is inconsistent.
The case-based reasoning on sentences illustrated above is called resolution.
It was introduced by Robinson [3] as a reasoning mechanism for the whole of
first-order logic, in which one can e.g. axiomatize Zermelo-Fraenkel set theory.
Outline of this chapter. We begin this chapter with a section on orders,
and review some definitions and properties. Then we define in Section 4.3 the
language employed to describe sentences. We give a semantics to first-order
to reason about the existence of a proof of a theorem, a proof of the negation of a theorem,
and the absence of proof for both a theorem and its negation.
39. 4.2. ORDERS 39
logic sentences by defining how the language constructs are interpreted. We
present in Section 4.5 some of the mathematical properties of first-order logic,
namely that it suffices to consider finite sets of universally quantified clauses,
where each clause is a disjunction of facts, and that it suffices to consider the
truth in particular interpretations called Herbrand’s interpretations. Then we
present in Section 4.6 a calculus on finite sets of clauses that recognizes the
finite sets of clauses that are always false. We present in Section 4.7 how to
integrate an equality predicate in this setting.
4.2 Orders
4.2.1 Definitions and first properties
Orderings and pre-orderings. A strict ordering < on a set S is a transitive,
anti-reflexive, and anti-symmetric relation on elements of this set. An ordering
≤ is the union of a strict ordering and of the equality relation. An equivalence is
a transitive, symmetric and reflexive relation. A pre-ordering is the transitive
closure of the union of an equivalence relation with a strict ordering.
A strict ordering < on a set S is said to be total whenever for two elements
e1 , e2 ∈ S we have either e1 = e2 , or e1 < e2 , or e2 < e1 . It is said to be well-
founded whenever there is no infinite strictly decreasing sequence e1 > . . . >
en > . . .. These definitions are extended as usual to orderings and pre-orderings.
We call an element e maximal (respectively strictly maximal ) with respect to a
set η of elements, if for any element e in η we have e e (respectively e e).
Extension to sets and multisets. Any ordering on a set E can be ex-
tended to an ordering set on finite subsets of E as follows: given two finite
subsets η1 and η2 of E we define η1 set η2 if (i) η1 = η2 , and (ii) for every
e ∈ η2 η1 there exists e ∈ η1 η2 such that e e. Given a set, any smaller set
is obtained by replacing an element by a (possibly empty) set of strictly smaller
elements.
Similarly, any ordering on a set E can be extended to an ordering mul
on finite multisets over E as follows: let ξ1 and ξ2 be two finite multisets over
E. As usual we denote ξ(e) the number of occurrences of e in the multiset
ξ, and we let > denote the standard “greater-than” relation on the natural
numbers. We define ξ1 mul ξ2 if (i) ξ1 = ξ2 and (ii) whenever ξ2 (e) > ξ1 (e)
then ξ1 (e ) > ξ2 (e ), for some e such that e e.
Given a multiset, any smaller multiset is obtained by replacing an occurrence
of element by occurrences of smaller elements. We call an element e maximal
(respectively strictly maximal ) with respect to a multiset ξ of elements, if for
any element e in ξ we have e e (respectively e e).
If the ordering is total (resp. well-founded), so is its multiset extension.
It is easy to see that in turn this implies that if the ordering is total (resp.
well-founded), so is its set extension.
40. 40 CHAPTER 4. FUNDAMENTALS OF FIRST-ORDER LOGIC
4.2.2 Orderings on terms and atoms
Lemma 4.1. Let t be a complete simplification ordering over terms, and
assume that a is compatible with t . Then a is:
1. well-founded;
2. monotone;
3. B a A implies Var(B) ⊆ Var(A).
Proof. We recall that the ordering a is compatible with the complete simpli-
fication ordering t and a is total on ground atoms.
1. Let us prove that a is well-founded. By contradiction there otherwise
exists an infinite descending chain of atoms A0 a A1 a . . .. Since the
ordering is total on terms the compatibility of a with t , we deduce that
there is an infinite descending chain of terms t0 t t1 t . . . where ti is a term
occurring in the atom Ai . Thus t is not well-founded, a contradiction
with the assumption that t is a complete simplification ordering.
2. Let A, B be two atoms such that B a A. Suppose that A = I(t1 , . . . , tn )
and B = I (s1 , . . . , sm ). By the compatibility of a with t , for all
i ∈ {1, . . . , m}, there is j ∈ {1, . . . , n} such that si t tj , and then, by
monotonicity of t , si σ t tj σ for any substitution σ. Again by the
compatibility of a with t , we deduce that Bσ a Aσ for any σ and
then the monotonicity of a .
3. Let A, B be two atoms such that B a A. The compatibility of a
with t implies that for each term tB occurring in B there exists a term
tA occurring in A such that tB t tA . Since t is subterm, this implies
Var(t) ⊆ Var(t ). We conclude that Var(B) ⊆ Var(A).
4.3 Syntax
We have adopted a bottom-up presentation of the constructions employed to de-
fine the language first-order logic. We first define the terms in Subsection 4.3.1.
Then we introduce the predicate symbols in Subsection 4.3.3. At this point we
have defined the atoms (called facts in the introduction of this chapter) that are
the basic elements of first-order logic. A formula is the arrangement of atoms
using the logical connectives defined in Subsection 4.3.4. Quantifiers are then
introduced to precise the meaning of formulas in Subsection 4.3.5. Finally we
introduce clauses which are formulas of a special form and correspond to the
sentences in the introduction.
41. 4.3. SYNTAX 41
4.3.1 Terms
Definition 1. (Signature) Let F be a finite or denumerable set. A signature α
is a mapping from F to the set of natural numbers I The image α(f ) of an
N.
element f ∈ F is called its arity.
A signature α employed to define terms is called a functional signature. Its
domain is then called a set of function symbols. Given a functional signature α
the constants are the elements e ∈ F of arity 0.
We denote T (α, X ) the set of terms built on a functional signature α and
a denumerable set of variables X . A term is an expression built in finite time
such that:
• constants and variables are terms;
• If t1 , . . . , tn are terms and α(f ) = n then f (t1 , . . . , tn ) is a term.
Given a term t we denote Var(t) (resp. Const(t)) the set of variables (resp.
constants) occurring in t. A term t is ground if Var(t) = ∅
Example 3. For instance we can choose a functional signature mapping ev-
ery rational number to 0, the symbol “minus” to 2, the symbol “abs” to 1,
and the symbol f to 1. A term in this signature is an expression t such as
abs(minus(x, f ( 1 ))).
2
4.3.2 Substitutions
A substitution is a function that replaces the variables occurring in a term by
other terms. It can be thought of as similar to an assignment in imperative
languages, since the effect of an instruction:
x := 1
is to replace the value of the variable x with the term 1. However some care
needs to be taken when considering assignments such as:
x := x + 1
since one needs to distinguish the current value of x, employed to compute
expression on the left-hand side, and the next value of x that will be the result
of the sum.
We avoid such intricacies by imposing that a variable changed by a substi-
tution does not occur in a term in the image of the same substitution. A simple
way to obtain this is to mandate that a substitution must be an idempotent
function, i.e. that applying it twice yields the same result as applying it only
once.
Another point is that we want the application of a substitution to be effec-
tively applicable in finite time. Accordingly we impose on substitutions to be
functions that change only a finite number of variables. There are two ways to
mandate this:
42. 42 CHAPTER 4. FUNDAMENTALS OF FIRST-ORDER LOGIC
• The first one is to define substitutions as partial functions from variables
to terms, and to impose that they have a finite domain;
• The second possibility is to say that substitutions are total functions but
with a finite support set, i.e. there exists only a finite set of variables x
such that σ(x) = x.
Definition 2. (Substitutions) A substitution σ : X → T (F, X ) is an idempo-
tent function such that the set {x ∈ X | x = σ(x)} is finite.
A substitution σ is ground is σ(x) = x implies that σ(x) is a ground term.
We extend substitutions homomorphically to terms in T (F, X ) by defining:
σ(t) If t ∈ X
σ(t) =
f (σ(t1 ), . . . , σ(tn )) If t = f (t1 , . . . , tn )
Finally we improve the readability of this document by writing the application
of a substitution σ on a term t in the postfix notation tσ. The application of first
the substitution σ and then the substitution τ on t is thus written tστ instead
of τ (σ(t)). Since substitutions are endomorphisms on the algebra of terms, they
can be composed, and the composition is associative.
Positions. It is often convenient to refer to a specific subterm in a term t. This
is achieved by using positions which can be viewed as pointers to the subterms
of t and are finite sequences of integers. They are defined as follows:
• the set of positions of constants and variables contains only one position
which is denoted ε, and is an empty sequence of integers;
• If t1 , . . . , tn are terms with respective sets of positions P1 , . . . , Pn , then
the set of positions of the term f (t1 , . . . , tn ) is:
n
{ε} ∪ {i · p | p ∈ Pi }
i=1
The set of the positions in a term t is denoted Pos(t).
Let t be a term, and p ∈ Pos(t) be a position. We define recursively the
subterm of t at position p, denoted t|p , and the symbol at position p, denoted
Symb(t, p), as follows:
• t|ε = t and Symb(f (t1 , . . . , tn ), ε) = f ;
• f (t1 , . . . , tn )|i·p = ti|p and Symb(f (t1 , . . . , tn ), i · p) = Symb(ti , p);
43. 4.3. SYNTAX 43
4.3.3 Predicates
The terms on a signature α are related one with another with relations. While
the usual examples of relations are “. . . is smaller than. . . ” or “. . . is equal
to. . . ”, the principle of relational database systems is to model each aspect of
a problem by a relation called table.
A signature employed to define predicate symbol is called a relational signa-
ture. Given a relational signature β and a functional signature α a (β, α)-atom
is an expression p(t1 , . . . , tn ) where β(p) = n and t1 , . . . , tn ∈ T (α, X ).
Example 4. Beside the functional signature of Example 3 let us consider the
following predicate signature:
β = inf → 2
Under this choice the expressions
inf(abs(minus(x, x )), λ)
inf(abs(minus(f (x), f (x ))), ε)
are (β, α)-atoms.
Given an atom a = p(t1 , . . . , tn ) we denote Var(a) (resp. Const(a)) the set
∪n Var(ti ) (resp. ∪n Const(ti )).
i=1 i=1
4.3.4 Logical connectives and formulas
Let α be a functional signature and β be a relational signature. Formulas
express truth relations between (β, α)-atoms. One may for instance write that
two atoms must be both true, or that at least one must be true, etc. We call
the functions that relate the atom one with another logical connectives. If one
denotes true with the symbol and false with the symbol ⊥, these connectives
can be a priori any function f : {⊥, }n → {⊥, } where n is the number
of connected atoms. However, defining one function for each arrangement of
atoms one wishes to express would be tedious. Hopefully it has long been noted
that every such function can be written as the composition of three logical
connectives:
• a ∨ b: is false iff a and b are false;
• a ∧ b: is true iff a and b are true;
• ¬a: is true iff a is false.
For example the logical implication a ⇒ b which is read “a implies b” can be
written ¬a ∨ b. Note that this implication does not have the causation meaning
associated to the implication in natural languages. It simply means that either
the value of the atom a is false (an implication with a false premise is always
true) or else that the value of the atom b must be true.
The (β, α)-formulas are the expressions built in finite time such that:
44. 44 CHAPTER 4. FUNDAMENTALS OF FIRST-ORDER LOGIC
• a (β, α)-atom is a (β, α)-formula;
• if f1 , f2 are (β, α)-formulas then f1 ∨ f2 and f1 ∧ f2 are (β, α)-formulas;
• if f is a (β, α)-formula then ¬f is a (β, α)-formula.
Example 5. Continuing the examples 3 and 4 a formula is an expression like:
¬(inf(abs(minus(x, x )), λ)) ∨ inf(abs(minus(f (x), f (x ))), ε)
Given a formula ϕ where the atoms a1 , . . . , an occur we denote Var(ϕ) (resp.
Const(ϕ)) the set ∪n Var(ai ) (resp. ∪n Const(ai )).
i=1 i=1
4.3.5 Quantifiers
The definition of (β, α)-formulas is still ambiguous. When one writes a(x) ∨ b(x)
it is not clear one means that for some value c of x it is true that a(c) ∨ b(c),
or one means that whatever the value c of x is it is true that a(c) ∨ b(c). In
order to precise the meaning of the variables in the formulas one introduces
existential (for some value of) and universal (for all values of) quantifiers denoted
respectively ∃ and ∀. Formally,
• A (β, α)-formula is a (β, α)-quantified formula with an empty set of quan-
tified variable;
• If ϕ is a (β, α)-quantified formula with a set of quantified variables Q
and x ∈ Var(ϕ) Q then ∃xϕ is a (β, α)-quantified formula with a set of
quantified variables Q ∪ {x};
• If ϕ is a (β, α)-quantified formula with a set of quantified variables Q
and x ∈ Var(ϕ) Q then ∀xϕ is a (β, α)-quantified formula with a set of
quantified variables Q ∪ {x}.
A (β, α)-quantified formula in which every variable is quantified is called a
(β, α)-sentence. Note that in the traditional presentation of sentences in first-
order logic the quantifiers may be interleaved with the logical connectives. The
price of the added complexity (in terms of defining the semantics, the quantified
variables, the handling of variable names clash, etc.) is however paid for nothing:
any (β, α)-sentence in the standard setting is logically equivalent to a formula in
the simpler language described above. An equivalent formula can be effectively
computed by algorithms that rewrite sentences in prenex normal form (see [146,
151, 94], for example).
Example 6. We complete the formula in the preceding example by quantifying
the variables occurring in two different ways, thereby obtaining two different
sentences:
∀x∀ε∃λ∀x , ¬(inf(abs(minus(x, x )), λ)) ∨ inf(abs(minus(f (x), f (x ))), ε)
∀ε∃λ∀x∀x , ¬(inf(abs(minus(x, x )), λ)) ∨ inf(abs(minus(f (x), f (x ))), ε)
45. 4.4. SEMANTICS OF FIRST-ORDER LOGIC 45
The educated reader should by now have noticed that we have given the usual
definitions of continuity and uniform continuity in a normed space. We leave as
an exercise the determination of an arrangement of quantifiers expressing that
the function f is a) bounded, or b) constant.
4.4 Semantics of First-Order Logic
4.4.1 Interpretation
Giving a semantics to a logic means defining when a formula is true. Since the
meaning of quantifiers and logical connectives is fixed, it suffices to define when
an atom is true. This is achieved by interpreting the symbols occurring in a
formula.
Definition 3. (Interpretation) Let α (resp. β) be a functional (resp. relational)
signature, and X be a set of variables. A (α, β)-interpretation I is defined by2 :
• A non-empty set DI , called the domain of the interpretation;
β(p)
• For each predicate symbol p in the domain of β a function I(p) : DI →
{ , ⊥};
α(f )
• For each function symbol f in the domain of α a function I(f ) : DI →
DI .
Given an interpretation I of domain DI a valuation v is a mapping from the
set of variables to elements in DI . Valuations are extended homomorphically
on terms, atoms, and formulas as expected.
The truth value of a sentence ϕ in an interpretation I of domain DI is
denoted [[ϕ]]I is determined as follows:
• If ϕ = ∃xψ(x) then [[ϕ]]I = if, and only if, there exists a valuation v of
domain x such that [[v(ψ(x))]]I = ;
• If ϕ = ∀xψ(x) then [[ϕ]]I = if, and only if, for all c ∈ DI we have
[[vc (ψ(x))]]I = with vc is the valuation mapping x to c;
• If ϕ = ϕ1 ∧ ϕ2 then [[ϕ]]I is if, and only if, [[ϕ1 ]]I = and [[ϕ2 ]]I = ;
• If ϕ = ϕ1 ∨ ϕ2 then [[ϕ]]I = if, and only if, [[ϕ1 ]]I = or [[ϕ2 ]]I = ;
• If ϕ = ¬ϕ1 then [[ϕ]]I = if, and only if, [[ϕ1 ]]I = ⊥;
• If ϕ = p(t1 , . . . , tn ) then [[ϕ]]I = I(p)(I(t1 ), . . . , I(tn ));
2 We note that the interpretation of a variable is not defined. While usually interpretations
are extended over variables with valuations—functions mapping variables in the formula to
elements in the domain of the interpretation—we have chosen to instantiate in the formulas the
variables by the elements of the domain. Given that this interleaving is not defined formally,
this instantiation should be thought of as syntactic sugar.