A process describes the temporal evolution of a system. Capturing the rules that govern its control flow helps to understand the boundaries of its behaviour. The declarative specification of processes is based on the representation of those boundaries by means of constraints rooted in temporal logics. The execution dynamics can vary as long as they do not violate such constraints, which specify the conditions that require or forbid the execution of actions.
This talk revolves around the recent advancements in research concerning the discovery of, and reasoning on, the declarative specifications of processes. The discourse will include a focus on how to automatically extract the constraints from process data, and how to losslessly minimise the size of discovered constraint sets. The conclusion will illustrate open challenges and future research avenues in the field.
Introduction to the declarative specification of processesClaudio Di Ciccio
This slides deck contains a short introduction to the declarative specification of processes, with examples of how to describe a process with the Declare language.
Resolving Inconsistencies and Redundancies in Declarative Process ModelsClaudio Di Ciccio
Presentation of the article entitled “Semantical Vacuity Detection in Declarative Process Mining”
(https://doi.org/10.1016/j.is.2016.09.005), held at EMISA 2017, Essen, Germany (https://www2.informatik.hu-berlin.de/emisa2017/).
Declarative process models are specifications of workflows based on constraints. Any sequence of activities is allowed, as long as the constraints are not violated. To discover declarative models out of IT systems’ logs, existing techniques verify every possible constraint candidate against the recorded data. Those that hold true are included in the resulting model. A first issue is that some returned constraints can contradict one another, with the result that the model does not accept any execution and turns out to be unusable. A second challenge is the reduction of returned constraints to a minimum set of significant ones, for the sake of readability. Due to their computational complexity, none of those issues had been successfully tackled in the past. Our paper formally frames these problems and formulates an algorithmic solution for both. Its validity and efficiency are demonstrated by extensive experiments on real-world data.
Slides of the presentation held at the Humboldt University of Berlin on 2016, December the 7th.
Abstract:
The declarative modelling of business processes is based upon the specification of behavioural rules that constrain the work-flows enactment. It is meant not to explicitly specify every possible execution path from the beginning to the end: The carry-out of the process is up to the actors, who can vary the execution dynamics as long as they do not violate the constraints imposed by the declarative model. The constraints specify the conditions that require or forbid the execution of activities, either considering them singularly or depending on the occurrence of other ones. In this talk, the recent advancements in the automated discovery of declarative control flows from event logs are discussed, together with open challenges in the field.
Presented at the 12th International Conference on Business Process Management (BPM 2014), 7-11 September 2014, Eindhoven, The Netherlands.
Abstract: Process discovery is the task of generating models from event logs. Mining processes that operate in an environment of high variability is an ongoing research challenge because various algorithms tend to produce spaghetti-like models. This is particularly the case when procedural models are generated. A promising direction to tackle this challenge is the usage of declarative process modelling languages like Declare, which summarise complex behaviour in a compact set of behavioural constraints. However, Declare constraints with branching are expensive to be calculated.In addition, it is often the case that hundreds of branching Declare constraints are valid for the same log, thus making, again, the discovery results unreadable. In this paper, we address these problems from a theoretical angle. More specifically, we define the class of Target- Branched Declare constraints and investigate the formal properties it exhibits. Furthermore, we present a technique for the efficient discovery of compact Target-Branched Declare models. We discuss the merits of our work through an evaluation based on a prototypical implementation using both artificial and real-world event logs.
Ensuring Model Consistency in Declarative Process DiscoveryClaudio Di Ciccio
Presentation of the paper entitled “Ensuring Model Consistency in Declarative Process Discovery” (http://dx.doi.org/10.1007/978-3-319-23063-4_9) at the 13th International Conference on Business Process Management (BPM 2015), Innsbruck, Austria.
The main theme is the description of an automated technique to detect inconsistencies within mined declarative process models.
From monolithic systems to microservices. a decomposition framework based on ...Davide Taibi
Decomposition is one of the most complex tasks during the migration from monolithic systems to microservices, generally performed manually, based on the experience of the software architects.
In this work, we propose a 6-step framework to reduce the subjectivity of the decomposition process.
The framework provides software architects with a set of decomposition options, together with a set of measures to evaluate and compare their quality. The decomposition options are identified based on the independent execution traces of the system by means of the application of a process-mining tool to the log traces collected at runtime. We validated the process, in an industrial project, by comparing the proposed decomposition options with the one proposed by the software architect that manually analyzed the system. The application of our framework allowed the company to identify issues in their software that the architect did not spot manually, and to discover more suitable decomposition options that the architect did not consider. The framework could be very useful also in other companies to improve the quality of the decomposition of any monolithic system, identifying different decomposition strategies and reducing the subjectivity of the decomposition process. Moreover, researchers could extend our approach increasing the support and further automating the decomposition support
Part II...A functional look at Interpolative Scope and its proposed capabilities , Story #1 Specifications in Glass, Conversion Process at the Gate from Wall Synergy, Load Value and Invested Margin, View toward Instancing next... This is Part II of Interpolative Scope dialogue... added drivers and functional mark-up
Introduction to the declarative specification of processesClaudio Di Ciccio
This slides deck contains a short introduction to the declarative specification of processes, with examples of how to describe a process with the Declare language.
Resolving Inconsistencies and Redundancies in Declarative Process ModelsClaudio Di Ciccio
Presentation of the article entitled “Semantical Vacuity Detection in Declarative Process Mining”
(https://doi.org/10.1016/j.is.2016.09.005), held at EMISA 2017, Essen, Germany (https://www2.informatik.hu-berlin.de/emisa2017/).
Declarative process models are specifications of workflows based on constraints. Any sequence of activities is allowed, as long as the constraints are not violated. To discover declarative models out of IT systems’ logs, existing techniques verify every possible constraint candidate against the recorded data. Those that hold true are included in the resulting model. A first issue is that some returned constraints can contradict one another, with the result that the model does not accept any execution and turns out to be unusable. A second challenge is the reduction of returned constraints to a minimum set of significant ones, for the sake of readability. Due to their computational complexity, none of those issues had been successfully tackled in the past. Our paper formally frames these problems and formulates an algorithmic solution for both. Its validity and efficiency are demonstrated by extensive experiments on real-world data.
Slides of the presentation held at the Humboldt University of Berlin on 2016, December the 7th.
Abstract:
The declarative modelling of business processes is based upon the specification of behavioural rules that constrain the work-flows enactment. It is meant not to explicitly specify every possible execution path from the beginning to the end: The carry-out of the process is up to the actors, who can vary the execution dynamics as long as they do not violate the constraints imposed by the declarative model. The constraints specify the conditions that require or forbid the execution of activities, either considering them singularly or depending on the occurrence of other ones. In this talk, the recent advancements in the automated discovery of declarative control flows from event logs are discussed, together with open challenges in the field.
Presented at the 12th International Conference on Business Process Management (BPM 2014), 7-11 September 2014, Eindhoven, The Netherlands.
Abstract: Process discovery is the task of generating models from event logs. Mining processes that operate in an environment of high variability is an ongoing research challenge because various algorithms tend to produce spaghetti-like models. This is particularly the case when procedural models are generated. A promising direction to tackle this challenge is the usage of declarative process modelling languages like Declare, which summarise complex behaviour in a compact set of behavioural constraints. However, Declare constraints with branching are expensive to be calculated.In addition, it is often the case that hundreds of branching Declare constraints are valid for the same log, thus making, again, the discovery results unreadable. In this paper, we address these problems from a theoretical angle. More specifically, we define the class of Target- Branched Declare constraints and investigate the formal properties it exhibits. Furthermore, we present a technique for the efficient discovery of compact Target-Branched Declare models. We discuss the merits of our work through an evaluation based on a prototypical implementation using both artificial and real-world event logs.
Ensuring Model Consistency in Declarative Process DiscoveryClaudio Di Ciccio
Presentation of the paper entitled “Ensuring Model Consistency in Declarative Process Discovery” (http://dx.doi.org/10.1007/978-3-319-23063-4_9) at the 13th International Conference on Business Process Management (BPM 2015), Innsbruck, Austria.
The main theme is the description of an automated technique to detect inconsistencies within mined declarative process models.
From monolithic systems to microservices. a decomposition framework based on ...Davide Taibi
Decomposition is one of the most complex tasks during the migration from monolithic systems to microservices, generally performed manually, based on the experience of the software architects.
In this work, we propose a 6-step framework to reduce the subjectivity of the decomposition process.
The framework provides software architects with a set of decomposition options, together with a set of measures to evaluate and compare their quality. The decomposition options are identified based on the independent execution traces of the system by means of the application of a process-mining tool to the log traces collected at runtime. We validated the process, in an industrial project, by comparing the proposed decomposition options with the one proposed by the software architect that manually analyzed the system. The application of our framework allowed the company to identify issues in their software that the architect did not spot manually, and to discover more suitable decomposition options that the architect did not consider. The framework could be very useful also in other companies to improve the quality of the decomposition of any monolithic system, identifying different decomposition strategies and reducing the subjectivity of the decomposition process. Moreover, researchers could extend our approach increasing the support and further automating the decomposition support
Part II...A functional look at Interpolative Scope and its proposed capabilities , Story #1 Specifications in Glass, Conversion Process at the Gate from Wall Synergy, Load Value and Invested Margin, View toward Instancing next... This is Part II of Interpolative Scope dialogue... added drivers and functional mark-up
High Performance Cooperative Distributed Systems in AdtechC4Media
Video and slides synchronized, mp3 and slide download available at URL http://bit.ly/34ipw9f.
Stan Rosenberg explores a set of core building blocks exhibited by Adtech platforms and applies them towards building a fraud detection platform. After addressing performance, he touches on the key attributes of system reliability and quality in an Adtech system. He concludes with many insights learned from building one of the leading fraud detection platforms from the ground up. Filmed at qconnewyork.com.
Stan Rosenberg currently heads up engineering at Forensiq - a leading fraud detection platform for online advertising. Prior to Forensiq, he built distributed platforms for startups.
Can we measure the components of glass and green in terms of their proposed products and timeline of achievement? We specify the events timeline but will work further to measure these events in V2 and V3. Not only is the Tan Line important but its incremental measures are discovered. In V2 and V3 presented simultaneously we discover the magnitude of the change and the paradigm of the performance requirements. In V4 we discuss the importance of the Paradigm as an advancing plan that when its specialized products are polled the magnitude of the overall project pulls toward the more significant tan line, which is the remote capability for architectural design and building.
The design series focused on simulations and specifically Part III: Confluent Scale. We began by highlighting the vector requirements for tasking in our current model and the need for lens facets for scale differentiation. Then in Part II define simulations reporting. In Part III we began to move past forms and mix to the affluent modeling of the Ark Science of Glass in the Advancing Age to come.
The Outcome of our Square Pi Digital Investment is a What If? Metric Map, Key to Stakeholder Investments https://www.slideshare.net/slideshow/comments-on-square-pi-key-to-the-investmentpdf/267325383
Building Agile Data Warehouses with Ralph HughesKalido
Ralph Hughes, TDWI faculty member, author and 25-year veteran of DW and BI projects for Fortune 500 companies, shares his thoughts on accelerated enterprise data warehousing. More info & webinar replay can be found here http://blog.kalido.com/building-agile-data-warehouses-ralph-hughes-webinar/
Adopting BIM - An Architect's Perspective (07 May 2014)Elrond Burrell
Slides from a guest lecture presented at London South Bank University, 07 May 2014.
Part 1 - The Wider Context of BIM including UK Government Poilcy
Part 2 - Architype's journey into BIM
Part 3 - Using BIM for Passivhaus Design
Part I: Meta Based Interpolations
Meta dimensional template for interpolation of virtual objects and portal arrangements; convert between the ark design to binary using a normalization pattern. Converting a meta based company to glass (hyD case study)
Log-Based Understanding of Business Processes through Temporal Logic Query Ch...Claudio Di Ciccio
Process mining is a discipline that aims at discovering, monitoring and improving real-life processes by extracting knowledge from event logs. Process discovery and conformance checking are the two main process mining tasks. Process discovery techniques can be used to learn a process model from example traces in an event log, whereas the goal of conformance checking is to compare the observed behavior in the event log with the modeled behavior. In this paper, we propose an approach based on temporal logic query checking, which is in the middle between process discovery and conformance checking. It can be used to discover those LTL-based business rules that are valid in the log, by checking against the log a (user-defined) class of rules. The proposed approach is not limited to provide a boolean answer about the validity of a business rule in the log, but it rather provides valuable diagnostics in terms of traces in which the rule is satisfied (witnesses) and traces in which the rule is violated (counterexamples). We have implemented our approach as a proof of concept and conducted a wide experimentation using both synthetic and real-life logs.
We have formed a new P(N) process and Theoretical Production to account for the cost of the Glass Mix. In doing so we see that there is a split in the process between the inhouse and remote requirement, and we need to devise Order Fulfillment in such a way that we can account for the completion of the Order. This we accomplish in V2 and V3. By running multiple instances of production we are able to test and to write a test for use at the site to finish the build. This leading to new encryption and remote building mechanics
We are comparing the Base Build Assembly in Plait Glass Emulation with the Dynamic Array for Product Media to show the more excellent features and financial outcomes. In V2 we will discuss how properties are developed for product media and in V3 we will begin the subject of inter-dimensional considerations leading to the Ark Mode requirements for remote building. We have begun mapping the properties and interdimensional requirement and show that we have exceeded the Scorecard by z dimensions
We are comparing the Base Build Assembly in Plait Glass Emulation with the Dynamic Array for Product Media to show the more excellent features and financial outcomes. In V2 we will discuss how properties are developed for product media and in V3 we will begin the subject of inter-dimensional considerations leading to the Ark Mode requirements for remote building. We have begun mapping the properties and interdimensional requirement and show that we not only have exceeded the Scorecard by z dimensions, but that we also have exceeded time and motion itself.
Supply Side Portfolio V1, V2, V3 Timing, Investments, and Custom Engineering.pdfBrij Consulting, LLC
Can we measure the components of glass and green in terms of their proposed products and timeline of achievement? We specify the events timeline but will work further to measure these events in V2 and V3. Not only is the Tan Line important but its incremental measures are discovered. In V2 and V3 presented simultaneously we discover the magnitude of the change and the paradigm of the performance requirements.
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Look but don’t touch: On the impalpable bond between blockchain and processClaudio Di Ciccio
Slides of the keynote held at the BPM Blockchain Forum 2023, 13 September 2023, Utrecht, Netherlands.
Synopsis:
Multi-party business processes rely on the collaboration of various players in a decentralized setting. Blockchain technology can facilitate the automation of these processes, even in cases where trust among participants is limited. Transactions are stored in a ledger, a replica of which is retained by every node of the blockchain network. The operations saved thereby are thus publicly accessible, which benefits transparency, reliability, and persistence. Smart contracts can encode the system behavior agreed upon by the involved parties to define the behaviour of collaborative processes. Rule enforcement, traceability and non-repudiation are thus catered for, too. However, data, objects and services in the outer world are not directly accessible from within a blockchain execution evironment. On one hand, access to limited information hinders the adoption of programmable blockchains as an effective aid to process intelligence. On the other hand, transferring every bit of off-chain information on-chain is not only impractical but also undesirable, as this operation could violate typical confidentiality requirements in enterprise settings. In this talk, we discuss and explore approaches aimed at strengthening the bond between process and blockchain execution environments, balancing between knowledge sharing and secrecy preservation.
More Related Content
Similar to Declarative Specification of Processes: Discovery and Reasoning
High Performance Cooperative Distributed Systems in AdtechC4Media
Video and slides synchronized, mp3 and slide download available at URL http://bit.ly/34ipw9f.
Stan Rosenberg explores a set of core building blocks exhibited by Adtech platforms and applies them towards building a fraud detection platform. After addressing performance, he touches on the key attributes of system reliability and quality in an Adtech system. He concludes with many insights learned from building one of the leading fraud detection platforms from the ground up. Filmed at qconnewyork.com.
Stan Rosenberg currently heads up engineering at Forensiq - a leading fraud detection platform for online advertising. Prior to Forensiq, he built distributed platforms for startups.
Can we measure the components of glass and green in terms of their proposed products and timeline of achievement? We specify the events timeline but will work further to measure these events in V2 and V3. Not only is the Tan Line important but its incremental measures are discovered. In V2 and V3 presented simultaneously we discover the magnitude of the change and the paradigm of the performance requirements. In V4 we discuss the importance of the Paradigm as an advancing plan that when its specialized products are polled the magnitude of the overall project pulls toward the more significant tan line, which is the remote capability for architectural design and building.
The design series focused on simulations and specifically Part III: Confluent Scale. We began by highlighting the vector requirements for tasking in our current model and the need for lens facets for scale differentiation. Then in Part II define simulations reporting. In Part III we began to move past forms and mix to the affluent modeling of the Ark Science of Glass in the Advancing Age to come.
The Outcome of our Square Pi Digital Investment is a What If? Metric Map, Key to Stakeholder Investments https://www.slideshare.net/slideshow/comments-on-square-pi-key-to-the-investmentpdf/267325383
Building Agile Data Warehouses with Ralph HughesKalido
Ralph Hughes, TDWI faculty member, author and 25-year veteran of DW and BI projects for Fortune 500 companies, shares his thoughts on accelerated enterprise data warehousing. More info & webinar replay can be found here http://blog.kalido.com/building-agile-data-warehouses-ralph-hughes-webinar/
Adopting BIM - An Architect's Perspective (07 May 2014)Elrond Burrell
Slides from a guest lecture presented at London South Bank University, 07 May 2014.
Part 1 - The Wider Context of BIM including UK Government Poilcy
Part 2 - Architype's journey into BIM
Part 3 - Using BIM for Passivhaus Design
Part I: Meta Based Interpolations
Meta dimensional template for interpolation of virtual objects and portal arrangements; convert between the ark design to binary using a normalization pattern. Converting a meta based company to glass (hyD case study)
Log-Based Understanding of Business Processes through Temporal Logic Query Ch...Claudio Di Ciccio
Process mining is a discipline that aims at discovering, monitoring and improving real-life processes by extracting knowledge from event logs. Process discovery and conformance checking are the two main process mining tasks. Process discovery techniques can be used to learn a process model from example traces in an event log, whereas the goal of conformance checking is to compare the observed behavior in the event log with the modeled behavior. In this paper, we propose an approach based on temporal logic query checking, which is in the middle between process discovery and conformance checking. It can be used to discover those LTL-based business rules that are valid in the log, by checking against the log a (user-defined) class of rules. The proposed approach is not limited to provide a boolean answer about the validity of a business rule in the log, but it rather provides valuable diagnostics in terms of traces in which the rule is satisfied (witnesses) and traces in which the rule is violated (counterexamples). We have implemented our approach as a proof of concept and conducted a wide experimentation using both synthetic and real-life logs.
We have formed a new P(N) process and Theoretical Production to account for the cost of the Glass Mix. In doing so we see that there is a split in the process between the inhouse and remote requirement, and we need to devise Order Fulfillment in such a way that we can account for the completion of the Order. This we accomplish in V2 and V3. By running multiple instances of production we are able to test and to write a test for use at the site to finish the build. This leading to new encryption and remote building mechanics
We are comparing the Base Build Assembly in Plait Glass Emulation with the Dynamic Array for Product Media to show the more excellent features and financial outcomes. In V2 we will discuss how properties are developed for product media and in V3 we will begin the subject of inter-dimensional considerations leading to the Ark Mode requirements for remote building. We have begun mapping the properties and interdimensional requirement and show that we have exceeded the Scorecard by z dimensions
We are comparing the Base Build Assembly in Plait Glass Emulation with the Dynamic Array for Product Media to show the more excellent features and financial outcomes. In V2 we will discuss how properties are developed for product media and in V3 we will begin the subject of inter-dimensional considerations leading to the Ark Mode requirements for remote building. We have begun mapping the properties and interdimensional requirement and show that we not only have exceeded the Scorecard by z dimensions, but that we also have exceeded time and motion itself.
Supply Side Portfolio V1, V2, V3 Timing, Investments, and Custom Engineering.pdfBrij Consulting, LLC
Can we measure the components of glass and green in terms of their proposed products and timeline of achievement? We specify the events timeline but will work further to measure these events in V2 and V3. Not only is the Tan Line important but its incremental measures are discovered. In V2 and V3 presented simultaneously we discover the magnitude of the change and the paradigm of the performance requirements.
Similar to Declarative Specification of Processes: Discovery and Reasoning (19)
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Look but don’t touch: On the impalpable bond between blockchain and processClaudio Di Ciccio
Slides of the keynote held at the BPM Blockchain Forum 2023, 13 September 2023, Utrecht, Netherlands.
Synopsis:
Multi-party business processes rely on the collaboration of various players in a decentralized setting. Blockchain technology can facilitate the automation of these processes, even in cases where trust among participants is limited. Transactions are stored in a ledger, a replica of which is retained by every node of the blockchain network. The operations saved thereby are thus publicly accessible, which benefits transparency, reliability, and persistence. Smart contracts can encode the system behavior agreed upon by the involved parties to define the behaviour of collaborative processes. Rule enforcement, traceability and non-repudiation are thus catered for, too. However, data, objects and services in the outer world are not directly accessible from within a blockchain execution evironment. On one hand, access to limited information hinders the adoption of programmable blockchains as an effective aid to process intelligence. On the other hand, transferring every bit of off-chain information on-chain is not only impractical but also undesirable, as this operation could violate typical confidentiality requirements in enterprise settings. In this talk, we discuss and explore approaches aimed at strengthening the bond between process and blockchain execution environments, balancing between knowledge sharing and secrecy preservation.
Measurement of Rule-based LTLf Declarative Process SpecificationsClaudio Di Ciccio
Slides of the paper presented at the 4th Int. Conference on Process Mining (ICPM 2022, Bolzano, Italy).
Synopsis:
The classical checking of declarative Linear Temporal Logic on Finite Traces (LTLf) specifications verifies whether conjunctions of sets of formulae are satisfied by collections of finite traces. The data on which the verification is conducted may be corrupted by a number of logging errors or execution deviations at the level of single elements within a trace. The ability to quantitatively assess the extent to which traces satisfy a process specification (and not only if they do so or not at all) is thus key, especially in process mining scenarios. Previous techniques proposed for this aim either require formulae to be extended with quantitative operators or cater to the coarse granularity of whole traces. In this paper, we propose a framework to devise probabilistic measures for declarative process specifications on traces at the level of events, inspired by association rule mining. Thereupon, we describe a technique that measures the degree of satisfaction of these specifications over bags of traces. To assess our approach, we conduct an evaluation with real-world data.
Blockchain and smart contracts: infrastructure and platformsClaudio Di Ciccio
An introductory presentation on the main concepts of blockchain technologies, with a special focus on the smart contracts. The slides supported the talk held at the Cyber 4.0 Seminar on Cyber 4.0 Seminar on “Blockchain and Smart Contracts: Concepts and applications” on 2021-03-03, virtually hosted by the Sapienza University of Rome for the Cyber 4.0 Competence Centre.
Extracting Event Logs for Process Mining from Data Stored on the BlockchainClaudio Di Ciccio
Presentation of the paper presented at the 2nd International Workshop on Security and Privacy-enhanced Business Process Management (SPBP’19), 2 September 2019, Vienna, Austria (pre-print available at https://easychair.org/publications/preprint/cW8l).
Abstract: The integration of business process management with blockchains across organisational borders provides a means to establish transparency of execution and auditing capabilities. To enable process analytics, though, non-trivial extraction and transformation tasks are necessary on the raw data stored in the ledger. In this paper, we describe our approach to retrieve process data from an Ethereum blockchain ledger and subsequently convert those data into an event log formatted according to the IEEE Extensible Event Stream (XES) standard. We show a proof-of-concept software artefact and its application on a data set produced by the smart contracts of a process execution engine stored on the public Ethereum blockchain network.
A blockchain can be defined as an immutable distributed ledger on which transactions exchanged between peers are recorded. Transactions are cryptographically signed and are meant to transfer digital commodities between parties. Lately, the blockchains have undergone a paradigm shift from mere electronic cash systems to a universal platform endowed with internal programming languages, on top of which decentralised applications can be built. That has been the turning point enabling the execution of inter-organisational business processes on blockchains.
In this talk, the concepts behind and around blockchains will be described, together with the current research and future directions on its usage as an infrastructure for business process management.
Blockchain based traceability of inter-organisational business processesClaudio Di Ciccio
Presentation of the paper entitled “Blockchain-based Traceability of Interorganisational Business Processes” (http://dx.doi.org/10.1007/978-3-319-94214-8_4), held at BMSD 2018, Vienna, Austria (http://www.is-bmsd.org/).
Abstract:
The blockchain technology opens up new opportunities for Business Process Management. This is mainly due to its unprecedented capability to let transactions be automatically executed and recorded by Smart Contracts in multi-peer environments, in a decentralised fashion and without central authoritative players to govern the workflow. In this way, blockchains also provide traceability. Traceability of information plays a pivotal role particularly in those supply chains where multiple parties are involved and rigorous criteria must be fulfilled to lead to a successful outcome. In this paper, we investigate how to run a business process in the context of a supply chain on a blockchain infrastructure so as to provide full traceability of its run-time enactment. Our approach retrieves information to trace process instances execution solely from the transactions written on-chain. To do so, hash-codes are reverseengineered based on the Solidity Smart Contract encoding of the generating process. We show the results of our investigation by means of an implemented software prototype, with a case study on the reportedly challenging context of the pharmaceutical supply chain.
Semantical Vacuity Detection in Declarative Process MiningClaudio Di Ciccio
Presentation of the paper entitled “Semantical Vacuity Detection in Declarative Process Mining”
(http://dx.doi.org/10.1007/978-3-319-45348-4_10), held at BPM 2016, Rio de Janeiro, Brazil (http://bpm2016.uniriotec.br/).
A large share of the literature on process mining based on declarative process modeling languages, like DECLARE, relies on the notion of constraint activation to distinguish between the case in which a process execution recorded in event data “vacuously” satisfies a constraint, or satisfies the constraint in an “interesting way”. This fine-grained indicator is then used to decide whether a candidate constraint supported by the analyzed event log is indeed relevant or not. Unfortunately, this notion of relevance has never been formally defined, and all the proposals existing in the literature use ad-hoc definitions that are only applicable to a pre-defined set of constraint patterns. This makes existing declarative process mining technique inapplicable when the target constraint language is extensible and may contain formulae that go beyond pre-defined patterns. In this paper, we tackle this hot, open challenge and show how the notion of constraint activation and vacuous satisfaction can be captured semantically, in the case of constraints expressed in arbitrary temporal logics over finite traces. We then extend the standard automata-based approach so as to incorporate relevance-related information. We finally report on an implementation and experimentation of the approach that confirms the advantages and feasibility of our solution.
Detecting Flight Trajectory Anomalies and Predicting Diversions in Freight Tr...Claudio Di Ciccio
Presentation of the paper entitled “Detecting Flight Trajectory Anomalies and Predicting Diversions in Freight Transportation”
(http://dx.doi.org/10.1016/j.dss.2016.05.004), held at EMISA 2016, Vienna, Austria (https://aic.ai.wu.ac.at/emisa2016/).
Abstract:
Timely identifying flight diversions is a crucial aspect of efficient multi-modal transportation. When an airplane diverts, logistics providers must promptly adapt their transportation plans in order to ensure proper delivery despite such an unexpected event. In practice, the different parties in a logistics chain do not exchange real-time information related to flights. This calls for a means to detect diversions that just requires publicly available data, thus being independent of the communication between different parties. The dependence on public data results in a challenge to detect anomalous behavior without knowing the planned flight trajectory. Our work addresses this challenge by introducing a prediction model that just requires information on an airplane’s position, velocity, and intended destination. This information is used to distinguish between regular and anomalous behavior. When an airplane displays anomalous behavior for an extended period of time, the model predicts a diversion. A quantitative evaluation shows that this approach is able to detect diverting airplanes with excellent precision and recall even without knowing planned trajectories as required by related research. By utilizing the proposed prediction model, logistics companies gain a significant amount of response time for these cases.
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Introduction:
RNA interference (RNAi) or Post-Transcriptional Gene Silencing (PTGS) is an important biological process for modulating eukaryotic gene expression.
It is highly conserved process of posttranscriptional gene silencing by which double stranded RNA (dsRNA) causes sequence-specific degradation of mRNA sequences.
dsRNA-induced gene silencing (RNAi) is reported in a wide range of eukaryotes ranging from worms, insects, mammals and plants.
This process mediates resistance to both endogenous parasitic and exogenous pathogenic nucleic acids, and regulates the expression of protein-coding genes.
What are small ncRNAs?
micro RNA (miRNA)
short interfering RNA (siRNA)
Properties of small non-coding RNA:
Involved in silencing mRNA transcripts.
Called “small” because they are usually only about 21-24 nucleotides long.
Synthesized by first cutting up longer precursor sequences (like the 61nt one that Lee discovered).
Silence an mRNA by base pairing with some sequence on the mRNA.
Discovery of siRNA?
The first small RNA:
In 1993 Rosalind Lee (Victor Ambros lab) was studying a non- coding gene in C. elegans, lin-4, that was involved in silencing of another gene, lin-14, at the appropriate time in the
development of the worm C. elegans.
Two small transcripts of lin-4 (22nt and 61nt) were found to be complementary to a sequence in the 3' UTR of lin-14.
Because lin-4 encoded no protein, she deduced that it must be these transcripts that are causing the silencing by RNA-RNA interactions.
Types of RNAi ( non coding RNA)
MiRNA
Length (23-25 nt)
Trans acting
Binds with target MRNA in mismatch
Translation inhibition
Si RNA
Length 21 nt.
Cis acting
Bind with target Mrna in perfect complementary sequence
Piwi-RNA
Length ; 25 to 36 nt.
Expressed in Germ Cells
Regulates trnasposomes activity
MECHANISM OF RNAI:
First the double-stranded RNA teams up with a protein complex named Dicer, which cuts the long RNA into short pieces.
Then another protein complex called RISC (RNA-induced silencing complex) discards one of the two RNA strands.
The RISC-docked, single-stranded RNA then pairs with the homologous mRNA and destroys it.
THE RISC COMPLEX:
RISC is large(>500kD) RNA multi- protein Binding complex which triggers MRNA degradation in response to MRNA
Unwinding of double stranded Si RNA by ATP independent Helicase
Active component of RISC is Ago proteins( ENDONUCLEASE) which cleave target MRNA.
DICER: endonuclease (RNase Family III)
Argonaute: Central Component of the RNA-Induced Silencing Complex (RISC)
One strand of the dsRNA produced by Dicer is retained in the RISC complex in association with Argonaute
ARGONAUTE PROTEIN :
1.PAZ(PIWI/Argonaute/ Zwille)- Recognition of target MRNA
2.PIWI (p-element induced wimpy Testis)- breaks Phosphodiester bond of mRNA.)RNAse H activity.
MiRNA:
The Double-stranded RNAs are naturally produced in eukaryotic cells during development, and they have a key role in regulating gene expression .
112. Declarative
Specification of
Processes
Di Ciccio, C. & Mecella, M. On the Discovery of Declarative Control Flows for Artful Processes. ACM
Trans. Manage. Inf. Syst., 2015, 5, 24:1-24:37, doi: 10.1145/2629447
Di Ciccio, C.; Maggi, F. M.; Montali, M. & Mendling, J. Resolving inconsistencies and redundancies in
declarative process models. Information Systems, 2017, 64, 425-446, doi: 10.1016/j.is.2016.09.005
Di Ciccio, C.; Maggi, F. M.; Montali, M. & Mendling, J. On the Relevance of a Business Constraint to an
Event Log. Information Systems, 2018, 78, 144-161, doi: 10.1016/j.is.2018.01.011