A talk I gave at the Yonsei University, Seoul in July 21st, 2015.
The aim was to show my background contribution to the CORCON (Correctness by Construction) research project.
I have to thank Prof. Byunghan Kim and Dr Gyesik Lee for their kind hospitality.
A talk I gave at the Yonsei University, Seoul in July 21st, 2015.
The aim was to show my background contribution to the CORCON (Correctness by Construction) research project.
I have to thank Prof. Byunghan Kim and Dr Gyesik Lee for their kind hospitality.
In this article we present a brief history and some applications of semirings, the structure of compact monothetic c semirings. The classification of these semirings be based on known description of discrete cyclic semirings and compact monothetic semirings. Boris Tanana "Compact Monothetic C-semirings" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-2 , February 2021, URL: https://www.ijtsrd.com/papers/ijtsrd38612.pdf Paper Url: https://www.ijtsrd.com/mathemetics/algebra/38612/compact-monothetic-csemirings/boris-tanana
Conceptual Spaces for Cognitive Architectures: A Lingua Franca for Different ...Antonio Lieto
We claim that Conceptual Spaces offer a lingua franca that allows to unify and generalize many aspects of the symbolic, sub-symbolic and diagrammatic approaches (by overcoming some of their typical problems) and to integrate them on a common ground. In doing so we extend and detail some of the arguments explored by Gardenfors [23] for defending the need of a conceptual, intermediate, representation level between
the symbolic and the sub-symbolic one. Additionally, we argue that Conceptual Spaces could offer a unifying framework for interpreting many kinds of diagrammatic and analogical representations. As a consequence, their adoption could also favor the integration of diagrammatical representation and
reasoning in Cognitive Architectures
Correspondence and Isomorphism Theorems for Intuitionistic fuzzy subgroupsjournal ijrtem
ABSTRACT:The aim of this paper is basically to study the First Isomorphism Theorem, Second Isomorphism Theorem, Third Isomorphism theorem, Correspondence Theorem etc. of intuitionistic fuzzy/vague sub groups of a crisp group.
KEY WORDS: Intutionistic fuzzy or Vague Subset, Intutionistic fuzzy Image, Intutionistic fuzzy Inverse Image, Intutionistic fuzzy/vague sub (normal) group, Correspondence Theorem, First (Second, Third) Isomorphism Theorem.
Completeness: From henkin's Proposition to Quantum ComputerVasil Penchev
The paper addresses Leon Henkin's proposition as a "lighthouse",
which can elucidate a vast territory of knowledge uniformly: logic, set theory,
information theory, and quantum mechanics: Two strategies to infinity are
equally relevant for it is as universal and thus complete as open and thus incomplete.
Henkin's, Godel's, Robert Jeroslow's, and Hartley Rogers'
proposition are reformulated so that both completeness and incompleteness to
be unified and thus reduced as a joint property of infinity and of all infinite sets.
However, only Henkin's proposition equivalent to an internal position to
infinity is consistent . This can be retraced back to set theory and its axioms,
where tha t of choice is a key. Quantum mechanics is forced to introduce infinity implicitly by Hilbert space, on which is founded its formalism. One can
demonstrate that some essential properties of quantum information,
entanglement, and quantum computer originate directly from infinity once it is
involved in quantum mechanics. Thus, these phenomena can be elucidated as
both complete and incomplete, after which choice is the border between them.
A special kind of invariance to the axiom of choice shared by quantum
mechanics is discussed to be involved that border between the completeness
and incompleteness of infinity in a consistent way. The so-called paradox of
Albert Einstein, Boris Podolsky, and Nathan Rosen is interpreted entirely in
the same terms only of set theory. Quantum computer can demonstrate
especially clearly the privilege of the internal position, or "observer'' , or "user" to infinity implied by Henkin's proposition as the only consist ent ones as to infinity. An essential area of contemporary knowledge may be synthesized from a single viewpoint.
In this talk, logically distributive categories are introduced to provide a sound and complete semantics to multi-sorted, first-order, intuitionistic-based logical theories. The peculiar aspect is that no universe is required to interpret terms, making the semantics really point-free.
Extending the knowledge level of cognitive architectures with Conceptual Spac...Antonio Lieto
Extending the knowledge level of cognitive architectures with Conceptual Spaces (+ a case study with Dual-PECCS: a hybrid knowledge representation system for common sense reasoning). Talk given at Stockholm, September 2016.
In this article we present a brief history and some applications of semirings, the structure of compact monothetic c semirings. The classification of these semirings be based on known description of discrete cyclic semirings and compact monothetic semirings. Boris Tanana "Compact Monothetic C-semirings" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-2 , February 2021, URL: https://www.ijtsrd.com/papers/ijtsrd38612.pdf Paper Url: https://www.ijtsrd.com/mathemetics/algebra/38612/compact-monothetic-csemirings/boris-tanana
Conceptual Spaces for Cognitive Architectures: A Lingua Franca for Different ...Antonio Lieto
We claim that Conceptual Spaces offer a lingua franca that allows to unify and generalize many aspects of the symbolic, sub-symbolic and diagrammatic approaches (by overcoming some of their typical problems) and to integrate them on a common ground. In doing so we extend and detail some of the arguments explored by Gardenfors [23] for defending the need of a conceptual, intermediate, representation level between
the symbolic and the sub-symbolic one. Additionally, we argue that Conceptual Spaces could offer a unifying framework for interpreting many kinds of diagrammatic and analogical representations. As a consequence, their adoption could also favor the integration of diagrammatical representation and
reasoning in Cognitive Architectures
Correspondence and Isomorphism Theorems for Intuitionistic fuzzy subgroupsjournal ijrtem
ABSTRACT:The aim of this paper is basically to study the First Isomorphism Theorem, Second Isomorphism Theorem, Third Isomorphism theorem, Correspondence Theorem etc. of intuitionistic fuzzy/vague sub groups of a crisp group.
KEY WORDS: Intutionistic fuzzy or Vague Subset, Intutionistic fuzzy Image, Intutionistic fuzzy Inverse Image, Intutionistic fuzzy/vague sub (normal) group, Correspondence Theorem, First (Second, Third) Isomorphism Theorem.
Completeness: From henkin's Proposition to Quantum ComputerVasil Penchev
The paper addresses Leon Henkin's proposition as a "lighthouse",
which can elucidate a vast territory of knowledge uniformly: logic, set theory,
information theory, and quantum mechanics: Two strategies to infinity are
equally relevant for it is as universal and thus complete as open and thus incomplete.
Henkin's, Godel's, Robert Jeroslow's, and Hartley Rogers'
proposition are reformulated so that both completeness and incompleteness to
be unified and thus reduced as a joint property of infinity and of all infinite sets.
However, only Henkin's proposition equivalent to an internal position to
infinity is consistent . This can be retraced back to set theory and its axioms,
where tha t of choice is a key. Quantum mechanics is forced to introduce infinity implicitly by Hilbert space, on which is founded its formalism. One can
demonstrate that some essential properties of quantum information,
entanglement, and quantum computer originate directly from infinity once it is
involved in quantum mechanics. Thus, these phenomena can be elucidated as
both complete and incomplete, after which choice is the border between them.
A special kind of invariance to the axiom of choice shared by quantum
mechanics is discussed to be involved that border between the completeness
and incompleteness of infinity in a consistent way. The so-called paradox of
Albert Einstein, Boris Podolsky, and Nathan Rosen is interpreted entirely in
the same terms only of set theory. Quantum computer can demonstrate
especially clearly the privilege of the internal position, or "observer'' , or "user" to infinity implied by Henkin's proposition as the only consist ent ones as to infinity. An essential area of contemporary knowledge may be synthesized from a single viewpoint.
In this talk, logically distributive categories are introduced to provide a sound and complete semantics to multi-sorted, first-order, intuitionistic-based logical theories. The peculiar aspect is that no universe is required to interpret terms, making the semantics really point-free.
Extending the knowledge level of cognitive architectures with Conceptual Spac...Antonio Lieto
Extending the knowledge level of cognitive architectures with Conceptual Spaces (+ a case study with Dual-PECCS: a hybrid knowledge representation system for common sense reasoning). Talk given at Stockholm, September 2016.
How to Ground A Language for Legal Discourse In a Prototypical Perceptual Sem...L. Thorne McCarty
Slides for my talk at the 15th International Conference on Artificial Intelligence and Law (ICAIL 2015), June 11, 2015.
The full ICAIL 2015 paper is available on ResearchGate at bit.ly/1qCnLJq.
Lecture 2: From Semantics To Semantic-Oriented ApplicationsMarina Santini
From the "Natural Language Processing" LinkedIn group:
John Kontos, Professor of Artificial Intelligence
I wonder whether translating into formal logic is nothing more than transliteration which simply isolates the part of the text that can be reasoned upon using the simple inference mechanism of formal logic. The real problem I think lies with the part of text that CANNOT be translated one the one hand and the one that changes its meaning due to civilization advances. My own proposal is to leave NL text alone and try building inference mechanisms for the UNTRANSLATED text depending on the task requirements.
All the best
John"
Chapter 1 Logic and ProofPropositional Logic SemanticsPropo.docxcravennichole326
Chapter 1: Logic and Proof
Propositional Logic Semantics
Propositional variables: p, q, r, s, ... (stand for simple sentences)
T: any proposition that is always true
F: any proposition that is always false
Compound propositions: formed from propositional variables and logical operators (all binary except negation):
Negation ¬
Conjunction ∧
Disjunction ∨
Implication →
Biconditional ↔
Exclusive Or ⊕
Truth Tables: assign all possible T, F to all possible variables, and determines all possible T, F of compound propositions; with n variables there are 2n rows in the table
Negation changes T to F and vice versa
Conjunction is only T if both conjuncts are T
Disjunction is only F is both disjuncts are F
Implication is only F is the antecedent is T and the consequent is F
Biconditional is only true if they have the same tvalue
Exclusive Or is only T if they differ in tvalue
Two (compound) propositions are equivalent (≡) iff they always have the same tvalue (see also below)
English translations:
Conjunction: and, but, although, yet, still, ...
Disjunctions: or, unless
Implication: if, if ... then, only if, when, implies, entails, follows from, is sufficient, is necessary, when, whenever
Biconditional: if and only if, just in case, is necessary and sufficient
A set of propositions is consistent iff there is some assignment of tvalue that makes all T
A set of propositions is inconsistent iff there is no assignment of tvalue that makes all T
A tautology is a compound propositions that is always T
A contradiction is a compound propositions that is always F
A contingency is a compound propositions that is sometimes T, sometimes F
A compound proposition is satisfiable iff some assignment of tvalues make it T
A compound proposition is unsatisfiable iff no assignment of tvalues make it T
Two compound propositions p and q are logically equivalent iff p ↔ q is a tautology
Common equivalences:
DeMorgan’s Laws (Dem)
¬(p ∨ q)≡¬p ∧ ¬q
¬(p ∧ q)≡¬p ∨ ¬q
Identity Laws (Id)
p ∧ T ≡p
p ∨ F ≡p
Domination Laws (Dom)
p ∨ T ≡T
p ∧ F ≡F
Idempotent Laws (Idem)
p ∨ T ≡T
p ∧ p ≡p
Double Negation Law (DN)
¬(¬p) ≡ p
Negation Laws (Neg)
p ∨ ¬p ≡T
p ∧ ¬p ≡F
Commutative Laws (Comm)
p ∨ q ≡q ∨ p
p ∧ q ≡q ∧ p
Associative Laws (Assoc)
(p ∨ q) ∨ r ≡p ∨ (q ∨ r)
(p ∧ q) ∧ r ≡p ∧ (q ∧ r)
Distributive Laws (Dist)
p ∨ (q ∧ r) ≡
(p ∨ q) ∧ (p ∨ r)
p ∧ (q ∨ r) ≡
(p ∧ q) ∨ (p ∧ r)
Absorption Laws (Abs)
p ∨ (p ∧ q) ≡ p
p ∧ (p ∨ q) ≡ p
Conditional Laws (Cond)
p →q≡ ¬p ∨ q
¬(p →q)≡ p ∧ ¬q
Biconditional Law (Bicond)
p ↔ q ≡ (p →q) ∧ (q →p)
Quantifier Negation (QNeg)
¬ ∀x P ( x ) ≡ ∃x ¬ P ( x )
¬ ∃x P ( x ) ≡ ∀x ¬ P ( x )
Predicate and Relational Logic (Quantificational Logic, First Order Logic): Semantics
Variables: x, y, z, ...
Predicates/Relations, Propositional Functions: P(x), M(x), Q(x,y), S(x,y,z), ...
Constants: a, b, c, 0, -1, 4, Socrates, ...
Domain (U): set of things the variables range over
Propositional functions are neither T nor F; however, if all the variables are re ...
ON THE CATEGORY OF ORDERED TOPOLOGICAL MODULES OPTIMIZATION AND LAGRANGE’S PR...IJESM JOURNAL
A category is an algebraic structure made up of a collection of objects linked together by morphisms. As a foundation of mathematics, categories were created as a way of relating algebraic structures and systems of topological spaces In this paper we define a derivative using cones in the category of topological modules and use the Lagrange’s principle to obtain optimization results in the category.
It was Kepler who first asked whether contra-globally bounded homomorphisms can be classified. Hence unfortunately, we cannot assume that M is differentiable and pointwise generic. Therefore this reduces the results of [9] to a well-known result of Sylvester [32, 21]. Now it would be interesting to apply the techniques of [31] to associative, naturally Euclid elements. Thus a central problem in elliptic calculus is the derivation of countable monoids.
Mathematical argumentation as a precursor of mathematical proofNicolas Balacheff
Along history or across educational traditions, the space given to mathematical proof in compulsory school curricula varies from a quasi-absence to a formal obligation which for some has turned into an obstacle to mathematics learning. The contemporary evolution is to give to proof the space it deserves in the learning of mathematics. This is for example witnessed in different ways by The national curriculum in England (2014), the Common Core State Standards for Mathematics (2010) in the US or the recent Report on the teaching of mathematics (1918) commissioned by the French government; the latter asserts: The notion of proof is at the heart of mathematical activity, whatever the level (this assertion is valid from kindergarten to university). And, beyond mathematical theory, understanding what is a reasoned justification approach based on logic is an important aspect of citizen training. The seeds of this fundamentally mathematical approach are sown in the early grades. These are a few examples of the current worldwide consensus on the centrality proof should have in the compulsory school curricula. However, the institutional statements share difficulty to express this objective. The vocabulary includes words such as argument, justification and proof without clear reasons for such diversity: are these words mere synonymous or are there differences that we should pay attention to? What are the characteristics of the discourse these words may refer to in the mathematics classroom? Eventually, how can be addressed the problem of assessing the truth value of a mathematical statement at the different grades all along compulsory school? I shall explore these questions, starting from questioning the meaning of these words and its consequences. Then, I shall shape the relations between argumentation and proof from an epistemological and didactical perspective. In the end, the participants will be invited to a discussion on the benefit and relevance of shaping the notion of mathematical argumentation as a precursor of mathematical proof.
The famous Kruskal's tree theorem states that the collection of finite trees labelled over a well quasi order and ordered by homeomorphic embedding, forms a well quasi order. Its intended mathematical meaning is that the collection of finite, connected and acyclic graphs labelled over a well quasi order is a well quasi order when it is ordered by the graph minor relation.
Oppositely, the standard proof(s) shows the property to hold for trees in the Computer Science's sense together with an ad-hoc, inductive notion of embedding. The mathematical result follows as a consequence in a somewhat unsatisfactory way.
In this talk, a variant of the standard proof will be illustrated explaining how the Computer Science and the graph-theoretical statements are strictly coupled, thus explaining why the double statement is justified and necessary.
The Graph Minor Theorem: a walk on the wild side of graphsMarco Benini
The Graph Minor Theorem says that the collection of finite graphs
ordered by the minor relation is a well quasi order. This apparently
innocent statement hides a monstrous proof: the original result by
Robertson and Seymour is about 500 pages and twenty articles, in which a
new and deep branch of Graph Theory has been developed.
The theorem is famous and full of consequences both on the theoretical side
of Mathematics and in applications, e.g., to Computer Science. But there
is no concise proof available, although many attempts have been made.
In this talk, arising from one such failed attempts, an analysis of the
Graph Minor Theorem is presented. Why is it so hard?
Assuming to use the by-now standard Nash-Williams's approach to prove it,we will
illustrate a number of methods which allow to solve or circumvent some
of the difficulties. Finally, we will show that the core of this line of
thought lies in a coherence question which is common to many parts of
Mathematics: elsewhere it has been solved, although we were unable to
adapt those solutions to the present framework. So, there is hope for a
short proof of the Graph Minor Theorem but it will not be elementary.
Analysing the categorical structure of well quasi orders, two proofs of the Higman's lemma are shown, emphasising the structural content of this property of well quasi order in relation to exponentiation.
As a side effect, a variant of the Lemma is found, which says that the finite sequences of elements on A, ordered by embedding, is well-founded if and only if, A is well-founded.
Numerical Analysis and Epistemology of InformationMarco Benini
The slides of my presentation at the workshop "Philosophical Aspects of Computer Science", European Centre for Living Technology, University “Ca’ Foscari”, Venice, March 2015.
L'occhio del biologo: elementi di fotografiaMarco Benini
The slides of the course "L'occhio del biologo", Alta Formazione, Università degli Studi dell'Insubria.
It is a small course on the fundamentals of photography oriented towards the scientific photography in a biological laboratory.
Marie Skłodowska Curie Intra-European FellowshipMarco Benini
A brief report of my experience as a Marie Curie Research Fellow in Leeds to illustrate to my colleagues what means to participate in such a program.
I have to acknowledge the kind invitation of the Research Office of the Università degli Studi dell'Insubria and the Rector delegate to research, Prof. Umberto Piarulli.
By analysing the explanation of the classical heapsort algorithm via the method of levels of abstraction mainly due to Floridi, we give a concrete and precise example of how to deal with algorithmic knowledge. To do so, we introduce a concept already implicit in the method, the ‘gradient of explanations’. Analogously to the gradient of abstractions, a gradient of explanations is a sequence of discrete levels of explanation each one refining the previous, varying formalisation, and thus providing progressive evidence for hidden information. Because of this sequential and coherent uncovering of the information that explains a level of abstraction—the heapsort algorithm in our guiding example—the notion of gradient of explanations allows to precisely classify purposes in writing software according to the informal criterion of ‘depth’, and to give a precise meaning to the notion of ‘concreteness’.
This talk aims at introducing, through a very simple example, a way to represent data types in the λ-calculus, and thus, in functional programming languages, so that the structure of the data types itself becomes a parameter.
This very simple technical trick allows to reconsider programming as a way to express morphisms between models of a logical theory. As an application, it allows to realise a way to perform anonymous computations.
From a philosophical point of view, the presented approach shows how it is possible to conceive a real programming system where properties like correctness of programs can be proved, but data cannot be inspected, not even in principle.
CORCON2014: Does programming really need data structures?Marco Benini
This talk tries to suggest how computer programming can be conceptually simplified by using abstract mathematics, in particular categorical semantics, so to achieve the 'correctness by construction' paradigm paying no price in term of efficiency.
Also, it introduces an alternative point of view on what is a program and how to conceive data structures, namely as computable morphisms between models of a logical theory.
A description of the formal model behind Constructive Adpositional Grammars.
Presented at Proof Theory and Constructive Mathematics Seminar, School of Mathematics, University of Leeds (2011).
Intuitionistic First-Order Logic: Categorical semantics via the Curry-Howard ...Marco Benini
A novel approach to giving an interpretation of logic inside category theory. This work has been developed as part of my sabbatical Marie Curie fellowship in Leeds.
Presented at the Logic Seminar, School of Mathematics, University of Leeds (2012).
The increased availability of biomedical data, particularly in the public domain, offers the opportunity to better understand human health and to develop effective therapeutics for a wide range of unmet medical needs. However, data scientists remain stymied by the fact that data remain hard to find and to productively reuse because data and their metadata i) are wholly inaccessible, ii) are in non-standard or incompatible representations, iii) do not conform to community standards, and iv) have unclear or highly restricted terms and conditions that preclude legitimate reuse. These limitations require a rethink on data can be made machine and AI-ready - the key motivation behind the FAIR Guiding Principles. Concurrently, while recent efforts have explored the use of deep learning to fuse disparate data into predictive models for a wide range of biomedical applications, these models often fail even when the correct answer is already known, and fail to explain individual predictions in terms that data scientists can appreciate. These limitations suggest that new methods to produce practical artificial intelligence are still needed.
In this talk, I will discuss our work in (1) building an integrative knowledge infrastructure to prepare FAIR and "AI-ready" data and services along with (2) neurosymbolic AI methods to improve the quality of predictions and to generate plausible explanations. Attention is given to standards, platforms, and methods to wrangle knowledge into simple, but effective semantic and latent representations, and to make these available into standards-compliant and discoverable interfaces that can be used in model building, validation, and explanation. Our work, and those of others in the field, creates a baseline for building trustworthy and easy to deploy AI models in biomedicine.
Bio
Dr. Michel Dumontier is the Distinguished Professor of Data Science at Maastricht University, founder and executive director of the Institute of Data Science, and co-founder of the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. His research explores socio-technological approaches for responsible discovery science, which includes collaborative multi-modal knowledge graphs, privacy-preserving distributed data mining, and AI methods for drug discovery and personalized medicine. His work is supported through the Dutch National Research Agenda, the Netherlands Organisation for Scientific Research, Horizon Europe, the European Open Science Cloud, the US National Institutes of Health, and a Marie-Curie Innovative Training Network. He is the editor-in-chief for the journal Data Science and is internationally recognized for his contributions in bioinformatics, biomedical informatics, and semantic technologies including ontologies and linked data.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
1. Point-free semantics of dependent type theories
M Benini, R Bonacina
Università degli Studi dell’Insubria
University of Canterbury,
December 4th, 2017
2. Why
Also known as constructive type theory, or Martin-Löf type theory,
dependent type theory has recently seen a huge raise of interest
because it is the basis for homotopy type theory.
Semantics for dependent type theories are known: they are variations
on the semantics of typed λ-calculi. Usually, they are complex:
they are either based on the advanced theory of orders (specialised
domains), or on category theory;
in the case of categorical models, they use non-elementary
constructions (fibrations, higher-order cells, . . . );
homotopy type theory has an intended semantics based on
∞-groupoids;
the only categorical semantics (Seely) which does not use those
advanced constructions, contains a problem (i.e., it does not
work). Locally Cartesian closed categories are not enough to
properly model dependent type theory (Hoffman, Dybjer).
( 2 of 17 )
3. Why
Why are complex, higher-order models needed? Are they, really?
These were the initial questions that have been addressed in
Roberta’s master thesis. The answer was that there is no need for
such complex constructions: a categorical model, using no
higher-order constructions, suffices to provide a sound and complete
explanation to dependent type theory.
However, the result was not completely satisfactory because
the notion of inductive theory was sketched but not precisely
defined in all the details;
some passages in the soundness and completeness proofs were
reasonable but not formal;
in the overall, there was the feeling that the result had to be
polished to reach its maximal generality.
( 3 of 17 )
4. What
So, the visit here, at the University of Canterbury had the purpose to
precisely define the syntactic notion of inductive theory, and pave
the way toward its extension to higher-inductive types, as defined
in homotopy type theory;
polish the semantics and fix all the passages in the soundness and
completeness proofs.
In short, we did it!
In the following, I am going to give a glance to the semantics.
( 4 of 17 )
5. Point-free semantics
Consider the following inductive types:
the 0 type, which is characterised as the type having no terms;
the N−
, which is characterised by the rule having the inductive step
n : N−
→ succ(n) : N−
but not the basic step 0 : N−
.
Usually, these two types are considered equivalent because they have
no terms belonging to them: the minimal fix point of their
constructing rules is the same.
However, they can be distinguished: if x : 0 and y : N−
stand for some
objects in the types 0 and N−
, it is evident that no object other than
x is forced to be in 0, while succ(y), succ(succ(y)), ... are all in N−
.
We want a semantics in which the meaning of a type depends on the
context in which it is defined, so to be able to distinguish 0 from N−
.
( 5 of 17 )
6. Point-free semantics
The semantics is then point-free: types and their terms do not
identify entities (“points”) in some universe. They explain how
judgements are kept together by the logical inferences. And they do
so by taking local values, which change under different assumptions.
So N−
is not equivalent to the 0 type: they are the same thing in the
empty context, while they differ in a context in which we assume both
types contain at least one term.
( 6 of 17 )
7. The big picture
The semantics is based on category theory.
An inductive theory, i.e., a series of inductive types defined in the
standard dependent type theory, has models in the class of
ML-categories.
These categories allow to interpret dependent type theory and
inductive types in a natural way. A category which makes valid all the
judgements of an inductive theory is a model, and it has holds that
every inductive theory has a model (Soundness);
every judgement which is valid in any model of an inductive theory
is derivable (Completeness);
for every inductive theory there is a model which is contained in
every other model (Classification).
( 7 of 17 )
8. The big picture
Mctx
Γ
MΓ
∆
M∆
•
M•
Mctx is a partial order with minimum • in which all paths are finite;
each MΓ is a preorder with Γ as its minimum such that each pair of
elements has a least upper bound;
Mctx and all the MΓ form the ML-category (for Martin-Löf).
( 8 of 17 )
9. Inside pyramids
Each MΓ has the following structure:
Γ
π π π π
a
∈
a
∈
b
∈
b
∈A
∈
B
∈Ui
∈
Ui+1
10. Inside pyramids
Each MΓ has the following structure:
Γ
π π π π
a
∈
a
∈
b
∈
b
∈A
∈
B
∈Ui
∈
Ui+1
context
proper terms
proper types
universes
( 9 of 17 )
11. Interpretation
x : A,y : B,z : C ctx =
x : A
y : B
z : C
•ctx
x : Actx
x : A,y : Bctx
x : A,y : B,z : C ctx
in Mctx
Γ a : A = Γ a A
π ∈
in MΓ
Γ a ≡ b : A = Γ
a
b
A
π ∈
π ∈
i in MΓ
( 10 of 17 )
12. Variables
Variables have more than one role in dependent type theory:
they are hypotheses in the context;
they are terms in the language;
they are the only entities which may be substituted.
The first role is captured by interpreting contexts as objects in Mctx
and their way to be written down as path of irreducible arrows in the
same category.
The second role requires that there is an object in MΓ deputed to
interpret x.
The third role imposes a deeper structure on the ‘pyramids’ over Mctx.
( 11 of 17 )
15. Variables
Γ
x:A
A
Ui
π
∈
A
Ui
π
∈
x
π
∈
a
a
∼=
∼=
an irreducible arrow in Mctx induces a new term x in the pyramid
over the codomain;
a term a of type A in Γ is so also in the extended context;
substituting a in x, i.e., making them isomorphic and closing for
type generation, forces the pyramids to be equivalent.
( 12 of 17 )
16. Inductive types
An inductive type is the minimal collection of terms closed under the
interpretation of its introduction rules.
Semantically, this means an inductive type is the colimit of the
diagram composed by the terms-in-context which are the result of the
closure of the transformation associated to the introduction rules.
For example, the dependent sum has the following introduction rule:
Γ b : B[a/x] Γ a : A Γ,x : A B : Ui Γ A : Ui
Σ−I
Γ (a,b) : Σx : A.B
The associated semantic transformation θ maps each pair of objects
α and β in MΓ such that there are ∈: α → A Γ and ∈: β → B[a/x] Γ
in an object θ(α,β) of MΓ.
( 13 of 17 )
17. Inductive types
In general, the guiding principle is that the formation rule is used to
identify the space S of terms which is transformed to construct the
terms in the new inductive type. So, θ: S → MΓ.
The space S forms a category, and θ becomes a functor. As such, θ
has to be free, in one of the way to interpret this adjective in category
theory, to ensure that the construction is inductive.
The idea is that θ must be associated with T, the inductive theory as
a whole, not to each specific type. Then, MΓ is the minimal fix point
of the θ transformation in a category having enough terms to
interpret the context Γ in which every variable is a distinct term.
( 14 of 17 )
18. Inductive types
Difficult to express, but mathematically straightforward, this idea of
inductive theory allows to capture at once recursive types, mutual
recursive types, and even more esoteric beasts.
The formal framework had also a pleasant consequence: namely, all
the canonical types are inductive. So, the dependent sum Σ, the
dependent product Π, the coproduct +, the empty type 0, the unit
type 1, and equality in a type =A can be treated in the very same way
as natural numbers.
In fact, the non-inductive part of dependent type theory is reduced to
the structural rules (context formation, variables, judgemental
equality) and the rules about universes, which are structural in a
sense, because they are used to distinguish types.
( 15 of 17 )
19. The state of the art
We are in the process of writing down all of this. In this moment, we
have the definition of ML-category, a few of its properties, and the
definition of interpretation written down with all the details.
Inductive types are completely developed: however, only the syntactic
side has been polished. The semantics of them needs to be written
and checked once again.
We have the definition of syntactic category, the classification,
soundness and completeness theorem to polish: they have been
developed in all the details, but not yet written down in the proper
mathematical style.
We have a rough sketch of how everything should work on
higher-inductive types, but this is far from being a result, even with a
very optimistic view. . . not yet, at least!
( 16 of 17 )