This document discusses the limits of computation. It distinguishes between intractable problems that take an impractical amount of time to solve versus truly unsolvable problems. It describes different complexity classes based on how fast the number of operations grows with input size. Hard problems like the traveling salesman problem are in NP and are inherently difficult even with faster computers or quantum computing. Reductions show relationships between problem difficulties. The halting problem and Godel's incompleteness theorems establish fundamental limits of computation and logical systems.
Introduction to complexity theory that solves your assignment problem it contains about complexity class,deterministic class,big- O notation ,proof by mathematical induction, L-Space ,N-Space and characteristics functions of set and so on
This file contains the concepts of Class P, Class NP, NP- completeness, Travelling Salesman Person problem, Clique Problem, Vertex cover problem, Hamiltonian problem, FFT and DFT.
P, NP, NP-Complete, and NP-Hard
Reductionism in Algorithms
NP-Completeness and Cooks Theorem
NP-Complete and NP-Hard Problems
Travelling Salesman Problem (TSP)
Travelling Salesman Problem (TSP) - Approximation Algorithms
PRIMES is in P - (A hope for NP problems in P)
Millennium Problems
Conclusions
Introduction to complexity theory that solves your assignment problem it contains about complexity class,deterministic class,big- O notation ,proof by mathematical induction, L-Space ,N-Space and characteristics functions of set and so on
This file contains the concepts of Class P, Class NP, NP- completeness, Travelling Salesman Person problem, Clique Problem, Vertex cover problem, Hamiltonian problem, FFT and DFT.
P, NP, NP-Complete, and NP-Hard
Reductionism in Algorithms
NP-Completeness and Cooks Theorem
NP-Complete and NP-Hard Problems
Travelling Salesman Problem (TSP)
Travelling Salesman Problem (TSP) - Approximation Algorithms
PRIMES is in P - (A hope for NP problems in P)
Millennium Problems
Conclusions
Towards a stable definition of Algorithmic RandomnessHector Zenil
Although information content is invariant up to an additive constant, the range of possible additive constants applicable to programming languages is so large that in practice it plays a major role in the actual evaluation of K(s), the Kolmogorov complexity of a string s. We present a summary of the approach we've developed to overcome the problem by calculating its algorithmic probability and evaluating the algorithmic complexity via the coding theorem, thereby providing a stable framework for Kolmogorov complexity even for short strings. We also show that reasonable formalisms produce reasonable complexity classifications.
NP completeness. Classes P and NP are two frequently studied classes of problems in computer science. Class P is the set of all problems that can be solved by a deterministic Turing machine in polynomial time.
Fractal dimension versus Computational ComplexityHector Zenil
We investigate connections and tradeoffs between two important complexity measures: fractal dimension and computational (time) complexity. We report exciting results applied to space-time diagrams of small Turing machines with precise mathematical relations and formal conjectures connecting these measures. The preprint of the paper is available at: http://arxiv.org/abs/1309.1779
Fractal Dimension of Space-time Diagrams and the Runtime Complexity of Small ...Hector Zenil
Complexity measures are designed to capture complex behaviour and to quantify how complex that particular behaviour is. If a certain phenomenon is genuinely complex this means that it does not all of a sudden becomes simple by just translating the phenomenon to a different setting or framework with a different complexity value. It is in this sense that we expect different complexity measures from possibly entirely different fields to be related to each other. This work presents our work on a beautiful connection between the fractal dimension of space-time diagrams of Turing machines and their time complexity. Presented at Machines, Computations and Universality (MCU) 2013, Zurich, Switzerland.
A Numerical Method for the Evaluation of Kolmogorov Complexity, An alternativ...Hector Zenil
We present a novel alternative method (other than using compression algorithms) to approximate the algorithmic complexity of a string by calculating its algorithmic probability and applying Chaitin-Levin's coding theorem.
https://telecombcn-dl.github.io/2017-dlsl/
Winter School on Deep Learning for Speech and Language. UPC BarcelonaTech ETSETB TelecomBCN.
The aim of this course is to train students in methods of deep learning for speech and language. Recurrent Neural Networks (RNN) will be presented and analyzed in detail to understand the potential of these state of the art tools for time series processing. Engineering tips and scalability issues will be addressed to solve tasks such as machine translation, speech recognition, speech synthesis or question answering. Hands-on sessions will provide development skills so that attendees can become competent in contemporary data analytics tools.
Abstract:
"We study different possibilities to apply the principles of rough-paths theory in a non-commutative probability setting. First, we extend previous results obtained by Capitaine, Donati-Martin and Victoir in Lyons' original formulation of rough-paths theory. Then we settle the bases of an alternative non-commutative integration procedure, in the spirit of Gubinelli's controlled paths theory, and which allows us to revisit the constructions of Biane and Speicher in the free Brownian case. New approximation results are also derived from the strategy."
René Schott
International Journal of Mathematics and Statistics Invention (IJMSI)inventionjournals
International Journal of Mathematics and Statistics Invention (IJMSI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJMSI publishes research articles and reviews within the whole field Mathematics and Statistics, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
https://telecombcn-dl.github.io/2017-dlai/
Deep learning technologies are at the core of the current revolution in artificial intelligence for multimedia data analysis. The convergence of large-scale annotated datasets and affordable GPU hardware has allowed the training of neural networks for data analysis tasks which were previously addressed with hand-crafted features. Architectures such as convolutional neural networks, recurrent neural networks or Q-nets for reinforcement learning have shaped a brand new scenario in signal processing. This course will cover the basic principles of deep learning from both an algorithmic and computational perspectives.
Published in ECML 2010
@inproceedings{DBLP:conf/pkdd/RoletT10,
author = {Philippe Rolet and
Olivier Teytaud},
title = {Complexity Bounds for Batch Active Learning in Classification},
booktitle = {ECML/PKDD (3)},
year = {2010},
pages = {293-305},
ee = {http://dx.doi.org/10.1007/978-3-642-15939-8_19},
crossref = {DBLP:conf/pkdd/2010-3},
bibsource = {DBLP, http://dblp.uni-trier.de}
}
@proceedings{DBLP:conf/pkdd/2010-3,
editor = {Jos{\'e} L. Balc{\'a}zar and
Francesco Bonchi and
Aristides Gionis and
Mich{\`e}le Sebag},
title = {Machine Learning and Knowledge Discovery in Databases, European
Conference, ECML PKDD 2010, Barcelona, Spain, September
20-24, 2010, Proceedings, Part III},
booktitle = {ECML/PKDD (3)},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
volume = {6323},
year = {2010},
isbn = {978-3-642-15938-1},
ee = {http://dx.doi.org/10.1007/978-3-642-15939-8},
bibsource = {DBLP, http://dblp.uni-trier.de}
}
Towards a stable definition of Algorithmic RandomnessHector Zenil
Although information content is invariant up to an additive constant, the range of possible additive constants applicable to programming languages is so large that in practice it plays a major role in the actual evaluation of K(s), the Kolmogorov complexity of a string s. We present a summary of the approach we've developed to overcome the problem by calculating its algorithmic probability and evaluating the algorithmic complexity via the coding theorem, thereby providing a stable framework for Kolmogorov complexity even for short strings. We also show that reasonable formalisms produce reasonable complexity classifications.
NP completeness. Classes P and NP are two frequently studied classes of problems in computer science. Class P is the set of all problems that can be solved by a deterministic Turing machine in polynomial time.
Fractal dimension versus Computational ComplexityHector Zenil
We investigate connections and tradeoffs between two important complexity measures: fractal dimension and computational (time) complexity. We report exciting results applied to space-time diagrams of small Turing machines with precise mathematical relations and formal conjectures connecting these measures. The preprint of the paper is available at: http://arxiv.org/abs/1309.1779
Fractal Dimension of Space-time Diagrams and the Runtime Complexity of Small ...Hector Zenil
Complexity measures are designed to capture complex behaviour and to quantify how complex that particular behaviour is. If a certain phenomenon is genuinely complex this means that it does not all of a sudden becomes simple by just translating the phenomenon to a different setting or framework with a different complexity value. It is in this sense that we expect different complexity measures from possibly entirely different fields to be related to each other. This work presents our work on a beautiful connection between the fractal dimension of space-time diagrams of Turing machines and their time complexity. Presented at Machines, Computations and Universality (MCU) 2013, Zurich, Switzerland.
A Numerical Method for the Evaluation of Kolmogorov Complexity, An alternativ...Hector Zenil
We present a novel alternative method (other than using compression algorithms) to approximate the algorithmic complexity of a string by calculating its algorithmic probability and applying Chaitin-Levin's coding theorem.
https://telecombcn-dl.github.io/2017-dlsl/
Winter School on Deep Learning for Speech and Language. UPC BarcelonaTech ETSETB TelecomBCN.
The aim of this course is to train students in methods of deep learning for speech and language. Recurrent Neural Networks (RNN) will be presented and analyzed in detail to understand the potential of these state of the art tools for time series processing. Engineering tips and scalability issues will be addressed to solve tasks such as machine translation, speech recognition, speech synthesis or question answering. Hands-on sessions will provide development skills so that attendees can become competent in contemporary data analytics tools.
Abstract:
"We study different possibilities to apply the principles of rough-paths theory in a non-commutative probability setting. First, we extend previous results obtained by Capitaine, Donati-Martin and Victoir in Lyons' original formulation of rough-paths theory. Then we settle the bases of an alternative non-commutative integration procedure, in the spirit of Gubinelli's controlled paths theory, and which allows us to revisit the constructions of Biane and Speicher in the free Brownian case. New approximation results are also derived from the strategy."
René Schott
International Journal of Mathematics and Statistics Invention (IJMSI)inventionjournals
International Journal of Mathematics and Statistics Invention (IJMSI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJMSI publishes research articles and reviews within the whole field Mathematics and Statistics, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
https://telecombcn-dl.github.io/2017-dlai/
Deep learning technologies are at the core of the current revolution in artificial intelligence for multimedia data analysis. The convergence of large-scale annotated datasets and affordable GPU hardware has allowed the training of neural networks for data analysis tasks which were previously addressed with hand-crafted features. Architectures such as convolutional neural networks, recurrent neural networks or Q-nets for reinforcement learning have shaped a brand new scenario in signal processing. This course will cover the basic principles of deep learning from both an algorithmic and computational perspectives.
Published in ECML 2010
@inproceedings{DBLP:conf/pkdd/RoletT10,
author = {Philippe Rolet and
Olivier Teytaud},
title = {Complexity Bounds for Batch Active Learning in Classification},
booktitle = {ECML/PKDD (3)},
year = {2010},
pages = {293-305},
ee = {http://dx.doi.org/10.1007/978-3-642-15939-8_19},
crossref = {DBLP:conf/pkdd/2010-3},
bibsource = {DBLP, http://dblp.uni-trier.de}
}
@proceedings{DBLP:conf/pkdd/2010-3,
editor = {Jos{\'e} L. Balc{\'a}zar and
Francesco Bonchi and
Aristides Gionis and
Mich{\`e}le Sebag},
title = {Machine Learning and Knowledge Discovery in Databases, European
Conference, ECML PKDD 2010, Barcelona, Spain, September
20-24, 2010, Proceedings, Part III},
booktitle = {ECML/PKDD (3)},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
volume = {6323},
year = {2010},
isbn = {978-3-642-15938-1},
ee = {http://dx.doi.org/10.1007/978-3-642-15939-8},
bibsource = {DBLP, http://dblp.uni-trier.de}
}
Theory of games, with a short reminder of computational complexity and an independent appendix on human complexity and the game of Go
@article{david:hal-00710073,
hal_id = {hal-00710073},
url = {http://hal.inria.fr/hal-00710073},
title = {{The Frontier of Decidability in Partially Observable Recursive Games}},
author = {David, Auger and Teytaud, Olivier},
abstract = {{The classical decision problem associated with a game is whether a given player has a winning strategy, i.e. some strategy that leads almost surely to a victory, regardless of the other players' strategies. While this problem is relevant for deterministic fully observable games, for a partially observable game the requirement of winning with probability 1 is too strong. In fact, as shown in this paper, a game might be decidable for the simple criterion of almost sure victory, whereas optimal play (even in an approximate sense) is not computable. We therefore propose another criterion, the decidability of which is equivalent to the computability of approximately optimal play. Then, we show that (i) this criterion is undecidable in the general case, even with deterministic games (no random part in the game), (ii) that it is in the jump 0', and that, even in the stochastic case, (iii) it becomes decidable if we add the requirement that the game halts almost surely whatever maybe the strategies of the players.}},
language = {Anglais},
affiliation = {Laboratoire de Recherche en Informatique - LRI , TAO - INRIA Saclay - Ile de France},
booktitle = {{Special Issue on "Frontier between Decidability and Undecidability"}},
publisher = {World Scinet},
journal = {International Journal on Foundations of Computer Science (IJFCS)},
volume = {Accepted},
note = {revised 2011, accepted 2011, in press },
audience = {internationale },
year = {2012},
}
Weather, opponents, geopolitics: so many uncertainties in such a case ? How to manage power systems in spite of these uncertainties, and how to decide investments.
Talk at Saint-Etienne in 2015; thanks to R. Leriche and to the "games and optimizations" days in Saint-Etienne.
The slide covers a few state of the art models of word embedding and deep explanation on algorithms for approximation of softmax function in language models.
This talk was based on my Master's thesis which I had completed earlier that year. It gives an overview on how certain parallel dynamic programming can be computed in parallel efficiently, and what we want that to mean here.
The plots in "Performance Examples" show speedup S on the left and efficiency E on the right, both against input size.
Read more over here: http://reitzig.github.io/publications/Reitzig2012
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
2. Hard vs Impossible
Intractable Problems – trillions of years to
solve: practically unfeasible
Unsolvable Problems – factually impossible, as
opposed to hard - .eg mutilated chessboard
problem
3. Hard vs Impossible
Intractable Problems – trillions of years to
solve: practically unfeasible
Unsolvable Problems – factually impossible, as
opposed to hard - .eg mutilated chessboard
problem
5. Problem Types
Optimization Problems: min/max. Must
examine all data
Decision Problems: boolean -eg <, >. stop
when we find what we are looking for.
Polynomial Problems: (P) – feasible,
tractable #ops
Non-Polynomial Problems (NP): n^2.5+,
2^n, n!
Eg Encryption Key asymmetry:
multiplication of 2 primes is n^2 , factoring is
NP
6. Easy Problems (P)
Eg arithmetic, searching, sorting
Brute-Force vs algorithm ‘tricks’ - eg binary-
search (split up, recurse)
Appropriate algorithm may depend on type
of input
7. Graph Theory
Eg 6 degrees of
separation
Max #edges that one
must traverse to
connect any 2
vertices
7 bridges of
Konigsberg
8. Easy Graph problems
Euler Cycle problem
Q: is there a path that starts & finishes at
same vertex that passes every edge exactly
once ?
A: if every vertex has an even number of
edges.
9. Easy Graph problems
Euler Path problem
Q: is there a path that passes every vertex
exactly once ?
A: if Max 2 vertices have odd #edges & rest
have even
11. Traveling Salesman Problem
travel shortest route
to visit each city
once. (Weighted
graph)
Brute-force: n! ops
Eg n=100. For each
route, need to add
distance between
100 cities and &
compare sum to
shortest distance
12. Traveling Salesman Problem
100! = 9.332622e+157
Given a computer that can check 1 million
routes per sec → 3 x 10^144 years.
There are 10^80 atoms in the universe – if
each were a computer, this would take
10^62 centuries. And this is for 100!
13. Hamiltonian Cycle Problem
Is there a path that passes every vertex
exactly once that starts & finishes at same
vertex ?
Brute force: n!
Similar to Euler Cycle - but unfortunately no
trick discovered (& probably wont be) !!!
14. Hard Set Problems
Set Partition Problem
split set of natural numbers into 2 groups
that both sum to the same number
first evaluate solvability: is the total sum
Odd ?
brute force: 2^n
Subset Sum Problem
for a given set of natural numbers, and a
natural number C (capacity), is there a
subset that sums to C ? eg fill a room
15. The Satisfiability Problem
deals with logical statements ^ (and), v (or), ~ (not) , →
(implies)
eg (p v ~q) → ~ (p ^ q) – is this true if p=true and q=false ?
Generalize: can we assign boolean values to variables to make
the statement true ?
Logic Rules – eg Modus Ponens: ((p → q) ^ p) → q If today is
Tuesday, then John will go to work. Today is Tuesday.
Therefore, John will go to work.
Brute-force:
truth-table lookup: 2^n
16. NP Problems are Inherently Hard
Intractability is not because we lack the
technology.
Faster computers – if 10,000 times faster, TS
(100) problem still takes 3 x 10^140 years
Parallelization – Similarly, perform 10,000
simultaneous ops – same story
Quantum Computing – superposition of
search states: examine multiple possibilities at
once. For searching list of size n: Sqrt(n) ops.
But when searching through all n! possible
solutions to an NP problem: Sqrt(n!) ops –
17. Reduction
Transform a problem – reduction of a problem
x to y. if we can solve y then we can definitely
solve x
eg x is as hard as or easier than y: x <=p y
climbing K2 is reducible to climbing mount
Everest
Set Partition Problem is reducible to
Subset Sum Problem
Hamiltonian Cycle Problem is reducible to
Traveling Salesman Problem
19. NP-Complete and P != NP
NP-Complete problems – the hardest NP
problems. Every NP problem is reducible to
NP-Complete. If we could find an algorithm
to solve it in P …
Unsolved problem: If NP is a subset of P,
then P = NP. Most researchers believe P is
a subset of NP.
20. Approximation Algorithms
if NP – best we can do is Heuristics. Better
than waiting 400 trillion centuries.
Traveling Salesman Problem
approximation: use a Greedy Algorithm:
OSPF, Nearest Neighbor
Set Partition Problem approximation:
Extreme Pairs – repeat, alternating
partitions: pick min & max from source set &
place in partition x
21. Even Harder problems
Super-exponential – eg 2^(2^n) (if n=10,
need 2^1024 ops), (n!)!
PSPACE (a superset of NP). Demand a
polynomial amount of space
eg is there a winning strategy for n x n tic-
tac-toe ?
Chess, Checkers, Nim, Go
22. Computing Impossibilities
Q: Can computers recognize art, make
moral decisions ? A: Quantification of
Aesthetics, Ethics – subjective.
Halting Problem
Incompleteness Theorem
23. Some Definitions
Science – language we use to describe &
predict the physical & measurable universe
Applied Maths – the language of science
Logic – the language of reason
Epistemology – philosophy of human
knowledge & its limits
24. Logic
Use symbols to reason about structure of
proofs – set up axiom systems to put
mathematics on a firm foundation
Peano Arithmetic – Axiom system for natural
numbers
symbolization / arithmetization – conversion
between logical symbolic systems and
numeric systems
25. Logical Paradoxes
input: axioms + assumptions;
processing: logical reasoning laws;
output: logically unacceptable falsehoods,
self-contradictions, nonsensical false facts →
jettison assumptions
often counterintuitive (eg “space is
continuous”)
reductio ad absurdum (proof by
contradiction)
26. Logical Paradoxes
self-referential paradoxes - eg Liar Paradox:
“this sentence is false”
reduction – infer limitations from other
problems
reality, science, maths, logic cannot permit
contradictions – but our minds & language
can: vagueness, jokes
28. The Halting Problem
Alan Turing, 1936: proof by contradiction (Liar
Paradox) that the Halting Problem is
undecidable
A limitation of mechanized processes, its not a
hard problem, its an impossible problem
a computer (or brain ?) cannot determine if a
black box program given input x will terminate
(with a return code) or enter an infinite loop.
Processing ….
https://www.youtube.com/watch?v=92WHN-
pAFCs
29. Oracle Machines & Turtles
imagine a non-mechanical mystic Oracle
Machine – the Halt Oracle can solve the
Halting Problem – could solve many other
unsolved math problems
eg Goldbach Conjecture: every positive
even number greater than 2 is a sum of 2
primes. (we know this holds up to 10^17)
the Halt Oracle searches for a counter-
example – if it exists, it can tell if the program
will halt.
Turing: Halting Problem for Halt Oracle
30. Computers vs Minds
Q: Can a human solve the Halting Problem ?
A: Kurt Godel , Sir Roger Penrose, Douglas
Hofstadfer - consciousness arises from
ability to self-reference. Self-Reference in
Computers brings paradoxical limitations,
while in humans it causes consciousness ???
31. Incompleteness
Kurt Godel – the
man who broke
Logic
Godel Sentence:
“This Logical
Statement is
Unprovable” →
contradiction
1st Incompleteness
Theorem – There
are logical
32. Incompleteness
If Peano Arithmetic
is consistent then
the Godel sentence
is unprovable and
true –> limitation of
basic arithmetic:
cannot determine
when its own
statements are
true / consistent.
There are stronger
33. Incompleteness
consistency of logical systems at basis of
mathematics and science is beyond the
bounds of reason!
Note: majority of maths work does not
require working with basic axiomatic proofs