This document summarizes and compares the views of Thurston and Penrose on the nature of mathematical truth and methods. Both argue that non-logical and non-computational methods like language, metaphor and visualization are useful for understanding mathematics. However, they differ on whether these methods are part of mathematics itself or external to it. The document also discusses the author's disagreement with some of their claims, such as whether non-logical methods must be justified logically, and analyzes their differing views of mathematics as either a fixed or expanding domain.
The principle of constructive mathematizability of any theory: A sketch of fo...Vasil Penchev
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality.
Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction.
Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom.
An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being (rather than reality for reality is external only to Gödel mathematics) is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem.
Middle Range Theories as Coherent Intellectual Frameworksinventionjournals
The argument is advanced that sound logical reasoning is essential in understanding the complex concept of middle range theories. This may be explainable as follows: firstly, that epistemological rules and principles are wider and incorporate under to incorporate such concepts as generalization; theoretical paradigms; empirical theories; formal theories; and intellectual theoretical and conceptual frameworks: major premise designated as B. Secondly, that middle range theories have three sets of meanings: called minor premises designated as B1; and these three sets of meanings are: (a)theoretical paradigms as forms of middle range theories are the basic sets of assumptions ideas and unified viewpoints: called minor premise B2; (b) empirical theories as forms of middle range theories as forms of middle range theories are conceptual models of analysis: minor premise B3; (c) formal theories as forms of middle range theories, designated as minor premise B4. (d) Therefore, minor premises B1, B2, B3 and B4 are related to B, major premise. Thirdly, the broader epistemological rules and principles thus incorporate the middle range theories as coherent intellectual frameworks. The latter aspect forms the subject of this article.
Analogia entis as analogy universalized and formalized rigorously and mathema...Vasil Penchev
THE SECOND WORLD CONGRESS ON ANALOGY, POZNAŃ, MAY 24-26, 2017
(The Venue: Sala Lubrańskiego (Lubrański’s Hall at the Collegium Minus), Adam Mickiewicz University, Address: ul. Wieniawskiego 1) The presentation: 24 May, 15:30
The Skepticism and the Dialectic as Instruments of Apprehension of Knowledge:...QUESTJOURNAL
ABSTRACT: The rationalist aspect of philosophy has in Plato and Descartes two of its main exponents. These are two distant thinkers about twenty centuries in time, but they have several possibilities of theoretical approaches, especially when used as guiding the study of his works the epistemological issues related to the dialectic (platonic) and the logical skepticism (Cartesian). Among these multiple possibilities of understanding of philosophy (and, more precisely, the epistemological perspective) of these philosophers, i will look for in the lines bellow to develop a brief essay regarding the role of dialog and doubt methodical as possibilities of research in epistemological work of these authors that became classics of human knowledge.
The square of opposition: Four colours sufficient for the “map” of logicVasil Penchev
How many “letters” does the “alphabet of nature” need?
Nature is maximally economical, so that that number would be the minimally possible one. What is the common in the following facts?
(1) The square of opposition
(2) The “letters” of DNA
(3) The number of colors enough for any geographic al map
(4) The minimal number of points, which allows of them not be always well-ordere
The principle of constructive mathematizability of any theory: A sketch of fo...Vasil Penchev
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality.
Its investigation needs philosophical means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction.
Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom.
An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being (rather than reality for reality is external only to Gödel mathematics) is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem.
Middle Range Theories as Coherent Intellectual Frameworksinventionjournals
The argument is advanced that sound logical reasoning is essential in understanding the complex concept of middle range theories. This may be explainable as follows: firstly, that epistemological rules and principles are wider and incorporate under to incorporate such concepts as generalization; theoretical paradigms; empirical theories; formal theories; and intellectual theoretical and conceptual frameworks: major premise designated as B. Secondly, that middle range theories have three sets of meanings: called minor premises designated as B1; and these three sets of meanings are: (a)theoretical paradigms as forms of middle range theories are the basic sets of assumptions ideas and unified viewpoints: called minor premise B2; (b) empirical theories as forms of middle range theories as forms of middle range theories are conceptual models of analysis: minor premise B3; (c) formal theories as forms of middle range theories, designated as minor premise B4. (d) Therefore, minor premises B1, B2, B3 and B4 are related to B, major premise. Thirdly, the broader epistemological rules and principles thus incorporate the middle range theories as coherent intellectual frameworks. The latter aspect forms the subject of this article.
Analogia entis as analogy universalized and formalized rigorously and mathema...Vasil Penchev
THE SECOND WORLD CONGRESS ON ANALOGY, POZNAŃ, MAY 24-26, 2017
(The Venue: Sala Lubrańskiego (Lubrański’s Hall at the Collegium Minus), Adam Mickiewicz University, Address: ul. Wieniawskiego 1) The presentation: 24 May, 15:30
The Skepticism and the Dialectic as Instruments of Apprehension of Knowledge:...QUESTJOURNAL
ABSTRACT: The rationalist aspect of philosophy has in Plato and Descartes two of its main exponents. These are two distant thinkers about twenty centuries in time, but they have several possibilities of theoretical approaches, especially when used as guiding the study of his works the epistemological issues related to the dialectic (platonic) and the logical skepticism (Cartesian). Among these multiple possibilities of understanding of philosophy (and, more precisely, the epistemological perspective) of these philosophers, i will look for in the lines bellow to develop a brief essay regarding the role of dialog and doubt methodical as possibilities of research in epistemological work of these authors that became classics of human knowledge.
The square of opposition: Four colours sufficient for the “map” of logicVasil Penchev
How many “letters” does the “alphabet of nature” need?
Nature is maximally economical, so that that number would be the minimally possible one. What is the common in the following facts?
(1) The square of opposition
(2) The “letters” of DNA
(3) The number of colors enough for any geographic al map
(4) The minimal number of points, which allows of them not be always well-ordere
AI and expert system
What is TMS?
Enforcing logical relations among beliefs.
Generating explanations for conclusions.
Finding solutions to search problems
Supporting default reasoning.
Identifying causes for failure and recover from inconsistencies.
TMS applications
This presentation contains information about the AI reasoning in uncertain situations. It also includes discussion regarding the types of uncertainty, predicate logic, non-monotonic logic, truth maintenance system and reasoning with fuzzy sets.
These are the slides for a talk given at the University of Alabama, Birmingham on April 19, 2013. The title of the talk is "Measuring Similarity and Relatedness in the Biomedical Domain : Methods and Applications"
ASSESSING SIMILARITY BETWEEN ONTOLOGIES: THE CASE OF THE CONCEPTUAL SIMILARITYIJwest
In ontology engineering, there are many cases where assessing similarity between ontologies is required, this is the case of the alignment activities, ontology evolutions, ontology similarities, etc. This paper presents a new method for assessing similarity between concepts of ontologies. The method is based on the
set theory, edges and feature similarity. We first determine the set of concepts that is shared by two ontologies and the sets of concepts that are different from them. Then, we evaluate the average value of similarity for each set by using edges-based semantic similarity. Finally, we compute similarity between
ontologies by using average values of each set and by using feature-based similarity measure too.
A General Principle of Learning and its Application for Reconciling Einstein’...Jeffrey Huang
The human brain has an amazing power at solving different cognitive tasks, ranging from visual perception, playing chess, building airplanes, to the discovery of the mass and energy relation E=mc^2. However, up to now, nobody knows how the brain works and what is the principle underlying its power. In this presentation, a general principle of learning is proposed to understand intelligence like the human one. It will attempt to reveal the logical structure underlying intelligence so that we can implement it using machines to help us to smooth out the differences in individual intelligence just like machinery to individual physical power. It can help us to build better human societies for equality, peace, and prosperity.
Mathematical Psychology in Broad StrokesPeter Anyebe
The concept of the quantum wave function collapse, QWFC in physics is adapted to the humanities via the natural order, N-O which operationalizes the forms principle as a unique case of the waves principle; in which all three, 3 features are linked by the forth, 4th to distinguish between the human and physical phenomena, which are characteristically continuous and discrete, respectively.
The natural order, N-O is derived from the four, 4 fundamental principles that govern the universe. It discretizes to predict, as well as axiomatizes choice to segment, based on which to diagnose with reference to the normal curve.
For establishment the factors S, ROIApp, and F are reconstructed, to derive them from first, 1st principles. Then F which is the motivational element that determines the other two, 2 is reduced into its four, 4 variants to demonstrate predictability, FN segmentation, Fb, FQ neutrality, FM and then diagnoses, for FM = ScT + PfI.
At the optimum the reconstructions are valid, 97% of the times.
AI and expert system
What is TMS?
Enforcing logical relations among beliefs.
Generating explanations for conclusions.
Finding solutions to search problems
Supporting default reasoning.
Identifying causes for failure and recover from inconsistencies.
TMS applications
This presentation contains information about the AI reasoning in uncertain situations. It also includes discussion regarding the types of uncertainty, predicate logic, non-monotonic logic, truth maintenance system and reasoning with fuzzy sets.
These are the slides for a talk given at the University of Alabama, Birmingham on April 19, 2013. The title of the talk is "Measuring Similarity and Relatedness in the Biomedical Domain : Methods and Applications"
ASSESSING SIMILARITY BETWEEN ONTOLOGIES: THE CASE OF THE CONCEPTUAL SIMILARITYIJwest
In ontology engineering, there are many cases where assessing similarity between ontologies is required, this is the case of the alignment activities, ontology evolutions, ontology similarities, etc. This paper presents a new method for assessing similarity between concepts of ontologies. The method is based on the
set theory, edges and feature similarity. We first determine the set of concepts that is shared by two ontologies and the sets of concepts that are different from them. Then, we evaluate the average value of similarity for each set by using edges-based semantic similarity. Finally, we compute similarity between
ontologies by using average values of each set and by using feature-based similarity measure too.
A General Principle of Learning and its Application for Reconciling Einstein’...Jeffrey Huang
The human brain has an amazing power at solving different cognitive tasks, ranging from visual perception, playing chess, building airplanes, to the discovery of the mass and energy relation E=mc^2. However, up to now, nobody knows how the brain works and what is the principle underlying its power. In this presentation, a general principle of learning is proposed to understand intelligence like the human one. It will attempt to reveal the logical structure underlying intelligence so that we can implement it using machines to help us to smooth out the differences in individual intelligence just like machinery to individual physical power. It can help us to build better human societies for equality, peace, and prosperity.
Mathematical Psychology in Broad StrokesPeter Anyebe
The concept of the quantum wave function collapse, QWFC in physics is adapted to the humanities via the natural order, N-O which operationalizes the forms principle as a unique case of the waves principle; in which all three, 3 features are linked by the forth, 4th to distinguish between the human and physical phenomena, which are characteristically continuous and discrete, respectively.
The natural order, N-O is derived from the four, 4 fundamental principles that govern the universe. It discretizes to predict, as well as axiomatizes choice to segment, based on which to diagnose with reference to the normal curve.
For establishment the factors S, ROIApp, and F are reconstructed, to derive them from first, 1st principles. Then F which is the motivational element that determines the other two, 2 is reduced into its four, 4 variants to demonstrate predictability, FN segmentation, Fb, FQ neutrality, FM and then diagnoses, for FM = ScT + PfI.
At the optimum the reconstructions are valid, 97% of the times.
Five keys to succesfull content marketing: Strategy, Target group analysis, Process, Distribution and Measurements. My presentation at Epic Content Marketing in Oslo May 19-20, 2016.
Essay on Mathematics and Art
Essay on Teaching Mathematics
Understanding Mathematics Essays
Philosophy in Mathematics Essay
Mathematics in Everyday Life Essay
Teaching Mathematics Essay example
The problem of effective computation is discussed. Short historical analysis of this problem and basic ways of its development in culture and science, including computer science, are represented. Place the problem of the creation effective calculation in modern science is discussed. Necessity of creation “computation metascience” as system with variable hierarchy (open system) is observed. Polymetric analysis as example of this metascience is represented and discussed.
.There are different paths to reality, they are determined by the knower, being instrumental methodological study object, epistemological axis, among others. Reality presents several faces, what is observable and what is perceived sensory empirical data obtained correspond to the visible, the main thing is to discover the hidden side, which is behind the perceptible or data. Epistemology is the whole process of obtaining scientific knowledge, ranging from the pre knowledge to get to know the hidden side, one thing is what is seen and what is not, and one that is not seen, is really it is.
The Potency of Formalism Logical Operations of Truth Tables StudyIOSR Journals
In this article, we used the automata theory to highlight the syntactic property representing the formal knowledge, and also we can use other contexts to represent the semantic property.
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, VOL. XIII, 001.docxpickersgillkayne
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, VOL. XIII, 001-14 (1960)
The Unreasonable Effectiveness of Mat hematics
in the Natural Sciences
Richard Courant Lecture i n Mathematical Sciences delivered at New York University,
May 1 1 , 1959
E U G E N E P. W I G N E R
Princeton University
“and it i s probable t h a t there i s some secret here
w h i c h r e m a i n s t o be discovered.” ( C . S . P e i r c e )
There is a story about two friends, who were classmates in high school,
talking about their jobs. One of them became a statistician and was working
on population trends. He showed a reprint to his former classmate, The
reprint started, as usual, with the Gaussian distribution and the statistician
explained t o his former classmate the meaning of the symbols for the actual
population, for the average population, and so on. His classmate was a
bit incredulous and was not quite sure whether the statistician was pulling
his leg. “How can you know that?” was his query. “And what is this
symbol i e r e ? ” “Oh,” said the statistician, “this is n.” “What is t h a t ? ”
“The ratio of the circumference of the circle t o its diameter.” “Well, now
you are pushing your joke too far,” said the classmate, “surely the pop-
ulation has nothing to do with the circumference of the circle.”
Naturally, we are inclined to smile about the simplicity of the classmate’s
approach. Nevertheless, when I heard this story, I had to admit to an
eerie feeling because, surely, the reaction of the classmate betrayed only
plain common sense. I was even more confused when, not many days later,
someone came t o me and expressed his bewilderment1 with the fact that
we make a rather narrow selection when choosing the data on which we
test our theories. “How do we know that, if we made a theory which focusses
its attention on phenomena we disregard and disregards some of the phe-
nomena now commanding our attention, that we could not build another
theory which has little in common with the present one but which, never-
theless, explains just as many phenomena as the present theory.” It has
to be admitted that we have not definite evidence that there is no such theory.
The preceding two stories illustrate the two main points which are the
‘The remark to be quoted was made b y F. Werner when he was a student in Princeton.
1
2 E. P. WIGNER
subjects of the present discourse. The first point is that mathematical con-
concepts turn up in entirely unexpected connections. Moreover, they often
permit an unexpectedly close and accurate description of the phenomena in
these connections. Secondly, just because of this circumstance, and because
we do not understand the reasons of their usefulness, we cannot know
whether a theory formulated in terms of mathematical concepts is uniquely
appropriate. We are in a position similar to that of a man who was provided
with a bunch of keys and who, having to open several .
In this contribution the philosphical consequences of the theorem of Goedel are studied. It is shown that with formal systems, like mathematics or physical science only part of the reality can be described.
This article aims to present the concept of scientific truth, the methods adopted for the search for scientific truth, the questions about the scientific method and how to prove the scientific truth.
Similar to Logic, Computation, and Understanding - The Three Roads to an Expanding Reality (20)
Logic, Computation, and Understanding - The Three Roads to an Expanding Reality
1. Balder1
Zachary Balder
June 9, 2014
Logic, Computation, and Understanding:
The Three Roads to an Expanding Reality
William Thurston, in “On Proof and Progress in Mathematics”, and Roger Penrose, in
“Mathematical Intelligence”, address two major questions: how does one arrive at mathematical
truth, and what methods does mathematics include? Thurston and Penrose argue that there are
other methods besides logic and computation that lead to objective truth. Thurston mentions
alternative tools such as language, metaphor, and visualization that one uses to comprehend and
represent mathematical constructs. Penrose concurs that these methods are useful in ascertaining
truth. But are these alternative methods within the scope of mathematics, or are they external to
it? Penrose makes clear that mathematics is a fixed domain that includes forms of non-
computational thought. Thurston, however, paints a different picture; he depicts mathematics as
a network that extends outward to include new methods.
Thurston’s description of mathematics is more convincing than Penrose’s description.
Thurston acknowledges the unpredictable nature of mathematical progress, and he portrays
mathematical practice as a human process. Mathematics is not, as Penrose’s model suggests, a
concrete realm unaffected by human intellectual pursuits. Mathematics bears a strong
resemblance to Thurston’s convoluted system, constantly sprouting in new areas where
mathematicians are exploring.
I disagree with a few claims Thurston and Penrose make. Whereas non-logical methods
are useful for paving new paths of understanding, they must be justified a posteriori with logic.
Thurston is too soft on this point; he argues that if non-logical methods are deductively sound,
2. Balder2
logical validation is unnecessary. Penrose, on the other hand, comes off too strong. Mathematical
understanding has a unique character, he claims, and it transcends computational reduction.
Referencing biology and the environment, Penrose tries to separate understanding from
computation. In the end, his argument is inconsistent. Before elaborating on my criticism, I will
flesh out both arguments, set them next to each other, and see how they interact.
Penrose demonstrates how non-computational methods are useful in understanding
mathematics and reaching objective truth. One non-computational method Penrose highlights is
geometric visualization. Penrose uses a two-dimensional rectangular array, or grid, to verify the
equation, 𝑎 ∗ 𝑏 = 𝑏 ∗ 𝑎 , for all real numbers, a and b (Penrose 122). A rectangular array with
a rows and b columns contains the same number of objects as an array with b rows and a
columns. This number happens to be 𝑎 ∗ 𝑏, or 𝑏 ∗ 𝑎. Counting the number of objects inside
either array is a simple computational procedure. However, the thought of constructing a
rectangular array to prove the equation, 𝑎 ∗ 𝑏 = 𝑏 ∗ 𝑎, is not. According to Penrose, this
process necessitates human intuition and higher-level forms of thought, which transcend mere
calculation (Penrose 115).
Thurston agrees with Penrose’s point that one may use alternative methods to achieve
mathematical understanding. Thurston does not, however, explicitly mention any form of the
word, “computation”. Rather, he discusses logic and its role in mathematics. Thurston and
Penrose cannot speak directly to each other because, when taken literally, they are discussing
different things. For the purposes of this essay, Penrose’s computation and Thurston’s logic can
be viewed through the same lens – as base-level procedural methods, or rules of inference, used
in mathematics. Although Thurston does not explicitly mention computation in his essay, his
discussion of logic has a similar taste to that of Penrose’s.
3. Balder3
Thurston describes several routes, besides logic, that lead to mathematical understanding.
One may think of a derivative as the slope of the tangent line of a graph, as the speed of a
position function of time, as the fraction of infinitesimal change in the value of a function over
the infinitesimal change in the function. One does not have to formulate a logical description of a
derivative to understand one or more of its manifestations. That is, one does not necessarily have
to understand 𝑓′( 𝑥) = lim
h→0
𝑓( 𝑥+ℎ)−𝑓( 𝑥)
ℎ
to understand the derivative as the slope of the tangent
line (Thurston 3). Mathematics is understood in many ways, and this understanding can be
communicated through several different mediums. Informal talks, formal lectures, and academic
papers are all ways through which people convey mathematical understanding (Thurston 6). The
multifarious nature of mathematical understanding, and the multifarious nature of the
communication of mathematical understanding show that mathematics is more than logic.
One may take Thurston’s argument a step further: without using alternative methods, it is
impossible to gain complete mathematical understanding. For instance, one cannot fully
understand the concept of a derivative without understanding the derivative as the slope of the
tangent line or as the speed of the position function. These are separate from logical definitions
(Thurston 3), and non-logical methods for representing a derivative are necessary for holistic
comprehension. In other words, one must look at the picture from every angle to understand it
completely. With only the logical viewpoint, one gains, at best, a partial understanding.
Penrose agrees with Thurston on this point, and he describes a case in which one must
use non-computational methods to reach objective truth. Suppose one has a procedure that
determines whether a calculation, given a certain input, does not terminate. That is, the procedure
terminates if the calculation does not terminate. Using the procedure as the procedure’s input,
one reaches a contradiction. If the procedure terminates, then the procedure does not terminate; if
4. Balder4
the procedure does not terminate, then the procedure terminates (Penrose 130). One cannot use
computation – in this case, the actual procedure – to find the contradiction. Penrose concludes,
“Mathematicians do not simply ascertain mathematical truth by means of knowably sound
calculational procedures” (Penrose 131). Rather, one must use higher intuition to recognize the
impossibility of having a decision procedure for non-terminating calculations.
Penrose and Thurston argue that non-computational and non-logical methods are needed
for reaching mathematical truth and achieving full understanding. Are these methods within the
scope of mathematics? Penrose argues that mathematical thinking, by its nature, contains many
aspects of general intelligence – intuition, common-sense, the appreciation of beauty (Penrose
107). In the first chapter of his book, The Road to Reality: A Complete Guide to the Laws of the
Universe, Penrose describes a platonic world containing all of the mathematical truths (Penrose
11-12). The platonic realm is finite and separate from the mental and physical worlds. To some
degree, the mental world projects onto the platonic realm (Penrose 20). That is, certain human
thought processes lead to objective truth. Penrose characterizes the field of mathematical practice
as a finite world containing those methods that bridge the gap between the mental sphere and the
platonic world. These methods, all of which belong to mathematics, include logic and
computation as well as higher-level thought procedures.
Thurston would agree with Penrose’s point that mathematics includes higher-level
thought procedures. He would, however, disagree with Penrose’s characterization of
mathematics. Thurston defines mathematics as number theory, plane and solid geometry, along
with other subject matter that mathematicians study (Thurston 2). Mathematicians – those
individuals who study mathematics and advance mathematical understanding – are responsible
for setting the mathematical agenda. As human understanding grows more complex,
5. Balder5
mathematics, as a field, expands and annexes the new territory mathematicians are exploring.
Penrose illustrates mathematics as a finite realm containing those methods that link the mental
world to the platonic world. This realm is unaffected by human activity. Thurston, on the other
hand, depicts mathematics as an ever-expanding network driven by human intellectual pursuits
that is always incorporating new methods.
Penrose suggests that mathematics always has, and always will include every method that
leads to objective truth. In the first chapter of The Road to Reality, Penrose claims that Fermat’s
Theorem always has been true, even before a logically-sound proof was formulated (Penrose 14).
That is, Fermat’s Theorem has always existed in the platonic realm, and, assuming there are
humanly-understandable methods for ascertaining Fermat’s Theorem, these methods have
always existed in mathematics. The methods were well-concealed, so it took mathematicians
awhile to uncover them. Nevertheless, they have always been a part of mathematics, according to
Penrose.
Thurston would most likely reject Penrose’s claim that mathematics spans all known and
unknown methods for reaching objective truth. In stark contrast to Penrose’s finite realm of
mathematical methods, Thurston’s illustration of mathematics resembles a tree that is constantly
growing and branching outwards. It is ridiculous to say the sapling is the same as the full grown
tree. It is similarly difficult to say mathematics now is the same as the mathematics of the future.
Mathematicians explore new frontiers and mathematics incorporates new elements. At present,
mathematics cannot possibly contain all potential mathematics because there is no way to predict
in which direction mathematicians will take mathematics. In the distant future, mathematicians
may deem Shakespeare an essential component of mathematical understanding. Thurston’s
6. Balder6
model of mathematics, unlike Penrose’s, accounts for unpredictable advances in mathematical
understanding.
Thurston’s model is a more accurate representation of mathematics than Penrose’s model
because it treats mathematical practice as a human process. Penrose may argue that mathematics
is bound to a certain path, and mathematicians simply follow the trail. This argument seems
unreasonable to me. Penrose might as well say, the sapling is bound to become the tree that it
will become. This is not true because environment is a huge factor. The fertility of the soil and
the availability of water and sunlight play a huge role in the sapling’s growth. If one alters these
factors, the sapling will grow into a different tree or it will not grow at all. Tree growth is
contingent on environmental factors. Likewise, the mathematical agenda is influenced by those
who study mathematics. If the only mathematicians were set theorists, mathematics would look
quite different than if geometric topologists were the only mathematicians. Unlike a computer
program, mathematics is a creative human process that changes as people change.
Mathematicians, ultimately, are the ones who dictate the script. Thurston’s model realizes this
fact while Penrose’s model disregards it.
Though I prefer Thurston’s model of mathematics, I disagree with his view of logical
verification. At one point in the essay, he states, if the toaster works, one does not need to check
the manual. That is, if a proof follows a deductively-coherent argument, one does not need to
check over the formalized structure of the proof (Thurston 7). While non-logical methods are
necessary for ascertaining mathematical concepts, they cannot support the building by
themselves. They provide the basic blueprint for constructing the edifice, though one must revert
to computation and logic – the brick and mortar of mathematics – when verifying the proof.
7. Balder7
While Thurston dismisses the necessity of using logic to evaluate mathematical
understanding, Penrose claims that higher-level thought transcends computational reduction.
Much of human thought is beyond the scope of computation, he argues, and natural selection
favors higher-level understanding (Penrose 134). This view that understanding is biologically
endowed and the human happened to win the evolutionary lottery is overly anthropocentric. I can
accept that human understanding is the product of anatomy and physiology. I cannot accept the
argument that understanding is some special, indefinable human quality perpetuated by natural
selection. Were understanding perpetuated by natural selection, there would have to be some link
to physiological phenomena. While human biological processes are extremely complex, they are
the product of physical law and, to some degree, computation. As follows, understanding would
be a higher-level product of physical law and computation. While it is beyond human capacity to
make sense of the computational reduction of understanding, it exists.
Penrose’s claim that understanding is some intangible quality favored by natural selection
is unreasonable. To make the biological argument, Penrose must link understanding to human
physiology, which is governed by physical law and computation. In doing so, Penrose would
have to concede the existence of a link between understanding and computation. Instead, Penrose
is inconsistent in suggesting that understanding transcends computational reduction.
Penrose then gives an escapist argument – he attributes the un-computational character of
understanding to the un-computational character of the environment (Penrose 135). The
environment is governed by a seemingly infinite number of computations, so randomness seems
to be a reality. But there is always a clear cause and effect. A tree sapling, when placed in
nutrient-rich soil and watered regularly, will grow more healthily than a sapling lacking nutrients
and water. Observing a grove of trees, I may see a random distribution between healthy and sick
8. Balder8
trees. However, when I take all of the minute factors into consideration – the amount of water,
nutrients, and sunlight each tree receives – the randomness does not seem so random. The
appearance of un-computational randomness is due to human inability to perform every
computation and take every factor into consideration. Penrose is wrong to suggest that
understanding transcends computational reduction when the computational reduction exists
beyond the scope of human comprehension.
Penrose and Thurston argue that non-computational and non-logical methods are
necessary for reaching objective truth and gaining understanding. Both authors agree these
methods are part of mathematics. Penrose and Thurston, however, depict mathematics
differently. Penrose illustrates a finite realm containing all methods that lead to objective truth.
This realm is unaffected by human activity, and mathematicians discover rather than create.
Thurston, instead, portrays mathematics as a network that expands outward to include new
methods. Thurston’s model is more accurate because it shows that mathematics proceeds in new,
unpredictable directions. Moreover, Thurston’s model treats mathematical practice as a process
shaped by human intellectual pursuits.
Though I prefer Thurston’s argument, I take issue with claims made by both authors.
Thurston downplays the importance of logically verifying mathematical proofs while Penrose
asserts the irreducibility of higher-level understanding to computational procedures. To
Thurston, I say non-logical methods are useful for constructing new ways of thinking but logic
must be used a posteriori to support them. I say to Penrose, simply because the computational
reduction of understanding is impossible for humans to comprehend or carry out does not mean
it is not there. I do not mean to blur the distinguishing line between computation and higher-level
thought; the two are different even though the latter consists of the former. I think it is beautiful
9. Balder9
that something as complex as understanding is nothing more than levels upon levels of
computations, and yet so much more.