History of Information: Classical, Medieval, Modern theory
Open problem of Information: The unification of various theories of information; What is useful/meaningful information?What is an adequate logic of information? Continuous versus discrete models of nature; Computation versus thermodynamics; Classical information versus quantum information; Information and the theory of everything; The Church-Turing Hypothesis; P versus NP?
It from Bit: Why the Quantum? It from Bit? A Participatory Universe?: Three Far-reaching, Visionary Questions from John Archibald Wheeler
Physic, Math, Information: String Theory, Quantum, Sporadic finite Groups, Leech Latice, Gravity as emergent,
Universe digital copy conjecture: representation of universal information
Emergent Transformation Conjecture: the math of emergent
Potential applications: Deep Learning; Capability Transformation using Enterprise Architect
What’s in it for us: Information science, getting ready for Industry 4.0
Tim Maudlin: New Foundations for Physical GeometryArun Gupta
New Foundations for Physical Geometry
Original URL: http://www.unil.ch/webdav/site/philo/shared/summer_school_2013/NYU.ppt
Tim Maudlin
NYU
Physics & Philosophy of Time
July 25, 2013
Introduction au web des données (Linked Data)BorderCloud
L'Open Data, le Big Data, le Web des données, le Web sémantique, les ontologies, le NoSql et le SPARQL sont autant de notions qu'il faut comprendre pour ne pas rater la prochaine rupture technologique du Web.
Cette présentation est l'introduction de la formation sur le Web sémantique que donne la société BorderCloud pour prendre un peu de recule sur les buzzwords du moment et savoir si vous avez besoin de faire du Big Data ou bien du Linked Data.
Tim Maudlin: New Foundations for Physical GeometryArun Gupta
New Foundations for Physical Geometry
Original URL: http://www.unil.ch/webdav/site/philo/shared/summer_school_2013/NYU.ppt
Tim Maudlin
NYU
Physics & Philosophy of Time
July 25, 2013
Introduction au web des données (Linked Data)BorderCloud
L'Open Data, le Big Data, le Web des données, le Web sémantique, les ontologies, le NoSql et le SPARQL sont autant de notions qu'il faut comprendre pour ne pas rater la prochaine rupture technologique du Web.
Cette présentation est l'introduction de la formation sur le Web sémantique que donne la société BorderCloud pour prendre un peu de recule sur les buzzwords du moment et savoir si vous avez besoin de faire du Big Data ou bien du Linked Data.
Conférence Sécurité et Intelligence Artificielle - INHESJ 2018OPcyberland
Cours-Conférence - INHESJ - Session Sécurité Numérique 2018.
Les apports de l'intelligence artificielle à la cybersécurité et la sécurité de l'IA. Conférence du 02 mai 2018
THE STRUCTURE OFSCIENTIFIC REVOLUTION -Thomas Kuhn Nouran Adel
Thomas Kuhn is most famous for his book The Structure of Scientific Revolutions (1962) in which he presented the idea that science does not evolve gradually toward truth, but instead undergoes periodic revolutions which he calls "paradigm shifts."
Talk given in London, 3 January 2017. The talk has three aims:
(i) To clarify the use of these concepts (‘emergence’ and ‘reduction’) in science, especially in physics.
(ii) Specifically: to argue that the contrast ‘emergence vs. reduction’ poses a false dichotomy, since these two concepts are independent.
(iii) To point out that the independence of the two concepts may open interesting avenues for the philosophy of mind. But I will not work this out in a theory of the mind.
Présentation faite lors d'une réunion du projet animitex à Montpellier en aôut 2014. Cette présentation brosse un apercu des standards du web sémantique disponible sur le web de données. Puis nous introduisons brièvement les travaux de Fabien Amarger sur la transformation de SKOS en ontologie.
Conférence Sécurité et Intelligence Artificielle - INHESJ 2018OPcyberland
Cours-Conférence - INHESJ - Session Sécurité Numérique 2018.
Les apports de l'intelligence artificielle à la cybersécurité et la sécurité de l'IA. Conférence du 02 mai 2018
THE STRUCTURE OFSCIENTIFIC REVOLUTION -Thomas Kuhn Nouran Adel
Thomas Kuhn is most famous for his book The Structure of Scientific Revolutions (1962) in which he presented the idea that science does not evolve gradually toward truth, but instead undergoes periodic revolutions which he calls "paradigm shifts."
Talk given in London, 3 January 2017. The talk has three aims:
(i) To clarify the use of these concepts (‘emergence’ and ‘reduction’) in science, especially in physics.
(ii) Specifically: to argue that the contrast ‘emergence vs. reduction’ poses a false dichotomy, since these two concepts are independent.
(iii) To point out that the independence of the two concepts may open interesting avenues for the philosophy of mind. But I will not work this out in a theory of the mind.
Présentation faite lors d'une réunion du projet animitex à Montpellier en aôut 2014. Cette présentation brosse un apercu des standards du web sémantique disponible sur le web de données. Puis nous introduisons brièvement les travaux de Fabien Amarger sur la transformation de SKOS en ontologie.
China, Law and the Foreigner: Mutual Engagements on a Global StageLarry Catá Backer
Prepared for the Conference: “Foreigners and Modern Chinese Law”, Tsinghua University School of Law, Beijing, China, July 9-10, 2016; Organized by Profgessors Xu Zhangrun and Chen Xinyu
Managing scalable infrastructure based on monitoringForthscale
A presentation from #CCCEU13. It is based on Forthscale`s method of running scalable infrastructure with the help of monitoring systems. Now updated as for OpsTalk TLV
Enterprise grade firewall and ssl termination to ac by will stevensbuildacloud
CloudOps has add support for enterprise grade security products in ACS. CloudOps has developed an integration with the Palo Alto Networks firewall appliance to enable ACS to orchestrate network features such as network creation, Source NAT, Static NAT, Port Forwarding and Firewall rules on the Palo Alto device. Additionally, CloudOps has extended ACS to support SSL certificate management as well as SSL termination by external load balancers. The existing ACS NetScaler plugin has been improved to support this new SSL termination functionality. The talk will cover the features added as well as a basic overview of how they are used.
Will Stevens is the Lead Developer at CloudOps. He has been directly involved in extending ACS to support more enterprise grade security functionality. Will has over 10 years experience as a software developer and is primarily focused on cloud integrations at CloudOps.
Information Physics: Towards A New Conception of ‘Musical Realitypaperpublications3
Abstract: For modern science information is not simply considered as just what we do not know. From entanglement and quantum mechanics to black holes it plays a fundamental role. It is a physical quantity which uses matter for its embodiment and energy for its communication. The concept of information can be used to ‘decode the reality’, to explain the formation and action of entities, from molecules to galaxies. Entities exist as long as they have boundaries. Which are the informational, energetic and material boundaries of music entities? How modern science can describe the unfolding of musical events? Is it valid to talk about fields in music and which are the attractive and repulsive forces? Is the distinction of musical past, present and future a convincing ‘illusion’, as it is for physicists? And mainly, which is the role of information while creating music structures?
Keywords: Entropy, Information, Music Analysis, Music Entity, Music Structure.
Title: Information Physics: Towards A New Conception of ‘Musical Reality
Author: Bakogiannis Konstantinos, Cambourakis George
ISSN 2350-1049
International Journal of Recent Research in Interdisciplinary Sciences (IJRRIS)
Paper Publications
Information Physics: Towards A New Conception of ‘Musical Realitypaperpublications3
Abstract: For modern science information is not simply considered as just what we do not know. From entanglement and quantum mechanics to black holes it plays a fundamental role. It is a physical quantity which uses matter for its embodiment and energy for its communication. The concept of information can be used to ‘decode the reality’, to explain the formation and action of entities, from molecules to galaxies. Entities exist as long as they have boundaries. Which are the informational, energetic and material boundaries of music entities? How modern science can describe the unfolding of musical events? Is it valid to talk about fields in music and which are the attractive and repulsive forces? Is the distinction of musical past, present and future a convincing ‘illusion’, as it is for physicists? And mainly, which is the role of information while creating music structures?
Keywords: Entropy, Information, Music Analysis, Music Entity, Music Structure.
Title: Information Physics: Towards A New Conception of ‘Musical Reality
Author: Bakogiannis Konstantinos, Cambourakis George
ISSN 2350-1049
International Journal of Recent Research in Interdisciplinary Sciences (IJRRIS)
Paper Publications
Overview of Presentations in the Field of Philosophy and ScienceAlfred Driessen
Presentations in the field of science and philosophy
Alfred Driessen
2007-2017
This document gives an overview of his presentations with hyperlinks to slideshare, where the presentations can be viewed and the pdf-files can be downloaded.
Aquinas’ Theory of Perception
Talk presented at the 19th International Interdisciplinary Seminar What differentiates human persons from animals and machines? Netherhall House, London, 3-1-2017
CONTENT
Introduction
Hylomorphism: matter and form
Acquisition and dealing with information
Isomorphism of mind and reality
Conclusions
Lisska 2016
distinction between esse naturale and esse intentionale
It is through the sense impression in the faculty that the sense faculty ‘becomes’ the sense object in the external world, but immaterially or intentionally. The same form is exemplified ‘intentionally’ in the faculty and ‘existentionally’ in the object; this is the Aristotelian insight further enhanced by Aquinas. There is an identity of form, one in esse intentionale and the other in esse naturale, indicating the two modes of exemplification utilized. Without this identity of structure rendered possible by the two modes of exemplification, the isomorphism of mind and reality in Aristotelian ontology and philosophy of mind would be impossible.
CONCLUSIONS
Hylomorphism is a powerful approach for understanding the full richness of human intellectual capacity.
The definition of truth of Aquinas becomes evident
truth: adaequatio rei et intellectus
truth is the conformity of the intellect to the things.
The non-material dimension of the human mind is fully acknowledged.
The application of concepts of the metaphysics of Aristotle and Aquinas is a adequate starting point for the study of philosophical issues of modern science.
This article aims to show how the scientific method emerged and demonstrate the need for changes in the search for scientific truth due to numerous questions about its effectiveness. There are several questions from great artisans and thinkers of science like Edgar Morin, Karl Popper, Bertrand Russell, Henri Poincaré, Albert Einstein, Pierre Duhem and Paul Feyerabend who contest that the current scientific method provides the search for scientific truth and claim another approach. Our proposal is that a new scientific method be developed that takes these questions into account.
This paper presents the evolution of the scientific method that has been instrumental in promoting the advancement of science and also technology throughout history. It is important to note that the scientific method refers to a cluster of basic rules of how to be the procedure in order to produce scientific knowledge, either new knowledge, either a correction or an increase of previously existing knowledge. The scientific method, therefore, is nothing more than the logic applied to science. The search for a suitable scientific method guided the action of most thinkers of the sixteenth and seventeenth highlighting among them Galileo Galilei, Francis Bacon, René Descartes and Isaac Newton, who with their contributions were crucial to the structure of what we call today of modern science. In addition to these thinkers, it was also important later contributions of Hegel, Marx, Engels, Popper, Russell, Duhem, Poincaré, Morin, etc.
Information is a fact, thought or data conveyed or described through various types of communication , like written, oral, visual and audio communications
A bried history of the science of learning part 1.
Connections: The Learning Sciences Platform work is focus on:
- Educational Support “in situ”
- Professional Development
- Educational Research
This work is complemented with “in situ” accomplaniment and joint research.
Visit our social networks
- Website: http://thelearningsciences.com
- Facebook: https://www.facebook.com/connectionstlsp/
- Instagram: ConexionesPCA2017
- Slideshare: https://www.slideshare.net/Lascienciasdelaprendizaje
- YouTube: https://www.youtube.com/channel/UCyUDsQmjsiJl8T2w5-EF78g
- Linkedin: https://www.linkedin.com/company-beta/16212567/
Contact us:
E-mail: info@thelearningsciences.com
Mobile: +593 995 615 247
This article aims to present the concept of scientific truth, the methods adopted for the search for scientific truth, the questions about the scientific method and how to prove the scientific truth.
What (Good) is Historical Epistemology Thomas Sturm ref.docxphilipnelson29183
What (Good) is Historical
Epistemology?
Thomas Sturm reflects on a conference on historical epistemology, held at the
MPIWG in July 2008, which brought together historians and philosophers of science.
• AUG 31, 2008
• Thomas Sturm
•
• DEPT. I
Philosophical epistemology aims to clarify what knowledge is, whether we possess
any of it, and how we can justify our knowledge claims, including scientific ones.
While epistemology is a strong branch of current philosophy, its universalistic
pretensions have often been criticized. In particular, it has been suggested that
knowledge is situated in contexts (biological, social, historical, material) and that
epistemology cannot afford to ignore these contexts. One such challenge, which has
recently attracted many historians of science, has been named “historical
epistemology”. Yet there are several different versions of this approach. The
conference aimed to clarify and evaluate these in talks and discussions with
internationally leading historians of epistemology and philosophers and historians of
science. The conference attracted over 120 guests from Europe, America, and Asia,
who work in disciplines as diverse as philosophy, history of science, physics, geology,
economics, sociology, psychology, art history, and philology.
The guiding task was to clarify what versions of historical epistemology exist and the
pros and cons each of them presents. What kind of historical enterprise is historical
epistemology? What are its basic assumptions, and what are their rationales?
Moreover, in what sense is such a focus on epistemic categories and practices itself a
form of epistemology (or philosophy of science)? As papers and discussions were
based on studies about specific topics that exemplify or test one or another version of
historical epistemology, the conference covered a wide variety of issues. These
included the historicity of epistemological categories and standards (such as the
replication of experiments in the seventeenth and eighteenth centuries, the relation
between perception and judgment, or different models of explanation and causal
inference); the historicity of epistemic objects, that is, the “birth, life, and death” of
real or apparent objects of research (like phlogiston, the electron, memory, or the
economy); and models of scientific development, which were either guided by a neo-
Kantian framework or tried to deal with alleged cases of incommensurability by
means of theories of concepts from recent cognitive science.
https://www.mpiwg-berlin.mpg.de/dept-one
The way the program was organized reflected three versions of historical
epistemology, as they are practiced by researchers at the MPIWG. Each has its own
points of contact to philosophical epistemology and the philosophy of science: (1)
According to Lorraine Daston, historical epistemology raises “the Kantian question
about the preconditions that make thinkin.
Temporality of the Future: Part of a Series on Cryptophilosophy
Husserl’s Internal Time Consciousness is a theory of the structure of time and the present-now moment, and distinguishes two kinds of memory, primary memory as retention and secondary memory as recollection (reproductive, representational) memory. Retention does not break continuity with the present-now moment; retention is the part of a temporal object that contemplates its pastness and allows the present to emerge from the temporal background. Recollection does break continuity with the present; the current moment is interrupted to recall and re-represent a past memory. Recollection and expectation are piled up snapshots of discrete past moments or events. When recollected, they are reproduced in flow, but exist unsummoned as discrete elements. The structure of the present-now moment, on the other hand, is a continuous flow of the intentional unity of primal impression and retention-protention. How far the retention-protention horizon extends is unclear. It might only encompass the most immediate recent-pasts and near-futures surrounding the primal impression of the present-now moment, or it might extend to include all previous and future experiences in the realms of recollection and expectation. This talk posits that there might be a middle third form of time that exists respectively between recollection and retention and protention and expectation. Whereas protention and retention are continuous, and recollection and expectation are discrete, this middle form of time (X-tention) is simultaneously discrete and continuous.
STEM với Chương trình giáo dục tổng thểNguyen Trung
Đề xuất triển khai giáo dục STEM trong chương trình giáo dục tổng thể:
CTTT chỉ nhắc tới STEM một lần duy nhất: “Cùng với Toán học, Khoa học tự nhiên và Tin học, các môn học về công nghệ góp phần thúc đẩy giáo dục STEM, một trong những xu hướng giáo dục đang được coi trọng ở nhiều quốc gia trên thế giới.”
Cách nhìn nhận giáo dục STEM tách rời khỏi chương trình tổng thể và chỉ coi như một xu hướng giáo dục cần thúc đẩy là chưa phù hợp.
Cần một tư duy cách mạng để nhúng giáo dục STEM vào tối đa các môn học và hoạt động của chương trình với các mức độ khác nhau. Cách làm thế nào đòi hỏi một nghiên cứu nghiêm túc với sự tham gia của lực lượng KHCN cùng với lực lượng sư phạm.
Cần có thí điểm triển khai giáo dục STEM tại nhiều cấp độ từ đơn giản đến phức tạp để huy động toàn xã hội vào cuộc: một cách khả thi là đi cùng đề án Tri thức Việt số hóa và chỉ thị 16 về CMCN4.
Bài viết được dựa trên quan điểm lấy mục tiêu về sự phù hợp của phẩm chất và năng lực của người học khi tốt nghiệp phổ thông với bối cảnh của thế giới kết hợp với phương pháp giáo dục STEM để xem xét bộ các năng lực được đưa ra trong chương trình giáo dục phổ thông sau đó ánh xạ vào các môn học tin học. Tương tự, cách làm này có thể dùng để xem xét các môn học khác.
Trước hết chúng ta cùng phân tích để nắm bắt được nội hàm của phương pháp Giáo dục STEM.
Giáo dục STEM là một phiên bản của phương pháp giáo dục nhấn mạnh vào Học qua hành (learning by doing) trong bối cảnh thế kỷ 21 và CMCN4. Phương pháp này đã tồn tại từ Rousseau tới Dewey và được áp dụng trong nhiều môn học. Giáo dục STEM sử dụng LbD để học sinh nắm được cách làm việc nhóm, cộng tác, giải quyết vấn đề để có được những kỹ năng cụ thể. Giáo dục STEM còn sử dụng LbD cho một mục tiêu thách thức hơn đó là để học sinh hiểu được nguyên lý của hệ thống (từ một hệ thống nhỏ đến cả thế giới) thông qua việc làm chủ tương tác (nền tảng của tích hợp), đây vốn là điểm khó nhất trong học tập. Giáo dục STEM vì vậy phù hợp với một môi trường khi mà các hệ thống thông minh xuất hiện ở nhiều nơi, nơi mà kỹ năng giải quyết vấn đề đòi hỏi mức độ phức tạp hơn do chúng ta chỉ có hiểu biết sơ sài về vấn đề cần giải quyết nhưng lại phải ra quyết định nhanh, và chúng ta phải làm những điều đó dựa trên việc sử dụng dữ liệu một cách hiệu quả dể phân tích khách quan, hướng mục tiêu.
VIETNAM STEM ALLIANCE – A CASE STUDY OF SOCIAL INNOVATION FOR STEM DEVELOPMEN...Nguyen Trung
VIETNAM STEM ALLIANCE – A CASE STUDY OF SOCIAL INNOVATION FOR STEM DEVELOPMENT
Nguyễn Thế Trung & Đỗ Hoàng Sơn
Founder and member Vietnam STEM Alliance
DTT khuyen nghi ve he sinh thai khoi nghiep IOTNguyen Trung
Internet of Things – danh tiếng và nhu cầu
Hệ sinh thái IoT, các góc nhìn
Hệ sinh thái khởi nghiệp, một vài tổng kết
Mô hình hệ sinh thái IoT sáng tạo
Giới thiệu nền tảng nguồn mở IoT: OIP một đóng góp cho hệ sinh thái khởi nghiệp IoT
Khuyến nghị phát triển hệ sinh thái khởi nghiệp IoT tại Việt Nam
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Slide 1: Title Slide
Extrachromosomal Inheritance
Slide 2: Introduction to Extrachromosomal Inheritance
Definition: Extrachromosomal inheritance refers to the transmission of genetic material that is not found within the nucleus.
Key Components: Involves genes located in mitochondria, chloroplasts, and plasmids.
Slide 3: Mitochondrial Inheritance
Mitochondria: Organelles responsible for energy production.
Mitochondrial DNA (mtDNA): Circular DNA molecule found in mitochondria.
Inheritance Pattern: Maternally inherited, meaning it is passed from mothers to all their offspring.
Diseases: Examples include Leber’s hereditary optic neuropathy (LHON) and mitochondrial myopathy.
Slide 4: Chloroplast Inheritance
Chloroplasts: Organelles responsible for photosynthesis in plants.
Chloroplast DNA (cpDNA): Circular DNA molecule found in chloroplasts.
Inheritance Pattern: Often maternally inherited in most plants, but can vary in some species.
Examples: Variegation in plants, where leaf color patterns are determined by chloroplast DNA.
Slide 5: Plasmid Inheritance
Plasmids: Small, circular DNA molecules found in bacteria and some eukaryotes.
Features: Can carry antibiotic resistance genes and can be transferred between cells through processes like conjugation.
Significance: Important in biotechnology for gene cloning and genetic engineering.
Slide 6: Mechanisms of Extrachromosomal Inheritance
Non-Mendelian Patterns: Do not follow Mendel’s laws of inheritance.
Cytoplasmic Segregation: During cell division, organelles like mitochondria and chloroplasts are randomly distributed to daughter cells.
Heteroplasmy: Presence of more than one type of organellar genome within a cell, leading to variation in expression.
Slide 7: Examples of Extrachromosomal Inheritance
Four O’clock Plant (Mirabilis jalapa): Shows variegated leaves due to different cpDNA in leaf cells.
Petite Mutants in Yeast: Result from mutations in mitochondrial DNA affecting respiration.
Slide 8: Importance of Extrachromosomal Inheritance
Evolution: Provides insight into the evolution of eukaryotic cells.
Medicine: Understanding mitochondrial inheritance helps in diagnosing and treating mitochondrial diseases.
Agriculture: Chloroplast inheritance can be used in plant breeding and genetic modification.
Slide 9: Recent Research and Advances
Gene Editing: Techniques like CRISPR-Cas9 are being used to edit mitochondrial and chloroplast DNA.
Therapies: Development of mitochondrial replacement therapy (MRT) for preventing mitochondrial diseases.
Slide 10: Conclusion
Summary: Extrachromosomal inheritance involves the transmission of genetic material outside the nucleus and plays a crucial role in genetics, medicine, and biotechnology.
Future Directions: Continued research and technological advancements hold promise for new treatments and applications.
Slide 11: Questions and Discussion
Invite Audience: Open the floor for any questions or further discussion on the topic.
Multi-source connectivity as the driver of solar wind variability in the heli...Sérgio Sacani
The ambient solar wind that flls the heliosphere originates from multiple
sources in the solar corona and is highly structured. It is often described
as high-speed, relatively homogeneous, plasma streams from coronal
holes and slow-speed, highly variable, streams whose source regions are
under debate. A key goal of ESA/NASA’s Solar Orbiter mission is to identify
solar wind sources and understand what drives the complexity seen in the
heliosphere. By combining magnetic feld modelling and spectroscopic
techniques with high-resolution observations and measurements, we show
that the solar wind variability detected in situ by Solar Orbiter in March
2022 is driven by spatio-temporal changes in the magnetic connectivity to
multiple sources in the solar atmosphere. The magnetic feld footpoints
connected to the spacecraft moved from the boundaries of a coronal hole
to one active region (12961) and then across to another region (12957). This
is refected in the in situ measurements, which show the transition from fast
to highly Alfvénic then to slow solar wind that is disrupted by the arrival of
a coronal mass ejection. Our results describe solar wind variability at 0.5 au
but are applicable to near-Earth observatories.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
1. It from Bit –
Information Theory State of Art.
Nguyen The Trung
05.01.2017
1/4/2017 1
2. Content: It from Bit – Information Theory State of Art.
1. History of Information: Classical, Medieval, Modern theory
2. Open problem of Information: The unification of various theories of information; What is
useful/meaningful information?What is an adequate logic of information? Continuous versus
discrete models of nature; Computation versus thermodynamics; Classical information versus
quantum information; Information and the theory of everything; The Church-Turing
Hypothesis; P versus NP?
3. It from Bit: Why the Quantum? It from Bit? A Participatory Universe?: Three Far-reaching,
Visionary Questions from John Archibald Wheeler
4. Physic, Math, Information: String Theory, Quantum, Sporadic finite Groups, Leech Latice,
Gravity as emergent,
5. Universe digital copy conjecture: representation of universal information
6. Emergent Transformation Conjecture: the math of emergent
7. Potential applications: Deep Learning; Capability Transformation using Enterprise Architect
8. What’s in it for us: Information science, getting ready for Industry 4.0
1/4/2017 2
3. History of Information
Classical philosophy Medieval philosophy Modern philosophy
‘information’ was a technical notion associated with a
theory of knowledge and ontology that originated in
Plato's (427–347 BCE) theory of forms, developed in a
number of his dialogues (Phaedo, Phaedrus,
Symposium, Timaeus, Republic). Various imperfect
individual horses in the physical world could be
identified as horses, because they participated in the
static atemporal and aspatial idea of ‘horseness’ in the
world of ideas or forms.
On various occasions Aristotle mentions the fact that
Plato associated ideas with numbers (Vogel 1974, pg.
139). Although formal mathematical theories about
information only emerged in the 20thcentury, and one
has to be careful not to interpret the Greek notion of a
number in any modern sense, the idea that information
was essentially a mathematical notion, dates back to
classical philosophy: the form of an entity was
conceived as a structure or pattern that could be
described in terms of numbers. Such a form had both
an ontological and an epistemological aspect: it
explains the essence as well as the understandability of
the object. The concept of information thus from the
very start of philosophical reflection was already
associated with epistemology, ontology and
mathematics.
Reflection on the notion of informatio is taken up, under
influence of Avicenna, by thinkers like Aquinas (1225–
1274 CE) and Duns Scotus (1265/66–1308 CE). When
Aquinas discusses the question whether angels can
interact with matter he refers to the Aristotelian
doctrine of hylomorphism (i.e., the theory that
substance consists of matter (hylo-wood, matter) and
form (morphè)). Here Aquinas translates this as the in-
formation of matter (informatio materiae) (Summa
Theologiae, 1a 110 2, Capurro 2009). Duns Scotus refers
to informatio in the technical sense when he discusses
Augustine's theory of vision in De Trinitate, XI Cap 2 par
3 (Duns Scotus, 1639, De imagine, Ordinatio, I, d.3, p.3).
The tension that already existed in classical philosophy
between Platonic idealism(universalia ante res) and
Aristotelian realism (universalia in rebus) is recaptured
as the problem of universals: do universal qualities like
‘humanity’ or the idea of a horse exist apart from the
individual entities that instantiate them? It is in the
context of his rejection of universals that Ockham (c.
1287–1347 CE) introduces his well-known razor:
entities should not be multiplied beyond necessity.
Throughout their writings Aquinas and Scotus use the
Latin terms informatio and informare in a technical
sense, although this terminology is not used by
Ockham.
Rationalism: The Cartesian notion that ideas are innate and thus a
priori. This view is difficult to reconcile with the concept of
experimental science.
Empiricism: Concepts are constructed in the mind a posteriori on the
basis of ideas associated with sensory impressions.
Locke's reinterpretation of the notion of idea as a ‘structural
placeholder’ for any entity present in the mind is an essential step in
the emergence of the modern concept of information. ... opens the
gate to a reconstruction of knowledge as an extensive property of an
agent: more ideas implies more probable knowledge.
In the second half of the 17th century formal theory of probability is
developed by researchers like Pascal (1623–1662), Fermat (1601 or
1606–1665) and Christiaan Huygens (1629–1695). For these authors,
the world was essentially mechanistic and thus deterministic,
probability was a quality of human knowledge caused by its
imperfection. Here knowledge about the future as a degree of belief
is measured in terms of probability, which in its turn is explained in
terms of the number of configurations a deterministic system in the
world can have.
With this new concept of knowledge empiricists laid the foundation
for the later development of thermodynamics as a reduction of the
secondary quality of heat to the primary qualities of bodies.
According to Kant, … the human mind can evaluate its own capability
to formulate scientific judgements. Kant developed transcendental
philosophy as an investigation of the necessary conditions of human
knowledge.
Source: https://plato.stanford.edu/entries/information/index.html
1/4/2017 3
4. Historical development of the meaning of the
term ‘information’
‘Information’ as the process of being informed.
• This is the oldest meaning one finds in the writings of authors like Cicero (106–43 BCE) and Augustine (354–
430 CE) and it is lost in the modern discourse, although the association of information with processes (i.e.,
computing, flowing or sending a message) still exists. In classical philosophy one could say that when I
recognize a horse as such, then the ‘form’ of a horse is planted in my mind.
‘Information’ as a state of an agent,
• i.e., as the result of the process of being informed. If one teaches a pupil the theorem of Pythagoras then,
after this process is completed, the student can be said to ‘have the information about the theorem of
Pythagoras’. In this sense the term ‘information’ is the result of the same suspect form of substantiation of a
verb (informare > informatio) as many other technical terms in philosophy (substance, consciousness,
subject, object).
‘Information’ as the disposition to inform,
• i.e., as a capacity of an object to inform an agent. When the act of teaching me Pythagoras' theorem leaves
me with information about this theorem, it is only natural to assume that a text in which the theorem is
explained actually ‘contains’ this information. The text has the capacity to inform me when I read it.
Source: https://plato.stanford.edu/entries/information/index.html
1/4/2017 4
5. Building blocks of modern theories of information
• Information is extensive. Our intuition is that longer text potentially
contains more information. Thus when we have two
structures A and B that are mutually independent, then the total
information in the combination should be the sum of both the
information in A and B: I(A and B)=I(A)+I(B).
• Information reduces uncertainty. Information grows with the
reduction of uncertainty it creates. When we are absolutely certain
about a state of affairs we cannot receive new information about it.
This suggests an association between information and probability
Source: https://plato.stanford.edu/entries/information/index.html1/4/2017 5
6. Developments in philosophy of Information -
Popper: Information as degree of falsifiability
• Thus it can be said that the amount of empirical information conveyed by a theory, or
its empirical content, increases with its degree of falsifiability” (Popper 1934 [1977], pg. 113,
emphasis in original).
• The logical probability of a statement is complementary to its falsifiability: it increases with
decreasing degree of falsifiability. The logical probability 1 corresponds to the degree 0 of
falsifiability and vice versa. (Popper 1934 [1977], p. 119, emphasis in original)
• It is possible to interpret numerical probability as applying to a subsequence (picked out from
the logical probability relation) for which a system of measurement can be defined, on the
basis of frequency estimates. (Popper 1934 [1977], pg. 119, emphasis in original)
• These issues were later developed in philosophy of science. Theory of conformation studies
induction theory and the way in which evidence ‘supports’ a certain theory (Huber 2007,
Other Internet Resources). Although the work of Carnap motivated important developments
in both philosophy of science and philosophy of information the connection between the two
disciplines seems to have been lost. There is no mention of information theory or any of the
more foundational work in philosophy of information in Kuipers (2007a), but the two
disciplines certainly have overlapping domains.
Source: https://plato.stanford.edu/entries/information/index.html1/4/2017 6
7. Shannon: Information defined in terms of probability
“The fundamental problem of communication is that of reproducing at one point either exactly or
approximately a message selected at another point.”
• In two landmark papers Shannon (1948; Shannon & Weaver 1949) characterized the communication entropy
of a system of messages A:
• H(P) = −Σ(i∈A) pi log2 pi
• Here pi is the probability of message i in A. This is exactly the formula for Gibb's entropy in physics. The use of base-2
logarithms ensures that the code length is measured in bits (binary digits). It is easily seen that the communication entropy
of a system is maximal when all the messages have equal probability and thus are typical.
• The amount of information I in an individual message x is given by:
• I(x) = −log px
• This formula, that can be interpreted as the inverse of the Boltzmann entropy, covers a number of our basic
intuitions about information:
• A message x has a certain probability px between 0 and 1 of occurring.
• If px = 1 then I(x) = 0. If we are certain to get a message it literally contains no ‘news’ at al. The lower the probability of the
message is, the more information it contains. A message like “The sun will rise tomorrow” seems to contain less information
than the message “Jesus was Caesar” exactly because the second statement is much less likely to be defended by anyone
(although it can be found on the web).
• If two messages x and y are unrelated then I(x and y)=I(x) + I(y). Information is extensive. The amount of information in two
combined messages is equal to the sum of the amount of information in the individual messages.
• One aspect of information that Shannon's definition explicitly does not cover is the actual content of the
messages interpreted as propositions. So the statement “Jesus was Caesar” and “The moon is made of green
cheese” may carry the same amount of information while their meaning is totally different.
Source: https://plato.stanford.edu/entries/information/index.html1/4/2017 7
8. Solomonoff, Kolmogorov, Chaitin: Information as the length of a program
• Solomonoff (1997): Carnap's model of probability started with a long sequence of symbols that was a description of the entire
universe. Through his own formal linguistic analysis, he was able to assign a priori probabilities to any possible string of symbols that
might represent the universe. He formulated the notion of a universal distribution: “consider the set of all possible finite strings to be
programs for a universal Turing machine U and define the probability of a string x of symbols in terms of the length of the shortest
program p that outputs x on U.” - Algorithmic Information Theory was invented independently somewhat later separately by
Kolmogorov (1965) and Chaitin (1969). Levin (1974):
• The information content or complexity of an object can be measured by the length of its shortest description. For instance the
string: "0101010101010101010101010101010101010101010101010101010101010101“ has the short description "32
repetitions of '01'", while
• "1100100001100001110111101110110011111010010000100101011110010110“: presumably has no simple description other
than writing down the string itself.
• The Kolmogorov complexity of an object, such as a piece of text, is the length of the shortest computer program (in a
predetermined programming language) that produces the object as output.
• An infinite binary sequence is said to be random if, for some constant c, for all n, the Kolmogorov complexity of the initial segment of
length n of the sequence is at least n − c.
• Chaitin constant (Chaitin omega number) – Ω. or halting probability is a real number that informally represents the probability that a
randomly constructed program will halt. These numbers are formed from a construction due to Gregory Chaitin. Each halting
probability is a normal and transcendental real number that is not computable, which means that there is no algorithm to compute its
digits. Indeed, each halting probability is Martin-Löf random, meaning there is not even any algorithm which can reliably guess its
digits.
• Algorithmic Information Theory has gained rapid acceptance as a fundamental theory of information. The well-known introduction in
Information Theory by Cover and Thomas (2006) states: “… we consider Kolmogorov complexity (i.e., AIT) to be more fundamental
than Shannon entropy” (pg 3). Several authors have defended that data compression is a general principle that governs human
cognition (Chater & Vitányi 2003; Wolff 2006). Hutter (2005, 2007a,b) argues that Solomonoff's formal and complete theory
essentially solves the induction problem. Hutter (2007a) and Rathmanner & Hutter (2011) enumerate a plethora of classical
philosophical and statistical problems around induction and claim that Solomonoff's theory solves or avoids all these problems.
Source: https://plato.stanford.edu/entries/information/index.html, Wikipedia
1/4/2017 8
9. Open Problems in the Study of Information and Computation
• The unification of various theories of information:
• Information-A, Knowledge, logic, what is conveyed in informative answers
• Information-B, Probabilistic, information-theoretic, measured quantitatively
• Information-C, Algorithmic, code compression, measured quantitatively
• What is useful/meaningful information?
• What is an adequate logic of information?
• Continuous versus discrete models of nature: The central issue is this: the most elegant models of physical systems are based on
functions in continuous spaces. In such models almost all points in space carry an infinite amount of information. Yet, the
cornerstone of thermodynamics is that a finite amount of space has finite entropy. What is the right way to reconcile these two
views?
• Computation versus thermodynamics: What is a computational process from a thermodynamical point of view? What is a
thermodynamical process from a computational point of view. Can a thermodynamic theory of computing serve as a theory of
non-equilibrium dynamics? This problem seems to be hard because 150 years of research in thermodynamics still leaves us with a
lot of conceptual unclarities in the heart of the theory of thermodynamics itself.
• Classical information versus quantum information: The most fundamental way to store information in nature on an atomic level
involves qubits. The qubit is described by a state vector in a two-level quantum-mechanical system, which is formally equivalent to
a two-dimensional vector space over the complex numbers. The exact relation between classical and quantum information is
unclear.
• Information and the theory of everything: The notion of information seems to play a major role in the analysis of black holes
(Lloyd & Ng 2004; Bekenstein 1994). Erik Verlinde (2010) has proposed a theory in which gravity is analyzed in terms of
information. For the moment these models seem to be purely descriptive without any possibility of empirical verification.
• The Church-Turing Hypothesis: We know that a lot of formal systems are Turing equivalent (Turing machines, recursive functions,
lambda calculus, combinatory logic, cellular automata, to name a few). The question is: does this equivalence define the notion of
computation.
• P versus NP? Can every problem for which the solution can be checked efficiently also be solved efficiently by a computer
https://plato.stanford.edu/entries/information/supplement.html
1/4/2017 9
10. • Information and the theory of everything:
• It from Bit: Why the Quantum? It from Bit? A Participatory Universe?: Three
Far-reaching, Visionary Questions from John Archibald Wheeler
• Erik gravity
1/4/2017 10
11. John Wheeler On quantum theory and information
• I have been led to think of analogies between the way a computer works
and the way the universe works. The computer is built on yes-no logic. So,
perhaps, is the universe. Did an electron pass through slit A or did it not?
Did it cause counter B to click or counter C to click? These are the iron
posts of observation. Yet one enormous difference separates the computer
and the universe—chance. In principle, the output of a computer is
precisely determined by the input. Chance plays no role. In the universe, by
contrast, chance plays a dominant role. The laws of physics tell us only
what may happen. Actual measurement tells us what is happening (or what
did happen). Despite this difference, it is not unreasonable to imagine that
information sits at the core of physics, just as it sits at the core of a
computer. —1998
http://permalink.lanl.gov/object/tr?what=info:lanl-
repo/lareport/LA-UR-02-4969-01
1/4/2017 11
12. Guessing game
• You recall how it goes—one of the after-dinner party sent out of the living room, the
others agreeing on a word, the one fated to be a questioner returning and starting his
questions. “Is it a living object?” “No.” “Is it here on earth?” “Yes.” So the questions go
from respondent to respondent around the room until at length the word emerges:
victory if in twenty tries or less; otherwise, defeat. Then comes the moment when we
are fourth to be sent from the room. We are locked out unbelievably long. On finally
being readmitted, we find a smile on everyone’s face, sign of a joke or a plot. We
innocently start our questions. At first the answers come quickly. Then each question
begins to take longer in the answering—strange, when the answer itself is only a simple
“yes” or “no.” At length, feeling hot on the trail, we ask, “Is the word ‘cloud’?” “Yes,”
comes the reply, and everyone bursts out laughing. When we were out of the room, they
explain, they had agreed not to agree in advance on any word at all. Each one around the
circle could respond “yes” or “no” as he pleased to whatever question we put to him. But
however he replied he had to have a word in mind compatible with his own reply—and
with all the replies that went before. No wonder some of those decisions between “yes”
and “no” proved so hard!
• — J. A. Wheeler [2]
1/4/2017 12
13. • Wheeler proposes to delay the decision
of whether to keep or to remove the
beam splitter 2, until we are sure the
photon passed from splitter 1 In fact, his
thought experiment uses instead of
beam splitters and mirrors, the
deflection of light caused by the gravity
of an entire galaxy.
• He concludes :
Since we make our decision whether to
measure the interference from the two
paths or to determine which path was
followed a billion or so years after the
photon started its journey, we must
conclude that our very act of
measurement not only revealed the
nature of the photon’s history on its way
to us, but in some sense determined that
history.
• The past history of the universe has no
more validity than is assigned by the
measurements we make–now!
The delayed choice experiment is the
source for Wheeler’s law without law
and it from bit.
(Frontiers collection) Aguirre, Anthony Nicholas_ Foster, Brendan Zdon_ Merali, Zeeya (eds)-It from bit
or bit from it on physics and information-Springer (2015)
1/4/2017 13
14. Why the Quantum? It from Bit? A Participatory Universe?:
Three Far-reaching, Visionary Questions from John Archibald
Wheeler and How They Inspired a Quantum Experimentalist
• It turns out that very naturally the referent of quantum physics is not reality per se but, as Niels Bohr said, it is "what can be said
about the world", or in modern words, it is information. Thus, if information is the most fundamental notion in quantum physics, a
very natural understanding of phenomena like quantum decoherence or quantum teleportation emerges. And quantum
entanglement is then nothing else than the property of subsystems of a composed quantum systems to carry information jointly,
independent of space and time; and the randomness of individual quantum events is a consequence of the finiteness of
information.
• The quantum is then a reflection of the fact that all we can do is make statements about the world, expressed in a discrete number
of bits. The universe is participatory at least in the sense that the experimentalist by choosing the measurement apparatus, defines
out of a set of mutually complementary observables which possible property of a system can manifest itself as reality and the
randomness of individual events stems form the finiteness of information.
• A number of experiments will be reviewed underlining these views. This will include an entangled photon delayed choice
experiment where the decision whether a photon that has passed a double slit did this as a particle or a wave is delayed not only
until a time after its passage through the double slit assembly but even after it has already been registered. Thus, while the
observed facts, i.e. the events registered by the detectors, are not changed, our physical picture changes depending on our choice
what to measure.
• In conclusion it may very well be said that information is the irreducible kernel from which everything else flows. Then the
question why nature appears quantized is simply a consequence of the fact that information itself is quantized by necessity. It
might even be fair to observe that the concept that information is fundamental is very old knowledge of humanity, witness for
example the beginning of gospel according to John: "In the beginning was the Word".
http://www.metanexus.net/archive/ultimate_reality/zeilinger.pdf1/4/2017 14
15. It from Bit from It from Bit... Nature and Nonlinear Logic
Wm. C. McHarris Departments of Chemistry and Physics/Astronomy Michigan State University East Lansing, MI 48824
• Perhaps the most dramatic illustrations of how little we comprehend nonlinear logic arise in computer science from evolutionary computer
programs. Initiated at MIT and Stanford, this sort of research is now being performed at many major universities, including MSU. Here is a
seemingly straightforward example.
• A popular yardstick among computer scientists is to create an efficient program that will sort, say, a set of 100 random numbers into ascending
order. To simulate evolution in creating such a program, one first uses a pseudorandom number generator to generate programs consisting of
(almost) random sequences of numbers. One can then either use these programs “raw,” or to speed up the process, one can retain only
instructions at least marginally useful for sorting, such as comparison and exchange instructions. Thus, one begins with a population consisting
of, say, 10,000 random programs, each consisting of several hundred instructions. One then runs and tests these randomly generated
programs, which, as expected, do a very poor job at first. Only the “most fit,” meaning any programs that show the slightest inclination for
sorting, are retained. These are used to create the next generation. This can be done in two ways: First, by inserting random minor variations,
corresponding to asexual mutations; second, by “mating” parent programs to create a child program, i.e., by splicing parts of programs
together, hoping that useful instructions from each parent occasionally will be inherited and become concentrated. This process can be
repeated thousands upon thousands of time with fast parallel processors, and eventually very efficient programs can result. (It should be
mentioned in passing that this sort of research is carried out on isolated computers using nonstandard operating systems, for such programs
could become dangerous viruses on the internet.)
• Some of the results were startling. Danny Hillis, who originated many aspects of these programs, sums it up vividly in his book, The Pattern on
the Stone [9]:
• I have used simulated evolution to evolve a program to solve specific sorting problems, so I know that the process works as described. In my experiments, I also
favored the programs that sorted the test sequences quickly, so that faster programs were more likely to survive. This evolutionary process created very fast
sorting programs. For the problems I was interested in, the programs that evolved were actually slightly faster than any of the algorithms described ...[standard
algorithms].— and, in fact, they were faster at sorting numbers than any program I could have written myself. One of the interesting things about the sorting
programs that evolved in my experiment is that I do not understand how they work. [my emphasis] I have carefully examined their instruction sequences, but I
do not understand -4- them; I have no simpler explanation of how the programs work than the instruction sequences themselves. It may be that the programs
are not understandable — that there is no way to break the operation of the program into a hierarchy of understandable parts. If this is true — if evolution can
produce something as simple as a sorting program which is fundamentally incomprehensible — it does not bode well for our prospects of ever understanding
the human brain.
http://www.mrtc.mdh.se/~gdc01/work/ARTICLES/2014/9-INFO-SpecialIssue/FQXi2013/13-1908-McHarris-Nonlinear-Logic.pdf
1/4/2017 15
17. New Kind of ScienceStephen Wolfram January 15, 2002
My purpose in this book is to initiate another such
transformation, and to introduce a new kind of science that is based on
the much more general types of rules that can be embodied in simple
computer programs.
Five hundred steps in the evolution of the rule cellular automaton above. The pattern
produced continues to expand on both left and right, but only the part that fits across
the page is shown here. The asymmetry between the left and right-hand sides is a
direct consequence of asymmetry that exists in the particular underlying cellular
automaton rule used.
1/4/2017 17
18. Erik Verlinde interview
• Most people, certainly in physics, think we can describe
gravity perfectly adequately using Einstein’s General
Relativity. But it now seems that we can also start from a
microscopic formulation where there is no gravity to begin
with, but you can derive it. This is called ‘emergence’.
•
We have other phenomena in Physics like this. Take a
concept like ‘temperature’, for instance. We experience it
every day. We can feel temperature. But, if you really
think about the microscopic molecules, there’s no notion
of temperature there. It’s something that has to do with
the property of all molecules together; it’s like the average
energy per molecule.
• Gravity is similar. It’s really something that only appears
when you put many things together at a microscopic scale
and then you suddenly see that certain equations arise
1/4/2017 18
20. Suy định luật Newton;
Không cần năng lượng tối
• Ý tưởng chính ở đây là : theo nguyên lý
toàn ảnh (holographic principle) thì thông
tin điều khiển các hiện tượng vật lý, thông
tin xác định vị trí mà ta gán cho hạt. Mọi
thông tin có thể phân theo « bit ».Bản thân
thông tin có bậc tự do riêng (own degree of
freedom) mà chúng ta có thể nối liền với
năng lượng toàn phần của hệ nhờ nguyên
lý phân đều (equipartition principle) năng
lượng.Nhờ entropy + nguyên lý toàn ảnh
mà chúng ta thu được định luật Newton.
• … Tác giả ( cũng) thu được phương trình
Einstein nhờ entropy +nguyên lý toàn ảnh
(holographic principle) mà không nói đến
vật chất tối cũng như năng lượng tối
https://www.facebook.com/notes/chi-cao/h%E1%BA%A5p-d%E1%BA%ABn-entropic-h%E1%BA%A5p-d%E1%BA%ABn-c%E1%BA%A3m-%E1%BB%A9ng/18279005840983541/4/2017 20
21. The Universe as Quantum Computer
• The inability of classical digital
computers to reproduce quantum
effects efficiently makes it implausible
that the universe is a classical digital
system such as a cellular automaton.
• However, all observed phenomena are
consistent with the model in
which the universe is a quantum
computer, e.g., a quantum cellular
automaton.
• The quantum computational model of
the universe explains previously
unexplained features, most
importantly, the co-existence in the
universe of randomness and order, and
of simplicity and complexity
1/4/2017 21
23. Monster – the discover of the fourth group class
that links to String theory and maybe – Emergent
• Discovery of the Monster came out of the search for new finite
simple groups during the 1960s and 70s
Steps to discovering the Monster:
1. Émile Mathieu’s discovery of the group of permutations M24
2. discovery of the Leech Lattice in 24 dimensions
3. J.H. Conway’s discovery of Co1 (Conway’s largest simple group)
4. Bernd Fischer’s discovery of the Monster
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 23
24. Overview of the Monster
• Largest sporadic group of the finite simple groups
• Order = 246 . 320 . 59 . 76 . 112 . 133 . 17 . 19 . 23 . 29 . 31 . 41 . 47 . 59 . 71
= 808017424794512875886459904961710757005754368000000000
• Smallest # of dimensions which the Monster can act nontrivially =
196,883
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 24
25. Griess’s Construction of the Monster
• 2 cross sections of the monster: Conway’s largest simple group
(requiring 96,308 dimensions) and the Baby Monster
Simple group acting on the Monster splits the space into three
subspaces of the following dimensions:
98,304 + 300 + 98,280 = 196,884
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 25
26. Construction (cont.d)
• 98,304 = 212 * 24 = space needed for the cross-section
• 300 = 24 + 23 + 22 + …+ 2 + 1 = triangular arrangement of numbers
with 24 in the first row, 23 in the second row, etc.
• 98,280 = 196,560/2 which comes from the Leech Lattice where there
are 196,560 points closest to a given point and they come in 98,280
pairs
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 26
27. Moonshine connections
the Monster’s connection with number theory
1. The j-function:
j(q) = q-1 + 196,884q + 21,493,760q2 + 864,299,970q3 +
20,245,856,256q4 + …
Character degrees for the Monster:
1
196,883
21,296,876
842,609,326
18,538,750,076
Let,
= m1
= m2
= m3
= m4
= m5
Let the coefficients of the j-
function = j1, j2, j3, j4, j5
respectively
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 27
28. j2 = m1 + m2
j3 = m1 + m2 + m3
j4 = m1 + m1 + m2 + m2 + m3 + m4
2. J-function and Modular Theory
all prime numbers that could be used to obtain other j-functions:
2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 41, 47, 59, 71
= prime numbers that factor the order of the Monster
Rationale:
modular group (allows one pair of integers to change into another) operates
on the hyperbolic plane
surface is a sphere when the number is one of the primes above,
otherwise it would be a torus, double torus, etc.
Moonshine Module: infinite dimensional space having the Monster as its
symmetry group which gives rise to the j-function and mini-j-functions
(Hauptmoduls)
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 28
29. 3. String Theory
number of dimensions for String Theory is either 10 or 26
• a path on which time-distance is always zero in a higher
dimensional (> 4) space-time (Lorentzian space) yields a
perpendicular Euclidean space of 2 dimensions lower
ex. 26-dimensional Lorentzian space yields the 24-dimensional
Euclidean space which contains the Leech Lattice
Leech Lattice contains a point
(0,1,2,3,4,…,23,24,70)
time distance from origin point in Lorentzian space
0 = 0² + 1² + 2² + … + 23² + 24² - 70²
this point lies on a light ray through the origin
Borcherd said a string moving in space-time is only nonzero if
space-time is 26-dimensional
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 29
30. 4. another connection with number theory
Some special properties of the number 163
a. eπ√(163) = 262537412640768743.99999999999925 which is
very close to a whole number
b. x² - x + 41 = 0 has √(163) as one of its factors
x² - x + 41 gives the prime numbers for all values of x
between 1 and 40
Monster: 194 columns in characteristic table which give
functions
163 are completely independent
“Understanding [the Monster’s] full nature is likely to shed light
on the very fabric of the universe.” Mark Ronan
people.virginia.edu/~mah7cd/Math552/Monster.ppt1/4/2017 30
32. In 1964 Leech published a paper on sphere packing in eight or more dimensions. It
contained a lattice packing in 24 dimensions.
In 1965 he submitted a supplement to the paper giving a packing in 24 dimensions.
He did not have the group theory skills necessary to prove his conjectures of the symmetry
of the group, so he sought the help of John Conway.
A lattice with no elements of length equal to 2, thus making it the tightest lattice packing of
spheres in 24 dimensions.
people.virginia.edu/~mah7cd/Math552/LeechLattices.ppt
1/4/2017 32
33. The Leech lattice
The points of the Leech lattice are the centers of spheres,
each touching 196,560 others with the following list of
properties:
• Unimodular = it can be generated by the columns of a
certain 24x24 matrix with determinant 1
• Even = the square of the length of any vector in the
leech lattice is an even integer
• The last condition = the unit balls at the points in the
leech lattice do not overlap. Each is tangent to 196,560
neighbors and thus is known to be the largest number
of non-overlapping 24-diemensional unit balls that can
simultaneously touch a single unit ball (compare with 6
in dimension 2 as the max number of pennies which
can touch a central penny).
people.virginia.edu/~mah7cd/Math552/LeechLattices.ppt
1/4/2017 33
34. Leech Lattice and representation of bit
• Bits are bits when they can reserve the fidelity through interaction.
• The representation must be able to cancel noise.
• Leech Lattice is the most representation ( thus far) to present bit with
highest density where 196,560 points are equidistant from the origin.
Conjecture of Universe digital copy
1/4/2017 34
35. Universe digital copy conjecture: representation of
universal information
• Let’s construct a universe digital copy that represent precisely the universe and
call this T ( Turing), now look at T:
• T has the complexity of 1 ( cannot be shorter)
• T’s code ( can be in bit, qbit or wbit) should be the most optimized representation because it
contains the whole universe, at the same time this coding must be intact despite whatever
noises.
• Hence T’s representation if formulated in n-dimensional must have a density equal to 1.
• Now we return to the Leech Lattice to see that in the mathematical space, 24-
dimension Leech Lattice is probably the most optimum coding that fits T.
• The conjecture goes: Universe digital copy is represented in 24-dimensional
mathematical model.
• The next question is what differentiate T and our U and how can we construct T,
knowing that it’s possible at 24D. This is where we need 2 other dimensions the
job is to emerge T from U and that’s the emergent model
1/4/2017 35
36. Emergent Transformation Conjecture: the math of emergent
• Theorem — Every finite simple group is isomorphic to one of the
following groups:
• A cyclic group with prime order; ( Cyclic transformation)
• An alternating group of degree at least 5; ( addition transformation)
• A simple group of Lie type, including both: ( Continuous transformation)
• the classical Lie groups, namely the simple groups related to the projective special linear,
unitary, symplectic, or orthogonal transformations over a finite field;
• the exceptional and twisted groups of Lie type (including the Tits group).
• One of the 26 sporadic simple groups. ( Not sure transformation)
• The conjecture goes: emergent transformation is from sporadic
simple groups transformation.
1/4/2017 36
38. Deep learning
• There is currently no math model underlining deep learning, seems
like with unlabeled data feeding in to n-layer neural network, Google
can do magic like translation and voice recognition. ( Refer to Cats
problem from Le Viet Quoc)
• No one knows based on what math model, the optimal neural
network is selected and weighted.
• It’s possible that emergent math model can provide a short-cut to
construct these model therefore shorten the learning process.
1/4/2017 38
39. Capability transformation
• Sustainable development equals to capability
maturing – any other shortcut is expensive
• Org Capability maturing is a transformation
process that involves very complicated
stakeholders and processes ( enterprise
architecture – enterprise ontology)
• Transforming EA/EO is an effective way to
transform organization.
• Disruption equals to Emergent transformation.
• By Engineering the Emergent transformation,
organization can disrupt to the next level (!?)
1/4/2017 39
40. What’s in it for us: Information science, getting ready
for Industry 4.0
1/4/2017 40
41. Information science
• It’s possible for Vietnam with lots of IT people to go deeper into
Information science, forming a School ( of thoughts) in this field
• It’s theory but also very closed to application
• The investment is modest
• The experts are available ( overseas)
• The application are needed in many fields: AI, Cyber Security ….
• And there are many open problems to solve – opportunity to get in to
the world map of science
1/4/2017 41
42. Industry 4.0
• Emergent transformation is a possible path for Vietnam to disrupt
into Industry 4.0 without blind hope
• AI, Deep Learning … are the heart of Industry 4.0
• So the research on Emergent transformation can be applied at
different level:
• Strategy: Disruptive strategic alignment
• Process: Enterprise Architecture
• Operation: AI, deep learning.
1/4/2017 42
43. Conclusion
• It’s interesting time to explore Information theory
• Conjectures may turns out Truth resulting in synergy of Physics,
Maths and Info.
• There are potential applications for Vietnam
• There are opportunities to research & disrupt
• Let’s discuss and drill
1/4/2017 43