• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Incommensurability and Semiotic Representation

Incommensurability and Semiotic Representation



How does Correspondence facilitate Represenation for Truth and Consequences?

How does Correspondence facilitate Represenation for Truth and Consequences?



Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-ShareAlike LicenseCC Attribution-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.


11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Incommensurability and Semiotic Representation Incommensurability and Semiotic Representation Document Transcript

    • “Out of the mouths of babes; praise is perfected” – Holy Bible Correspondence Principle and Incommensurability Heart/Soul | Mind/Spirit - Metaphysics – A Correspondence Theory of Truth Rhetoric and Communication Design – Correspondence or incommensurability Representation Theory – Truth and Justification - Justice – Coherent or Abhorrent Semiotic Communications Theory – Visualization and Representation Modality and Representation Whilst semiotics is often encountered in the form of textual analysis, it also involves philosophical theorizing on the role of signs in the construction of reality. Semiotics involves studying representations and the processes involved in representational practices and to semioticians, 'reality' always involves representation. Daniel Chandler Axioms of Activity Occurrence Theory Semiotics, semiotic studies, or semiology is the study of signs and symbols, both individually and grouped into sign systems. It includes the study of how meaning is constructed and understood. This discipline is frequently seen as having important anthropological dimensions. However, some semioticians focus on the logical dimensions of the science. They examine areas belonging also to the natural sciences - such as how organisms make predictions about, and adapt to, their semiotic niche in the world (see semiosis). In general, semiotic theories take signs or sign systems as their object of study: the communication of information in living organisms is covered in biosemiotics or zoosemiosis. Inference and Meaning - 'Correspondence' as Axiomatic Semantics The coherence of Objective Truth to Empirical Proof is reflective of a trustworthy observer and the relevance of this “truth” to what philosophy terms discourse. When the Infinite “transfigures” its “totality” what will transpire – this question awaits! Immanuel Kant
    • Is the truth or falsity of a statement determined only by how it relates to the world, and whether it accurately describes (i.e., corresponds with) that world? Truth cannot be quot;determinedquot; by its correspondence with reality alone - but its representation forms a basis through which truth [as Identity] can be understood. Incommensurable Values First published Mon Jul 23, 2007 --- Stanford Encyclopedia of Philosophy Values, such as liberty and equality, are sometimes said to be incommensurable in the sense that their value cannot be reduced to a common measure. The possibility of value incommensurability is thought to raise deep questions about practical reason and rational choice as well as related questions concerning topics as diverse as akrasia, moral dilemmas, the plausibility of utilitarianism, and the foundations of liberalism. This entry outlines answers in the contemporary literature to these questions, starting with questions about the nature and possibility of value incommensurability. Quantum Computing A ‗square‘ Quantum Analogy Quantum Logic for Photon Detection Incommensurability between values must be distinguished from the kind of incommensurability associated with Paul Feyerabend (1978, 1981, 1993) and Thomas Kuhn (1977, 1983, 1996) in epistemology and the philosophy of science. Feyerabend and Kuhn were concerned with incommensurability between rival theories or paradigms — that is, the inability to express or comprehend one conceptual scheme, such as Aristotelian physics, in terms of another, such as Newtonian physics. In contrast, contemporary inquiry into value incommensurability concerns comparisons among abstract values (such as liberty or equality) or particular bearers of value (such as a certain institution or its effects on liberty or equality). The term ―bearer of value‖ is to be understood broadly. Bearers of value can be objects of potential choice (such as a career) or states of affairs that cannot be chosen (such as a beautiful sunset). Such bearers of value are valuable in virtue of the abstract value or values they instantiate or display (so, for example, an institution might be valuable in virtue of the liberty or equality that it engenders or embodies). In retrospect – History provides the example of the changes that took place when the ‗horseless carriage‘ replaced man‘s noble friend – the horse as a primary means of ‗fast‘ transportation. This same ‗incommensurability‘ exist where nanotechnology will replace conventional thermodynamics.
    • Ontology#1 -- (the metaphysical study of the nature of being and existence) From Bonjour: - Is the Coherence Theory Adequate for ―Truth‖ Truth is said to consist in the agreement of knowledge with the object. According to this mere verbal definition, then, my knowledge, in order to be true, must agree with the object. Now, I can only compare the object with my knowledge by this means, namely, by taking knowledge of it. My knowledge, then, is to be verified by itself, which is far from being sufficient for truth. For as the object is external to me, and the knowledge is in me, I can only judge whether my knowledge of the object agrees with my knowledge of the object. Such a circle in explanation was called by the ancients Diallelos. And the logicians were accused of this fallacy by the skeptics, who remarked that this account of truth was as if a man before a judicial tribunal should make a statement, and appeal in support of it to a witness whom no one knows, but who defends his own credibility by saying that the man who had called him as a witness is an honorable man.[23] Classical definition of knowledge as: (i) Belief (ii) True (iii) Justified The quot;knowerquot; has quot;apprehensionquot; of being justified. This quot;apprehensionquot; needs be quot;coherentquot; to quot;othersquot;. The purpose of quot;needfulquot; justification must seemingly be itself a conceptual apprehension in that the knower comprehends that others may not necessarily see things this way. Does this ‗represent‘ a quot;coherence theory of justificationquot;! Is such an apprehension of the ‗given‘ as necessarily non-conceptual and non-propositional – hence [Counter-intuitive]? Is this ‗coherence‘ immune from error? What if the quot;apprehensionquot; of what is ‗given‘ is not quot;coherentquot; at all? Is this the discernment of the difference between intuitive and the counter-intuitive. Do Truth and 'Correspondence' in a coherent representation – become discourse. Science – Perception - Reality - empirical truth; without sprit - are deemed abstract. What relevance does the OT requirement for testimony of quot;two or three witnessesquot; have to the ( 3) synoptic gospels? What distinguishes Christianity from all other religions is not its morality - Buddhism promotes moral behavior; not its longevity - Hinduism is older; but its claim that God became man and redeemed the world by his own sacrifice. This is Christianity's strongest attribute, since it can stand the test of history and historical empiricism. We can prove what
    • others only theorize, meditatively conjure, or feel. It is also Christianity's greatest vulnerability, because if one could disprove Jesus and his resurrection, one would disprove Christianity itself. If Buddha never lived, the moral principles of Buddhism would survive. If Krishna was not a manifestation of God, the philosophical ideas of Hinduism would still be entertained. But if Jesus did not live, die, and rise again immortal in his physical body, then the very basis of Christianity is destroyed. Judicial and Islamic expert Sir Norman Anderson remarked, Christianity is, truly, quot;the witness of historyquot; - it‘s original followers died not for a system of rituals or a list of behaviors, but for the empirically verified and historically preserved fact of the death, burial, and resurrection of Jesus Christ. As the apostle Paul said, quot;if Christ is not raised, our faith is vain and we are of all people most miserablequot; (1 Cor. 15:17). Emmanuel Levinas calls God – ‗the absolute other‘ and in Otherwise than Being he uses the term the ‗otherwise than being‘ (Totality and Infinity. Dickens University Press 2001 pp. 34-35) You shall know the truth and the truth shall set you free! Laws as rules of inference – ‗Justification‘ as Purpose in ―meaning‖ Despite its obvious interest and importance, however, it does not seem to me that the semantic conception of truth helps in any way to solve the problem of truth with which we are presently concerned, viz. the problem of how a true empirical belief or statement is related to the world of which it is true. The key point to be noted is that what appears on the right-hand side of an equivalence of form (T) such as is a consequence of a Tarski-type truth definition is a translation of the sentence whose truth it is intended to explicate; in fact, in the case of a meta-language which contains its object-language as a sub-component, what appears on the right is just the object-language sentence itself. Thus such equivalence seems to tell us only (i) that an object-language sentence is true if and only if its meta- language translation can be correctly asserted, i.e. is true, and (ii) what that translation is, where it may be just the sentence itself. Now (i) seems only to represent a necessary, though clearly not a sufficient, condition of adequacy for a translation; (ii) on the other hand conveys an important relation between the two languages. But it is hard to see that either (i) or (ii) says anything about the nature of truth. Coherence theory of truth is integral to all comprehensive systems A pervasive tenet is the idea that truth is primarily a property of whole systems of propositions and can be ascribed to individual propositions only derivatively according to their coherence with the whole Holistic Systems - Squaring the Jacobean and finding its determinant Holism (from???? holos, a Greek word meaning all, entire, total) is the idea that all the properties of a given system (biological, chemical, social, economic, mental, linguistic, etc.) cannot be determined or explained by the sum of its component parts alone. Instead, the system as a whole determines in an important way how the parts behave. The general principle of holism was concisely summarized by Aristotle in the Metaphysics: quot;The whole is more than the sum of its partsquot;. Reductionism is seen as the opposite of holism. Reductionism in science says that a
    • complex system can be explained by reduction to its fundamental parts. Essentially, chemistry is reducible to physics, biology is reducible to chemistry and physics, and psychology and sociology are reducible to biology, etc. Theories of truth  ·Consensus theory  ·Correspondence theory  ·Deflationary theory  ·Epistemic theories  ·Indefinability theory  ·Pragmatic theory  ·Redundancy theory  ·Semantic theory Stanford Encyclopedia of Philosophy:  Truth  Coherence theory  Correspondence theory  Deflationary theory  Identity theory  Revision theory  Tarski's definition  Axiomatic theories – Semantic [the Adage] /Wisdom The Ontogeny of Environmental biology - Adaptation and the Tree of Life Ontogeny (also ontogenesis or morphogenesis) describes the origin and the development of an organism from the fertilized egg to its mature form. Ontogeny is studied in developmental biology. Ontogeny [Phylogenic] corresponds commensally with Ontology within the framework of a much more comprehensive metaphysics (providing axiomatic semantics) - it also reconciles 'casualty' with creation through the 'tree of life' concept even where such a concept is rejected [a divine creator]. As with all (visionary concepts) this one meets with the normal 'framework' problem – [incommensurability with lesser theories [lacking dimension] or which do not experience CHANGE. http://www.geosociety.org/pubs/gsatoday/grgsat/0007-2.htm http://www.astrobio.net/articles/images/phylogenic_tree_lg.jpg http://www.nai.arc.nasa.gov/news_stories/news_detail.cfm?ID=94
    • Phylogenic Tree of Life – Two Dimensional The Tree of Life: Cold Start? A phylogenetic tree of living things based on RNA data and proposed by Carl Woese, showing the Comparing among many different organisms the separation of bacteria, archaea, and eukaryotes. sequence of a gene that encodes a ribosomal RNA Trees constructed with other genes are generally (rRNA), Woese drew the first comprehensive tree of similar, although they may place some early- life. One surprise in the tree was the appearance of branching groups very differently, thanks to long bacteria that thrive in high temperatures branch attraction. The exact relationships of the three (hyperthermophiles), near or at the root of the tree. The domains are still being debated, as is the position of tree, and supporting research from other fields, led to the root of the tree. It has also been suggested that the speculation that life originated in very hot due to lateral gene transfer, a tree may not be the environments, perhaps in the hydrothermal vent systems best representation of the genetic relationships of all found deep underwater around the globe. Woese's organisms. For instance some genetic evidence original tree, and the resulting speculation that life arose suggests that eukaryotes evolved from the union of in a hot environment have become widely accepted some bacteria and archaea (one becoming the among researchers, and have taken on the status of nucleus and the other the main cell). textbook explanations of the origin of life. The primordial dichotomy in the Physiology of EukaryoteProkaryote (Processing of gene Information) cells is the fundamental principle guiding ethical utilization of bioengineering technologies. Explore the Tree of Life - 2009 Was Darwin Wrong about „The Tree‟? The correspondence principle with its commensality | incommensurability paradigm can be applied to 'ontogeny' where an quot;idea of valuequot; is discerned. This quot;idea of valuequot; must represent ‗integrity‘ as well; holistically. Global Seed Banks meet this requirement. CORRESPONDENCE forms a commensal relation with science so the two may coexist peacefully in a shared mental framework – noetic [eidetic] noesis. Representation Theory utilizes the correspondence principle in this way. The Question of Ontogeny and the Ontological has some interesting ASPECTS The convergence of Philosophy and theology Semantic Ontology‘s provide ‗order‘ for Insight and understanding Correspondence and Ontogeny found commensurable in Semantic Ontology.
    • Light Propagates at an ‗astronomical‘ distance – Deep Space Objects in ‗Rapid Development‘ Visualization utilizes a powerful lens [Hubble Space Telescope] to view a Millennial Moment. 1987 2004 You can now see the prophecy of Hubble scientist George Sonneborn fulfilled. The Biblical statement in Matthew 6:22 says quot;if your eye be single your body will fill with lightquot;. Let us look at the two pictures together, in the beginning and now. Then think to yourself, why has this happened, for what purpose is this happening? There is another eye of God as you know which the Hour Glass Nebula is. That is the all Seeing Eye. This is the cosmic pineal, this is the cosmic third eye and what happens inside of a persons brain when the pineal lights is what is happening in the cosmos now. Joy to the world – the Lord has come; Hosanna in the Highest
    • Metaphysics and Epistemology Augustine's Confessions: Issues and Commentaries This text was a breakthrough by which Augustine imposed on philosophy and theology central issues: the self, election as identification, philosophy seen from the point of view of salvation (spiritual exercise), time as history and eschatology, being as creation, biblical text as interpreting the reader, etc. But all those themes have a recent renewed intensity because postmodern thought and mainly phenomenology (Heidegger, Arendt, Derrida, etc.) have pointed out that Augustine, to some extent, might not have been involved in standard metaphysics. The reading is based on the Latin text (Bibliotheque augustinienne, Paris); some knowledge of Latin may be helpful. Translations: either H. Chadwick (Oxford, 1991) or M. Boulding (New York, 1997) Jean-Luc Marion. Spring 2004. Physics imparts the property of ‗super symmetry‘ to the metaphysics; this is why without ethics ‗the metaphysics‘ are erroneous; the concept of a monopole is perfectly harmonious with conception and perception as well – through this learning and insight are accomplished The question of incommensurability to a ‗classical observer‘ is of the utmost importance; Especially when ‗interpreting‘ natural phenomenon such as SN 1987a and this is because Of the ‗metaphysical‘ principles which allow such interpretation to be ‗represented‘. The Micro – Macro debate is never more coherent than when dealing with the ‗microscope‘ of ‗Deep Space‘ – instantiation or instanton; Molecular ring currents induced by magnetic monopoles Transverse resistivity ρxy for the single crystal, thin film and calcium doped thin film (image credit: Z Fang et al. 2003 Science 302 92). These researchers; such as David Akers term it – ―Magnetic-monopole spin resonance‖ Space Systems Division, Lockheed Missiles and Space Company, Sunnyvale, California 94086 Here is the ‗counter intuitive‘ viewpoint concerning ONTOLOGY! http://www.shirky.com/writings/ontology_overrated.html
    • K-Theory and non-commutative geometry – NON STANDARD STUFF Ultrafilters – Ultraproducts; but unfortunately the ‗corruption of ultra power‘ ON THE FINE STRUCTURE OF SPACETIME ‗This‘ formalism of quantum mechanics ftp://ftp.alainconnes.org/2000.pdf Being discovered as; quot;infinitesimal variablesquot; gives a framework where continuous variables can coexist with infinitesimal ones, at the only price of having more subtle algebraic rules where commutativity no longer holds. The new infinitesimals have an quot;orderquot; (an infinitesimal of order one is a compact operator whose characteristic values mu_n are a big O of 1/n). The novel point is that they have an integral, which in physics terms is given by the coefficient of the logarithmic divergence of the trace. Thus one obtains a new stage for the quot;calculusquot; and it is at the core of noncommutative differential geometry.
    • Spread-spectrum techniques are methods by which energy generated in a particular bandwidth is deliberately spread in the frequency domain, resulting in a signal with a wider bandwidth. These techniques are used for a variety of reasons, including the establishment of secure communications, increasing resistance to natural The most celebrated invention of interference and jamming, and to prevent detection. frequency hopping was that of actress Hedy Lamarr and composer George Antheil, who in 1942 received U.S. Patent Code division multiple access 2,292,387 for their quot;Secret (CDMA) is a channel access Communications Systemquot;. method utilized by various radio communication technologies. It should not be confused with the mobile phone standards called cdmaOne and CDMA2000 (which are often referred to as simply quot;CDMAquot;), that use CDMA as their underlying channel access methods. THE METAPHYSICS OF REASON - http://diacentro.physics.auth.gr/rtalks/pdfs/nikolaidis01.doc The real problem of BRST supersymmetry - Physics requires Metaphysics Incommensurability is the real issue within this micro-macro dichotomy! The Question of Metaphysical ‗Validity‘ is not confined to scientific research: Where is the wisdom we have lost in knowledge? Where is the knowledge that we have lost in information?quot; T.S. Eliot, The Rock (1934) pt.1
    • Dirac quantization condition First, and above all for Dirac, the logic that led to the theory was, although deeply sophisticated, in a sense beautifully simple. Much later, when someone asked him (as many must have done before) quot;How did you find the Dirac equation?quot; he is said to have replied: quot;I found it beautiful.quot; Second, it agreed with precise measurements of the energies of light emitted from atoms, in particularly where these differed from ordinary (non-relativistic) quantum mechanics. Attempts to find monopoles A number of attempts have been made to detect magnetic monopoles. One of the simplest is to use a loop of superconducting wire that can look for even tiny magnetic sources, a so- called quot;superconducting quantum interference detectorquot;, or SQUID. Given the predicted density, loops the size of a soup can would expect to see about one monopole event per year. Although there have been tantalizing events recorded, in particular the event recorded by Blas Cabrera on the night of February 14, 1982 (thus, sometimes referred to as the quot;Valentine's Day Monopolequot;), there has never been reproducible evidence for the existence of magnetic monopoles. The lack of such events places a limit on the number of 29 monopoles of about 1 monopole per 10 nucleons. The predicted magnitude and generalized rarity of ‗magnetic monopoles‘ speaks of things Which may ‗approach‘ the Infinite and are present in things which are ‗very peculiar‘? THE HOUR GLASS NEBULA and the Cosmic Ontogeny of Astronomic Observation You can see the same pattern in the Hour Glass Nebula of the intersecting circles which we also found in Supernova 1987a. We can see a close up of the all Seeing Eye in the hourglass nebula. So here we observe the all seeing eye of God. And we now have the third eye of God on fire. The single eye that Jesus referred to in Matthew 6:22. And something spectacular is happening just as was the prophecy from NASA Hubble Scientist George Sonneborn. Let us go to the NASA announcement of developments in Supernova 1987a It is written – ―and man was made in God‘s Image‖ - AS IT IS ABOVE SO IT IS BELOW As we have shown most recently, the human brain and the cosmic brain are the same. As you can see here the web configuration in the human cerebral cortex is the same as the web configuration in the universe.
    • THE UNIVERSE HUMAN CEREBRAL CORTEX In the same way that the universal brain and the human brain are the same, the universal pineal body (single eye) and the human pineal body (single eye) are the same. quot;Scientific Realism or Irenic Instrumentalism?quot;- Metaphysics for Rational Folks! Quodlibet Online Journal Preface to Totality and Infinity © Val Petridis; Assisted by: Tom Fatsis Throughout his works Emmanuel Levinas has used various terminologies to refer to God. In every case God is described as infinite unknowable, unsayable and unsignifyable. In the ―Trace of the Face― Levinas refers to God as the Unknown and absolute other. I will show that these concepts are completely compatible with one another and are based on the same premises. Furthermore, it will be argued that term God is a phrase or ideatum that refers to that which cannot be known, signified, or contained in any expression of language. Levinas‘ God will be shown to everything that can never be known or said. This God is not the personal deity depicted in typical religiosity, Levinas‘ God is not a divinity that interacts with the human world, rather this God is that which lies beyond the limits of what humans can ever experience or know. It will be shown that the term God as found in Levinas work can be easily substituted by any other term that refers to that which is beyond everything contained within that which can be known. Thus, I will prove that Levinas use of the terms infinite, the unknown, the absolute other and the otherwise than being to refer to the same non-religious God or the something that is absolutely beyond being. The term God refers to that which a secular term like infinity could as easily be used to referred to without losing any of its intended connotations lost in the substitution of one term for the other.. In fact, I will show that the term God is more problematic than its secular counterparts as it contains religious presuppositions that are not contained in Levinas formulation of that which lies beyond being. In this sense, Levinas‘ terms for God secularize divinity and relegate God to a concept acceptable even to atheists. This is not the God of religion rather a term used as an ideatum of that which is always beyond what humans will ever know. This God who will never fulfill any of the characteristics usually attributed to God and even if God did, no one would ever know it.
    • Reading the Kristevan Semiotic and Symbolic: Nina Sadur's quot;Kol'tsaquot; and Marina Kulakova's quot;Reka po imeni Masterquot; Canadian Slavonic Papers, Sep-Dec 2003 by Sutcliffe, Benjamin M ABSTRACT: The contemporary author and playwright Nina Sadur's short story quot;Kol'tsaquot; reflects how Julia Kristeva's dichotomous concepts semiotic and symbolic work with objects and dreams influencing the lives of two teenage girls. The semiotic is chaotic and resists organization, while Kristeva links the symbolic to the rational world. quot;Kol'tsaquot; shows that ultimately coherent meaning is doomed. Marina Kulakova's prose poem quot;Reka po imeni Masterquot; depicts a teacher, initially paired with the symbolic, who is ultimately subsumed by the semiotic silence and shifting seasons of the remote village where she works. Kulakova, primarily a poet, has been published in Russia but remains unknown in the West. U istiny svobodnye odezhdy. The truth wears loose-fitting clothes. Marina Kulakova Julia Kristeva's opposed terms; semiotic and symbolic - suggest different modes of signification in two works by contemporary Russian authors: Nina Sadur's short story quot;Kol'tsaquot; and Marina Kulakova's prose poem quot;Reka po imeni Master.quot; A Kristevan analysis allows us to make several crucial distinctions concerning varying ways of perceiving the world and how these operate within works devoted to depicting ambiguity. Kristeva's Revolution in Poetic Language describes the semiotic and symbolic as two modalities of what is, for us, the same signifying process. We shall call the first the semiotic and the second the symbolic. These two modalities are inseparable within the signifying process that constitutes language, and the dialectic between them determines the type of discourse (narrative, metalanguage, theory, poetry, etc.) involved [...]. Because the subject is always both semiotic and symbolic, no signifying system he produces can be either 'exclusively' semiotic or 'exclusively' symbolic, and is necessarily marked by indebtedness to both. Definitions of Ontogeny should include:  Metamorphosis and Cytology – dynamic Ontogeny  Complex Adaptive Systems – Systems Ecology  Information Behavior and Biological Adaptation [Eugenesis]  Facilitation of Genetic Integrity - Eukaryotic | Prokaryotic  The Dynamic Ecology of Successional Communities  Potential Theory and its ―dynamic‖ functionality  Ethics of the Common Good – Protection Paradigms The primordial dichotomy in the Physiology of Eukaryote Prokaryote (Processing of gene Information) cells is the fundamental principle which must guide the ethical utilization of bioengineering technologies.
    • Phylogenic – Morphological Structures; Systemic Functions and Cellular ‗organs‘ • Action Potential • Chemical Synapse • Cochlear Structures • Endocytosis and Exocytose • Eukaryote Prokaryote (Processing of gene Information) • Gastric Secretion • Lysosome • Neuromuscular Junction • Oxygen Carbon Dioxide • Sarcomere • Signal Amplification • Thyroid Hormone Explore the Tree of Life The source of this ‗Tree of Life‘ concept is of course the Book known as the BIBLE. It is ‗referenced in the first book – GENESIS || as ‗the way of the tree of life‘ growing in The ‗Garden of Eden‘ and this ‗way‘ is guarded by ‗spirit beings‖ [Angels]. This ‗new representation‘ is based upon thermodynamic considerations and is a step towards reconciliation of Science with Faith. Representation of ―the tree of life‖ is of course about commensality but theories can be incommensurable in Correspondence with ‗historic record‘ National Institute of Biomedical Imaging Bioengineering (NIBIB) International Union of Pure and Applied Chemistry (IUPAC) Converging Technologies for Improving Human Performance Nanotechnology Biotechnology Information Technology - Cognitive Science
    • http://www.skepticfiles.org/science/931129ts.htm Until Roberts and Sharp announced their finding, at the same meeting in June 1977 held at Cold Spring Harbor, it was thought that the genetic information embedded in DNA was continuous. This understanding arose in large part from work on prokaryote systems, such as E. coli. But in eukaryotes, the genetic information is, in the vast majority, interrupted by nucleotide regions that do not code for proteins. These are called intervening sequences or introns. The domains that carry protein coding amino acids are known as exons, because their information is expressed. In these structures, the genetic information is split into pieces, hence the name quot;split genes.quot; http://www.nsf.gov/crssprgm/nano/reports/program_converging_tech_sch050127.pdf Misogyny - Issues in bioethics which relate to this dichotomy are:  cloning,  Frozen embryos – can these be used for ‗embryonic stem cell‘ Research?  debate over GMOs  xenotransplantation  privacy issues in genetic testing Research and Development – Bioengineering | Nanotechnology  Converging new technologies in industry and medicine  Brain, mind and behavior  Human-machine interface  Reshaping organizations and business  Enhancement of cell functions  Legal and ethical issues  Computer interfaces with nano-bio-cognitive  International research activities components - Government programs http://www.highbeam.com/doc/1G1-114924599.html This article is a brief introduction to the extraordinarily complex phenomenon of life and to its molecular basis. We begin with the amazing diversity of life forms and the equally amazing unity in the molecules underlying life's processes. The challenge of accounting for both the variety and the commonalities among organisms is met by evolutionary theory; despite controversies, all scientific approaches to understanding life build on a shared core that can briefly be stated. One of the great insights of the last generation of biologists was
    • the chemical instantiation of these evolutionary theories, whose discovery has driven biology toward the study of the structure and function of biological molecules. After an introduction to some of these key molecules and to the central dogma of molecular biology, we can begin to see the outlines of how such molecules can accomplish the tasks required of simple and then more complex life forms. The introduction concludes with a brief account of some of the new instruments and model systems that are now so rapidly advancing scientific understanding of life. We deem necessary to: 1- Promote an active policy towards the protection of intellectual property rights (IP) for many of the key issues. IP agreements between Funding Agencies must contemplate the claims of other institutions (i.e.: universities or firms where research is carried out) 2- Promote international establishment of standards for products derived from NT. Many physical or chemical properties may already be measured for regular materials or components but NT contribution may increase significantly some limit values for new materials (viscosity, chemical stability, among many others) 3- Promote the remote use of instruments to break the gap between research laboratories and even to allow the access of the manufacturing sector to NT. 4- Establish local committees to study potential risks in practices. Investigate risks and hazards and write rules or exchange best practices to overcome them (as done previously with genetic constructions or dangerous chemicals). Would general regulations and standards like those of FDA or EPA be applicable? 5- Include safety and ethical issues as a separate chapter in research meetings. Once risks are clearly identified, the compliance with protocols, safety rules and ethics should be an issue for 1) project proposals at financing agencies and academic and business laboratories 2) release of products to the market 3) appropriate disposal mechanisms. Many countries have established national committees to evaluate these practices for the introduction and handling of Genetic Modified Organisms, among others. Bioengineering and Bioethics – Ontogeny and Metaphysics http://www.non-gmoreport.com/ Bioethics and Biologic Integrity – ALL ORGANIC MOVEMENT – FOOD LABELING A genetically modified organism (GMO) is an organism whose genetic material has been altered using techniques in genetics generally known as recombinant DNA technology. Recombinant DNA technology is the ability to combine DNA molecules from different sources into the one molecule in a test tube. Thus, the abilities or the phenotype of the organism, or the proteins it produces, can be altered through the modification of its genes. [Controversy - Genetically modified food and Transgenic plants [such as grown in Paraguay] Transnational Corporations and their EU [European Union] trade sanctions have created a global marketplace bureaucracy. Making GMO‘s in the case of GMF and transgenic plants very dangerous indeed due to constraints on sanctioning ‗competitive advantages‘. Also many of the TNC‘s which are accused of promoting transgenic plants and consequently GMFs; coincidentally they are also the sponsors of the World Trade Organization; Bilderberg Meetings and the ‗euphamous‘ Tri-lateral Commission (Club of Rome).
    • Ontogeny is defined traditionally in its relationship to Phylogeny; as the ―origin‖ and ―development‖ of cellular structure and function, to biological molecules, bioenergetics, to the genetics of both prokaryotic and eukaryotic organisms, and to the elements of molecular biology. Ontogeny (also ontogenesis or morphogenesis) describes the origin and the development of an organism from the fertilized egg to its mature form. Ontogeny is studied in developmental biology. The idea that ontogeny recapitulates phylogeny, that is, that the development of an organism exactly mirrors the evolutionary development of the species, is discredited today. However the phenomenon of recapitulation, in which a developing organism will for a time show a similar trait or attribute to that of an ancestral species, only to have it disappear at a later stage is well documented. For example, embryos of the baleen whale still develop teeth at certain embryonic stages, only to later disappear. A more general example is the emergence of what could develop into pharyngeal gill pouches if it were in a lower vertebrate in almost all mammalian embryos at early stages of development. (April, 2001). Excerpt [amongst many others – courtesy of Wikipedia] Morphogenesis and Cytology provide the ontogeny necessary to ―appreciate‖ the living world which has well sustained ―the habitants‖. Succession works within dynamic ecosystems providing a sound basis to explain the diversity of life in its global ―Biomes‖ [Ecosystems]. Mother Nature is a ―fitting‖ and ―appropriate‖ ontology which is universally accepted. Mankind has ―just begun‖ to understand the principles which sustain ―living systems‖; Space Exploration is one indicator of this. Only a few years ago very few references to Ontogeny – it has greatly increased its ―semantics of concern‖ in recent years. The Ontogeny of Environmental biology || Adaptation and the Tree of Life Zoonosis - Reference > Zoonosis Updates Zoonosis (pronounced zoo-e-no-sis) is any infectious disease that may be transmitted from other animals, both wild and domestic, to humans or from humans to animals. The word is derived from the Greek words zoon (animal) (pronounced as zoo-on) and nosos (disease). Many serious diseases fall under this category. The plural of zoonosis is zoonoses, from which an alternative singular zoonose is derived by back-formation. The simplest definition of a zoonosis is a disease that can be transmitted from other animals to humans. A slightly more technical definition is a disease that normally exists in other animals, but also infects humans. The emerging interdisciplinary field of conservation medicine, which integrates human and veterinary medicine, and environmental sciences, is largely concerned with zoonosis. Systems and Cybernetic Convergence
    • Correspondence or incommensurability – Human/Computer  Conceptual Role Semantics and Cognitive Design Patterns  Complex Adaptive Systems || Systems Ecology Representation  Ecosystem Ecology  System Definition Model _ Dynamic Systems Initiative  Generative Metamodel - Model Integrated Computing  Solves Axiomatic Semantics Problem - of Declarative/Imperative  Atlantic Zoo - Meta-Modeling | Meta-languages  Generative Modeling Tools (GMT) project  set of prototypes - Model Driven Engineering (MDE)  Semantic Web || Information Behavior Entity Relationship Model - relevant relation [Occam‟s Principle] Relevance in correspondence with another Correspondence - in response to - in conjunction with - in spite of Conformity Congruence Agreement Accordance Copying Picturing Signification Representation Reference Satisfaction - Correspondence with a relevant portion of reality Question of Reality - quot;classical observerquot; Facts - States of affairs –Situations - Events Objects and their Aspect Orientation Sequences of objects – Dimensional Aspects Sets Properties Tropes - entropic constraints; environments; relationship hierarchy 8. The Correspondence Theory and Its Competitors Against the traditional competitors—coherent-ist, pragmatist, and verificationist and other epistemic theories of truth—correspondence theorists raise two main sorts of objections. First, such accounts tend to lead into relativism. Take, e.g., a coherent-ist account of truth; since it is possible that ‗p‘ coheres with the belief system of S while ‗not-p‘ coheres with the belief system of S*, the coherent-ist account seems to imply, absurdly, that contradictories, ‗p‘ and ‗not-p‘, could both be true. To avoid embracing contradictions, coherent-ists often
    • commit themselves (if only covertly) to the objectionable relativistic view that ‗p‘ is true-for- S and ‗not-p‘ is true-for-S*. Second, the competing accounts tend to lead into some form of idealism or anti-realism. E.g., it is possible for the belief that p to cohere with someone's belief system even though it is not a fact that p; also, it is possible for it to be a fact that p even if no one believes that p, or if the belief that p does not cohere with anyone's belief system. Cases of this form are frequently cited as counterexamples to coherent-ist accounts of truth. Coherent-ists tend to reject such counterexamples by insisting that they are not possible after all—a reaction that commits them to the anti-realist view that the facts are (largely) determined by what we believe. B. according to the identity theory of truth, true propositions do not correspond to facts, they are (identical with) facts: the true proposition that snow is white = the fact that snow is white. This non-traditional competitor of the correspondence theory threatens to collapse the correspondence relation into identity. In response, a correspondence theorist might point out: First, the identity theory is defensible only for propositions as truth bearers, and only if propositions are construed in a certain way, namely as having objects and properties as constituents rather than ideas or concepts of objects and properties. Hence, even if the identity theory of truth were accepted for propositions (so construed), there would still be ample room (and need) for correspondence accounts of truth with respect to other types of truth bearers. Second, the identity theory rests on the assumption that that-clauses always denote propositions, so that the ‗that-clause‘ in ―the fact that snow is white‖ denotes the proposition that snow is white. The assumption can be questioned. ‗That-clauses‘ can be understood as ambiguous names, sometimes denoting propositions and sometimes denoting facts. The descriptive phrases ―the proposition…‖ and ―the fact…‖ can be regarded as serving to disambiguate the succeeding ambiguous ‗that-clauses‘. Content - Didactic Pedagogy is essential to the comprehension of ‘Classical Philosophy’. This advocacy defines ‘arts’ as well as ‘letters’; a discourse of creative – if purposeful; communication. The ‘aesthetics’ of walking and talking are part of its ‘charm’. http://www.elea.org/Parmenides/ Parmenides wrote his didactic poem ‘On Nature’ and the 'speculation' began - the concept of metaphysics is often traced to this writing. Ontology is the being; Greek Scholars of the present day - 2007 | 5768 places it's etymology in the 'garden of Eden' - where Adam walked with the Lord and 'named' all of the creatures there; pleasing 'his father'. This naming hence is ontology - the being which is immovable. Parmenides did not have access to the book of Genesis so it may be surmised that On Nature is about Mother Earth as so many endemic people believe and practice. In metaphysics ontology is the study of Nature – Physos (Greek) - it is directly related to 'consciousness' as BEING; from the perspective of a ‘classical observer’ (who in fact is everyone and no one in particular). Ontologies are Ontology - this is the sense of it.
    • Noesis is a state of knowing – comprehensive understanding of an ontologic presence; being. It is not ‘knowledge’ as such; for it is written – ‘The fear of the God is the beginning of knowledge, the fear of the Lord is the beginning of Wisdom.’ The Parmenidean Paths of Inquiry - http://www.elea.org/Parmenides/Parm-comment.htm An Interpretation The following is my interpretation of the philosophy of Parmenides of Elea, the Greek father of metaphysics. His only work, On Nature, is written in rather obscure verse, and so his thesis can be viewed from a variety of perspectives, of which mine is only one (although a fairly standard one). Parmenides' most important principle, hereafter called quot;Parmenides' Principlequot;, was that anything rationally conceivable must exist. Nonbeing is not a thing and can neither be thought of nor spoken about in any meaningful or coherent way. Parmenides forbade talking as if there are possible things that nonetheless do not exist. He illustrated this principle by showing us three possible methods of inquiry, of which only one is valid. The following chart summarizes them. Parmenides' Principle from Allan F. Randall Consistent, 1. The Way of Objective Truth: Necessarily, all possibilities exist. Coherent Consistent, 2. The Unthinkable Way:: Necessarily, no possibilities exist. Incoherent Inconsistent, 3. The Way of Subjective Belief: Some possibilities exist, some do not. Incoherent Bioethics however must encompass a much greater metaphysics; and that is one which includes understanding of Systems (Complex and Adaptative); and their discrete principles – especially Living Systems and their Ontogeny. Seekers of truth; for this is the ‘LOVE’ of wisdom – indeed the continuing history of philosophy itself is one of the ‘subjects’ of discourse; the ethics of COMMON GOOD its objective; Virtue is it’s reward. It is well taken that in ‘allegory’ Parmenides has provide NATURE for this purpose. Natures Ontogeny has developed teleological systems for commensality and biodynamic pathways of procreation. The conclusion of ON NATURE provides insight into this ‘impasse’ – An Ontogeny for Ontology: 19quot;Thus, according to belief, these things were born and now are, and hereafter, having grown from this, they will come to an end. And for each of these did humans establish a distinctive name. 20One and unchanging is that for which as a whole the name is: 'to be'.quot;
    • Logic is bound by human cognition in terms of this insight (as intuition) – this subjective/objective dichotomy if you will; and can shine the light of understanding upon the counterintuitive and non-binding logic of chaos; and its ‘reprobate’ ideologies. Mankind’s responsibility for ‘things which were born and that now are’ is not diminished through the providence of science. Greek Philosophy produced ‘Ethics’ – Hebrew Theology Commandments and a Covenant with God. Nichomachean Ethics (sometimes spelled 'Nichomachean'), or Ta Ethika, is a work by Aristotle on virtue and moral character which plays a prominent role in defining Aristotelian ethics. It consists of ten books based on notes from his lectures at the Lyceum and were either edited by or dedicated to Aristotle's son, Nicomachus. Nichomachean Ethics focuses on the importance of habitually behaving virtuously and developing a virtuous character. Aristotle emphasized the importance of context to ethical behavior and the ability of the virtuous person to recognize the best course of action. Aristotle argued that eudaimonia is the goal of life, and that a person's pursuit of eudaimonia, rightly conceived, will result in virtuous conduct. Eudaimonia (Greek: εὐδαιμονία) is a classical Greek word commonly translated as 'happiness'. Etymologically, it consists of the word quot;euquot; (quot;goodquot; or quot;well beingquot;) and quot;daimōnquot; (quot;spiritquot; or quot;minor deityquot;, used by extension to mean one's lot or fortune). Although popular usage of the term happiness refers to a state of mind, related to joy or pleasure, eudaimonia rarely has such connotations, and the less subjective quot;human flourishingquot; is often preferred as a translation. The Perseus Project is a digital library project of Tufts University that assembles digital collections of humanities resources. It is hosted by the Department of Classics. It suffers, unfortunately, from very frequent computer hardware problems, and as such its resources [1] are often unavailable. The project is mirrored in Berlin and Chicago.[2] The Ethics; Treatise on the Emendation of the Intellect; Selected Letters - by Benedictus de Spinoza (Author) quot;If Socrates' execution was the key event in Plato's life that shaped his subsequent career in philosophy, Spinoza's excommunication from the Jewish community of Amsterdam...quot; (more) Key Phrases: equally great emotion, absolutely infinite entity, fictitious idea, Corollary Hence, Proof All
    • Jonathan Barnes’s Aristotle (2000) provides an excellent and brief introduction to Aristotelian philosophy. In terms of impact on the Ethics, perhaps Aristotle’s most significant concept is that of the teleology of nature. According to Aristotle, nature works toward a telos, or end goal. His biological work aims constantly at the question of what purpose different aspects of plants and animals serve. He classifies humans as “rational animals,” meaning that our telos is rational. In other words, our function in life is to realize our full potential as rational beings. If we are not fully rational, we are falling short of our true nature. This teleological view gives Aristotle’s Ethics a clear sense of direction. Our goal in life is to achieve our true nature, and this true nature consists essentially of rationality. The purpose of a moral education, then, is to teach us how we may become perfectly rational and immune to the temptations of our lower animalistic parts. Ethics is just one of a number of fields that Aristotle classifies as “practical science.” Unlike the natural sciences, which examine the world around us, these sciences deal with the practical aspects of human society and how best to arrange this society. The practical sciences are all closely connected, and Aristotle frequently expounds on the connection between the good life for the individual and the kind of state that could make this good life possible. Hence, Aristotle’s Politics is an important companion and sequel to his Ethics. While the Nicomachean Ethics is Aristotle’s most popular work on ethics, there is a second work called the Eudemian Ethics, which is far less widely read. Most scholars agree that the Eudemian Ethics was written earlier in Aristotle’s career and represents a less mature view. Books V, VI, and VII of the Nicomachean Ethics are also found in the Eudemian Ethics. Aristotle – Louvre Queen Ester – Jews for Jesus w3c Standards
    • A Solution to Plato's Problem: The Latent Semantic Analysis Theory of Acquisition, Induction and Representation of Knowledge Abstract -8 How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, Latent Semantic Analysis (LSA), is presented and used to successfully simulate such learning and several other psycholinguistic phenomena. By inducing global knowledge indirectly from local co-occurrence data in a large body of representative text, LSA acquired knowledge about the full vocabulary of English at a comparable rate for school-children. LSA uses no prior linguistic or perceptual similarity knowledge; it is based solely on a general mathematical learning method that achieves powerful inductive effects by extracting the right number of dimensions (e.g., 300) to represent objects and contexts. Relations to other theories, phenomena, and problems are sketched. The Latent Semantic Analysis Model The model we have used for simulation is a purely mathematical analysis technique. However, we want to interpret the model in a broader and more psychological manner. In doing so, we hope to show that the fundamental features of the theory that we will later describe are plausible, to reduce the otherwise magical appearance of its performance, and to suggest a variety of relations to psychological phenomena other than the ones to which we have as yet applied it. We will explicate all of this in a somewhat spiral fashion. First, we will try to explain the underlying inductive mechanism of dimensionality matching upon which the model's power hinges. We will then sketch how the model's mathematical machinery operates and how it has been applied to data and prediction. Next, we will offer a psychological process interpretation of the model that shows how it maps onto but goes beyond familiar theoretical ideas, empirical principles, findings and conjectures. We will then, finally, return to a more detailed and rigorous presentation of the model and its applications. An Informal Explanation of the Inductive Value of Dimensionality Matching
    • Truth, Coherence and Correspondence in the Metaphysics of F.H. Bradley Absolute Idealism and Analytic Philosophy – The Incommensurability Paradigm Quantum Phenomenology – KTheory is a product of this transitional discourse Starting with the Descartes' cogito, quot;I think, therefore I amquot;--and taking an uncompromisingly rational, rigorously phenomenological approach--I attempt to derive the basic principles of recursion theory (the backbone of all mathematics and logic), and from that the principles of feedback control theory (the backbone of all biology), leading to the basic ideas of quantum mechanics (the backbone of all physics). What is derived is not the full quantum theory, but a basic framework--derived from a priori principles along with common everyday experience--of how the universe of everyday experience should work if it operates according to rational principles. We find, to our surprise, that the resulting system has all the most puzzling features of quantum physics that make physicists scratch their heads. Far from being quot;bizarrequot; and quot;weirdquot;, as is usually thought, the strangest paradoxes of quantum theory turn out to be just what one ought to expect of a rational universe. It is the classical, pre-quantum universe of the nineteenth century that has irrational, mystical components. The quantum-mechanics-like theory that is developed is, furthermore, most compatible with the strictest, most uncompromisingly rationalist of the standard interpretations of quantum mechanics, those which add no ad hoc elements to the theory, and which generally trace their history to the relative state formulation of Everett (also called the quot;many worldsquot; interpretation). These interpretations take the universe to be quite literally describable as a quantum wave function. As with any project this far-reaching in scope, I confess I have had to make some working assumptions along the way. I have attempted to isolate these, and clearly label them as points of possible future revision--they are marked in the text with an asterisk (*). A critique of Allan Randall’s poignant insights: Paradox gives rise to enigma – the enigma of existence itself; in that it is ‘integral’ – meaning; that entropy cannot ‘be broken’. Pedagogy purports Logic and Recursion Theory as a basis for ‘Metaphysics’ – yet Eidetic Intuition provides for a greater comprehension of what is seen and observed without contradicting either logic or recursion. So-called Mathematical Control Theory provides for rigorous computation of stochastic processes; yet the stochastic is seen as infinitely reducible [Bose – Einstein] – modern computer science utilizes recursion: and provides ‘proof’ that recursion is a ‘metaphysical’ paradigm. No matter what aspect or perspective one appreciates – this ‘fundamental harmony’ is essential to appreciating classical and modern philosophy as well; commensurable with man’s stewardship.
    • Implicit Ambiguity Resolution – inter-subjectivity; the soul of reciprocity - regime of representation that has prevailed in philosophy since Descartes; correction of semantic drift - semiotic representation affecting or effecting || self affecting reflexivity symbolism denotes intending objects through signs question of the reflexivity of awareness || counter-intuitive defining consciousness as 'auto-affection' becoming autocracy counter-intuitive root of reflexivity || essence of Cartesians Intractability and Conflict Resolution Toward Better Concepts of Peace – what is the purpose of discourse anyway? IN SOME THEORETICAL MODELS, researchers think that distinct sets of cognitive processes interpret a sentence at each posited level of representation, and they claim that distinct mental representations result from those computations. That approach predicts that the qualitatively different processing principles for syntax and semantics arise from the existence of qualitatively different neural processing (Chomsky, 1986; Clifton, Speer, & Abney, 1991; Ferreira & Clifton, 1986; Frazier, 1987; Marslen-Wilson & Tyler, 1987; Rayner, Garrod, & Perfetti, 1992). In contrast, other models have proposed that a semantic representation is assembled directly, without an intermediate syntactic representation (McClelland, St. John, & Taraban, 1989; Mitchell & Holmes, 1985). In that perspective, syntactic information is integrated with lexical or semantic and pragmatic information in a continuous process of mapping on to a meaningful representation of the whole sentence (Altman, Garnham, & Dennis, 1992; MacDonald, Pearlmutter, & Seidenberg, 1994; Spivey-Knowlton & Tanenhaus, 1994; Taraban & McClelland, 1990). Hypotheses Despite the different variables taken into consideration (Kotz, Holcomb, & Kounios, 1992), the observations allow us to hypothesize that, at least under certain experimental conditions, the N400 and the P600 are elicited as functions of the representational level of the anomaly, both semantic and syntactic (Osterhout & Holcomb, 1995). From our perspective, the identification of different wave variations should invite us to consider the existence of distinct components that intervene in sentence decoding. They are fairly diversified phenomena, nevertheless co- occurring in the process of sentence comprehension (Osterhout, McLaughlin, & Bersick, 1997). Moreover, in the present research, we analyzed a specific sentence-level context, and we cannot apply our results to other cognitive processes, such as word level. The principal paradigms of analysis point out the difference between word-level processes, which include the recognition of isolated words (words in single-word context), and sentence-level processes, which include the recognition of words in sentence context and the computation of syntactic structure (Balconi, 2001a; Osterhout & Holcomb). The
    • choice of sentence level is motivated by the need to compare more rigorously syntactic and semantic information processing and to explore the dichotomy between structural and semantic representations of words in sentence context. (Ainsworth-Darnell et al., 1998; Osterhout & Nicol, 1999; Van Petten, 1993) One of the basic concerns of classical mathematical logic has been a rigorous definition of “mathematical proofquot;. A proof may be discovered by luck, genius or accident. But once discovered it is mechanically checkable. Thus the most general type of mechanically checkable procedure provides a definition of the most general kind of proof. An investigation of “mechanical checkabilityquot; leads naturally to the notion of “computable processquot;. Recursion theory is that branch of mathematical logic which studies computability theory. From its very inception recursion theory has been so closely associated with theoretical computer science that it is sometimes difficult to tell where one begins and the other ends. Complexity is seen as ‘ultra’ to which ‘filters’ are applied for the sake of these rigorous paradigms – Living Systems however manifest a purpose which purports ‘commonality’ and their complexity is beyond recursive techniques; yet mathematical logic proposes to define it. The classical viewpoint IS commensurable with Quantum Theory and its communications theory – if and only if; what is being communicated is information propagated by a living system itself; purposefully to maintain this existence and ‘adapt’ [taxis] to an ‘ever’ changing environment – even where that environment is the living organism itself and its ecologic community of adaptation. Thereby the ontogeny of complex adaptive systems becomes commensurable with Quantum Mechanics and its rigorous mathematical recursion and feedback ‘taxis’. Eidetic Noesis - however; has serious problems with morality and the bioethics which stewardship must maintain – this excerpt from basic Engineering goals and objects: eal@eecs.berkeley.edu 1 Introduction Engineers have a major advantage over scientists. For the most part, the systems we analyze are of our own devising. It has not always been so. Not long ago, the principle objective of engineering was to coax physical materials to do our bidding by leveraging their intrinsic physical properties. The discipline was one of “applied science.” Today, a great deal of engineering is about coaxing abstractions that we have invented. The abstractions provided by microprocessors, programming languages, operating systems, and computer networks are only loosely linked to the underlying physics of electronics.
    • The rapid improvements in the capabilities of electronics during the last half of the 20-th century are, in part, the reason for this separation. The physical constraints imposed by limited memory, processing speed, and communication bandwidth appeared to evaporate with each new generation of computers. What appeared to one generation as luxuriously inefficient abstractions became the bread and butter of the next generation. The separation of “computer science” from “electrical engineering” is both a consequence and a cause, fueling the separation and reflecting it at the same time. At the same time, the systems science that was incubated in the study of electronic circuits (control systems, communications theory, and signal processing) has also become more abstract. Although these disciplines were created by true “electrical engineers” (“true” means that they were engaged with electrical systems), many of the practitioners today rarely encounter electricity directly. Their techniques are often realized in “embedded” software, ironically building on the abstractions that are only loosely connected to the electronics that their theory originally helped to create. The theories, however, have not adapted as well as one might hope to world of software. Perhaps these theories remain too wedded to their physical heritage. (quotation) Science and its twin children – magic and engineering have never exhibited restraint in the area of abstinence from temptation to experiment – even when such experiments violate all laws and the ‘norms’ of cultural decency. Medical Practice has become a major source of this malfeasance – where ‘malpractice’ is merely the retribution of ‘insurance’ claims; not the criminal culpability for deliberate and recurring acts of subversion. Into this void has stepped ‘Bioethics’ and its paradigms for the common good and proper caretaking of its communities. The Mosaic Effect and the Ecologic Patch Phenomenon ONTOLOGIES FOR CRISIS MANAGEMENT Session ID: INT-04 Motivation _ Crisis Intervention and Information Architecture What is a crisis? When does a crisis begin, and when does it end? What factors are likely to aggravate or alleviate a crisis? How is a crisis different from an emergency, or a disaster? Who are the players, stakeholders, actors and agents responsible for action during a global emergency and what are their roles? What and whose procedures are to be followed? What protocols are in place to support coordination and communication among the various agents? What infrastructures are in place? How can community “resilience” fit into the picture? Where are the bottlenecks? How can information systems be deployed and used to improve crisis management and support the optimization of resources and relief operations when the need arises? How can
    • transparency and collaboration be balanced with security and privacy measures during a crisis? How feasible is a common shared ontology for emergency management? Will such an ontology scale to international levels and who will drive this process and manage its evolution? These, and many more, are “ontological challenges” that pertain to the emergency management functions across all levels of government, non-government organisations, industry, and community groups. During a crisis incident, they all need to collaborate and cooperate and share information and resources to respond and recovery from the disaster. Under these conditions, it is critical that they share a common ontology to support their crisis functions and decision making roles. This session aims to provide an opportunity to allow researchers and practitioners to present their views, and to stimulate experts to further investigate the underlying “ontological challenges” that are at the heart of technical information cooperation during an international crisis. Ontologies are critical to the design and management of complex and sustainable information systems and are central to information flow in crisis management. The need to improve and open up knowledge and research in the area for “ontologies for crisis management” is becoming compelling and relevant to real-world requirements. Ontological challenges relating to crisis management need to be asked, and answered, in order to provide mechanism to widen adoption, interoperability, usefulness, efficiency, robustness, reliability, availability and accountability of information systems, during emergencies. The ontologies need to represent a wide cross section of dynamic emergency functions and to support dynamically adaptive real time scenarios as changes occur quickly and need to be propagated widely. Reasoning and decision making must be transparent and flexible with the support of crisis ontologies. The boundaries of this emerging research area is vast and still be determined. Ontology research has been increasing over the past years with the Semantic Web providing new technologies to solve ontological needs across disciplines and domains. Ontology management still faces many challenges when taken at the broader level – in particular at an international level - across many different stakeholder areas. How can ontologies support the many agencies and groups involved in a crisis? Emergency 2.0 LA Fire Department is clearly raised in the same spirit, as the people working there quickly absorbed the Web 2.0 tools into their activity to make it more effective and people caring. LAFD_ALERT service Flickr Photo Gallery YouTube Channel at BlogTalkRadio.com Along the blog, the LA Fire Department inserted Twitter into its online panoply of citizen services and set its designation as a reliable tool for emergency response in ordinary or crisis situations. Brian Humphrey and Ron Myers from LAFD said that the attributes the Web 2.0 tools possess — “desirable, beneficial, justifiable and sustainable” — motivated their choice.
    • Intuition: Discernment Of Conscience There are, in fact, two such vortices at work in the Pacific Ocean, the other one lying just off the coast of Japan (another major producer of plastic waste on the Pacific Rim. According to Greenpeace: An enormous island of trash twice the size of Texas is floating in the Pacific Ocean somewhere between San Francisco and Hawaii. Chris Parry with the California Coastal Commission in San Francisco said the so-called Great Pacific Garbage Patch, has been growing a brisk rate since the 1950s, The San Francisco Chronicle reported Friday. The trash stew is 80 percent plastic and weighs more than 3.5 million tons.
    • http://alipr.com/ Automatic Linguistic Indexing of Pictures - Real Time Content-based image retrieval (CBIR) using Aspect Oriented Context Sensitive Workflows. Content-based image retrieval (CBIR), also known as query by image content (QBIC) and content-based visual information retrieval (CBVIR) is the application of computer vision to the image retrieval problem, that is, the problem of searching for digital images in large databases. (see this survey[1] for a recent scientific overview of the CBIR field) quot;Content-basedquot; means that the search will analyze the actual contents of the image. The term 'content' in this context might refer to colors, shapes, textures, or any other information that can be derived from the image itself. Without the ability to examine image content, searches must rely on metadata such as captions or keywords, which may be laborious or expensive to produce. representative CBIR systems APIR QBIC VisualSEEk Photobook support content-based retrieval by color, texture, and shape Block-Based Neural Network Architecture Passive Aggressive Model for Image Retrieval PLSA-based image auto-annotation: constraining the latent space. In: ACM Multimedia cross-media relevance models Bioinformatics – Semantic Web Tools || Information Architecture and 'the human condition'; The network may not always completely mask the computing infrastructure behind it, but it hides considerable detail and mostly allows access through quot;Web-yquot; protocols, languages, and standards such as HTTP, RSS, XML, Javascript, and REST. On July 24, 2004 I responded just after that with, quot;So the user-created bottom-up categorical structure development with an emergent thesaurus would become a Folksonomy?quot; Eidetic Linguistics Indexing using Semiotic Representation - The Semiotics of Folksonomy It is not that technology provides 'opportunity' but exponentially expanded 'ACCESS' to opportunity; this drives ‘business applications’ of 'semantic web technology'. Prayer and meditation are still required for 'perfection' of decision making - especially in their discrete application to real life situations. Use of Semantic Modeling - the power of algorithms’ applied to content syndication on the Web can be conceptualized;
    •  structured topic correlation  syntactic syndication approaches  automated reasoning support  Web Ontology Language – Web Services Process Model Ontology (WSPMO)  rich semantics-based mechanism for expressing subscriptions and published content  Polyhedron representation - octoroons; fractal Geometry (topology) - Correspondence Theory  Stanford SIMPLIcity system - Automatic Linguistic Indexing of Pictures  Quaternions - minding your p's and q's - gimble locks and lie big algebras  rendering animated polygonal models with real-time lighting and shadows – SIMD  Folksonomy in the context sensitive generation of Microformats Visualization - cleaning our lenses Semantics – the meaning of words; is a ―course‖ discipline at best – and its relationship to what today is called Information theory is equally ―disputed‖. Here is a quote – ―Information is a semantic unity” - which others have argued as ―ontologies are tautological; or they are not Ontology‖. This quotation from Tim Berners-Lee; the WWW protagonist; presents a ―catharsis‖ of: A Semantic Web is not Artificial Intelligence (search for Article). Information becomes the ―paradox‖ to be resolved through an ―ontological‖ tautology. This is ―systems thinking‖ at its best – logic abhors tautology – being is ―immersed‖ in it. Ontogeny quot;Cause and effect, means and ends, seed and fruit cannot be severed; for, the effect already blooms in the cause, the end pre-exists in the means, the fruit in the seed.quot; - Ralph Waldo Emerson The Ontogeny of Environmental biology Zoonosis- Reference > Zoonosis Updates Adaptation and the Tree of Life The word is derived from the Greek words zoon (animal) (pronounced as zoo-on) and nosos (disease). Many serious diseases fall under this category. The plural of zoonosis is zoonoses, from which an alternative singular zoonose is derived by back-formation. The simplest definition of a zoonosis is a disease that can be transmitted from other animals to humans. A slightly more technical definition is a disease that normally exists in other animals, but also infects humans. The emerging interdisciplinary field of conservation medicine, which integrates human and veterinary medicine, and environmental sciences, is largely concerned with zoonoses.
    • Genome research and its malediction – Evolutionary Biology Moabite Stone Fringetail Purple Finch An Idea of Value for Semantic Ontologies Semantic Ontologies express the basis of relationships; true correspondence and mutual understanding amongst quot;sentient beingsquot;. One must differentiate Ontology as being from the comprehended ontological in order to appreciate that quot;beingquot; is what generally is termed quot;consciousnessquot; – living systems as ALIVE; ONTOGENY. The sense of naming ―creatures‖ within the ―Tree of Life‖ is fundamental to ONTOLOGY. Semantics offers a dichotomy to consider; Etiology and Ethology - causation and casualty. Etiology is ontological and is the root of quot;myth and folklorequot; - Ethology may explain why certain quot;speciesquot; [of the human genera] are fond of making up stories about the Natural world. Etymology then must consider ontogeny, etiology and a generalized Ethology of a human family which evolution denies. General theory of reflection - Gödel and Tarski  Metalogical foundations - make the logic of choice an inter-changeable parameter  Reflective logical and semantic frameworks  Nuprl's constructive type theory as a reflective Metalogical framework  The Structure of Post-Axiomatic Mathematics  Research relations between formulas and functions on sets of formulas  Meta-Formulas - Need for Theory of Formulas  The Concept of Truth in Formalized Languages  On the Concept of Logical Consequence  Independence of the Axiom of Choice and the Generalized Continuum Hypothesis relationship of relativity theory and idealistic mathematics in the light of philosophy
    • The Group Concept - Notes ``The theory of groups is, as it were, the whole of mathematics stripped of its matter and reduced to pure form''. Jules-Henri Poincaré (1854-1912) Correspondence or Incommensurability? Technologic Considerations  Representation Theory + commensality or incommensurability  Categorical Geometry :: Fractal Geometry [topologic perfection]  Partial differential equations and harmonic functions  Tesseract and hypercube  Super strings Theory :: grand set theory :: symmetry :: supersymmetry  Self-rationalization [dangerous practice]  Differentiates four levels for possible analysis  Correspondence principle :: Quantum Mechanics – Inequalities | Uncertainty Ontogeny and its ONTOLOGY – questions of a philosophical nature After reading a modern philosopher like Edward Levinas – ‘Totality and Infinity’; the matter is much clearer - ontology and noesis are co-adjunct operators in the quot;metaphysicsquot; - In the time of Golden Age of Greek Philosophy this noetic framework was the didactic [teaching] of geometry, ethics and logic. Yet – it is difficult to practice what you preach if your society is extremely immoral or even depraved. Today the quot;Golden Age of Physicsquot; has propelled a global society into quot;stellar spacequot; using another aspect of this quot;noetic frameworkquot; and that is eidetic noesis - which is quot;photographicquot; quality visualization. Yet the semantics remain unchanged - this is good! This is most certainly true of bioethics - one of the most compelling quot;interdisciplinaryquot; fields; where even the most basic conceptions of western civilization can be challenged due to the quot;misanthropyquot; of its quot;unethical experimentationsquot;. One of the most recent example of this is the quot;AIDS holocaustquot; which by now is threatening to become just as destructive to human populations as WW2.
    • The River: Origins of the Aids Pandemic Zoonosis - evolution - symbiosis; these terms may apply to HIV/AIDS pandemic quot;prognosisquot; but will survive not a quot;parsecquot; in the hysterical world of the World Health Organization and it's pharmacological based medicine. An important footnote is that ‗indeed‘ - the Salk vaccine was wholly adequate to ‗effectively prevent‘ Polio Mellitus. Knowledge representation system The true crisis in correspondence - especially with quot;realityquot; is realizing the very great deceit which has been perpetrated historically. Cosmic convergence is heralded by the vast changes in Earth's global weather patterns [thermodynamics]; this physical yet unpredicted change; transcends the havoc of waste and destruction which mankind's greed has contributed. Spiritual Consciousness; in sympathy with those suffering terrible loss; looks to a quot;Holy Godquot; whose quot;ways are far past finding outquot;. The spirit our ears and eyes receives ―patience‖ from: to understand or if not - to search for the truth....this is harmony with cosmic convergence. The millennia are converging; space and time are actually warped by the incredible power of the cosmos which is quot;investedquot; in our quot;Planet Earthquot;. The millennia have quot;jubileesquot;; celebrations of history and cultural traditions. Globally the amount of intercultural reconciliation is greater than any one could ever have predicted. Why is this? The Millennia are being quot;discoveredquot; This means our unity will be restored Meaning and Purpose are co-adjunct motivators of ontology; and society in general. Without meaning, purpose is counter-intuitive and begets as it is colloquially stated - quot;the vicious circlequot;. Millennial Moments Mood: chillin' Now Playing: En-Raptures - Topic: Historic Insights Discernment of Harmonic Convergence of the ―physical‖ requires Spiritual Consciousness - this so-termed ―higher consciousness‖ is ―ontogenous‖ – One with being. Infosystems Intelligence The holistic unity of the natural world is an ontogeny; in and of itself. Our ontological perception of it is also an ontogeny - observation by a living soul. This unity of body and soul is known as spirituality. And we are forewarned— only two spirits exist; that which is good and that which is reprobate…..these spirits act derisively [against each other] and are the source of hostility and it‘s human neurosis. Ontogeny is based upon living systems and their highly discrete purpose [survival]. The Hebrew Scriptures inscribe that Almighty God dwells in unapproachable light….his creation must be endowed with some very great qualities as well.
    • There is apparently considerable misunderstanding about quot;ethicsquot; and its moral values. When it is written that quot;all have sinned and fallen short of the glory of godquot;; sin must be atoned for.....crime therefore will be punished. Deceit; hypocrisy and it's quot;counter- intelligencequot; are all works and character traits of that quot;wicked onequot; which the Holy Spirit will rebuke. Whether it is the quot;axiomatic semantics of first order logicquot; or the ideologue of folk wisdom - quot;god fearingquot; people; what ‗ought to‘ - will be. The milieu does not seem to grasp this; secular humanism and its evolutionary theories clutch at madness - why is this? Man is evil from his youth onwards - who can know it! The global Opium trade is one long-term example of this. Ontogeny provides the quot;superstructurequot; which a living world and its living systems abide in. This quot;cosmosquot; is more than the elements; it is also the very nature of reality. Super symmetry - time reversal invariance - holistic ecology; these all quot;emanatequot; from a quot;creatorquot;. The uncanny similitude between the aloe plant and moon jelly fish transcends habitat; but requires ideal conditions for optimal growth. Such ‗morphological symmetry‘ permeates Nature. It is written that quot;the worlds are made of things which do not appearquot;! It must be immediately observed by anyone taking part in the quot;developmentsquot; which are being termed Web 2.0 - that all computer language based technologies are quot;convergingquot;; ie....becoming interoperable with each other. This was the whole point of the quot;Semantic Webquot; and its quot;Ontologiesquot;. Those who declare quot;paradigm shiftsquot; and quot;subversionquot; practices while maintaining semantic quot;ambivalencequot; to a general usurpation of ethical practices and privacy requirements counsel quot;acquiescencequot; while quot;jackingquot; up the prices of their quot;inefficientquot; products regularly. Paradigm shifts of this kind generate quot;semantic driftquot;; a dangerous practice in the highly abstract realm of cyberspace. Silicon Snake Oil salesman everyone - all probability is ultimately ONE. Perhaps another point well taken that on ‗millennium‘ – mankind‘s UNITY is becoming restored - rapidly. Semiotics, semiotic studies, or semiology is the study of signs and symbols, both individually and grouped into sign systems. It includes the study of how meaning is constructed and understood. This discipline is frequently seen as having important anthropological dimensions. However, some semioticians focus on the logical dimensions of the science. They examine areas belonging also to the natural sciences - such as how organisms make predictions about, and adapt to, their semiotic niche in the world (see semiosis). In general, semiotic theories take signs or sign systems as their object of study: the communication of information in living organisms is covered in biosemiotics or zoosemiosis.
    • [1] Syntactics is the branch of semiotics that deals with the formal properties of signs and symbols. Terminology The term, which was spelled semeiotics (Greek: ζημειωηικός, semeiotikos, an interpreter of signs), was first used in English by Henry Stubbes (1670, p. 75) in a very precise sense to denote the branch of medical science relating to the interpretation of signs. John Locke used the terms semeiotike and semeiotics in Book 4, Chapter 21 of An Essay Concerning Human Understanding (1690). Here he explains how science can be divided into three parts: All that can fall within the compass of human understanding, being either, first, the nature of things, as they are in themselves, their relations, and their manner of operation: or, secondly, that which man himself ought to do, as a rational and voluntary agent, for the attainment of any end, especially happiness: or, thirdly, the ways and means whereby the knowledge of both the one and the other of these is attained and communicated; I think science may be divided properly into these three sorts. —Locke, 1823/1963, p. 174 Locke then elaborates on the nature of this third category, naming it Σημειωτικη (Semeiotike) and explaining it as quot;the doctrine of signsquot; in the following terms: [2] Nor is there anything to be relied upon in Physics, but an exact knowledge of medicinal physiology (founded on observation, not principles), semeiotics, method of curing, and tried (not [3] excogitated, not commanding) medicines. —Locke, 1823/1963, 4.21.4, p. 175 In the nineteenth century, Charles Peirce defined what he termed quot;semioticquot; as the quot;quasi-necessary, or formal doctrine of signsquot; that abstracts quot;what must be the characters of all signs used by...an intelligence capable of learning by experiencequot; (Collected Papers of Charles Sanders Peirce, paragraph 2.227). Charles Morris followed Peirce in using the term quot;semioticquot; and in extending the discipline beyond human communication to animal learning and use of signals. Saussure, however, viewed the most important area within semiotics as belonging to the social sciences:
    • It is... possible to conceive of a science which studies the role of signs as part of social life. It would form part of social psychology, and hence of general psychology. We shall call it semiology (from the Greek semeîon, 'sign'). It would investigate the nature of signs and the laws governing them. Since it does not yet exist, one cannot say for certain that it will exist. But it has a right to exist, a place ready for it in advance. Linguistics is only one branch of this general science. The laws which semiology will discover will be laws applicable in linguistics, and linguistics will thus be assigned to a clearly defined place in the field of human knowledge. —Cited in Chandler's quot;Semiotics For Beginners, Introduction. A Question of Totality - what Metaphysics can be! Metaphysics Defined - http://members.aol.com/Srabbitt1/index3.html According to Funk & Wagnall‘s Dictionary, Metaphysics is defined as: (1) ―The branch of philosophy that investigates principles of reality transcending those of any particular science, traditionally including cosmology and ontology. (2) ―All speculative philosophy‖. The second definition, at first glance, appears vague and almost meaningless in its generality. The term ―speculative‖ carries with it the secondary meaning of risk along with that of contemplation, although we do not understand what risk lies in philosophy other than taking it seriously! The first definition appears to be quite PC (Politically Correct) and speaks in clearly nonjudgmental tones. It too seems simple enough, provided you know how to define philosophy, transcendence, cosmology and ontology. Philosophy - (1) The general inquiry into the most comprehensive principles of reality in general, or of some sector of it, as human knowledge or human values‖ (2) The love of wisdom, and the search for it. (3) A Philosophical system; also a treatise on such a system. (4) The general laws that furnish the rational explanation of anything: the philosophy of banking. (5) Practical wisdom; fortitude Transcendence - To be independent of, or beyond Cosmology - The general philosophy of the universe considered as a totality of parts and phenomena subject to laws Ontology - The branch of metaphysics dealing with the philosophical theory of reality While Philosophy‘s multitude of meanings seem to encompass most of human intellectual activity, the other defined terms seem straight forward. Rather than restate
    • the metaphysics definition (1), we leave it to you to plug in the appropriate definitions, including the philosophical definition of choice. For the more skeptical among you, or those most pressed for timely answers for new comers that can fit into a ―Blipvert‖ of two lines in a chat room, Metaphysics definition (2) seems reasonable. So is that it? Have we learned all we need to know about Metaphysics, ―the definition‖? Certainly not! Delving ever deeper into the meaning of our room‘s name sake, I discovered an historically oriented, and perhaps more illuminating definition. With all due respect to its source ―A History Of Philosophy‖ by A.G. Fuller (Prof. at USC), we have a definition which better satisfies the savage soul..... Metaphysics - ―In its popular and general sense – denotes; the investigation of the essential and absolute nature of reality as a whole or of the nature of being as such. Cf. ontology. The search for first principles; originally meant - ―what comes after physics‖; and was used originally of the works of Aristotle that followed his Physics in the collection made by Andronicus. Used by Aquinas (St. Thomas) to designate knowledge of supernatural entities; by the Cartesians (Rene), of immaterial entities; By Kant, of constructive attempts to know the nature of things as they are in themselves, and of theories regarding objects of faith, like G-d, freedom and immortality; By Bergson and other intuitionists, of the immediate acquaintance with the real given by direct intuition of its nature, as contrasted with the falsifications of the nature of the real by the intellectual process....‖ So metaphysics wasn‘t that easy to define after all, was it? RECENT ADVANCES IN METAPHYSICS - E. J. Lowe 1. Philosophy, metaphysics and ontology There is a widespread assumption amongst non-philosophers, which is shared by a good many practicing philosophers too, that 'progress' is never really made in philosophy, and above all in metaphysics. In this respect, philosophy is often compared, for the most part unfavorably, with the empirical sciences, and especially the natural sciences, such as physics, chemistry and biology. Sometimes, philosophy is defended on the grounds that to deplore the lack of 'progress' in it is to misconceive its central aim, which is challenge and criticize received ideas and assumptions rather than to advance positive theses. But this defense itself is liable to be attacked by the practitioners of other disciplines as unwarranted special pleading on the part of
    • Philosophers, whose comparative lack of expertise in other disciplines, it will be said, ill-equips them to play the role of all-purpose intellectual critic. It is sometimes even urged that philosophy is now 'dead', the relic of a pre-scientific age whose useful functions, such as they were, have been taken over at last by genuine sciences. What were once 'philosophical' questions have now been transmuted, allegedly, into questions for more specialized modes of scientific inquiry, with their own distinctive methodological principles and theoretical foundations. This dismissive view of philosophy is at once shallow and pernicious. It is true that philosophy is not, properly speaking, an empirical science, but there are other disciplines of a non- empirical character in which progress most certainly can be and has been made, such as mathematics and logic. So there is no reason, in principle, why progress should not be made in philosophy. The four-category ontology has no difficulty in saying what 'ties together' the particular properties — that is, the modes — of an object. An object's modes are simply 'particular ways it is': they are characteristics, or features, or aspects of the object, rather than constituents of it. If properties were constituents of an object, they would need, no doubt, to be tied together somehow, either very loosely by coexisting in the same place at the same time, or more tightly by depending in some mysterious way either upon each other or upon some still more mysterious 'substratum', conceived as a further constituent of the object, distinct from any of its properties. It is precisely because a mode is a particular way this or that particular object is that modes cannot 'float free' or 'migrate' from one object to another — circumstances that pure trope theorists seem obliged to countenance as being at least metaphysically possible. Moreover, the four-category ontology allows us to say that the properties of a kind are tied to it, in the laws to which it is subject, in a manner which entirely parallels, at the level of universals, the way in which an individual object's modes are tied to that object. In both cases, the tie is simply a matter of the 'characterization' of a propertied entity by its various properties and consists in the fact that the properties are 'ways' the propertied entity is. Fig. 2 - below may help to highlight the main structural features of the four-category ontology as I have just outlined it. In this diagram I use the term 'attribute', as suggested earlier, to denote the category of property-universals and, for simplicity of presentation, I am ignoring relational universals. (needs quotation) Kinds characterized by Attributes Instantiated by exemplified by instantiated by Objects characterized by Modes Fig. 2: The four-category ontology
    • An object O may exemplify an attribute A in either of two ways. O may instantiate a kind K which is characterized by A, in which case O exemplifies A [dispensational]. Alternatively, O may be characterized by a mode M which instantiates A, in which case O exemplifies A concurrently. It may perhaps be doubted whether the four-category ontology provides an adequate metaphysical foundation for the more esoteric reaches of modern physics, such as the general theory of relativity and quantum physics. But I believe that even there it will serve well enough. THE METAPHYSICS OF REASON ABSTRACT: We will briefly review the developments in Physics and Mathematics during the last century. We find that while Science was born and grown under the auspices of Cartesian principles, the 20th century Physics and Mathematics defy the Cartesian premises. The emerging unified viewpoint on cosmos renounces the dualisms (subject-object, reason-faith, and cogitation-being) and incorporates elements of the traditions. Modern Mathematics was marked by Cantor‘s study of infinity. An infinite quantity breaks the usual axiom that «the whole is greater than (Sum of) its part». An infinite quantity is the same as a part of it. Cantor found that there is a hierarchy of infinities. The «simplest» infinity, 0 , corresponds to the infinity of integers (set N). The next level of infinity, 1, is represented by the real numbers (set R). The chain of infinity continues with no end (0 , 1 , 2 , 3 …). How two levels of infinity are connected? We move from one level of infinity to the next level by invoking the requirement of totality. For example starting from the set of integers N, the set of all subsets of N belongs to R. Every time we use the requirement of totality we move to a higher level of representation, where different laws apply (for example the rules of arithmetic with 0 , are not the same as the rules of arithmetic with 1 ). It is not an accident that most of the paradoxes in Mathematical Logic involve the mixing of different levels; recall Russell‘s paradoxes where we use the notion «the set of all sets». Mathematical systems cannot be purified of any internal contradictions, or presented as constructions of pure logic. Gödel has shown that we cannot separate mathematics from meta- mathematics. His ingenious numbering of mathematical propositions indicated that a proposition accepts a double reading: as a proposition of theory and as a statement (or a comment) about the propositions of the theory. Furthermore Gödel using the diagonal lemma of Cantor formulated the proposition G: there is a theorem, which is self-referenced as a non-theorem. From now on, we know that the dichotomy into true or false statements is not correct. We may encounter un-decidable statements, statements which cannot be classified as true or false. The true statements exceed in number the proven statements and the continent of truth cannot be explored using only the analytical method [2]. From Gödel‘s proof we understand that any organized system breaks down when we employ self-preferentiality, when the system is
    • delivered to an endoscopic examination. In order to assess the system as a whole, we have to move outside the system, to another level of representation and knowledge. In Genesis God asks Adam to give names to all beings of Creation? Name, recalling father Pavel Florensky [5], is a human energy, which takes the mind beyond subject to reach a world. Thus the name and the word constitute the reality itself. We are asked, Adam and ourselves, to face and provide meaning to cosmos. We have to decipher in a creative way the άλογος λόγος (the speechless word), as Origen defined the universe. According to St. Maximos the Confessor, man should unite the sensible beings. Then going beyond the level of the sensible and entering the universe of the intelligible, man unifies all different logoi to a single logos. Finally into an act of immense αγάπη (love, eros), man presents the whole united universe as an offering to God. Thus man and nature participate in a cosmic liturgy, a Eucharist Paul defines faith as ―the examination of unseen things‖. Faith leads to knowledge, and then this knowledge to a new higher faith and so on …. ―from faith to faith and from knowledge to knowledge‖. Thus the whole knowledge of the visible and the invisible appears organized in levels. The knowing process leads to a real unity between the knower and the known. At the end, η γνώζις αγάπη γίνεηαι (knowledge becomes love), as St. Gregory of Nyssa says. Regarding now triadic structures, Christian faith rests upon this great mystery of a triadic God. Without any intention of entering into this deep theological issue, we could mention the relation between God who is revealed and those who are revealing Him, Logos and the Holy Spirit. Gnosis and deification is achieved by the presence of both Logos and Spirit (δια ηου Υιού εν Αγίω Πνεύμαηι). We stressed already the emerging paradox. The interpretation and the understanding of Science of the 20th century, requires and suggests elements of the Tradition. The new-born knowledge reflects the 2500 years old Sophia and gnosis. Somehow we discover what the Greeks and the Christians call ΑΙΩΝ, that is Time, not a segment of time, not time in evolution, but Time in its duration and in its totality. The new underlying pattern of knowledge is composed of old elements. How to rearrange these old elements into a new structure, is a question of aesthetics. We have to prove, over and over again that the cosmos is cosmos. Or as Charles Sanders Peirce put it: quot;the universe as an argument is necessarily a great work of art, a great poem - for every fine argument is a poem and a symphony - just as every true poem is a sound argument …quot; The dimension of freedom, however, is not something created out of nothing by a God, for that would be to deal with it like some ontic thing caused by an agent. The dimension of freedom is indeed the dimension of Dasein, but Dasein's existence cannot be accounted for Ontically - causally by way of creation, but rather only ontologically-transcendentally by uncovering and stepping back into its conditions of possibility, for freedom is the transcendental dimension per se. The creature is an existent which of course depends on an ‗other‘ (or is it ‗another‘), but not like a part which separates itself from the other. Creation out of nothing breaks the system; it posits a being outside the system, i.e. there where its freedom is possible. (78/149) (not necessarily; since for one contemporary example – Levinas posits God as ‗absolutely other‘ – ‗otherwise than being‘; which cannot in this sense; theologically be anything other than being itself – a true paradox!)
    • The question of how did the foundling Moshe come to 'know and write' Hebrew. v) Ontology flattened - http://www.webcom.com/artefact/wrldshrg/lvnsoth5.html Lévinas, on the contrary, in ‗Totality and Infinity‘, takes the appearance of the face of the other to be the point at which the Infinite or the Absolute intervenes with the word that teaches or instructs, like a master instructs his disciple. The point at which the master appears is, he claims, the origin of language. The intervention of the Absolute is strongly reminiscent of the Old Testament motif of God calling on Moses to lay down the Ten Commandments, and this impression is reinforced by Lévinas' insistence on the ethical impossibility of killing the other: quot;Thou shalt not killquot;. The appearance of the Infinite on the scene is in a quite literal sense a ‗deus ex machina‘ in Lévinas' hands; in fact he introduces the idea of a quot;creatio ex nihiloquot; (e.g. 78/149) in order to quot;breakquot; with the system of a (causal) totality and introduces an understanding of a human being as a quot;separate existentquot; self-centred on its own enjoyment (cf. e.g. 91/166). But this totality is conceived as an ontic-causal system into which he introduces the dimension of freedom. Strangely enough, the ‗deus ex machina‘ who performs the ‗creatio ex nihilo‘ is a causal principle: http://www.kazuo-katase.com/texteng.htm But Katase discovered elementary geometries, light and shadow, complementarities in general, which tend towards traditional minimalism. For him, space is not an empty vacuum with objects placed in it, which is still observable in perspective terms as dead spatiality. Instead, space is an elementary moment of integration of all subjects that is narrated by the ground of being in the totality of nature on the globe; it is a system of relationships of tension that take effect within the paradox of complete stillness. Square and cube, circle, ring and sphere, line, rod, and cylinder are the basic forms for this; blue, red and yellow are the principal colors, white and black are the non-colors; monochrome is an ensemble style. An overview is always simultaneously an inner view. Born in Shizuoka in Japan in 1947, Katase has been living and working in Kassel, Germany, since 1976. He brought with him to the West a Zen tradition enlightened by modern Japanese philosophy, mainly represented by Kitaro Nishida (1870-1945). Nishida's philosophy of pure experience (3) emerged from the cultural opening of Japan toward the West that took place after 1868. By its very nature, Zen is incapable of becoming a world philosophy - and it has never attempted to be this, rejecting as it does explanatory discourse from the start; inner experience, image, and gesture are more important in it. Initially in the pragmatic and behaviorist thought of British philosophers, and later in the metaphysics of German Idealism and the romantic psychologisms that developed from it, Japanese philosophers recognized elements capable of bridging the gap between East and West(4). LAO TZU'S METAPHYSICS AND HIS CRITIQUE OF CONFUCIAN ETHICS - VINCENT SHEN http://www.crvp.org/book/Series03/III-4/chapter_ii.htm
    • Lao Tzu has given to Chinese Culture the most profoundly speculative system of metaphysics in the history of Chinese philosophy. However, his metaphysics was quite intimately related to his critique of the Confucian Ethics. In fact, Lao Tzu's metaphysics emerged first as a vehement critique of the `Too Human', that is, too anthropocentric ethical orientation in Confucianism. Lao Tzu's metaphysics is proposed as an ultimate solution to the impasse created by this too human ethics. It is characterized by an emphasis upon the ontological foundation of human nature and its re-insertion of human action into the cosmic spontaneity shared by all things as begotten by the Ultimate Tao. According to my interpretation, Lao Tzu was first of all a cultural critic of his time: his concept of Tao was proposed as an ulterior solution to the socio-political and spiritual crises of that society. By his penetrating criticism, fused with a profound praxis of life, he established a paradigm of social critique and critique of ideology for Chinese culture in general. His writings on the Tao and its virtues, entitled the Tao Te Ching ( ), has revealed to us the image of a society in a process of radical change. On the one hand was the disintegration of the cultural order in the ancient China constituted by Chou Li ( ): the social institutions and politico-religious rites of Chou Dynasty. On the other hand, new cultural elements were emerging, but without being able to stabilize themselves as a new social order. Viewed from this perspective, even though we possess very few historical accounts about the life of Lao Tzu, after a rigorous textual analysis of the Tao Teh Ching, we could still judge that its author had composed it in the epoch of Warring States (480-221 BC). Thus, we are justified in denying the only narrative tradition concerning Lao Tzu since Ssu-ma Ch'ien, according to which Lao Tzu was a keeper of the archives at the Chou Court and an elder contemporary of Confucius (551-479 BC). On the contrary, the Tao Teh Ching was composed much later than Confucius. It was in criticizing the society of Warring States and the ethics of Confucianism that Taoism emerged as a vigorous way of thinking, and hence as a deep, fundamental trait of Chinese thinking and even of the Chinese attitude towards life and society in general. Formal Ontology, Common Sense and Cognitive Science Barry Smith - http://ontology.buffalo.edu/focscs.htm From: International Journal of Human-Computer Studies, 43 (1995), 641-667. Abstract Common sense is on the one hand a certain set of processes of natural cognition - of speaking, reasoning, seeing, and so on. On the other hand common sense is a system of beliefs (of folk physics, folk psychology and so on). Over against both of these is the world of common sense, the world of objects to which the processes of natural cognition and the corresponding ‗belief-contents standard‘ relate. What are the structures of this world? How does the scientific treatment of this world relate to traditional and contemporary metaphysics and formal ontology?
    • Can we embrace a thesis of common-sense realism to the effect that the world of common sense exists uniquely? Or must we adopt instead a position of cultural relativism which would assign distinct worlds of common sense to each group and epoch? The present paper draws on recent work in computer science (especially in the fields of naive and qualitative physics), in perceptual and developmental psychology, and in cognitive anthropology, in order to consider in a new light these and related questions and to draw conclusions for the methodology and philosophical foundations of the cognitive sciences. It is then necessary to distinguish between: O1. natural cognition as a totality of processes (within which we can distinguish various sub-totalities of language, reasoning, vision, etc.); O2. the more or less coherently organized systems of pre-scientific beliefs (of folk physics, folk psychology, etc.), which are extractable from this totality of cognitive processes and which can be seen as playing a central role in the organization thereof;(2) O3. the world (the system of reference-objects) to which the cognitive activities and beliefs in O2 primarily relate. This gives rise to a corresponding division on the theoretical side between: T1. sophisticated (scientific) theories of the processes in O1 (for example psychological theories of the workings of the human visual system); T2. sophisticated (scientific) theories of the naive belief-systems in O2;(3) T3. sophisticated (scientific) theories of the objects in O3. Unfortunately, none of the listed dimensions is entirely unproblematic and each is the subject of as yet unresolved debates between different philosophical and methodological camps. Thus many cognitive scientists see work on developing theories under T3 as part and parcel of attempts to develop computer simulations of common-sense reasoning and beliefs under O1 and O2. This association - which is manifested in the very terminology of `naive physics', `folk psychology', etc. - has led some to suppose that the scientific investigation of the structures of the common-sense world ought properly to involve the use of less sophisticated logical or mathematical tools than are available to those engaged in scientific investigations of other sorts.(4) This assumption is here rejected. Common-Sense Realism The thesis that there is only one world towards which natural cognition relates is a central plank of what philosophers in the course of history have identified as the doctrine of common-sense realism. This is a doctrine according to which: a. we enjoy in our everyday cognitive activities a direct and wide-ranging relational contact with a certain stable region of reality called the common-sense world; b. our everyday cognitive activities rest upon a certain core of interconnected beliefs - called `common sense' - which is in large part true to the common-sense world as it actually is, not least in virtue of the fact that such beliefs and our associated cognitive capacities have arisen through interaction with this world;
    • Appearance and Reality Our natural cognitive experiences are of course in many cases non-veridical, and thus the Common-sense realist must confront the fact of error. At the same time, however, it must be pointed out that common-sense is itself aware of the many sorts and species of error that are involved in our everyday cognitive endeavors. Thus common sense is not, in spite of its reputation, naive; it draws a systematic distinction between reality and appearance, or in other words between the way the world is and the way the world seems or appears via one or other of the sensory modalities and from the perspective of one or other perceiving subject in one or other context. The thesis that there is only one world towards which natural cognition relates must thus be understood as being compatible with the thesis that there are many different ways in which this world can appear to human subjects in different sorts of circumstances.(8) Perception as Discrimination The common-sense realist holds that perception is a source of veridical information about the common-sense world. As Neisser puts it: `Under normal circumstances, perception of the local environment is immediate, effortless, and veridical' (1987, p. 11). The putative information supplied by perception is always partial, and sometimes erroneous, but it can in every case be supplemented and corrected by the gathering of further information about the sides of objects we cannot see, about the future behavior of objects, and so on. Common Sense and Physics The common-sense realist must also confront the question of the relation between the common-sense world and the world that is described in the textbooks of standard physics. Here again a number of different philosophical alternatives have been mapped out in the course of philosophical time, including the view that it is the common-sense world that is truly autonomous while the world of physics is to be awarded the status of a cultural artifact.(11) How could we discover that common sense is false? The doctrine of common-sense realism is not merely a doctrine concerning the ontological status and nature of the common-sense world. The doctrine also has an epistemological component, embracing a thesis affirming the existence of a network of actually existing relations between the objects in this world and ourselves as cognizing agents, relations which facilitate veridical cognition: the world and its subjects are as it were in tune with each other. Thus common-sense realism is to be contrasted not only with idealism but also with what might be called asymptotic or utopian realism, a view according to which cognitive access to reality can in principle be achieved, but only in the long run, with the ultimate perfection of our cognitive apparatus, when some future ultimate science will finally match up to the world as it is in itself. Like various forms of idealism, asymptotic realism denies what seems to the common-sense realist to be the evident fact that we are already in direct contact with the world, or with much of the furniture of the world, and have been so for a long time. (Moore 1959, p. 33) A Note on Model -Theoretic Semantics - I for one favor 'common-sense'!
    • An adequate theory of natural cognition presupposes a theoretical understanding also of the structures of that common-sense world to which natural cognition relates. This thesis is at odds with the thesis taken for granted by many modern philosophers - a thesis rarely explicitly formulated - to the effect that the tasks of semantics and theory of reference are most properly to be realized by investigating not the concrete world of common sense but certain sorts of abstract structures. This applies very clearly, for example, to those formal philosophers working in the tradition of Montague, but the mentioned thesis is much more widely held, and it is nowadays almost always taken for granted that the job of the semantic theorist involves the construction of special set-theoretic models. The latter serve as surrogates for the things and events of the common-sense world, so that semantic investigation of the sentences of natural languages (and of the thoughts which these sentences express) remains at one - removed from the world towards which - on a realist understanding - these sentences relate. This is a reasonable outcome if the aim of semantics is one of characterizing linguistic structure in its own right, i.e. independently of its concrete referential application. It is reasonable also if the aim of semantics is one of specifying the meanings of those syncategorematic terms - for example quantifiers and other logical constants - which play a role equally in all domains of discourse. It is through a treatment of such functional meanings - and this is the glory of post-Tarskian semantics - that set-theoretic models are able to help us to understand the truth-behavior of logically compound sentences in terms of a prior understanding of the truth-behavior of the logically simple sentences - out of which they are constructed. Where, however, our concern is precisely with the world-embrangled meanings of a natural language and with the associated world-embrangled systems of natural cognition, then this sort of semantic investigation must at the very least be supplemented by inquiries of a different sort, inquiries pertaining precisely to the structures of the common- sense world. Only thus, indeed, will we be in a position to establish the adequacy of set-theoretic models of natural language of the more usual sort. Against Methodological Solipsism The thesis that semantics ought properly to proceed by dealing only with certain abstract models tailored to and dependent upon the structures of the languages with which the semantic theorist deals, has its analogue, in the psychological sphere, in the doctrine of `representational' or `internalism'. This doctrine, which lies at the heart of much contemporary cognitive science, has roots in the view of Descartes to the effect that the external world could be in every respect completely different from what it is (could even not exist) and yet our thoughts would remain exactly the same. Perhaps the strongest statement of the internalist thesis has been formulated by Fodor in his paper (1980) on the methodology of cognitive psychology. If, Fodor argues, our psychological processes were to be conceived as relational in structure in the sense that they are intrinsically of or about certain corresponding real-world objects, then the investigation of such processes would have to involve the investigation also of those ‗objectual‘ targets themselves. This, however, would rule out the possibility of a science of psychology. For the latter, before it could formulate laws of its own, would need to presuppose a theory of the objects of thought, and this, as Fodor puts it, would have to be a theory of everything, a universal pan-science embracing all other sciences as constituent disciplines. A ‗nomological‘ psychology would have to `attempt to specify environmental objects in a vocabulary such that environment/organism relations are law-instantiating
    • when so described.' But `we have no access to such a vocabulary prior to the elaboration (completion?) of the non-psychological sciences' (Fodor 1980, p. 300). Thus we could not construct a naturalistic psychology of reference unless we had some way of saying for example what salt is, which of its properties determine its causal relations with other things and with ourselves, and so on ‗ad infinitum‘. Folk Psychology and Folk Ontology The set of common-sense beliefs about external reality is of course part of a wider totality which includes also common-sense linguistics, common-sense economics, a common-sense theory of ethics and law and table-manners as well as much else, and it is by no means easily detachable from this wider background. Really existing common sense is a jumble of many different things, ranging from transient and culture-dependent prejudices to universally accepted truths as trivial as: an A is an A. Common sense includes a massive storehouse of factual knowledge about colors and sounds, about time and space, about what foods are edible and what animals are dangerous. How, then, can we bring some necessary order into this jumble? Recall, first of all, that in developing our theory of the common-sense world we shall not confine ourselves to the resources available at the level of common sense itself. Rather, we shall use in our theorizing the most sophisticated instruments available, paying attention to common-sense beliefs and reasoning processes only insofar as these help us to determine the nature and limits of that common-sense reality which is the proper object of our investigations. Anthropology The first clue to the nature of such pruning is provided by anthropology, and we shall in fact begin our pruning by restricting our attentions, as best we can, exclusively to those beliefs which are common to all cultures and societies. Each culture has of course its own culture-specific common-sense beliefs pertaining to external reality. Anthropologists have, however, established that there is a non-trivial core of such beliefs which is, modulo variations in emphasis and calibration referred to above, common to all societies. Such beliefs belong to what the anthropologist Robin Horton calls `primary' theory, as contrasted with the `secondary' theories of a religious, mythical or scientific nature which pertain to what lies beyond or behind the world that is immediately given in perception and action. As Horton puts it: Primary theory really does not differ very much from community to community or from culture to culture. A particular version of it may be greatly developed in its coverage of one area of experience, and rather undeveloped in its coverage of another. … These differences notwithstanding, however, the overall framework remains the same. In this respect, it provides the cross-cultural voyager with his intellectual bridgehead. Primary theory gives the world a foreground filled with middle-sized (say between a hundred times as large and a hundred times as small as human beings), enduring, solid objects. These objects are interrelated, indeed inter-defined, in terms of a `push-pull' conception of causality, in which spatial and temporal contiguity are seen as crucial to the transmission of change. They are related spatially in terms of five dichotomies:
    • `left'/`right'; `above'/`below'; `in-front-of'/`behind'; `inside'/`outside'; `contiguous'/`separate'. And temporally in terms of one trichotomy `before'/`at the same time'/`after'. Finally, primary theory makes two major distinctions amongst its objects: first, that between human beings and other objects; and second, among human beings, that between self and others. In the case of secondary theory, differences of emphasis and degree give place to startling differences in kind as between community and community, culture and culture. For example, the Western anthropologist brought up with a purely mechanistic view of the world may find the spiritualistic world-view of an African community alien in the extreme. (Horton 1982, p. 228) Developmental Psychology The idea that there is a non-trivial, true theory of reality that is common to all people has a long history, making itself felt already in connection with doctrines on the natural law in Ulpian and Aquinas.(23) The idea is often formulated in developmental terms, as for example in the work of the French Jesuit Claude Buffier who, in arguing against what he saw as the threat of skepticism initiated by the Cartesian philosophy, presents a view of common sense as a matter of certain dispositions given by nature to all men or, manifestly, to the great majority, so that when they have attained the use of reason they may pass common and uniform judgment concerning various objects of private opinion individually perceived. This judgment is not the consequence of any prior principle. (24) Or as Thomas Reid would have it: common-sense beliefs, in being shared by all of us past the stage of infancy, are such that they form an inevitable presupposition to our interchange with others. These beliefs are taken so much for granted that whatever is contrary to them appears not false but absurd. Dr. Benjamin Spock and his prognosis of kindness won the day for a new generation. Cultural Relativism What of arguments to the effect that the common-sense world is a cultural artifact of Western science, a framework for organizing data which is presupposed by Western linguists, anthropologists and others and which is as it were foisted on the data obtained in the investigation of alien cultures in such a way as to make it appear that the common-sense world thereby `discovered' amounted to a cultural invariant? Views of this sort seem to be implied by the work of philosophers such as Quine, whose thesis of the inscrutability of reference might be held to support the claim that we can never know the ontology of an alien interlocutor since we can never enjoy data that is free of our own ontological imputations. As Kelley argues, however, Quine's very formulation of his own thesis is inconsistent, since it uses terms such as `language', `reference' etc. as if the referents of these terms were themselves ‗scrutable‘. Linguistic idealism in general is, as Kelley points out, incompatible with the knowledge used to explain and defend it. Linguistic idealists rely on a theory of human beings as objects of scientific knowledge, including theories of how language is learned and social practices inculcated, to explain in what sense the objects of knowledge and the truth of propositions depend on our conceptual scheme, and to justify their claim that they do so depend (Kelley 1986, p. 193).
    • Towards a Theory of the Common-Sense World Primary theory, is a matter of beliefs relating to the objects of direct perceptions. This means: 1. Perceptions which do not involve the interpolation of any theory or interpretation, perceptions which are integrated directly (physiologically), rather than via some conceptually mediated process of deduction or inference. And it means 2. Perceptions which are typical or generic, in the sense that they do not involve special instruments or apparatus or special circumstances - as contrasted e.g. with perceptual experiences in the cinema or in the psychology laboratory or under special chemical influence. Such special cases are not significant from the point of view of the specification and delineation of the common- sense world (a state of affairs parallel, in some ways, to that which obtains in the field of research into linguistic universals). For the common-sense world is delineated by our beliefs about what happens in ‗mesoscopic‘ ‗reality in most cases and most of the time. It is oriented, in other words, about the focal instances of the phenomena of the everyday world, rather than about non-standard or deviant phenomena. Where is the wisdom we have lost in knowledge? Where is the knowledge that we have lost in information?quot; T.S. Eliot, The Rock (1934) pt.1 Merci W. Cooper --Mathematics does play, however, also a more sovereign role in physics. This was already implied in the statement, made when discussing the role of applied mathematics, that the laws of nature must have been formulated in the language of mathematics to be an object for the use of applied mathematics. The statement that the laws of nature are written in the language of mathematics was properly made three hundred years ago; [8 It is attributed to Galileo] it is now more true than ever before. In order to show the importance which mathematical concepts possess in the formulation of the laws of physics, let us recall, as an example, the axioms of quantum mechanics as formulated, explicitly, by the great physicist, Dirac. There are two basic concepts in quantum mechanics: states and observables. The states are vectors in Hilbert space, the observables self-adjoint operators on these vectors. The possible values of the observations are the characteristic values of the operators - but we had better stop here lest we engage in a listing of the mathematical concepts developed in the theory of linear operators. A much more difficult and confusing situation would arise if we could, some day, establish a theory of the phenomena of consciousness, or of biology, which would be as coherent and convincing as our present theories of the inanimate world. Mendel's laws of inheritance and the subsequent work on genes may well form the beginning of such a theory as far as biology is concerned. Furthermore,, it is quite possible that an abstract argument can be found which shows that there is a conflict between such a theory and the accepted principles of physics. The argument could be of such abstract nature that it might not be possible to resolve the conflict, in favor of one or of the other theory, by an experiment. Such a situation would put a heavy strain on our faith in our theories and on our belief in the reality of the concepts which we form. It would give us a deep sense of frustration in our search for what I called quot;the ultimate truth.quot; The reason that such a situation is conceivable is that, fundamentally, we do not know why our theories work so well.
    • Hence, their accuracy may not prove their truth and consistency. Indeed, it is this writer's belief that something rather akin to the situation which was described above exists if the present laws of heredity and of physics are confronted. Let me end on a more cheerful note. The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve. We should be grateful for it and hope that it will remain valid in future research and that it will extend, for better or for worse, to our pleasure, even though perhaps also to our bafflement, to wide branches of learning. Histoire et philosophie des mathmatiques Le quasi-empirisme en philosophie des mathmatiques. Une presentation Liens mathmatiques en relation indirecte avec le quasi-empirisme Retour la page d'accueil. Mathematics, rightly viewed, possesses not only truth, but supreme beauty - a beauty cold and austere, like that of sculpture, without appeal to any part of our weaker nature, without the gorgeous trappings of painting or music, yet sublimely pure, and capable of a stern perfection such as only the greatest art can show. The true spirit of delight, the exaltation, the sense of being more than Man, which is the touchstone of the highest excellence, is to be found in mathematics as surely as in poetry. - BERTRAND RUSSELL, Study of Mathematics The real problem of BRST super symmetry - Physics requires Metaphysics Incommensurability is the real issue within this micro-macro dichotomy DIRAC STRUCTURES ON HILBERT SPACES A. PARSIAN and A. SHAFEI DEH ABAD (Received 5 May 1997 and in revised form 4 August 1997) Keywords and phrases. Dirac structure, maximally isotropic, Hilbert space, $L$-admissibility. The Becchi – Rouet – Stora - Tyupin (BRST) quantization method Richard Feynman's book Lectures on Gravitation (1962-63 lectures at Caltech), Addison-Wesley 1995, contains a section on Quantum Gravity by Brian Hatfield, who says: quot;... Feynman ... felt ... that ... the fact that a mass less spin-2 field can be interpreted as a metric was simply a quot;coincidencequot; ... In order to produce a static force and not just scattering, the emission or absorption of a single graviton by either particle [of a pair of particles] must leave both particles in the same internal state ... Therefore the graviton must have integer spin. ... when the exchange particle carries odd integer spin, like charges repel and opposite charges attract ... when the exchanged particle carries even integer spin, the potential is universally attractive ... If we assume that the exchanged particle is spin 0, then we lose the coupling of gravity to the spin-1 photon ... the graviton is mass less because gravity is a long ranged force and it is spin 2 in order to be able to
    • couple the energy content of matter with universal attraction ... Hence, the gravitational field is represented by a rank 2 tensor field ... the antisymmetric part behaves like a couple of spin-1 fields ... and therefore should be dumped. This leaves a symmetric tensor field ... the higher spin possibilities are neglected ...‖ Feynman's book also contains a Foreword by John Preskill and Kip S. Thorne in which they say: quot;... Feynman's investigations of quantum gravity eventually led him to a seminal discovery ... that a quot;ghostquot; field must be introduced into the covariant quantized theory to maintain unitarily at one- loop order of perturbation theory ....it was eventually DeWitt ... and also Faddeev and Popov ... who worked out how to generalize the covariant quantization of Yang-Mills theory and gravitation to arbitrary loop order ...quot;. Comments about Higgs fields, noncommutative geometry and the standard model Abstract: G.Cammarata R.Coquereaux We make a short review of the formalism that describes Higgs and Yang Mills fields as two particular cases of an appropriate generalization of the notion of connection. We also comment about the several variants of this formalism, their interest, the relations with noncommutative geometry, the existence (or lack of existence) of phenomenological predictions, the relation with Lie super-algebras etc. ‗Spontaneous symmetry-breaking‘ is physics for ‗phase change‘ - ice to water::solid to liquid The Higgs mechanism requires quot;spontaneous symmetry breakingquot; of a scalar field potential whose minima are not zero, but which form a 3-sphere SU(2). In particular, one real component of the complex Higgs scalar doublet is set to v / sqrt(2), where v is the modulus of the 3-sphere of minima, usually called the vacuum expectation value. A good description of BRST cohomology is in the paper of Garcia-Compean, Lopez-Romero, Rodriguez-Segura, and Socolovsky. As they state, the only force for which a renormalizable quantum theory is not well known is gravity. Do physics and it's cosmology offer a plausible description of creation? As cosmologists and physicists push the boundary of our understanding of the universe ever closer to its beginning, one has to wonder whether the creation event itself is explainable by physics as we know it, or can ever know it. Though such a program still seems quite fantastic, not so long ago it seemed utterly unthinkable. A few theoretical physicists have started to work on the problem. One approach, still highly speculative, is to consider our entire universe as the result of a tiny quantum fluctuation in the vacuum. Under the right circumstances such a fluctuation could expand to scales unimaginably larger than the entire observable universe. Clearly, these questions are at the heart of humankind's quest to understand our place in the cosmos. They involve some of the most fundamental unanswered questions of physical science. But why, in a time of great national needs and budget deficits, should the U.S. taxpayer support such seemingly impractical research as that described above? Cosmology based solely upon quot;modernquot; physics and its quantum field debates is like the Tertullian Theories of the medieval monastic‘s....egocentric. It is intuitive and compelling to theorize about cosmological origins and universal principals; even precocious - yet the magnitude; momentum and sheer unwieldy nature of such vastness accounts theology as greater than philosophy.
    • Insight into quot;nonmonotonic deductive object oriented database languagesquot; and noncommutative geometry of Dirac - Hilbert Spaces - these are abstracts of scientific papers; ON THE QUANTUM BRST STRUCTURE OF CLASSICAL MECHANICS -------------------------------------------------------------------------------- Dirac-Bergmann Observables for Tetrad Gravity Luca Lusanna An „Enveloped‟ Comet Sezione INFN di Firenze, Largo E.Fermi 2, 50125 Firenze, Italy lusanna@fi.infn.it Abstract. The electromagnetic, weak, strong and gravitational interactions are described by singular Lagrangians, so that their Hamiltonian formulation requires Dirac-Bergmann theory of constraints [1,2]. The requirements of gauge and/or diffeomorphism invariance, plus manifest Lorentz covariance in the case of flat space-time, force us to work with redundant degrees of freedom. In the standard SU(3)xSU(2)xU(1) model of elementary particles in Minkowski spacetime the reduction to the physical degrees of freedom is done only at the quantum level with the BRST method. However, in this way only infinitesimal gauge transformations in the framework of local quantum field theory are considered, so that there are many open problems: the understanding of finite gauge transformations and of the associated moduli spaces, the Gribov ambiguity dependence on the choice of the function space for the fields and the gauge transformations, the confinement of quarks, the definition of relativistic bound states and how to put them among the asymptotic states, the nonlocality of charged states in quantum electrodynamics, not to speak of the foundational and practical problems posed by gravity. While behind the gauge freedom of gauge theories proper there are Lie groups acting on some internal space so that the measurable quantities must be gauge invariant, the gauge freedom of theories invariant under diffeomorphism groups of the underlying space-time (general relativity, string theory and reparametrization invariant systems of relativistic particles) concerns the arbitrariness for the observer in the choice of the definition of quot;what is space and/or timequot; (and relative times in the case of particles), i.e. of the definitory properties either of space-time itself or of the measuring apparatuses.
    • Have Acoustic Oscillations been detected in the Current Cosmic Microwave background Data? and Last ARCHEOPS flight Abstract The angular power spectrum of the Cosmic Microwave Background has been measured out to sufficiently small angular scale to encompass a few acoustic oscillations. We use a ‗phenomenological fit’ to the angular power spectrum to quantify the statistical significance of these oscillations and discuss the cosmological implications of such a finding. Creations “Lambda Presence” – VISIBLE AND INVISIBLE An international team of astronomers using NASA's Hubble Space Telescope has created a three- dimensional map that provides the first direct look at the large-scale distribution of dark matter in the universe This „quintessential‟ state – rests upon what physicists have discerned as the „Bose Einstein‟ state. The „quest‟ for Absolute Zero – what the 19th century considered as „ground state‟; provided the „classical observer‟ with the realization that „zero degrees Kelvin – 0 K‟ is actually another „Energy State‟. Apollo Astronaut theoretical Fusion „dynamo‟ Bose-Einstein State – BCE 2005 Walking on Moon developed by European Agency Laser in „interstitial‟ superfluid This „state‟ represents „profound incommensurability‟; K theory itself recognizes this „infinitesimal‟ integral as a first order paradigm for scientific understanding. The „Josephson Junction‟ provided Quantum Mechanics with a practical application – the SQUID; diagramed below –
    • SQUID Magnetometer Josephson junction - Tunneling Helium 2 – Superfluid It is written as spoken by an angel – “For with God; Nothing – is Impossible”. Condensed matter physics Medical Nanotechnology MAPK/ERK pathway ARCHEOPS experiment - February 2002 flight. Are these 'metaphysical' paradigms: 'phenomenological fit' and 'cosmological implications'? Or allegorical descriptions of concepts inspired by statistical models. Intuition The verb to intuit means to grasp by intuition _Implies MEMORY (basis)  quick and ready insight  seemingly independent of previous experiences and empirical knowledge  immediate apprehension or cognition [perceptual insight]  knowledge or conviction gained by intuition  the power or faculty of attaining to direct knowledge or cognition  Without evident rational thought and inference.  the perceiving of unconsciousness  The verb to intuit means to grasp by intuition. Intuition is by definition not the same as an opinion based on experience but may have unconsciously been formed by previous experiences. A person who has an intuitive opinion cannot (necessarily) fully explain why he or she holds that view.
    • Intuition is an unconscious form of knowledge. It is immediate and not open to Rational/analytical thought processes. It differs from instinct, which does not have the experience element. It is memory based – that is why an ‗intuitive interface‘ as computer programmers delineate how their ‗products‘ work and are easily grasped (learned). It is the highest form of skill acquisition of Dreyfus and Dreyfus models (in practice). Intuition has advantages in solving complex problems and finding new results. Intuition is one source of common sense. It can also help in induction to gain empirical knowledge. Sources of intuition are feeling, experiences and knowledge. An important intuitive method is brainstorming. Intuition does not mean to find a solution immediately. Sometimes it helps to sleep one night. There is an old Russian maxim: quot;The morning is wiser than the eveningquot; (quot;Утро вечера мудреннееquot;). Intuition plays a key role in Romanticism. A situation which is or appears to be true but violates our intuition is called a paradox (a paradox can also be a logical self-contradiction). An example of this is the Birthday paradox. In the philosophy of Immanuel Kant, intuition is one of the basic cognitive faculties, equivalent to what might loosely be called perception. Kant held that our mind casts all of our external intuitions in the form of space, and all of our internal intuitions (memory, thought) in the form of time. Intuitionism is a position in philosophy of mathematics derived from Kant's claim that all mathematical knowledge is knowledge of the pure forms of the intuition. Intuitionist logics are a class of logics, devised and advanced by Arend Heyting and Luitzen Egbertus Jan Brouwer and more recently by Michael Dummett, to accommodate intuitionism about mathematics (as well as anti-realism more generally). These logics are characterized by rejecting the law of excluded middle: as a consequence they do not in general accept rules such as disjunctive syllogism and ‗reductio ad absurdum‘. Intuitionism is a form of constructivism. Intuition (MBTI) is one of the four axes of the Myers-Briggs Type Indicator. Controversy From Wikipedia, the free encyclopedia. (Redirected from Dispute) Look up…… Controversy on Wiktionary, see Controversy (disambiguation). A controversy is a contentious dispute, a disagreement in opinions over which parties are actively arguing. Controversies can range from private disputes between two individuals to large-scale social upheavals. Controversies in mathematics and the sciences are generally eventually solved. It is the nature of controversies in the humanities that they cannot generally be conclusively settled and may be accompanied by the disruption of peace and even quarreling. In some cases, this may
    • be because the two sides to a dispute differ so much in their quot;givensquot; that in effect they are not having the same argument. In other cases, culture moves on, and the subject of the controversy becomes quaint in retrospect and increasingly irrelevant. Present-day areas of controversy include religion, politics, war, property, social class, and taxes. Controversy in matters of theology has traditionally been particularly heated, giving rise to ‗odium theologicum‘. Counter-intelligence From Wikipedia (Redirected from Counterintelligence) Counterintelligence or counter-espionage is the act of seeking and identifying espionage activities; Or using it as a 'diversion' to 'false flag' attention from another covert activity. Major nations have organizations which perform this role. Methods include surveillance of suspects and their communications, undercover agents, monitoring the behavior of legally accredited 'diplomatic personnel' (some of whom are sometimes actually spies or spy handlers), and similar means. Treachery is its mainstay! When spying is discovered, the agencies usually have arrest power, but it is often more productive to keep a careful eye on them to see what they know, where they go, and who they talk to. Furthermore, disinformation can be used to fool the spies and their sponsors, or make them cease their activities if they learn their information has become unreliable and/or their secrecy has been compromised. Intelligence and counter-intelligence activities occur not only between governments but between industries as well as criminal groups. In law as jurisprudence, a controversy differs from a case, which includes all suits criminal as well as civil; a controversy is a purely civil proceeding. In the Constitution of the United States, the judicial power shall extend to controversies to which the United States shall be a party (Article 2, Section 1). The meaning to be attached to the word controversy in the constitution is that given above. In propaganda The term is not always used in a purely descriptive way. The use of the word tends itself to create controversy where none may have authentically existed, acting as a self-fulfilling prophecy. Propagandists, therefore, may employ it as a quot;tar-brush,quot; pejoratively, and thus create a perceived atmosphere of controversy, discrediting the subject: In advertising On the other hand, controversy is also used in advertising to try to draw attention to a product or idea by labeling it as controversial, even if the idea has become widely accepted to a given segment of the population. This strategy has been known to be especially successful in promoting books and films. In early Christianity Many of the early Christian writers, among them Irenaeus, Athanasius, and Jerome, were famed as quot;controversialistsquot;; they wrote works against perceived heresy or heretical individuals, works whose titles begin quot;Adversus...quot; such as Irenaeus' Adversus haeresis. The Christian writers inherited from the classical rhetors the conviction that controversial confrontations, even over trivial matters, were a demonstration of intellectual superiority.
    • Oriental Viewpoint on Existential Phenomenology Nietzsche once called Christ ―a worthy opponent‖; Watsuji Tetsurô might have said the same of Martin Heidegger. Watsuji wrote his Fûdo (Climates) as a direct response to Being and Time, to emphasize the spatial element of human existence which he felt was lacking in Heidegger‘s account. His magnum opus, the Rinrigaku (A Study of Ethics), was also written with Heidegger in mind: in arguing that human existence is always recognized as an existence with others, Watsuji objects to the idea that Heidegger‘s Dasein encounters the world before encountering social relationships. It turns out both of these objections are connected, for Watsuji defines spatiality as the dimension in which relationships take place, and those relationships in turn play a part in creating that spatiality. For Watsuji, social relatedness and spatiality define the world we live in, he sees that Heidegger‘s conception of world is deficient in this dimension. The question, then, is how accurate Watsuji‘s objections truly are. By comparing Watsuji‘s conception of world with Heidegger‘s, I will show that their views have much more in common than Watsuji suspects, and that, while some of his criticisms are legitimate, the two philosophies are not worlds apart. Phenomenological Ontology and Consciousness - http://www.geocities.com/sartresite/sartre_theses1.html Early Sartrean philosophy is one of a pursuit of being. It is an attempt to grasp being through an investigation of the way being presents to consciousness - phenomenological ontology. Phenomenological ontology refers to the study of being through its appearances. This simplistic definition needs further clarification. First, by phenomenon Sartre refers to the totality of appearances of a thing and not simply a particular appearance. As Wilfrid Desan writes, phenomenology is quot;a method which wants to describe all that manifests itself as it manifests itself.quot; Moreover, H.J. Blackham observes that in Sartre, quot;the objects of consciousness, the phenomena, the appearances of things, disclose what is really there as it really is, though never exhaustively.quot; This position is better understood when we come to discuss Sartre's twofold division of being. There we shall find that the in-itself presented to the reflective consciousness in phenomena is a totalized being, being in its plenitude. Blackham continues: Consciousness implies and refers to an existence other than its own and to its own existence as a question. It is this relation of the pour-soi to the en-soi which is the foundation (and the only condition) of knowledge and action. Knowledge is necessarily intuition, the presence of consciousness to the object which it is not. This is the original condition of all experience. Before the object is defined and interpreted, consciousness constitutes itself by separating itself from it. Second, most commentators claim that Sartre preferred ontology to metaphysics because the former, as the study of being as being, presupposes the traditional claim of the precedence of essence over existence and the existence of human nature. Moreover, although ontology denotes the study of being, it quot;does not revive the ghosts of substance, soul, and God.quot; Sartre claims that the basic distinction of Existentialism from other systems of thought is its claim of the precedence of existence over essence and the negation of a primordial human nature. Man first is, and then he makes his essence through the choices he makes. It would thus be inappropriate to use the term metaphysics since it jeopardizes this very distinction.
    • Consciousness is a being such that in its being, its being is in question in so far as this being implies a being other than itself. We can never have consciousness which is stable; the basic characteristic of consciousness is its dynamicity, spontaneity, and freedom. He follows Husserl's principle of the intentionality of consciousness. Consciousness is first and foremost a consciousness of something. To say that consciousness is consciousness of something means that for consciousness there is no being outside of that precise obligation to be revealing intuition of something - i.e. of a transcendent being. All forms of consciousness are likewise intentional. Imagination, as a form of consciousness, is intentional. One cannot just imagine, he must always be imagining something. Furthermore, even emotional consciousness is intentional. When Sartre defined emotion as a certain way of apprehending the world, it implies that emotion is a way of relating to the world. In this relation consists the intentionality of emotions. When one loves, he always loves something or somebody. Second, subjectivity is the consciousness of consciousness. Sartre says that consciousness is a being, the nature of which is to be conscious of its being. When applied to man, Sartre further claims that man is, for the reason that man thinks. Finally, Is consciousness a ‗nothingness‘? It is nothingness in the sense that it is always not ‗that thing‘. It is always in the making, and to try to view it as permanent is to do injustice to its very definition. From this arises the assertion that there is no set of permanent entity which is the human self. Consciousness is either pre-reflexive or reflexive. Sartre sometimes uses the term non-thetic consciousness or non-positional self-consciousness to pre-reflexive consciousness in discussing this kind of consciousness. It refers to the cogito prior to all forms of reflection. What we have here is not knowledge, but an implicit consciousness of being consciousness of an object. The basic datum of Sartrean phenomenology is basically this kind of consciousness which is prior to all forms of reflection. On the other hand, reflective consciousness, which he sometimes terms thetic consciousness, is the consciousness of the reflecting cogito. Complexities of Living Tissue Coastal Cleanup Logo Battle for the Mind – Drugs Kill
    • Med nanotechnology Cancer Fighting Cells Zhang – Neural Stem Cells Optics - Recogniton Stem Cells Process Model Motor Neuron Stem Cell Pluripotential - Blastocytes EGH - http://www.sciencedaily.com/releases/2000/10/001011071804.htm Study Reveals How Growth Factors Affect Human Stem Cells ScienceDaily (Oct. 11, 2000) — October 10, 2000 — Researchers have begun to probe the effects that growth factors have on the differentiation of human embryonic stem cells. According to the researchers, their efforts represent a step toward understanding how to direct human embryonic stem cells to become the more specialized cells that make up specific tissues such as brain and muscle. In a research article published in the October 10, 2000, issue of Proceedings of the National Academy of Sciences, research teams led by Howard Hughes Medical Institute investigator Douglas A. Melton and Hebrew University geneticist Nissim Benvenisty report that they applied eight growth factors to cultured human embryonic stem cells to observe their effects on cell differentiation. Human embryonic stem cells are undifferentiated cells that can develop into any of the specialized cell types found in the human body. Their developmental fate is influenced by the activity of a number of cellular signals, including growth factors. quot;Until now, no one had reported extensive and systematic studies on human embryonic stem cells,quot; said Melton, who is at Harvard University.
    • Other research teams had performed similar studies on mouse embryonic stem cells, but Melton, Benvenisty and their colleagues saw the need to do a more comprehensive, systematic analysis of the effects of growth factors on the differentiation of human embryonic stem cells. By applying each of the eight growth factors to the cultured stem cells, the researchers were able to follow the developmental path that the cells chose while under the influence of a specific growth factor. The studies showed that each of the growth factors elicited subtle differences in effect. While none of the growth factors unequivocally directed development of differentiation toward a specific cell lineage, the studies hint that a combination and timing of growth factors might achieve such an end. Melton emphasized that the choice of growth factors was a practical one and by no means represents the broad spectrum of growth factors that might govern stem cell differentiation. quot;While we have demonstrated the potential for directing the differentiation of these cells for use in cell replacement therapy, even in the best case, this represents only an initial step forward,quot; he said. quot;Besides choosing those growth factors that were available, we chose those for which we could detect receptors on the surface of the cells. There was no sense in adding a growth factor if the cells didn't express a receptor for that growth factor.quot; The growth factors directed the stem cells to differentiate into three different categories— endodermal, ectodermal and mesodermal. Endodermal cells give rise to the liver and pancreas; ectodermal cells become brain, skin and adrenal tissues; and mesodermal cells become muscle. Furthermore, the researchers found that they could categorize the growth factors based on their effects on differentiation. One group of growth factors appeared to inhibit endodermal and ectodermal cells, but allowed differentiation into mesodermal cells. A second group induced differentiation into ectodermal and mesodermal cells, and a third group allowed differentiation into all three embryonic lineages. quot;When an egg cell divides, it doesn't immediately tell its daughter cells to become nerve, brain or pancreatic cells,quot; said Melton. quot;Rather, it first parses cells into the three general territories (germ layers)—ectoderm, mesoderm and endoderm. And, our studies showed that the growth factors encourage cells to develop into more of one germ layer and less of the other two. quot;In the best of all possible worlds, one would like to find growth factors that could be added to a human embryonic stem cell to make it become a cardiomyocyte to replace defective heart muscle or a pancreatic beta cell for transplantation into diabetics,quot; said Melton. quot;But these studies strongly suggest that finding such a factor will be exceedingly unlikely.quot; Also, said Melton, the finding that most of the growth factors inhibit differentiation of specific cell types suggests that use of growth factor inhibitors might prove as important as inducers in directing stem cell differentiation. Ultimately, he said, controlling stem cell differentiation will likely involve a strategy that employs multiple growth factors in a certain order and at certain times. quot;It may be a bit like educating a child, in which you don't designate children in kindergarten as doctors, lawyers or surgeons, but you give them some kind of general education. And, as they progress and show an interest in a specific field, you give them a more specialized education.quot; Adapted from materials provided by Howard Hughes Medical Institute.
    • Howard Hughes Medical Institute (2000, October 11). Study Reveals How Growth Factors Affect Human Stem Cells. Purpose-designed nanomaterial – NanoMedicine embryonic Mice stem cells In a dramatic demonstration of what nanotechnology might achieve in regenerative medicine, paralyzed lab mice with spinal cord injuries have regained the ability to use their hind legs six weeks after a simple injection of a purpose-designed nanomaterial. Photos taken by a scanning electron microscope of silicon nanowires before (left) and after (right) absorbing lithium. Both photos were taken at the same magnification. The work is described in ―High-performance lithium battery anodes using silicon nanowires,‖ published online Dec. 16 in Nature Nanotechnology.
    • Nanoplumbing – DeSalination Technology Tesla Motors – the ‗roadster‘ being tested Entanglement detection and fractional quantum Hall effect in optical lattices Quantum Information Processing Dynamic Model of Human Heart Quantum ‘anyons’ and the fractal hall effect – courtesy Microsoft
    • Quantum Hall effect From Wikipedia, the free encyclopedia The quantum Hall effect (or integer quantum Hall effect) is a quantum- mechanical version of the Hall effect, observed in two-dimensional electron systems subjected to low temperatures and strong magnetic fields, in which the Hall conductivity σ takes on the quantized values where e is the elementary charge and h is Planck's constant. The prefactor ν is known as the quot;filling factorquot;, and can take on either integer ( ν = 1, 2, 3, etc...) or rational fraction ( ν = 1/3, 1/5, 5/2, 12/5 etc...) values. The quantum Hall effect is referred to as the integer or fractional quantum Hall effect depending on whether ν is an integer or fraction respectively. The integer quantum Hall effect is very well understood, and can be simply explained in terms of single particle orbitals of an electron in a magnetic field (see Landau Quantization). The fractional quantum Hall, however, effect is more complicated, and its existence relies fundamentally on electron-electron interactions.
    • Fractal Geometry – courtesy of D. Hofstadter Hofstadter's butterfly Energy levels and wave functions of Bloch electrons in rational and irrational magnetic fields Abstract Download: Page Images , PDF (2330 kB), or Buy this Article (Use Article Pack) Export: BibTeX or * EndNote (RIS) Douglas R. Hofstadter - Physics Department, University of Oregon, Eugene, Oregon 97403 Received 9 February 1976 Original paper Phys. Rev. B 14, 2239 - 2249 (1976) An effective single-band Hamiltonian representing a crystal electron in a uniform magnetic field is constructed from the tight-binding form of a Bloch band by replacing ℏ k→ by the operator p→-eA→ / c. The resultant Schrödinger equation becomes a finite-difference equation whose eigenvalues can be computed by a matrix method. The magnetic flux which passes through a lattice cell, divided by a flux quantum, yields a dimensionless parameter whose rationality or irrationality highly influences the nature of the computed spectrum. The graph of the spectrum over a wide range of quot;rationalquot; fields is plotted. A recursive structure is discovered in the graph, which enables a number of theorems to be proven, bearing particularly on the question of continuity. The recursive structure is not unlike that predicted by Azbel', using a continued fraction for the dimensionless parameter. An iterative algorithm for deriving the clustering pattern of the magnetic sub-bands is given, which follows from the recursive structure. From this algorithm, the nature of the spectrum at an quot;irrationalquot; field can be deduced; it is seen to be an uncountable but measure-zero set of points (a Cantor set). Despite these-features, it is shown that the graph is continuous as the magnetic field varies. It is also shown how a spectrum with simplified properties can be derived from the rigorously derived spectrum, by introducing a spread in the field This spectrum satisfies all the intuitively desirable values. properties of a spectrum. The spectrum here presented is shown to agree with that predicted by A. Rauh in a completely different model for crystal electrons in a magnetic field. A new type of magnetic quot;superlatticequot; is introduced, constructed so that its unit cell intercepts precisely one quantum of flux. It is shown that this cell represents the periodicity of solutions of the difference equation. It is also shown how this superlattice allows the determination of the wave function at nonlattice sites. Evidence is offered that the wave functions belonging to irrational fields are everywhere defined and are continuous in this model, whereas those belonging to rational fields are only defined on a discrete set of points. A method for investigating these predictions experimentally is sketched. ©1976 The American Physical Society URL: http://link.aps.org/abstract/PRB/v14/p2239 DOI: 10.1103/PhysRevB.14.2239
    • Lattice QCD From Wikipedia, the free encyclopedia - Jump to: navigation, search In physics, lattice quantum chromodynamics (lattice QCD) is a theory of quarks and gluons formulated on a space-time lattice. That is, it is a lattice model of quantum chromodynamics, a special case of a lattice gauge theory or lattice field theory. At the moment, this is the most well established non-perturbative approach to solving the theory of Quantum Chromodynamics. Analytic or perturbative solutions in QCD are hard or impossible due to the highly nonlinear nature of the strong force. The formulation of QCD on a discrete rather than continuous space-time naturally introduces a momentum cut off at the order 1/a, which regularizes the theory. As a result lattice QCD is mathematically well-defined. Most importantly, lattice QCD provides the framework for investigation of non-perturbative phenomena such as confinement and quark-gluon plasma formation, which are intractable by means of analytic field theories. Condensed Matter Theory Group – Technical Aspects are Multiplying I will place your ‘transgressions’ as far away as the east is from the west! Composite fermions are a new class of particles discovered in condensed matter physics. A composite fermion is the bound state of an electron and an even number of quantized vortices (often thought of as an electron carrying an even number of magnetic flux quanta).
    • In lattice QCD spacetime is represented not as continuous but as a crystalline lattice, vertices connected by lines. Quarks may reside only on vertices and gluons can only travel along lines. While this is understood to be a fiction, the hope is that as the spacing between vertices is reduced to zero, or to the Planck length, the theory will yield meaningful results. Ontogeny and it’s Ontologies Molecular Biology and its Ontogeny Ribosome Celebrating the Three Domain Hypothesis - press release from the University of Illinois at Urbana-Champaign (USA) Thirty years ago this month, researchers at the University of Illinois published a discovery that challenged basic assumptions about the broadest classifications of life. Their discovery – which was based on an analysis of ribosomal RNA, an ancient molecule essential to the replication of all cells – opened up a new field of study, and established a first draft of the evolutionary “tree of life.” If there was a set of universal ethical principles that applied to all cultures, philosophies, faiths and professions, it would provide an invaluable framework for dialogue. Larry Colero, Crossroads Programs Inc. What is capable of bridging the gaps within the rational - irrational paradigms of psychology; while developing the relationship between commutative logic and its emotional intelligence theory? The answer is very important for the cause of gaining a perspective of the millennia that is neither trivial nor excessively ethno-centric. Ethical virtues are esteemed as ‘common good’ – their importance is emphasized in consideration of the ‘Age Old’ paradox of philosophy – how to understand the adamant principles of reason in their relationship with ‘the soul’. The question of virtue is dependent upon effective use of human potentialities; empathy for rational discernment [as emotional intelligence] and an intuitive grasp of reality – making a virtuous person someone whom others can trust; looking to them for insight and leadership.
    • The purpose of discernment is to actually comprehend what is going on; hopefully for the cause of living a meaningful life. Discernment of the Spirit is a gift/talent that provides spiritual insight - even into history itself; which from the viewpoint of the millennia is a very useful talent to have. Discernment of the spirit and what is generally referred to as 'common sense' are very closely related and at important moments - the same. Some will deny this spiritual aspect of existence; promoting instead 'it's' mentality as being 'purely physical'. Be that as it may; the difference [spiritual - physical] is sublime. And the truth of spiritual reality can be understood as GOOD in its 'true way' being divided from that which denies this ‘good’ and promotes violence. One aspect of the millennia which promotes spiritual discernment is worthy of both theological and philosophical consideration; the authentic Judeo-Christian tradition and its relevance to the continuum of human existence - especially its 'just' relations with other traditions of spiritual values. This millennial 'period' – i.e. 1992-2008; has a panoramic aspect - Jerusalem 3000 and the Millennial Jubilees of A.D. 2000 in relationship to a global celebration of history which is amazing to consider. These 'contingencies' were and are being enjoyed by a global audience [in spite of the world’s enmity] thanks to the discreteness and privacy of the global internet and it’s World Wide Web. Are these developments merely coincidence or is this 'momentous' development a product of the desire for kinship and peace; usurping a military industrial edifice which would use this incredible power for selfishness and greed? Consider the quality of light that fiber optic networks are capable of providing; it may be appreciated as a remarkable achievement of 'momentous' importance. Disaster relief efforts and environmental monitoring efforts have greatly benefited - helping to save lives, relieve suffering and even forecast large scale geological disasters. An utterly amazing variety of scientific achievements and discoveries have also benefited from collaborative sharing and communications - helping to better organize what 'may become of them'. In terms of the millennia this can be understood as 'prophetic reality'; being shown a vision of things which 'will be' - people can prepare themselves for what must be done to counter act great calamities and work to help [being prepared] what can be done. It cannot be emphasized strongly enough – the difference between ‘prophetic reality’ and the ‘prophetic vision’ which the Holy Scriptures attains. Indeed the vision of mankind which Biblical Prophecy presents has the purpose of ‘salvation’ and that is to impart to the reader (or listener) – both God’s love and the Almighty’s Omnipresence! The relevance of the spiritual to the physical is being demonstrated for all who want to understand. In fact - providing technologic developments which make the IMMENSE Universe of Astronomic observation; that 'space' exploration is revealing - even more compelling than 'starry nights' already do! Two of which:  The Bose/Einstein Effect [Matter as 'entropic energy' waves]  The Fractal Hall Effect [information as 'geometric' - transactional] Are being demonstrated (and developed) during this period. Both 'undergird' the correspondence
    • principle which quantum physics needs to understand for the communication of 'change'. Energy Efficiency will greatly increase because of these and associated technologic advancements as 'quantum computing' becomes a reality. Both were also first 'theoretical' insights - 'science fiction'. Theoretical Tachyon Synchrotron – Proton Therapy Sriyan Mantra One insight concerning the investigation and development of Teleportation and ‘quantum entanglement’: Physicist Richard Feynman is quoted as having said that quot;if you think you understand quantum mechanics, you don't understand quantum mechanics.quot; Or sometimes he is cited thusly: quot;I think I can safely say that nobody understands quantum mechanics.quot;  Zero Point Energy (ZPE) Quantum Vacuum Fluctuations Our world may be a giant hologram  15 January 2009 by Marcus Chown  Magazine issue 2691. Subscribe and get 4 free issues.  For similar stories, visit the Cosmology Topic Guide Scientists worldwide have been looking for gravitational waves - ripples in space-time thrown off by superdense astronomical objects such as neutron stars and black holes. GEO600 [a project based in Germany] has not detected any gravitational waves so far, but it might inadvertently have made the most important discovery in physics for half a century. For many months, the GEO600 team-members had been scratching their heads over inexplicable noise that is plaguing their giant detector. Then, out of the blue, a researcher approached them with an explanation. In fact, he had even predicted the noise before he knew they were detecting it. According to Craig Hogan, a physicist at the Fermilab particle physics lab in Batavia, Illinois, GEO600 has stumbled upon the fundamental limit of space-time - the point where space-time stops behaving like the smooth continuum Einstein described and instead dissolves into quot;grainsquot;, just as a newspaper photograph dissolves into dots as you
    • zoom in. quot;It looks like GEO600 is being buffeted by the microscopic quantum convulsions of space-time,quot; says Hogan. __________________________________________________________________________________________ Could our three dimensions be the ultimate cosmic illusion? A German detector is picking up a hint that we are all mere projections (Image Ledomira/Stock.xchng) Full size image Image 2 of 2 Here are ‘some very recent (summer of 2008) deep space images produced by Hubble Space Telescope and th published in celebration of its 100,000 orbit of our amazing planet – here the concept of ‘commensurability’ and principles of incommensurability take on new meaning. Quantum Information provides a unique conceptual framework and a ‘long awaited’ return to the original ‘vision’ of scientists which occurred during the early stages of ‘The Golden Age of Physics’ and that is the question of ‘the vacuum’ and it’s virtuality. One long hallowed paradigm of metaphysics is ‘Dichotomy’ - which is very useful in appreciation of different viewpoints and concepts relating to the same (or similar) ‘phenomenon’. A new paradigm is ‘re-emerging’ and that is ‘Trichotomy’ where the virtuality of spacetime becomes metaphysical and therefore ‘integral’ to Reason and it’s Logic. The promise of Quantum Field Theory is being realized; in ways which may never have been considered – such as the true Omnipresence of Light; while other aspects considered in pop culture as purely science fiction are now rapidly being demonstrated (according to their actual nature and it’s adamant constraints) – with all diligence. The practical aspects of these things are equally amazing – a few of them are presented as pictures (under the fair use principle) – quantum computing; condensed matter technologies; optical interferometry and holographic projection; nanotechnology and it’s ‘fullerene’ conceptuality; superceding the servo-mechanical Newton Principles of Classical Physics.
    • Holographic Imaging IASA – EuSA LISA - European Space Agency Fermi Lab Complex Each proton is made of three quarks, but the individual masses of these quarks only add up to about 1% of the proton's mass (Illustration: Forschungszentrum Julich/Seitenplan/NASA/ESA/AURA-Caltech) Physicists have now confirmed that the apparently substantial stuff is actually no more than fluctuations in the quantum vacuum. The researchers simulated the frantic activity that goes on inside protons and neutrons. These particles provide almost all the mass of ordinary matter. Each proton (or neutron) is made of three quarks - but the individual masses of these quarks only add up to about 1% of the proton's mass. So what accounts for the rest of it? Theory says it is created by the force that binds quarks together, called the strong nuclear force. In quantum terms, the strong force is carried by a field of virtual particles called gluons, randomly popping into existence and disappearing again. The energy of these vacuum fluctuations has to be included in the total mass of the proton and neutron. But it has taken decades to work out the actual numbers. The strong force is described by the equations of quantum chromodynamics, or QCD, which are too difficult to solve in most cases. So physicists have developed a method called lattice QCD, which models smooth space and time as a grid of separate points. This pixilated approach allows the complexities of the strong force to be simulated approximately by computer. This ‘reality’ is presented in ‘living systems’ as well – biochemistry teaches that a ‘proton motive force’ Cellular energy – can be represented as the “sum of the electrical and chemical potentials are called the electrochemical potential, which when divided by nF gives the proton motive force.” from - It's confirmed: Matter is merely vacuum fluctuations - 19:00 20 November 2008 by Stephen Battersby
    • Molecular Biology has never been VIEWED so comprehensively - Nanotechnology from TAINANO Courtesy of New Scientist Uprooting Darwin's tree IN JULY 1837, Charles Darwin had a flash of inspiration. In his study at his house in London, he turned to a new page in his red leather notebook and wrote, quot;I thinkquot;. Then he drew a spindly sketch of a tree. How can Physics and its 'calculations' become amenable to spiritual principles? In the Gospel of John - Jesus the Messiah spoke these words to a 'Samaritan woman' who was drawing water from a well; quot;God is a spirit and those that worship him must worship him in Spirit and in Truthquot;. This 'insight' corresponds with what philosophic tradition; both East and West - discern as the 'Rationality of Truth' and the irrationality of deceit. The growth and development of individuals, families, communities and even worldwide cultures succeed as rational adjustments to reality and respect for the rights of others become living testaments to a 'common good'. This ethical logic is reason; being demonstrated physically [as real properties] in fractal geometry on a sub-microscopic scale. Being considered - the courage of conviction is needed in order to stand up for the truth; what is known to be right. Reason [as Logic] divides the rational from the irrational and this 'discernment' can provide 'advocacy' to the cause of justice [as True Purpose]. Universal Ethics are advanced at the millennium – providing a ‘Rational Framework’ for discourse The principles have been organized into three categories for ease of use: personal, professional, and global ethics. PRINCIPLES OF PERSONAL ETHICS Personal ethics might also be called morality, since they reflect general expectations of any person in any society, acting in any capacity. These are the principles we try to instill in our children,and expect of one another without needing to articulate the expectation or formalize it in any way.
    • Principles of Personal Ethics include:  Concern for the well-being of others  Respect for the autonomy of others  Trustworthiness and honesty  Willing compliance with the law (with the exception of civil disobedience)  Basic justice; being fair  Refusing to take unfair advantage  Benevolence; doing good  Preventing harm PRINCIPLES OF PROFESSIONAL ETHICS Individuals acting in a professional capacity take on an additional burden of ethical responsibility. For example, professional associations have codes of ethics that prescribe required behavior within the context of a professional practice such as medicine, law, accounting, or engineering. These written codes provide rules of conduct and standards of behavior based on the principles of Professional Ethics, which include:  Impartiality; objectivity  Openness; full disclosure  Confidentiality  Due diligence/duty of care  Fidelity to professional responsibilities  Avoiding potential or apparent conflict of interest Even when not written into a code , principles of professional ethics are usually expected of people in business, employees, volunteers, elected representatives and so on. PRINCIPLES OF GLOBAL ETHICS Global ethics are the most controversial of the three categories, and the least understood. Open to wide interpretation as to how or whether they should be applied, these principles can sometimes generate emotional response and heated debate. Principles of Global Ethics include:  Global justice (as reflected in international laws)  Society before self/social responsibility  Environmental stewardship  Interdependence and responsibility for the “whole”  Reverence for place
    • Each of us influences the world by simply existing; and it is always wise to “think globally.” An added measure of accountability is placed on globally influential enterprises such as governments and transnational corporations. (Responsibility comes with power whether we accept it or not.) One of the burdens of leadership is to influence society and world affairs in a positive way. Can a person, nation or company truly be “successful” while causing human suffering or irreparable environmental damage? A more modern and complete model of success also considers impact on humanity and the earth’s ecology. CO-EXISTENCE OF PRINCIPLES Principles can only provide guidance. There are myriad situations that will never lend themselves to an easy formula, and the principles can only be used to trigger our conscience or guide our decisions. (As stated earlier, they are also useful for ethics education.) It is important to note that principles of personal ethics are the first checkpoint in any situation, often overriding those at the professional and global levels. For example, when judging if a corporation has been socially responsible, we still need to consider principles of personal ethics as prerequisites. Contributions to charities and the like (doing good) may appear to be in the interests of society, but lose their significant if the corporation has not also taken responsibility to prevent or minimize the damage done by their core business operations. Similarly, trustworthiness is fundamental to professionalism, and so on. As well, there are many times when principles will collide with other principles. Let’s say you are a scientist who has been coerced by a corrupt military dictatorship into designing a biological weapon. Since the project is top secret, you have a professional duty to maintain confidentiality. But if there were an opportunity to inform United Nations observers, global and personal principles would justify divulging confidential information to protect the overall good of humanity.(Compare this to selling confidential information for personal gain.) Still, the scientist is faced with a tough decision since they or their family could be harmed as a result of the whistle-blowing. This is where the principles must be viewed in the context of universality. © Crossroads Programs, Inc. www.universalthics.com
    • Appendix Ronen, Ruth: quot;Incommensurability and Representationquot; Ontology and Ontogeny - TWO INTERPRETATIONS OF MODEL THEORY The four basic Ontologies of semantic interpretation – language modeling.  Rhetoric and Incommensurability  “Rationality and Judgment”  Relativism Refuted: A Critique of Contemporary Epistemological Relativism  KNOWLEDGE, JUSTIFICATION, AND TRUTH  Correspondence and Justification Bonjour - Princeton Dissertation  The Correspondence Principle and Incommensurability  quot;methodology of scientific research programsquot;  quot;tradition-constituted enquiry”  sustained attempts to overcome the assumptions of logical empiricism  apprehension - theory-change is rational http://www.bu.edu/wcp/Papers/Scie/ScieMine.htm  a tradition's quot;internal failurequot; is coherent [in it’s becoming]  crucial questions about the epistemology and ontology of traditions  Ontogeny - The Tree of Life: Cold Start? http://www.astrobio.net/news/article302.html  (Emmanuel Levinas ―Trace of the Face Translated by A Linglis, Tijdschrift Philosophie 1963  Totality and Infinity  The New Scientist – Numerous Citations from this excellent publication  Otherwise than Being or Beyond Essence. Duquesne University Press  Merci W. Cooper http://www.dartmouth.edu/~matc/MathDrama/reading/Wigner.html  http://www.ditext.com/bonjour/bonjour0.html  http://www.incommensurability.com/about.htm  http://www.linguistik.uni-erlangen.de/~rrh/papers/ontologies/dublin.html  http://www.chass.utoronto.ca/french/as-sa/ASSA-No5/RR11.htm  http://www-out.bell-labs.com/project/classic/papers/NeoRef.html  http://www.uow.edu.au/arts/sts/bmartin/dissent/documents/AIDS/Hooper04/evidence.html
    •  http://search.msn.com/results.aspx?q=Representation+Theory+%2b+communalism+&FORM=SSPW2  http://www.non-gmoreport.com/sample_articles.php  http://plato.stanford.edu/entries/truth-correspondence/#8  http://www.quodlibet.net/petridis-levinas.shtml  ftp://ftp.alainconnes.org/shahnlong.pdf  http://noncommutativegeometry.blogspot.com/2007/07/non-standard-stuff.html  http://plato.stanford.edu/entries/truth-correspondence/ Appendix – continued  http://www.mel.nist.gov/psl/psl-ontology/part12/act_occ.th.html  http://www.hiddenmeanings.com/cosmos.html  http://tolweb.org/tree/phylogeny.html  http://www.nai.arc.nasa.gov/news_stories/news_detail.cfm?ID=94  http://www.linguistik.uni-erlangen.de/~rrh/papers/ontologies/dublin.html Network Information Flow - Networking and Information Theory Semantic Web Services Initiative Semiotics for Beginners‘ Bioethics must be a watchdog of ‗Military Research and Development‘ Information Theory for Mobile Ad-Hoc Networks Biologically-Inspired Cognitive Architectures (BICA) – ‘dangerous’ http://www.aber.ac.uk/media/Documents/S4B/semiotic.html http://www.swsi.org/ Refine results for Bioethics: No No local Workflows WSDL workflows applications only only US progressives fight for a voice in bioethics : Nature Arguing that conservative bioethics is out of step with most Americans, the group is forming a 'progressive' movement to influence discussions of scientific ... www.nature.com/nature/journal/v437/n7061/full/437932a.html Labeled No workflows WSDL only BioMed Central | Full text | Global bioethics - myth or reality?
    • There has been debate on whether a global or unified field of bioethics exists. If bioethics is a unified global field, or at the very least a closely ... www.biomedcentral.com/1472-6939/7/10 - 86k - Cached Labeled WSDL only No workflows Bioethics at the bench : Nature Some researchers, however, pay little heed to bioethics, or regard it as a potential impediment to their work. Concerns about the relationship between ... www.nature.com/nature/journal/v440/n7088/full/4401089b.html Labeled WSDL only No workflows [DOC] Appendix 3: Top 50 ‗Bioethics‘ Books on Amazon, US & UK File Format: Microsoft Word - View as HTML 7, Life, Liberty and the Defense of Dignity: The Challenge for Bioethics, Leon Kass (2004) .... Cloning, Assisted Suicide, & Other Challenges in Bioethics, ... www.biomedcentral.com/content/supplementary/1472-6939-7-10-S3.doc Labeled No workflows WSDL only Cultivate bioethics!: Nature Reviews Genetics Medical practitioners and life science researchers increasingly complain about bioethics as an inadequate endeavour that impedes both biomedical research ... www.nature.com/nrg/journal/v7/n8/full/nrg1928.html Labeled No workflows WSDL only [PDF] SENS: An Engineer‘s Point of View on Feasibility and Bioethics ... to the editor, I agree that the bioethics com-. munity has not significantly discussed ... Present day bioethics. discourse does not even waste time posing ... www.liebertonline.com/doi/pdf/10.1089/rej.2006.0522 Labeled No workflows WSDL only Science, politics and the President's Council on Bioethics ... Though the reports produced by the US President's Council on Bioethics have not yet had a dramatic effect on public debates in the United States, ... www.nature.com/nbt/journal/v22/n5/full/nbt0504-509.html Labeled No workflows WSDL only [PDF] Perspective Longevity Research and Bioethics http://www.bioethics.gov/background/age_retardation.html. (last visited June 22, 2004). .... So, putting aside the Council on Bioethics Work- ... www.liebertonline.com/doi/abs/10.1089/blr.2004.23.542 Labeled No workflows WSDL only Bioethics : authors & referees @ npg The Nature journals' policies relating to bioethics, such as experiments on living organisms and biosecurity, can be found here. www.nature.com/authors/editorial_policies/bioethics.html Labeled WSDL only No workflows [PDF] Meetings and Events American Society for Bioethics and Humanities. October 28–31, 2004 ... American Society of Bioethics and. Humanities. 4700 W. Lake. Glenview, IL 60025-1485 ... www.liebertonline.com/doi/pdf/10.1089/jpm.2004.7.732 Labeled WSDL only No workflows
    • Result Page: 1 2 3 4 5 6 7 Next