In spite of being tempting and easy-to-understand, Thought Experiments have a high chance to enforce preconscious biases and assumptions of the researcher/student/listener. This "injection of bias" happens because, upon hearing a story, the listener pre-consciously fills in the blanks/the unknowns with hidden/mute assumptions, which do NOT then reach the "debate" from the conscious mind. As you may expect, this phenomenon especially affects universities from Western countries with biases about non-Western cultures.
Discourse Or Document? Issues of adopting Emerging Digital Genres for Scholar...Cornelius Puschmann
Held on June 24th 2009 in Cologne at the 5th International Conference on e-Social Science (http://www.ncess.ac.uk/conference-09/) as part of the workshop 'Scientific Writing and New Patterns of Scientific Communication' organized by Julian Newman and Esther Breuer.
1
Assignment: Annotated Bibliography
xxxxxx xxxxxxxx
University of Maryland University College
WRTG391: Advanced Research Writing
xxxxxx xxxx
January 26, 2021
Assignment: Annotated Bibliography
Bolhuis, J. J., Tattersall, I., Chomsky, N., & Berwick, R. C. (2014). How Could Language Have Evolved? PLoS Biology, 12(8), 1–6. https://doi-org.ezproxy.umgc.edu/10.1371/journal.pbio.1001934
The authors, Bolhuis et al. take an exclusivist approach and claim that language has a hierarchical structure that is missing from animal communication systems. The authors view the way our minds create language as being by “merging” elements which they call “atoms” (i.e. words, verbs, etc.). Merged elements can both continue indefinitely and apply to themselves, therefore being nested and in a sense, hierarchical. They further state that there’s no evidence that animals have conceptual “atoms.” They conclude that given the universality of language amongst humans, and the assumption that it has not evolved from animals, that it is this “merge” trait that effectively singlehandedly is responsible for the evolution of language in humans. They mention sign language as a counter argument that studying animal vocal communication is essential to understanding the evolution of language. Also interesting is the point that the unique nature of human language renders a comparison by shared evolutionary descent impossible. I found this article very useful for illustrating this particular school of thought and as a contrast to Fitch, 2019.
Donald, M. (2017). Key cognitive preconditions for the evolution of language. Psychonomic Bulletin & Review, 24(1), 204–208. https://doi-org.ezproxy.umgc.edu/10.3758/s13423-016-1102-x
The author, Donald claims that development of the use of tools to a degree complex enough that their use was culturally embedded and cross-generational was what drove the coevolution of language. Donald further claims that the opposite possibility, that language came first, is unlikely because the acquisition and use of language requires the ability to hone an intrinsically complex hierarchical skillset, as in complex toolmaking. The author does not seem to support this with any specific evidence, but reiterates his claims as a conclusion. I don’t think this is well argued but I agree with the premise that human cognitive capacities have been pushed to greater complexity out of the necessity to manipulate tools (I would say our environment) as much as interact socially. I would find this article useful as a jumping off point for this perspective but most likely not as a primary source.
Ten Cate, C. (2017). Assessing the uniqueness of language: Animal grammatical abilities take center stage. Psychonomic Bulletin & Review, 24(1), 91-96. https://dx.doi.org/10.3758%2Fs13423-016-1091-9
The author presents the “Strong Minimalist Thesis” that the essence of language is that we combine, or ‘merge’ elements. An element can be a noun, a verb, an art ...
Pragmatics and Discourse , context & speech actsNaeemIqbal88
Pragmatics and Discourse
What is pragmatics?
An approach within DA which concentrates on the way language
acquires meaning in use. It has developed from the tradition of the
philosophy of language known as pragmatics.
Focus: The study of contextualised meaning and is concerned with
describing the principles that underlie how we interpret the meaning
behind words: how we get from what we say to what we mean.
Pragmatic approaches tend to be interested in the 'big picture': trying
to formulate generalisable principles about how people produce and
interpret discourse (eg’ the use of humour in business meetings’).
Context
Context is an important concept in DA. Language does not take place in a vacuum and we
need to consider the context in which it occurs in order to understand it.
However, this seemingly unproblematic statement masks the issues and debates that are
ongoing in discourse analysis around the concept of context and its significance.
Two types of context
The 'intrinsic' or 'linguistic' context which refers to information that can be found
within the text that surrounds the language that is being analysed at a particular
moment. It is generally agreed that this type of context is not only useful but essential.
The more problematic type of context lies outside the actual text: what is sometimes
called 'extrinsic' (Schegloff 1997) or 'experiential' context.
This refers to all sorts of information about setting, situation, social circumstances of the
participants such as age, gender, ethnicity and possibly also about the shared
background knowledge and assumptions of the participants.
So, in the example:
'Later, an item about vasectomy and the results of the do-it-yourself competition'
(from Cameron 2001:12)
 The issue with extrinsic context is moving from description to interpretation in
research. Along with describing 'what' is happening in the discourse , it is also
important to interpret 'why' it is happening.
 Extrinsic contextual evidence can be potentially very useful in discussing why
participants say a particular thing in a particular way at a particular time,
however, there is also a danger of 'reading too much into the text' and of
judging which out of many possible interpretations is the 'right' one.
For instance, if the analyst is aware of gender, age or ethic difference among
discourse participants, these variables may well appear to influence the
discourse but how do we know which of these particular variables are
important to the participants in an particular communicative event.
 This is not to say that we should ignore extrinsic context but to suggest that we
need to be cautious about what we select as significant and rigorous about how
we incorporate it into our analyses. Schegloff (1997) advises that the best
option is to use only what can be shown to be relevant to participants.
 Can you imagine a meaningful context for this text?
a. Which of you was the prawns?
Whatever your question is, math already has a map to the answerBogdan Bocse
Mathematics seems to be discovered, nonetheless, the same way the perfect form for a a cutting tool was discovered by the primitive man. Just as the tool, mathematics still represents a concoction of essential secrets of the Universe and circumstantial limitation of the human body, mind and perception.
The Intelligence Wars -Neopolitics of so-called ”A.I.” in the Digital Post-tr...Bogdan Bocse
I do not believe we are at the cusp of some great precipice in the ascent of the human spirit, intelligence and collection consciousness. I don't have to believe this. Because I know for a fact that it is true: the last barrier between Humanity and a Modern, Digital age of Illuminism is Education, not the blind reliance on technological promises. Once we remove the boundaries set by the previous generation in defining the educational goals our civilization will experience a new stage in its enlightenment.
My passion and my passion is to carve out a new, strong, more flexible class of languages that bridge the gap between human intelligence and machine intelligence before the inequality gaps of Access, of Opportunity and of Autonomy widen even further into degradation.
Discourse Or Document? Issues of adopting Emerging Digital Genres for Scholar...Cornelius Puschmann
Held on June 24th 2009 in Cologne at the 5th International Conference on e-Social Science (http://www.ncess.ac.uk/conference-09/) as part of the workshop 'Scientific Writing and New Patterns of Scientific Communication' organized by Julian Newman and Esther Breuer.
1
Assignment: Annotated Bibliography
xxxxxx xxxxxxxx
University of Maryland University College
WRTG391: Advanced Research Writing
xxxxxx xxxx
January 26, 2021
Assignment: Annotated Bibliography
Bolhuis, J. J., Tattersall, I., Chomsky, N., & Berwick, R. C. (2014). How Could Language Have Evolved? PLoS Biology, 12(8), 1–6. https://doi-org.ezproxy.umgc.edu/10.1371/journal.pbio.1001934
The authors, Bolhuis et al. take an exclusivist approach and claim that language has a hierarchical structure that is missing from animal communication systems. The authors view the way our minds create language as being by “merging” elements which they call “atoms” (i.e. words, verbs, etc.). Merged elements can both continue indefinitely and apply to themselves, therefore being nested and in a sense, hierarchical. They further state that there’s no evidence that animals have conceptual “atoms.” They conclude that given the universality of language amongst humans, and the assumption that it has not evolved from animals, that it is this “merge” trait that effectively singlehandedly is responsible for the evolution of language in humans. They mention sign language as a counter argument that studying animal vocal communication is essential to understanding the evolution of language. Also interesting is the point that the unique nature of human language renders a comparison by shared evolutionary descent impossible. I found this article very useful for illustrating this particular school of thought and as a contrast to Fitch, 2019.
Donald, M. (2017). Key cognitive preconditions for the evolution of language. Psychonomic Bulletin & Review, 24(1), 204–208. https://doi-org.ezproxy.umgc.edu/10.3758/s13423-016-1102-x
The author, Donald claims that development of the use of tools to a degree complex enough that their use was culturally embedded and cross-generational was what drove the coevolution of language. Donald further claims that the opposite possibility, that language came first, is unlikely because the acquisition and use of language requires the ability to hone an intrinsically complex hierarchical skillset, as in complex toolmaking. The author does not seem to support this with any specific evidence, but reiterates his claims as a conclusion. I don’t think this is well argued but I agree with the premise that human cognitive capacities have been pushed to greater complexity out of the necessity to manipulate tools (I would say our environment) as much as interact socially. I would find this article useful as a jumping off point for this perspective but most likely not as a primary source.
Ten Cate, C. (2017). Assessing the uniqueness of language: Animal grammatical abilities take center stage. Psychonomic Bulletin & Review, 24(1), 91-96. https://dx.doi.org/10.3758%2Fs13423-016-1091-9
The author presents the “Strong Minimalist Thesis” that the essence of language is that we combine, or ‘merge’ elements. An element can be a noun, a verb, an art ...
Pragmatics and Discourse , context & speech actsNaeemIqbal88
Pragmatics and Discourse
What is pragmatics?
An approach within DA which concentrates on the way language
acquires meaning in use. It has developed from the tradition of the
philosophy of language known as pragmatics.
Focus: The study of contextualised meaning and is concerned with
describing the principles that underlie how we interpret the meaning
behind words: how we get from what we say to what we mean.
Pragmatic approaches tend to be interested in the 'big picture': trying
to formulate generalisable principles about how people produce and
interpret discourse (eg’ the use of humour in business meetings’).
Context
Context is an important concept in DA. Language does not take place in a vacuum and we
need to consider the context in which it occurs in order to understand it.
However, this seemingly unproblematic statement masks the issues and debates that are
ongoing in discourse analysis around the concept of context and its significance.
Two types of context
The 'intrinsic' or 'linguistic' context which refers to information that can be found
within the text that surrounds the language that is being analysed at a particular
moment. It is generally agreed that this type of context is not only useful but essential.
The more problematic type of context lies outside the actual text: what is sometimes
called 'extrinsic' (Schegloff 1997) or 'experiential' context.
This refers to all sorts of information about setting, situation, social circumstances of the
participants such as age, gender, ethnicity and possibly also about the shared
background knowledge and assumptions of the participants.
So, in the example:
'Later, an item about vasectomy and the results of the do-it-yourself competition'
(from Cameron 2001:12)
 The issue with extrinsic context is moving from description to interpretation in
research. Along with describing 'what' is happening in the discourse , it is also
important to interpret 'why' it is happening.
 Extrinsic contextual evidence can be potentially very useful in discussing why
participants say a particular thing in a particular way at a particular time,
however, there is also a danger of 'reading too much into the text' and of
judging which out of many possible interpretations is the 'right' one.
For instance, if the analyst is aware of gender, age or ethic difference among
discourse participants, these variables may well appear to influence the
discourse but how do we know which of these particular variables are
important to the participants in an particular communicative event.
 This is not to say that we should ignore extrinsic context but to suggest that we
need to be cautious about what we select as significant and rigorous about how
we incorporate it into our analyses. Schegloff (1997) advises that the best
option is to use only what can be shown to be relevant to participants.
 Can you imagine a meaningful context for this text?
a. Which of you was the prawns?
Whatever your question is, math already has a map to the answerBogdan Bocse
Mathematics seems to be discovered, nonetheless, the same way the perfect form for a a cutting tool was discovered by the primitive man. Just as the tool, mathematics still represents a concoction of essential secrets of the Universe and circumstantial limitation of the human body, mind and perception.
The Intelligence Wars -Neopolitics of so-called ”A.I.” in the Digital Post-tr...Bogdan Bocse
I do not believe we are at the cusp of some great precipice in the ascent of the human spirit, intelligence and collection consciousness. I don't have to believe this. Because I know for a fact that it is true: the last barrier between Humanity and a Modern, Digital age of Illuminism is Education, not the blind reliance on technological promises. Once we remove the boundaries set by the previous generation in defining the educational goals our civilization will experience a new stage in its enlightenment.
My passion and my passion is to carve out a new, strong, more flexible class of languages that bridge the gap between human intelligence and machine intelligence before the inequality gaps of Access, of Opportunity and of Autonomy widen even further into degradation.
We are transitioning from the data economy to the intelligence economy. In the short-to-mid term, artificial intelligence will be neither autonomous or capable of taking over the world. It will however be at the centre of a revolution of work and the economy.
Computer Vision - The New Renaissance or 1983?Bogdan Bocse
Computer vision brings great potential for automation, while also being associated with fears of violations of privacy. Let us discover how we can reap the benefits while mitigating the risk.
VisageCloud makes face recognition as easy as possible, so you can focus your energy on your creativity and the specifics of your app, without having to worry about managing deep learning, classifiers, perspective alignment, color space and all the other hassle. In this document, we’ll go through the domain model and some example API calls.
Training and Face Recognition in 5 Easy Steps with VisageCloudBogdan Bocse
VisageCloud makes face recognition as easy as possible, so you can focus your energy on your creativity and the specifics of your app, without having to worry about managing deep learning, classifiers, perspective alignment, color space and all the other hassle. In this document, we’ll go through the domain model and some example API calls.
VisageCloud - Face Recognition meets Big Data.Bogdan Bocse
Visage Cloud merges state-of-the-art deep learning algorithms for face recognition and classification with data querying, tagging and querying techniques so as to empower you to leverage the full value of your data.
Face recognition meets big data. In cloud or on-premise.
Axway - comunicat de presa - HackathonBogdan Bocse
Mâine, 22 septembrie 2016, are loc prima ediţie Hackathon organizată intern de către Axway România. Evenimentul va începe la ora 11:00 la sediul companiei şi va dura 24 de ore, non-stop, prezentarea proiectelor şi anunţarea câştigătorilor având loc a doua zi.
The Rise of Digital Audio (AdsWizz, DevTalks Bucharest, 2015)Bogdan Bocse
The exponential growth of digital audio brings AdsWizz to challenges that relate not only to huge volumes of data, but also to respecting milliseconds constraints around response times and to leveraging rich prediction models. Let us share how big data stores, distributed processing and elastic infrastructures have turned from being the cool trend to being business-as-usual for us.
An introduction into how solution architecture is a discipline that helps teams and organization understand technology challenges and problems before they happen. Understand why apparently free solutions may often become the most expensive. Learn too expect that having too much choice (and freedom) isn't always a good thing when building state of the art systems in the real world. Find out why integrating systems is a necessary evil that must be tamed, not avoided. And be prepared to ask questions - because defining the problem is often a lot harder than finding a solution.
TimeOP: Automated System for PC Activity Tracking and User Productivity AnalysisBogdan Bocse
The current project aims at describing, analyzing and building TimeOP (http://www.timeop.com), an integrated environment for tracking the computer activity, productivity and efficiency of the individual users and teams, with respect to the time and effort they spend on various desktop and web applications.
The desktop tracker responsible for acquiring and safely sending data does so in a seamless transparent manner, which does not disrupt the users’ workflow. The activity reports, time sheets and productivity charts are made available online to authorized users through an interface that is easy to use, responsive and compliant with web standards.
In order to maximize the commercial potential of TimeOP, the system integrates an interface with online payments for subscription-based access to the service. Also, it incorporates a server-side licensing mechanism for easy in-house deployment to larger customers.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
ISI 2024: Application Form (Extended), Exam Date (Out), EligibilitySciAstra
The Indian Statistical Institute (ISI) has extended its application deadline for 2024 admissions to April 2. Known for its excellence in statistics and related fields, ISI offers a range of programs from Bachelor's to Junior Research Fellowships. The admission test is scheduled for May 12, 2024. Eligibility varies by program, generally requiring a background in Mathematics and English for undergraduate courses and specific degrees for postgraduate and research positions. Application fees are ₹1500 for male general category applicants and ₹1000 for females. Applications are open to Indian and OCI candidates.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
Deep Software Variability and Frictionless Reproducibility
The deconstruction of the Chinese Room
1. The deconstruction of the
Chinese Room
Thought-experiments are Breeding Grounds for Bias, Prejudice, Preconception and
Discrimination
Bogdan BOCȘE
Managing Partner Knosis.ai
2. About Bogdan
● Chief Executive Geek at Knosis.ai and
Envisage.ai
○ AAI*: artificial and augmented intelligence
● 14+ years of experience in big data and
solution architecture, startup and corporate
● Founder of DeepVISS
○ Open-source ways to make machines talk to
each other about how humans learn
○ Applied Mathematics and Interdisciplinary
Research in Education and Psychology of
Intelligence
“To an open mind there are no closed doors.”
3. 1. The Chinese Room - an overview
2. Other thoughts experiments
a. The Turing Test
b. The Babel Library
3. Hidden fallacies of thoughts experiments:
a. The fallacy of immutability over formalism
b. The fallacy of the axiom of (forced) choice
c. The fallacy of assuming all language is formal and algorithmic
d. The unthrown exceptions and undeclared unknowns unknowns
4. Paths forwards & Tools for Researchers
a. Listeners as Compilers
b. Deconstruction is NOT the same as destruction
c. Avoid confusion between <Intuition> and <Knowledge>
d. The catastrophe of boundless abstractions: loss of significance
e. Similitude (tight, strict) vs. Simile (loose, permissive)
5. Our Work at Knosis.ai
Agenda
4. What is the Chinese Room?
● Thought experiment proposed by John Searle in his paper, "Minds, Brains,
and Programs", published in Behavioral and Brain Sciences in 1980
● Assumptions:
○ Assume there is a Computer that behave as if it understand Chinese, in writing, by interpreting
a program for parsing messages and composing replies in Chinese
○ Computer performs its task so convincingly that it comfortably passes the Turing test
● Proposal:
○ What if we were to replace the Computer that is “able” to interpret Chinese with a human
person (or a team of humans) that would execute precisely the same Program (instructions)
for parsing and generating messages in Chinese?
● Questions:
○ Could we then say that the Chinese room has the ability to speak Chinese?
5.
6. The Chinese room argument holds that a digital
computer executing a program cannot have a
"mind", "understanding" or "consciousness",
regardless of how intelligently or human-like the
program may make the computer behave.
The argument is intended to refute a position
Searle calls strong AI: "The appropriately
programmed computer with the right inputs and
outputs would thereby have a mind in exactly
the same sense human beings have minds."
The conclusion of the thought-experiments are
however often corrupted and polluted by the
assumptions allowed or enforced by the
proposer of the thought experiment.
Such assumptions include:
● The machine cannot repair itself
● The machine cannot make copies of itself
● The machine cannot autonomously decide
it physical position (no autonomous
movement).
● The machine cannot make willful or
accidental changes to its body or to its
mind (code).
8. The Turing Test is actually a two-blade scissors (for cutting the “powers” of conversation):
● Linguistic reductionism
○ The entrapment of linguistic formalism -> immutable rigidity of rules
○ No new words are formed or suggested
○ None of the unknowns can be described or inspected in further detail
○ There is no means to gauge para-verbal cues regarding reaction and affect.
○ The entire purpose of the language exchanged is to achieve persuasive imitation.
● Corporal reductionism
○ The entrapment of the interlocutors
○ Lack of material exchange (eg. objects)
○ Lack of shared stimuli (eg. isolated in different rooms)
○ The speaker (writer) and the listener (reader) are forced to communicate as if they have to awareness
of their bodies, of their surroundings and of each others’
The Turing Test and the imitation game
9. …force exact same form …
A Procrustean bed is an arbitrary standard
to which exact con-formity is forced.
In Edgar Allan Poe's influential crime story
"The Purloined Letter" (1844), the private
detective Dupin uses the metaphor of a
Procrustean bed to describe the Parisian
police's overly rigid method of looking for
clues.
10. Looking at subjects of the thought experiment as if they were inmates in a prison.
(or, at least, partially devoided of free will)
11. The Library of Babel
At present it contains all possible pages of 3200
characters, about 104677
books.
Any text you find in any location of the library will
be in the same place in perpetuity.
Try it yourself
Sometimes how one asks the question is more
important than finding the answer.
(1) Choose chamber
(3) Choose shelf
(2) Choose a wall
(4) Choose book & a page
13. The fallacy of immutability over formalism
What it is:
● immutable, adjective. unchanging over
time or unable to be changed.
● The assumption that the objects in the
scene can only perform the actions and
transitions
● Examples:
○ It is assumed that the people working in
the Chinese room cannot leave and cannot
take breaks (for the duration of the
experiment)
○ It is assumed that none of the topics
discussed have to do with the respective
bodies of the interlocutors
What it contradicts:
● In practice (at large scale), actions and
transitions of an actor or an object are NOT
bound by the linguistic definition of the script.
The actor may choose his/her free will to exit
(escape) a role or the actor may, by mistake,
perform a deviation of the original script.
● (At small scale) Heisenberg’s principle of
uncertainty is enough to show that any body
of mass diverges informationally from the
image or from the definition that any observer
has fixed.
14. The fallacy of the axiom of (forced) choice
What it is:
● Informally put, the axiom of choice says
that given any collection of bins, each
containing at least one object, it is possible
to construct a set by arbitrarily choosing
one object from each bin, even if the
collection is infinite. Formally, it states that
for every indexed family of nonempty sets,
there exists an indexed set such that for
every . The axiom of choice was
formulated in 1904 by Ernst Zermelo in
order to formalize his proof of the
well-ordering theorem.
What it contradicts:
● Only bounded (finite) entropy can be
present in a finite volume (Bekenstein
bound).
● No system can, in the observable
universe, expend an infinite amount of
work to perform, at least once, the
prescribed Supertask (eg. picking from
infinitely many non-empty baskets).
15. The fallacy of assuming all* language is formal and algorithmic
What it is:
● Saying that something* is formal is
equivalent with saying that “the vessel
which holds the rules which govern the
existence of said something* does not
change its forms/shapes/topologies”.
● A formalism has at least a core set of rules
(a kernel) which is deemed immutable
(and often impenetrable).
● Such assumptions come at the expense of
the freedoms (degrees of freedom,
entropy, free energy) of the actors involved
in using the prescribed language in a
given script.
What it contradicts:
● The ability of language to form and to adapt.
○ Forming new words for newly-observed
phenomena or newly-defined intentions
○ Forming new language by mixing, especially
during speech, one’s native language with words
from an international language.
● The ability of the interlocutors to negotiate and to
agree-upon changing rules of the conversation or
the form of the conversation.
○ Describing new phenomena using new signals or
symbols which are inspired by local sensing rather
than previous formal convention.
○ Find a name for a new, unnamed fruit after first
tasting it
● The choice to _wait_ for more information to
become available is often available to most
agents in real-life situations.
16. The unthrown exceptions and undeclared unknowns unknowns
What it is:
● ”<Unthrown> exceptions” may refer to
situations where the (en)actor of the formalism
(the person or machine from a thought
experiment) would observe _in-situ anomalies_
(considerable differences between what is
expected and what is observed) and would be
unwilling or unable to report such anomaly to the
architect of the formalism.
● Concealing evidence that contradicts expectations
only to avoid the complexities and perils involved
scrutiny of the core/ground rules (axioms, articles
of the constitution, the N commendments).
What it is:
● “Undeclared unknown unknowns” are a
form of epistemic misrepresentation, which
can be either willful (done to mislead) or
neglijent (done in alleged good faith).
● Example:
○ Pretending to be sure of something you
are not sure about.
○ Falsely pretending to hold knowledge that
you know the other cannot verify.
17. Intuition as a magic mirror.
● Thought experiments rely on intuition
and imagination in order to
mentally-enact the realization of the
proposed narrative under the proposed
assumptions (i.e. anonymous axioms).
● Intuition is the opening (i.e. availability
to expend time and mental resources)
of the reader and writer to also allow
the usage of partially-, weakly-, or
loosely-defined relationships of
similitude->simile->similarity in the
furling and unfurling of the narrative.
19. Listeners as Compilers
● In computing, tombstone diagrams (or
T-diagrams) consist of a set of “puzzle
pieces” representing compilers and other
related language processing programs.
They are used to illustrate and reason
about transformations from a source
language (left of T) to a target language
(right of T) realised in an implementation
language (bottom of T).
● Each listener can be thought of executing
a compiler (corresponding to known
languages, including body language) that
can output thoughts, questions, answers,
as well as descriptions for new compilers.
20. Deconstruction is NOT the same as destruction
Deconstruction ≠ Destruction
Deconstruction ≡ Disassembly
(Taking something apart to see what it is made of and how/why it works the way it does)
(Just like debugging)
(Or like reverse engineering)
( Deconstruction is the act of inventorying hidden assumptiuons)
21. Avoid confusion between <Intuition> and <Knowledge>, while allowing both types of
statements..
● Avoid confusion between <Intuition> and
<Knowledge>, while allowing both types of
statements..
● The formalism of reducing the result of evaluating
ALL statements to a bipolar, flat, uni-linear, True and
False, is incongruent with the levels of uncertainty
most observers experience for most observations in
our Known Universe.
● Recommended alternative: Subjective Logic
22. The catastrophe of boundless abstractions.
● Collapse of the virtual-to-real entropic
representation ratio: too few bits on the “map”
(symbol) representing too many possible states of
the underlying area (the signified).
● Categorical collapse of assumed dependencies
and entropic under-representation.
● See also:
○ Loss of Significance
○ Catastrophic Cancelation
(especially as applied to weights of large neural models)
23. Similitude (tight, strict) vs. Simile (loose, permissive)
A model is said to have similitude with the real
application if the two share geometric similarity,
kinematic similarity and dynamic similarity.
A simile is a figure of speech involving the
comparison of one thing with another thing of a
different kind, used to make a description more
emphatic or vivid (e.g. as brave as a lion).
24. Our Work at Knosis.ai
● Data augmentation
○ Descriptive
■ Labeling
■ Tagging
○ Adversarial
■ Fake Detection
■ Fake Injection
○ Elicitation
■ Imaginative projection
■ Conotational projection
● Hybrid, Full-Duplex Learning between Human and Machine Agents
○ H2M2H Learning
○ Recursive auto-poiesis and reflection
● The Immersion of Directed Multi-graphs in Embedding Fields. (Paper)