TICS: An Approach to The Concept of Information   Presentation for INTAS seminar 21-23 September 2003, Moscow Pavel Luksha...
Why Another Concept of Information? <ul><li>Universal concept of science, a “ third element ” which is “ neither matter no...
Three Approaches to a Concept of Information <ul><li>Materialistic : information is a material phenomenon or essence </li>...
Major Groups of Systems with Information <ul><li>A great diversity of systems in which information  is thought to exist , ...
Invariant Property <ul><li>We suggest that universal, or invariant, property of a diversity of “systems with information” ...
Starting from interaction <ul><li>Every complex system is in constant interactions with its environment </li></ul><ul><li>...
What Is Memory <ul><li>Memory  can be most generally defined as a phenomenon  of reflection </li></ul><ul><ul><li>structur...
What Is SAFE <ul><li>System adaptive functioning environment (SAFE)   </li></ul><ul><ul><li>A structure and organization o...
Defining Information <ul><li>We propose to consider, as informational, only processes in which  </li></ul><ul><ul><li>cont...
Information as Regulation as Information <ul><li>Each such interaction between memory and SAFE is a mutual change of (inte...
Actualization and Potentialization <ul><li>In system dynamics, it is necessary to distinguish actual and potential form of...
Principle of Information Relativity <ul><li>Information ‘omnipresence’   ( Brillouin [1956], Stonier [1972]) – information...
How It Fits with Various Concepts? <ul><li>Few examples (from [Scott, 2003]; can be extended) </li></ul><ul><li>Ashby [195...
Basic Information System <ul><li>Set of relations/ interactions between given memory and SAFE </li></ul><ul><li>Memory and...
Information/Cybernetic System (ICS) <ul><li>Structure differentiated to (a)  a component that interacts (communicates/ reg...
Communication model of ICS <ul><li>Communication not possible without alphabet, representation of signals in memory </li><...
Shannon’s Communication Model <ul><li>Three states of signal: (a) entering communication channel [encoding], (b) transport...
Dynamic Aspect of Information/Programs <ul><li>Information/ cybernetic system is a dynamic system that passes through a ce...
Hierarchy of Functioning  Cycles Simplified example  (higher animal  reproductive behavior) <ul><li>We consider organizati...
Sources of Activity in ICS <ul><li>A source of activity in ICS can either be internal memory, or external SAFE (also, spon...
Adequacy/Efficiency <ul><li>One-one relations do not describe a variety of potential interactions between memory and SAFE ...
Major Conclusions <ul><li>The key to information processes is found in dynamic relation between complex system and its env...
Matrix Model <ul><li>Modeling ‘discrete’ ICS </li></ul><ul><li>Sides of matrix model represent elements of memory and SAFE...
Communication of ICS in Matrix Model: Conjoining Matrices <ul><li>Junction of two matrices represent junction of memory an...
Example: Watt’s Governor <ul><li>A classical example of “first order” model of regulation with feedback </li></ul>changes ...
Example: Matrix model of  Watt’s Governor (4 matrices) weight goes down weight goes up pressure falls pressure grows shaft...
Example: Watt’s Governor (Cont.) weight steam in boiler  working shaft pressure gate memory /  SAFE   ( execution unit no....
Quantitative Measures of Information/Programs Measures of information/program variety Measures of information/program quan...
Limits to Variety and Adjustment to Requisite Variety Law <ul><li>The law of requisite variety , as introduced by W. Ashby...
Limits to Information/Program Quantity with Copies <ul><li>Assuming that each potential information/program appears in mac...
Measures of Shannon and Kolmogorov K(X) = min |p|: U(p)=X   K(X|Y) = min |p|: U(p, Y)=X   I(Y:X) = K(X) – K(X|Y)   Shannon...
Upcoming SlideShare
Loading in …5
×

Revising the Theory of Information

817 views

Published on

Pavel Luksha. This paper, presented at ISSS 2003 and the INTAS project seminar series, redefines the concept of information as the interation between 'memory' and 'environment' components of a system.

Published in: Technology, Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
817
On SlideShare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
25
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Revising the Theory of Information

  1. 1. TICS: An Approach to The Concept of Information Presentation for INTAS seminar 21-23 September 2003, Moscow Pavel Luksha, Alexander Plekhanov
  2. 2. Why Another Concept of Information? <ul><li>Universal concept of science, a “ third element ” which is “ neither matter nor energy ” [Wiener, 1948] </li></ul><ul><li>Some 150 definitions exist [Capurro, 1992] </li></ul><ul><li>Vagueness of concept leads to poor use and even misuse (instead of sticking to a certain commonly accepted definition) </li></ul><ul><li>Yet, we suggest that it is possible to construct a uniform concept of information that will incorporate most of (intuitive) definitions and itself will become a basis for meta-theory </li></ul>
  3. 3. Three Approaches to a Concept of Information <ul><li>Materialistic : information is a material phenomenon or essence </li></ul><ul><li>Idealistic : information is not a material phenomenon or essence </li></ul><ul><li>“ Fictionalistic” : it is neither phenomenon nor essence, but a “fiction” (or, a mental construct) </li></ul><ul><li>For scientific studies of information, it is necessary to consider information as objective material phenomenon (rationalistic tradition [Winograd, Flores, 1986]) </li></ul><ul><li>At the same time, it is an object of meta-theory, not theory (information systems on various ‘layers of existence’) </li></ul>
  4. 4. Major Groups of Systems with Information <ul><li>A great diversity of systems in which information is thought to exist , and completely different material substrate </li></ul>Technical systems with information (computers, robots, telecom means) technical Self-reproducing systems starting from SR macromolecula Biological organisms with central nervous system Pre-social organization of higher animals biological “ Super-organism” insect populations Social groups of various level of organization (micro-groups to societies) Psychic life (emerging socially, as shown by Vygotsky [1978]) Technical systems storing/ transmitting social information social
  5. 5. Invariant Property <ul><li>We suggest that universal, or invariant, property of a diversity of “systems with information” is memory </li></ul>No. Type of information system Types of memory 1 technical    permanent and operative memory devices (memory and quasi-memory of technical systems) 2 biological    genetic innate memory    individual memory in central nervous system 3 social    genetic memory of social behavior    individual memory    social memory in individual memories and technical devices
  6. 6. Starting from interaction <ul><li>Every complex system is in constant interactions with its environment </li></ul><ul><li>For complex systems, these interactions are (a) diverse (more than one state exists) and (b) regularly repeated (Ashby’s analysis of cybernetic systems) </li></ul><ul><li>Understanding information can start from analysis of interactions in systems of lesser complexity (computers, DNA, neurons) towards systems of higher complexity (psychics, societies) </li></ul><ul><li>Consider two types of systems (similar ideas in Cottam [2003]): </li></ul><ul><ul><li>‘ discrete’, or ‘digitized’, which can be described as finite state automata (FSA) </li></ul></ul><ul><ul><li>‘ continuous’ (can be reduced to FSA, but cannot be induced from them) </li></ul></ul>system environment Physics is no longer a science about properties of nature elements; it is about relations and interactions (physicist Yu. Rumer, lecturing)
  7. 7. What Is Memory <ul><li>Memory can be most generally defined as a phenomenon of reflection </li></ul><ul><ul><li>structure and organization of one system are reflected, and further are ‘stored’ for some time, in other system, and are used in interactions between these systems </li></ul></ul><ul><li>Memory can be localized in a given system as its component </li></ul><ul><ul><li>DNA coding a cell, “cognitive eigenvalues” in brain, social memory in society etc </li></ul></ul><ul><li>According to a diversity of interaction types, memory elements can be identified (various interaction types correspond to various memory elements – but only in ‘discrete’ case) </li></ul>memory environment
  8. 8. What Is SAFE <ul><li>System adaptive functioning environment (SAFE) </li></ul><ul><ul><li>A structure and organization of system memory can unambiguously be put into one-one correspondence with a set of objects and links in system’s environment. </li></ul></ul><ul><ul><li>In basic case, a memory can contain nothing but representations of objects with which a system regularly interacts </li></ul></ul><ul><ul><li>The relationship of memory and environment distinguishes some specific set of objects and links in the latter - SAFE </li></ul></ul><ul><li>Some similar concepts: </li></ul><ul><ul><li>‘ Umwelt’ of von Uexküll [von Uexküll, 1982] </li></ul></ul><ul><ul><li>‘ field’ in theory of Kurth Lewin [Lewin, 1950] </li></ul></ul><ul><ul><li>gestalt-/ environmental psychology, e.g. [Lettvine et al., 1959], [Gibson, 1986] </li></ul></ul><ul><ul><li>Lifeworld of Agre and Horswill [Agre, Horswill, 1997] </li></ul></ul>memory SAFE
  9. 9. Defining Information <ul><li>We propose to consider, as informational, only processes in which </li></ul><ul><ul><li>content of system memory </li></ul></ul><ul><ul><li>is explicitly revealed </li></ul></ul><ul><ul><li>through interactions (representation / transformation) </li></ul></ul><ul><ul><li>with its environment (more precisely, SAFE) </li></ul></ul><ul><li>In basic (‘discrete’) case, information is </li></ul><ul><ul><li>interaction between an element of memory and an element of SAFE </li></ul></ul><ul><li>It should be distinguished between </li></ul><ul><ul><li>different informations (qualitatively new interactions) and </li></ul></ul><ul><ul><li>copies of the same information (copies of same interaction ) </li></ul></ul><ul><li>Similar concepts: </li></ul><ul><ul><li>Gibson’s affordances </li></ul></ul><ul><ul><li>Maturana’s system activities, etc. </li></ul></ul>
  10. 10. Information as Regulation as Information <ul><li>Each such interaction between memory and SAFE is a mutual change of (interacting parts) memory and SAFE. </li></ul><ul><ul><li>change in environment through its interaction with memory is a process of regulation , while </li></ul></ul><ul><ul><li>changes in memory through interaction with environment is an identification, or a representation . </li></ul></ul><ul><li>Thus, interaction between memory and SAFE always has two aspects: it is an information and a program at the same time. We propose to call them information/programs . </li></ul><ul><li>Models which can be obtained with this approach represent a generalization of traditional models in theory of information, and traditional models of cybernetics; we propose to call them information/cybernetic models (or information/ cybernetic systems, ICS) </li></ul>
  11. 11. Actualization and Potentialization <ul><li>In system dynamics, it is necessary to distinguish actual and potential form of information/programs: </li></ul><ul><ul><li>actual = interactions (current, or accomplished within some period of time) between elements of SAFE and elements of memory </li></ul></ul><ul><ul><li>potential = interactions which are possible in principle but are not currently observed (i.e. all possible interactions between memory and SAFE). </li></ul></ul><ul><li>A cycle of change between actualization/ potentialization – a swinging pendulum metaphor </li></ul><ul><li>Information / programs are potentially contained inside memory AND inside environment. They can only be actualized as interaction of these two. </li></ul>memory SAFE
  12. 12. Principle of Information Relativity <ul><li>Information ‘omnipresence’ ( Brillouin [1956], Stonier [1972]) – information as physical entity opposing entropy </li></ul><ul><li>Our suggestion – there can never exist information in general. </li></ul><ul><ul><li>Any specific information/program exists only as a part of operation cycle of some specific finite ICS. </li></ul></ul><ul><ul><li>Only if an object in environment is represented in memory, there can exist (and be realized) a corresponding information/program. </li></ul></ul><ul><li>I nformational closure of a complex systems is implied </li></ul><ul><ul><li>O nly those objects of environment reflected in a system memory may actualize corresponding information/programs. </li></ul></ul><ul><ul><li>A variety of information/programs is limited by the volume of system memory </li></ul></ul><ul><li>A universal principle of relativity of information/program existence </li></ul><ul><ul><li>If there is no corresponding element of system memory representing some object of environment, then, relative to this system, even in case of regular interactions with this object , there is no information/program. At the same time, relative to other system which possesses a proper element of memory, there exists a corresponding information/program. </li></ul></ul>
  13. 13. How It Fits with Various Concepts? <ul><li>Few examples (from [Scott, 2003]; can be extended) </li></ul><ul><li>Ashby [1956]: “cybernetic system is closed to information” = information is strictly defined by system memory </li></ul><ul><li>Konorski [1962]: “information cannot be separated from its utilization” = information is regulation, and ‘manifests itself’ through interactions </li></ul><ul><li>Maturana, Pask: “cybernetic system is organizationally closed” = depending on system organization (and memory organization), information contained in system is different </li></ul><ul><li>Stonier: “measure of order” = regularities (incl. sequences of repeated interactions) can be considered as order in system; more information means more (diverse) regularities </li></ul><ul><li>key definitions of Shannon [1948] and Kolmogorov [1965] to be considered (matrix model) </li></ul>
  14. 14. Basic Information System <ul><li>Set of relations/ interactions between given memory and SAFE </li></ul><ul><li>Memory and SAFE as two groups of objects/links isomorphic to each other; without a “reference system”, two equally rightful descriptions “which one plays which role” are possible </li></ul><ul><li>Must be a uniform physical basis for interactions, a similarity (‘lock and key’, ‘eye and light’) </li></ul>SAFE system memory Clutched cog-wheels model
  15. 15. Information/Cybernetic System (ICS) <ul><li>Structure differentiated to (a) a component that interacts (communicates/ regulates) with external environment, and (b) a component that interacts (communicates/ regulates) with internal components, but not with external environment. </li></ul>information/ cybernetic system automate Controlling device ( memory ) Executing device ( memory / SAFE ) internal regulation and communication : regulative interactions inside automate , internal communication channel Contact environment ( SAFE ) external regulation and communication : regulative interactions between automate and external SAFE , external communication channel Non-contact environment regular interactions ( exchange of matter and energy ) between contact and non-contact environment A perceptor model Three cog-wheels?
  16. 16. Communication model of ICS <ul><li>Communication not possible without alphabet, representation of signals in memory </li></ul><ul><li>Signal or message = element of SAFE </li></ul><ul><li>Message can be transported in communication channel, but information only exists as interaction between message and recipient </li></ul><ul><li>In system with differentiated internal memory, at least two informations exist (one as interaction with external SAFE, another as interaction with internal SAFE) </li></ul>SAFE memory memory / SAFE internal interaction between components of information/cybernetic system interaction between information/cybernetic system and its environment internal communication/ regulation channel external communication/ regulation channel
  17. 17. Shannon’s Communication Model <ul><li>Three states of signal: (a) entering communication channel [encoding], (b) transported/waiting in channel [transmission], (c) existing channel [decoding] </li></ul><ul><li>Two communicating agents: (a) for each one, another one is part of external SAFE, (b) each one has “her own” information (“what is said is not what is heard” hermeneutic principle), (c) for adequate communication, must exist a certain similarity between agents </li></ul>memory memory / SAFE memory / SAFE 1 st system 2 nd system memory sender/ recipient 1 sender / recipient 2 coding/ decoding (sending/ receiving) device 1 coding/ decoding (sending/ receiving) device 2
  18. 18. Dynamic Aspect of Information/Programs <ul><li>Information/ cybernetic system is a dynamic system that passes through a certain set of states information/program interactions between memory and SAFE </li></ul><ul><li>These states are regularly reproduced in some (certain pre-determined) sequence forming a cycle of ICS operation </li></ul><ul><li>It is evident that macro-cycle is not a completely arbitrary set of information/programs in arbitrary sequence; it is a quasi-targeted process related to a teleological aspect of ICS operation, and so are its sub-cycles </li></ul><ul><li>Such cycle can be represented as an attractor </li></ul><ul><ul><li>focus point (evolution towards some final state, as in automata) </li></ul></ul><ul><ul><li>limit cycle (repetition of a loop) </li></ul></ul><ul><ul><li>strange attractor (case of complex synergetic systems) </li></ul></ul>
  19. 19. Hierarchy of Functioning Cycles Simplified example (higher animal reproductive behavior) <ul><li>We consider organization of functioning cycles as hierarchical, Russian-doll structure (case of ‘discrete’ model) </li></ul>Cycle type Information/program organization Properties elementary cycle of operation (micro-cycle) actualization of a single information/ program cannot be decomposed to lower level (lower complexity) cycles on a given level of abstraction sub-cycle (meso-cycle) combination of several information/ programs (in certain sequence) can be decomposed to meso-cycles of lower level and microcyles; has a determined goal ICS operation cycle (macro-cycle) (repeated) cycle with a final major goal state and a full variety of information/ programs can be decomposed to meso-/ micro-cycles; repeated cycle in self-maintaining and self-reproducing systems full cycle of reproduction breed care courting
  20. 20. Sources of Activity in ICS <ul><li>A source of activity in ICS can either be internal memory, or external SAFE (also, spontaneous activity from ext. memory / internal SAFE). </li></ul><ul><li>Two major types of meso-cylces (determined by cause-effect sequence): </li></ul><ul><ul><li>memory-driven [active, or pro-active ‘behavior’]: rigid sequence of information/programs, elements of SAFE are ‘expected’ (as result, e.g. inefficient reflexes in animal behavior [Dewsbury, 1978], magic rituals [Levi-Strauss, 1962]) </li></ul></ul><ul><ul><li>SAFE-driven [re-active ‘behavior’]: touch-string principle, explanation for ‘if-then’ sequences </li></ul></ul>memory-driven meso-cycle SAFE - driven meso-cycle external SAFE memory memory / SAFE 1 2
  21. 21. Adequacy/Efficiency <ul><li>One-one relations do not describe a variety of potential interactions between memory and SAFE (e.g. - non-complimentary junctions in DNA replication, inefficient reflexes) </li></ul><ul><li>All information/programs can be distinguished by their value – a degree of adequacy/efficiency </li></ul><ul><ul><li>if actualization of some set of information/ programs ensures a macro-cycle (or its sub-cycle) which is optimal according to some criterion, then this set of information/ program is adequate/efficient. </li></ul></ul><ul><li>Two criteria can be suggested: </li></ul><ul><ul><li>evolutionary , or survival criterion: actualization of a given set of information/programs assures a maximal repetition of a given macro-cycle / meso-cycle </li></ul></ul><ul><ul><li>functional criterion: actualization of a given set of information/ programs allows to achieve a maximal efficiency ratio in a given macro-cycle / meso-cycle </li></ul></ul><ul><li>Summarize more traditional criteria: </li></ul><ul><ul><li>adequate identification (or exact recognition of signals, when a predator is recognized as a predator, or a symbol as a symbol) </li></ul></ul><ul><ul><li>efficient transformation, when a required result is achieved with maximal accuracy and minimal ‘cost’ (time and energy) </li></ul></ul><ul><li>Based on criteria, it is evident that, in natural complex systems (emerged through evolution processes), one-one relations of memory and SAFE correspond to adequate/efficient information/programs </li></ul>
  22. 22. Major Conclusions <ul><li>The key to information processes is found in dynamic relation between complex system and its environment </li></ul><ul><li>Information systems are systems with memory (and not only living, autopoetic or complex). Studies of systems with memory may become a new focus for FIS / UTI </li></ul><ul><li>This approach permits a solution of many methodological problems, and an alternative representation of system statics/dynamics (matrix model) which can be quite enlightening (e.g. adjustment to Ashby’s law). </li></ul><ul><ul><li>Also – bridging traditional users of paradigm (e.g. computer scientists), and scholars looking for ‘clarified paradigm’ (humanity and biology scientists) </li></ul></ul><ul><li>Applications already evident in social sciences (e.g., systematic theory of society) and in biology </li></ul>
  23. 23. Matrix Model <ul><li>Modeling ‘discrete’ ICS </li></ul><ul><li>Sides of matrix model represent elements of memory and SAFE (element base), aligned in one-one relation to each other </li></ul><ul><li>Cells of matrix correspond to element interactions, or information/programs: those in main diagonal are adequate/efficient, other (also possible) inadequate/ inefficient </li></ul><ul><li>Matrix may represent a model of Shannon’s communication with errors </li></ul><ul><li>Two possibilities: (a) any distortions lead to total inadequacy (adequacy 0 or 1); (b) (similarity between elements of element base) slight distortions lead to incomplete adequacy (between 0 and 1) </li></ul>Matrix is around you Cells can be assigned quantity of occurrence (matrix Q), and (derived indicators of) probability/ frequency (matrices P, F), or “state of actualization” (matrix A) SAFE 1 2 3 4 memory 1 2 3 4
  24. 24. Communication of ICS in Matrix Model: Conjoining Matrices <ul><li>Junction of two matrices represent junction of memory and SAFE through an intermediate (external memory/ internal SAFE) </li></ul><ul><li>An intermediate is thus ‘poly-functional’ (consideration of matrix allows to explain why it is typically not an initiator of activities), through which SAFE-driven or memory-driven cycles occur </li></ul><ul><li>If no distortions exist in internal or external relation, internal memory and external SAFE shall stand in one-one relation </li></ul>ext. memory/ int. SAFE internal memory external SAFE
  25. 25. Example: Watt’s Governor <ul><li>A classical example of “first order” model of regulation with feedback </li></ul>changes speed of a working shaft regulator weight lifts up / pulls down a lever lifts up / pulls down a regulator weight lever opens / closes a pressure gate working impact steam pressure
  26. 26. Example: Matrix model of Watt’s Governor (4 matrices) weight goes down weight goes up pressure falls pressure grows shaft slows down shaft speeds up pressure gate closed pressure gate opened
  27. 27. Example: Watt’s Governor (Cont.) weight steam in boiler working shaft pressure gate memory / SAFE ( execution unit no. 1) ( effector ) memory / SAFE ( execution unit no.2 ) ( receptor ) external SAFE (contact environment) internal memory ( controlling unit ) <ul><li>For a full cycle, two information/programs in each of four sub-systems must be actualized (V F =8) </li></ul><ul><li>No repetitions (I F =V F =8) </li></ul><ul><li>No inadequate/ inefficient information/ programs (I A = V A = I F = V F =8) </li></ul>
  28. 28. Quantitative Measures of Information/Programs Measures of information/program variety Measures of information/program quantity with copies indicated as IPions? V F =  j  i A ij full variety V A =  j  i A ij for i=j adequate/efficient variety V N =  j  i A ij for i  j, or inadequate/inefficient variety V N =V F -V A L= lim T  T/R length of macro-cycle I F =  j  i F ij full quantity I A =  j  i F ij for i=j adequate/efficient quantity I N =  j  i F ij for i  j, or inadequate/inefficient quantity I N =I F -I A
  29. 29. Limits to Variety and Adjustment to Requisite Variety Law <ul><li>The law of requisite variety , as introduced by W. Ashby [Ashby, 1964], can be re-considered and re-stated </li></ul><ul><li>A quantity of information/program variety must be greater to variety of controlled objects or events (various elements of SAFE). If there is no memory element for the given object or event in environment, then there is no information/program that can control it. </li></ul><ul><li>On the other hand, an upper bound of system variety can be pointed out: it is the square of controlled environment disturbances (SAFE elements). In efficiently operating information/cybernetic system, a variety of controlling information/programs tends to a number of SAFE elements. </li></ul>Limits for information/program variety k  V F  k 2 full variety V A =k adequate/efficient variety 0  V N  k 2 -k inadequate/inefficient variety  V =V A /V F variety efficiency ratio
  30. 30. Limits to Information/Program Quantity with Copies <ul><li>Assuming that each potential information/program appears in macro-cycle at least once, following correspondences can be drawn </li></ul>Limits for information/program quantity with copies I A  L limits of adeaquate/efficient information quantity I N  L-I A limits of inadeaquate/inefficient information quantity V A  I A relation between adeaquate/efficient variety and quantity V F  I F relation between full variety and quantity  I =I A /I F information/program efficiency ratio
  31. 31. Measures of Shannon and Kolmogorov K(X) = min |p|: U(p)=X K(X|Y) = min |p|: U(p, Y)=X I(Y:X) = K(X) – K(X|Y) Shannon’s measure Kolmogorov’s measure <ul><li>Kolmogorov defines information through information, referring to properties of Turing automate </li></ul><ul><li>Shannon implicitly assumes “a concise recipient” of information who has a “function of expectation” and thus may be “surprised” by information </li></ul><ul><li>Yet, this measure is one possible static measure of (dis)organization in information interactions, which can be easily derived from matrix model as one possible characteristics of matrix P </li></ul><ul><li>In matrix model, if the target (final) state of system is considered as X, then measure analogue to Kolmogorov’s will indicate (minimal) number of information/programs required to reach X from state Y </li></ul>L = L(X) = I A L 2 = L(Y:X) = L(X) – L(Y) < I A

×