Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Harry Halpin: Artificial Intelligence versus Collective Intelligence

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to comment

Harry Halpin: Artificial Intelligence versus Collective Intelligence

  1. 1. Artificial Intelligence versus Collective Intelligence Harry Halpin, <H. Ha| pin@ed. ac. uk> <>
  2. 2. Always Historicizel We want scientifically-grounded two-way traffic between philosophy and the Web. Technologies are ideas given flesh, the exteriorization of the conceptual structures and utopian impulses of humanity, and so are alien only insofar as their history and materiality are unknown. Artifactualization does not happen for and of itself, but reflects the ontological assumptions of their historical period. What is most interesting about the Web is it is clear that older philosophical categories like "mind" and "language" have to be fundamentally rethought. Thesis The Web is the rise of increasingly "intertwingled" techno-social assemblage that increasingly displaces the previous stable ontological assumption of the individual.
  3. 3. A disciple of Norbert Wiener, the psychologist J. C.R. Licklider tirelessly pursued the vision of man- machine s mbiosis, a vision that closel couled man and machine due to their complementary abilities. Inspired by Cold War missile defense systems like Project Whirlwind, Licklider became the institutional architect of the Information Processing Techniques Office at Advanced Research Projects Agency (ARPA), providing him the funding needed to assemble a "galactic network" of researchers to implement the first step of his symbiosis, interactive computing through time- shanng. J. C.R. Licklider
  4. 4. l'li‘_= .l. I-4l'li‘§l§*§. il]. Il§- ~_‘w’-1:. ‘-.171»; xrfkr "a- ie ymbiosis"b J. C.R. Licklider(1980) -. ‘lint, -‘M'_; ; -_, _ I! '-¢ "The fig tree is pollinated only by the insect Blastophaga grossorun. . T; The larva of the insect lives in the ovary of the fig tree, and there it the tree cannot reproduce without the insect; the insect cannot eat - without the tree; together, they constitute not only a viable but a ~ productive and thriving partnership. This cooperative "living together in intimate association, or even close union, of two dissimilar _ organisms" is called symbiosis. The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the inforrnation-handling machines we know today. " ""'1"gets its food. The tree and the insect are thus heavily interdependent:
  5. 5. The Foundations of Al To historicize the birth of Al in the 1950s and 1960s, Al was conceived Mnumhya, as a technological solution to the problem of labour in an era of full svanronmtab employment and traditional factory production. What was needed was intelligence but servile machines to replace the skilled individual | aborer's manual work. The paradigmatic problem of both analytic philosophy and cognitive science is to explain the intelligence of the human individual: What properties of the individual human deserve credit for intelligence, and why? Al attempted to define an individual on a level of abstraction and then implement this computationally. Everyone will learn how to program. That is how we will speak to the servants. John Mccarthy
  6. 6. Cartesian Frameworks Al was built on a robust and often unstated Cartesin framework: 1. The individual-world dichotomy is primary. 2. Intelligence is be explained in terms of representational states implemented inside the head of the ‘ individual and the ways in which such states are manipulated and transformed. 3. Intelligent then is the action of the body in response to the world as an outcome of general purpose reasoning processes. 4. In humans, these representations are implemented neurally, in machines it may be on other mediums.
  7. 7. Augmenting the Human Intellect Douglas Engelbart had in parallel generated his own vision of human augmentation that shared much of the same conceptual groundwork with Lick| ider's "man-machine symbiosis. " Human augmentation differed from man-machine symbiosis in a subtle yet powerful manner by putting the human at the center of the symbiosis, focusing on using the machine to extend to the human. Unlike AI, Engelbart's thesis on collective intelligence has never been phrased in terms o its underlying philosophical assumptions, but only articulated as an engineering
  8. 8. Engelbart's Forgotten Legacy " By augmenting human intellect We mean increasing the Dou Ias Engelbart at his current Boolstrao Institute capability of a man to approach a complex problem situation, “L F, .‘ to gain comprehension to suit his particular needs, and to ' ’ derive solutions to problems. Man's population and gross product are increasing at a considerable rate, but the complexity of his problems grows still faster, and the urgency ' with which solutions must be found becomes steadily greater in response to the increased rate of activity and the increasingly global nature of that activity. Augmenting man's intellect, in the sense defined above, would wan'ant full pursuit by an enlightened society if there could be shown a reasonable approach and some plausible benefits. " Augmenting Human Intellect: A Conceptual Framework by D. Engelbart
  9. 9. Engelbart and the Web The human augmentation framework went beyond the physical interface that allowed maximal efficiency of communication, and instead focused on the digital organization of information itself. Engelbart and his researchers developed a system, the NLS (oNLine System), that was an early predecessor of the World Wide Web. NLS allowed any text to be hierarchically organized in a series of outlines, with summaries, allowing a user to move through various levels of information and ''link'' information, foreseeing the development of hypertext, and also to "publish" information in a Journal for others to use and comment upon. Engelbart's vision could not be be realized on the primitive computers , of his day, with limited memory, and his zeal for efficiency made NLS so arcane that when it was put on ARPANet as the Network lnforrnation Center(NLS was used to publish many of the early ublications of the IETF NLS never experienced the viral growth of lnleractingwnh NLS he
  10. 10. ‘I"IIt— ¢‘| t:1== :'r. _I = ".r<f- 11' it fit" lire ‘I'M’- "Forget intelligence completely, in other words; take the project as one of ~ 1 ”’ constructing the wor1d's largest hypertext system, with CYC functioning as a _ H ‘H radically improved (and active) counterpart for the Dewey decimal system. Such _ : ’f-‘fr vs. , : a system might facilitate what numerous projects are struggling to implement: Prawn ‘.7 »’~ . . . . . .- . . r ~ aria reliable, content-based searching and Indexing schemes for massive textual L25’ ‘K, ~ . . |.: l. .. . databases" Brian Cantwell Smith, The Owl and the Electric Encyclopedia. Artificial Intelligence, 47 251-288 (1991). ‘Q frat“? ‘Around the same time, Tim Berners- , Lee and Robert Cailliau invented the “ Vi/ .7JC ‘core protocols of the Web as a universal information space. Douglas Engelbart met Tim Berners- V Lee at the 1991 ACM Hypertext _ ; _A Conference in San Antonio, Texas. ‘ _ ‘ . . Tim Berners-Lee's paper was ‘_» rejected, the Web was too simple to E be interesting, didn't guarantee two- way link consistency.
  11. 11. -Ijmii: .r: +ritr1,rI]rL; il‘rr: - In~; :ilvrr: =1l! r_: rI Recent work in tracking the behavior of individuals finds that their behavior - ranging from movement to turn-taking in conversation - can be reliably tracked by appealing to the behavior of others in their social network with a high degree of accuracy (over 40 to 80% of variation over a wide variety of tasks) without any appeal to planning, l reasoning, or verbal language Pent| and(2007) claims "that important parts of our personal cognitive processes are caused by the network via unconsciousl, . and automatic processes such as signaling and imitation, and that consequently, important parts of our intelligence depend upon network properties. " ‘On the collective nature of human intelligence‘, Adaptive Behavior, 15(2) 189-198.
  12. 12. The Individual-World Distinction is Not Tenable AIthough this may be true in some cases, it would be to primitive ‘ to describe the ant totally to be at the mercy of its environment. ld Sum QWMVED ttctlfluitli ill (I) Intelligence in general - collective or not - leave traces behind in ’ the environment. 4 The classic example is the pheromone trace of the ant, in which a {“ traces get reinforced as more ants use a particular trail, has been “ shown to be an efficient way of navigating the environment. This shows how individuals with limited memory can use the shaping of b 4‘ the environment as an external memory. V Culture, ranging from design of cities to Wikipedia, can be A ‘ considered collective cognition extended into the environment. This usage of the environment has a number of advantages over direct individual-to-individual communication. To modify Pent| and's thesis: The collective activity of individuals and their modifications to the environment are responsible for intelligence. Paul Baran‘s lletrroriting Diagram
  13. 13. Against Internal Representations Contrary to the Cartesian assumptions of classical artificial intelligence, in their study of frog vision, what Maturana and others discovered was that the frog's eye "peaks to the brain in a language already highly organized and interpreted instead . of transmitting some more or less accurate copy of the , distribution of light upon the receptions" J. Lettvin, H. Maturana, W. McCulloch, and W. Pitts "What the frog's eye tells’ ‘ the frog brain", Proceedings of the Institute Radio Engineers, 47(11) 1940-1959 (1959). This discovery caused Maturana to reconceptualize the foundations of cognitive science in terms of autopoiesis: that "living organization is a circular organization which secures , the production or maintenance of the components that , ‘‘ specify it in such a manner that the product of their 3 . ‘ functioning is the very same organization that produces 1 them" Humberto Maturana and Francisco Varela, Autopoiesis I j and Cognition (1973). j
  14. 14. Adlxc mgr, r'r; +-711; Example: A frog is autopoietic precisely because its internal metabolism is inside a boundary, frog-skin, that defines its organization as a frog. Second, the components, the organs, are inside the frog-skin and self-reproducing. Yet autopoietic systems are not entirely closed, for the frog's consumption of gadflys and other interactions with the environment are done in lieu of maintaining its own organization as a frog, since eating allows it to bring in energy to maintains its metabolism. Despite the biological favoritism of Maturana and Varela, there is nothing inherent in autopoiesis that restricts the components of biology in all possible worlds.
  15. 15. Defining Collective Intelligence What we are searching for is then a notion that can define an individual body without resort to making biological tissue some sort of "wonder tissue" (Dennett . If the individual can be defined via autopoiesis, and to maintain its autopoiesis the individual must increasingly incorporate non-biological components, then the individual is no longer a static, closed system, but an open and dynamic system capable of assimilating and decoupling from various components as it goes in and out of autopoiesis, including digital representations and other biological beings. The obvious objection could be that the biological component is reproducing itself, while the non- biological component is not.
  16. 16. Why Collective Intelligence Now? In an era of mass unemployment and distributed labor, while AI-enabled robots are economically unfeasible, social computation as exemplified by crowd-sourcing via the Web is now central to the modem economy. It is cheaper to distribute a task using human computation through interfaces like "Amazon Mechanical Turk" than to build a robot to do the task! he Web and the Semantic Web are attempts to build an architecture for a new kind of intelligence in which humans - re components of a larger intelligent system.
  17. 17. .7‘-'~: ir'r i, il€- E‘. Z". ,'(? -l| l~: Z(§1*? '. Mlrrri Extended Mind Thesis: Otto suffers from Alzheimer's disease, and like many Alzheimer's patients, he relies on information in the environment to help structure his life. Otto carries a notebook around with him everywhere he goes. When he learns new information, he writes it down. When he needs some old information, he looks it up. Andy Clark and David Chalmers. Analysis 58:10-23 (1998) What is the difference between this and an act of memory? The mind is therefore extended into the environment. What if there are other people in the environment? Or knowledge _ representations? Imagine Otto has a knowledge representation-driven rip * 9 I It Semantic Web system helping him? Or de| .ici. us generated bookmarks? j ’ o. The evolution of digital technology is the logic of human's overcoming their own biological heritage. Far from replacing or replicating human intelligence, digital technology fundamentally complements human ' i QC. intelligence.
  18. 18. To Collective Intelligence When one eliminates the human subject as an transcendental individual, one opens the door for new collective subjects, subjects composed of what were formerly thought of as individuals. When this collectivity is put into practice, not as an abstract or utopian notion, but in confrontation against the concrete and practical problems that threaten these new subjects, collective intelligence is born. The Web and its development into the Semantic Web makes much more sense as a "a common space in which we could all interact", a technological and universalizing system of information that has the possibility of creating universal collective intelligence. To a computer. then, the web is a flat, boring world devoid of meaning. .. This is a pity, as in fact documents on the web describe real objects and imaginary concepts, and give articular relationships between them. ..Adding semantics to the web involves two things: allowing documents which have information in machine-readable forms, and allowing links to be created with relationship values. Tim Berners-Lee, 1994