• Like
14-HumanMachines.ppt
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

14-HumanMachines.ppt

  • 304 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
304
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
7
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Department of Computer Science City College of New York Spring 2006 Copyright © 2006 by Abbe Mowshowitz CSc 375 SOCIAL ISSUES IN COMPUTING
  • 2. TOPIC 14. HUMAN MACHINES
    • Intellectual Challenge to Human Identity
    1. “Man is nothing but a meat machine.” - Marvin Minsky
  • 3. HUMAN MACHINES
    • Intellectual Challenge to Human Identity
    • 2. Strong form of Church’s Thesis:
    Any effective procedure can be programmed on a computer. AND Everything humans can do can be expressed as effective procedures.
  • 4. HUMAN MACHINES
    • A. Intellectual Challenge to Human Identity
    • 3. Turing Test (Alan Turing. “Computing machinery and intelligence, 1950.): reformulation of the question “can machines think?”
  • 5. HUMAN MACHINES
    • A.3. Turing Test
    • Based on the “Imitation Game” in which
    • Human interrogator converses by means of a tele-typewriter with two unidentified respondents
    • - one a machine
    • - one a person
    • Object: distinguish between person and machine
  • 6. HUMAN MACHINES
    • A.3. Turing Test
    If in repeated trials with different human subjects, the interrogator cannot distinguish with better than 50% accuracy, the machine is said to simulate human intelligence.
  • 7. Japanese scientists have unveiled a "female" android called Repliee Q1. Source: BBCNews, July 28, 2005 “ Professor Hiroshi Ishiguro of Osaka University says one day robots could fool us into believing they are human .”
  • 8. HUMAN MACHINES
    • A. Intellectual Challenge to Human Identity
    • 4. Information processing model (Simon and Newell, 1964)
    • a. Science of information processing (IP) can be independent of particular IP mechanisms
    • b. Thinking can be explained in IP terms
    • c. IP theories can be formulated in programming languages and simulated on computer
  • 9. HUMAN MACHINES
    • B. Practical Challenge
    • Simon’s continuum of behavior:
    • Capabilities of computer applications have been moving steadily toward the non-programmed end since the 1960s
    PROGRAMMED NON-PROGRAMMED
  • 10. HUMAN MACHINES
    • C. Historical Perspective
      • Earth pushed from center stage by the heliocentric theory of Copernicus
      • Humans linked to apes by Darwin
      • Denied conscious control by Freud
      • Now AI would turn us into machines without free will
  • 11. HUMAN MACHINES
    • D. Challenge Posed by AI
    • 1. Until 1980s AI was an obscure academic discipline
    • 2. Support for research came mostly from government (DoD in the US)
        • Intelligent command and control
        • Guidance for automatic weapons
        • Spying, reconnaissance
  • 12. HUMAN MACHINES
    • D.2. Support
    • spying/reconnaissance calls for:
        • Natural language processing
        • Speech recognition
        • Image processing
      • Examples: unmanned, armored tank; automated co-pilot
  • 13. HUMAN MACHINES
    • D. Challenge Posed by AI
    • 4. AI in the spotlight
        • Japan Fifth Generation Computer Project
          • Announced 1980 by MITI
          • Projected as 10 year $500 m. program
          • Proposed to develop revolutionary computer systems incorporating AI concepts: problem solving functions, intelligent interfaces, inference & knowledge-based functions
  • 14. HUMAN MACHINES
    • D.4. AI in the spotlight
        • U.S. answer to Fifth Generation Project
          • DARPA’s Strategic Computing Program
          • $600 m. funding
          • Envisioned developing AI systems and advanced computer technology
  • 15. HUMAN MACHINES
    • D. Commercial Developments in AI
    • 1. Emergence of start-up firms marketing expert systems with
        • Knowledge base containing rules of thumb
        • Inference engine that makes decisions based on rules in a specialized domain
    • Major challenge: capturing knowledge of human expert and embedding in system (knowledge engineering)
  • 16. HUMAN MACHINES
    • D. Commercial Developments in AI
    • 2. Expert system applications are now widespread (e.g., authorizer’s assistant)
    • 3. Motivation: reduce labor costs, provide consistent performance
  • 17. HUMAN MACHINES
    • D. Differences in Cognitive Capabilities of Humans and Machines?
    • 1. Affirmative answer calls for rejection of Church’s Thesis (e.g., Dreyfus’ argument based on need for a body)
    • 2. What cannot be done today may be realized tomorrow?
  • 18. HUMAN MACHINES
    • E. Ethical Implications of AI
    • 1. Robots and ‘ethical brakes’
    • (e.g., Asimov’s three laws of robotics)
    • 2. Are there areas of application that should be off limits to machines?
        • Judgment in criminal cases?
        • Psychotherapy?
  • 19. HUMAN MACHINES
    • F. Literary Perspectives
    • 1. Background
        • Early Post WW II period
        • Stimulated by the first computers
        • Themes echo those in science fiction and futuristic literature
  • 20. HUMAN MACHINES
    • F.2. Central Figures
        • Robots
          • Capek’s R.U.R. ( Rossum’s Universal Robots ), 1923
          • Czech word “robota” means work
          • Robots limited in action by design
          • Usually obedient servants
  • 21. HUMAN MACHINES
    • F.2. Central Figures
        • Androids (or humanoids): living beings not created by human birth
        • Computers: based loosely on contemporary machines
        • Hybrids (cyborgs): mixed organic and computer components
    • Distinctions hard to maintain in practice.
  • 22. HUMAN MACHINES
    • F.3. Major Themes
        • Dehumanization
          • Over-reliance on machines
          • Disregard of human needs
          • Excessive standardization and inflexibility
        • Identity crisis
          • Superfluity (nothing to do)
          • Myth of regeneration
  • 23. HUMAN MACHINES
    • F.3. Major Themes
        • Persistence of human impulses (especially in negative utopias)
        • Unanticipated consequences (Pandora’s Box)
        • Knowledge and power (cosmic mind, computer as God)
          • Forbidden knowledge and the Word of God
          • Sorcerer’s apprentice, Golem
          • Computers as adjunct to social power
  • 24. HUMAN MACHINES
    • F.3. Major Themes
        • Partnership
          • Human-robot interaction
          • Synthesis of man and machine into higher entity
  • 25. HUMAN MACHINES
    • F. Literary Perspectives
    • 4. Views and attitudes
        • Ambivalence towards technology
        • Machine take-over of human functions
        • Anxiety over role of humans in society
        • Fear of diminished human worth and dignity