9worlds robots

1,161 views

Published on

Robots are no longer creatures of science fiction nor even restricted to industrial and warfare contexts but moving into sensitive domnestic worlds such as homes, hospitals and schools. How will laws about liability, privacy, evidence etc apply in this brave new world? How do we avoid creating kneejerk moral panic laws which may restrict the vaule of robotics to society?

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,161
On SlideShare
0
From Embeds
0
Number of Embeds
168
Actions
Shares
0
Downloads
12
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • “When the robot is sad it hunches its shoulders forward and looks down. When it is happy, it raises its arms, looking for a hug. .. When frightened, it cowers and stays like that until it is soothed with gentle strokes on the head.” Mimics 1 yr child. Potl 24 hr companion for kids in hospital.
  • NOW ---Industrial robots – good for dangerous repetitive tasks - esp in hazardous environments eg down mines, in oceans, in space?
  • NATO Hummingbird drone – v smallIssues of ethics of cyber and robowarDrone planes – removing the human from loop to get rid of the “command” delayInability to distinguishcivilan from military targetsChild soldiers from adultSurrender from other actions etcProportionality of total war vs “human” enemies – no feat of body bags to inhibit warfare
  • Robot bomb defusercf hurt locker
  • Paro has been found to reduce patient stress and their caregiversParo stimulates interaction between patients and caregiversParo has been shown to have a Psychological effect on patients, improving thier relaxation and motivationParo improves the socialiazation of patients with each other and with caregivers since2003World's Most Therapeutic Robot certified by Guinness World RecordsAlzheimers , Paro has five kinds of sensors: tactile, light, audition, temperature, and posture sensors, with which it can perceive people and its environment. With the light sensor, Paro can recognize light and dark. He feels being stroked and beaten by tactile sensor, or being held by the posture sensor. Paro can also recognize the direction of voice and words such as its name, greetings, and praise with its audio sensor.Paro can learn to behave in a way that the user prefers, and to respond to its new name. For example, if you stroke it every time you touch it, Paro will remember your previous action and try to repeat that action to be stroked. If you hit it, Paro remembers its previous action and tries not to do that action kids etc where real animals not poss
  • "We aim to realize mass marketing of cheap robots costing 100,000 yen to 200,000 yen (about $1,000 to $2,000), no matter whether they look like a typical humanoid robot," Kitashima said. 2013But see also Paro, Kibaro – robotcompn for ISS and It also leases a powered exoskeleton suit called HAL for assisting wearers to walk and perform other activities to a hospital in Kanagawa to help rehabilitation programs.Japanese resistance to humanoid robots , some, 2010 bbc - and not as usefulRobot guides in hosptals withdrawnBed -> wheelchair
  • Google cars have clocked up 140 000 miles – not seen as legal to be allowed out on own altho Arizona, Nevada, California has passed a law specifically legalising them – but also smart roads, sensor equipped – smart chips in cars enabling surveillance and some control – etc.. Stanford, OxfordThe Nevada state legislature JUNE 2011 has just passed a bill, Assembly Bill No. 511, that does two things. First, the law allows the Nevada Department of Transportation to create rules and regulations regarding the use of self-driving cars, so that they can be used legally on the road. The second part of the law requires the Nevada state Department of Transportation to designate areas in which these vehicles can be tested. The cars in question use a wide array of sensor, GPS technology and a little but of help from an artificial intelligence program to function
  • Rooxxy, Aug 2010, US ! Snores & can discuss Man U football – more like a man than a woman.. Douglas Hines apparently wants to charge $7,000 to $9,000 for Roxxxy and an attached laptop running its software. Read more: http://news.cnet.com/8301-17938_105-10432597-1.html#ixzz1QKDzjHke
  • Sales of professional/service domestic robots => 11.5 m in 2011 (double from 2008)Japan leader; S Korea aiming at 1 robot/ home by 2015.US/West lag behind – cultural issues? Technophobia vs loving the alien. The “uncanny valley”.
  • Instead of Three Laws of Robotics, we have devised a Five Ethics for Roboticists,which are aimed at researchers, designers,manufacturers, suppliers and maintainers ofrobots.
  • serious work on roboethics – a term coined in 2004 by GianmarcoVeruggio, lead author of the European Robotics Research Network Roboethics Roadmap – is not. Several countries have drafted ethical or standards frameworksfor robotics, notably South Korea and Japan, and there is a growing literature on autonomousrobot weapons. However, much of it focuses on the safety, accountability and even morality of robots’ behaviour.But tends to be fairly aspirational with few gard line rules on who of the manufacturer, programmer, user, adapter, owner or operator of robot is reponsible if “something goes wrong”
  • Omission/commissionHow far does learning go?Is there a difference between insecurity of robots =? Disclosure to 3rd parties – marketers, scammers/ govts and general privacy scene re disclosure when data is centrally accumuated and processed without clear explicit informed consnet?
  • Exemption of producers from liability: the producer is freed from all liability if he proves:that he did not put the product into circulation;that the defect causing the damage came into being after the product was put into circulation by him;that the product was not manufactured for profit-making sale;that the product was neither manufactured nor distributed in the course of his business;that the defect is due to compliance of the product with mandatory regulations issued by the public authorities;that the state of scientific and technical knowledge at the time when the product was put into circulation was not such as to enable the defect to be discovered. On this point, the Member States are permitted to take measures by way of derogation;in the case of a manufacturer of a component of the final product, that the defect is attributable to the design of the product or to the instructions given by the product manufacturer.
  • ? Is there any value here in trying to come up with rules for “robots” or just for each robot as it arises?Cf privacy – where every robot which gathers data about humans in its surroundings does have a similarity to other bugsIndustry codes are in fact main reg’y mechanism currently – industrial robots – t0ys? - holding back domestic robotics in West? Eg current regulator U of new medical tech had not addressed issue of robots yet (nor privacy in smart pacemakers)
  • 2mph speed limit, 60 yds in frontAlready suggested speed limit for driverless cars, 2013If we don’t regulate tech early , bad things happen, public loses trust cf GM crops. What happens when first care robot maims a child?
  • 9worlds robots

    1. 1. Lilian Edwards Professor of E-Governance University of Strathclyde Nine Worlds August 2013
    2. 2.  FICTION  Terminator 2 – mp4  Robbie the Robot  Maria REALITY  Asimo  Nao  ROXXY, Geminoid F
    3. 3. Metropolis 1927
    4. 4. 1956/ 1965
    5. 5. Nao, U Herts , 2010 ASIMO (Honda) 2002-
    6. 6. PARO
    7. 7. HELPER ROBOTS: Japan's population is ageing rapidly, with over 22% of the population aged 65 or older, overworked kids & few immigrant/low paid carers MOBISERV EU project 2013 HAL exo skeleton
    8. 8. ROXXY, New Jersey, 2010 Ishiguro’s GEMINOID F 2013
    9. 9.  Human-intelligent robots /“Strong AI”  The “singularity”  Hence not, robots having “legal personality” (can sue, be accused of crimes)  Transhumanism (human mind into metal container)  Cyborgism (human mind/body enhanced by robotics) – or not very much!
    10. 10.  Here and now robots “about as intelligent as dishwashers”, or lobsters (Winfield)  Not necessarily or even often humanoid  Features:  “intelligence”  autonomous action;  learning and adaptation;  embodiment cf Skynet; Google; Twitterbots/agents  mobility
    11. 11.  Robots now moving into consumer, domestic, and “caring” environments – not just industrial/military  Current/near current legal issues – 5-10 years away  Not just about hypothetical morality, ethics or philosophy – real problems beginning  Privacy; liability; crime; evidence; road traffic law!  Ethics & social issues eg under age sexbots, saving lives with driverless cars, leaving old people alone, environment, employment impact, robowar and unequal conflict/civilian & humanitarian impact..  More interesting than Skynet!
    12. 12.  Robots as legal category – general regulation? Legal analogies:  Person (legal personality)  Slave (lesser legal personality – cf Roman law of slavery to get round agency issues re bots)  Animal? (animate, sub-legal personality, unpredictable cf cats, some anthropomorphism)  Tool – machine – car – manaufactured object – “product liability” for consumer safety  Fails to capture aspect of learning/adaptivity/unpredictability  Or the “state of the art” defense  Software?
    13. 13.  Problems caused?  *Liability for harm caused (who is responsible?)  *Privacy (is a care robot in the home or hospice 24/7 surveillance? Control & vulnerable people?);  Criminal law (can a driverless car lose its license? Can a robot surgeon or mining robot kill? Can a robot carer “give evidence” in court re suspicious death?)  Humanitarian law (Can a robot soldier break laws of war? If a drone plane takes out civilians can US (say) be brought to ICC ?)
    14. 14.  A robot may not injure a human being or, through inaction, allow a human being to come to harm.  A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.  A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
    15. 15.  Edwards' Three Laws for Roboticists  (from EPSRC Sandpit, 2010) 1.Robots are multi-use tools. Robots should not be designed solely or primarily to kill, except in the interests of national security. 2 Humans are responsible for the actions of robots. Robots should be designed & operated as far as is practicable to comply with existing laws & fundamental rights and freedoms, including privacy. 3) Robots are products. As such they should be designed using processes which assure their safety and security (which does not exclude their having a reasonable capacity to safeguard their integrity).
    16. 16.  Robots are manufactured artefacts, so they should not be designed in a deceptive way to exploit vulnerable users (“their machine nature should be transparent”);  It should always be possible to find out who is legally responsible for a robot. (cf registered keeper of cars? Person who “signed” contract to buy robot?)  See Winfield, New Scientist, 9 May2011
    17. 17.  A military robot kills a civilian by mistake  A mining robot excavates wrong area and landslip results damaging civilian houses  A Roomba trips up an old person who hurts herself  A care robot fails to stop one child from hitting another (?) as not in programmed remit; or report an old person swallowing too many pills (not too few)  A sex robot “learns” one kind of behaviour from person A (eg caning) that causes harm to person B  A driverless car is hacked so that it has an accident leading to economic/physical harm
    18. 18.  Negligence  Issues : proof of fault ie breach of duty of care? Contractual exclusion of liability?  Who is liable – manufacturer – programmer – “trainer” – owner – leaser – user?)  Product liability = Strict liability for manufacturer if defect => damage. US/EU differences. In EU:  state of art defense  does not currently apply if the defect causing the damage came into being after the product was put into circulation (learning)  3rd party intereference; What if eg robot car is hacked?
    19. 19.  Animals : cf PARO  Liability of custodian (cf user, or owner – not necc the same) =strict liability or liability after some notice (1 bite) . Issues  Are robots tame or wild??  Bad analogy? Nature changes, harms v different.  Children:  VERY divergent civil/common law traditions etc. Eg Scots law , no automatic resp of parent for child’s delicts.  Robots have no ability to reach “maturity”?
    20. 20.  Contract: Allocate liability by contract.  Fair to consumers? The new Facebook T & C?  The “small print” and “shrink wrap” problems.  An insurance market.  Cf cars – things we own very liekly to hurt others or ourselves. Likely to arise for driverless cars anyway, but other robot classes?  Establishing actuarial risks v difficult. Compulsory insurance of owner?  We don’t require insurance for dishwashers – or even pets!
    21. 21.  Is there value to considering robots as a special category for “rules” cf Asimov?  Is there value to considering robots as a special catgory even for liability? Not clear.  Are we worried about the growth of domestic surveillance by robots of our most vulnerable people? (cf Google Glass – public!)  Lack of jurisdictional harmonisation for laws will be huge issue: Japanese robot, software from US, used in EU?  Do we want THIS?

    ×