Designing cyborgs

3,354 views

Published on

Presentation for Hybrid Days, making the point that we are part of technologies rather than them being part of us, so our technologies (at least the softer and collective ones) are cyborgs.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
3,354
On SlideShare
0
From Embeds
0
Number of Embeds
1,109
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Technologies, as W. Brian Arthur observes, involve the orchestration of phenomena to some use. Sometimes that orchestration is pre-decided and embedded in the technology itself, being built into the fabric of physical machines, software, rules and laws: factories, legal systems, and railways for example. These I describe as hard technologies. Sometimes, the orchestration is more like a jazz improvisation that we, to a greater or lesser extent, make up as we go along: language, computers, wikis and screwdrivers, for example, can be used for myriad purposes and orchestrate myriad phenomena. These I describe as soft technologies.\nSoftness and hardness depends upon your point of view and the context in which you view the technology. A computer is soft for a programmer, hard for the user of an ATM, for example. Neither is suitable in all cases. Soft technologies give creativity and flexibility, but at the cost of effort and active invention: soft approaches to washing clothes, for example, mean that it takes a day to do what it would take a few minutes to do with a washer-dryer. Hard technologies give efficiency, speed, and freedom from error, but at the cost of creativity and flexibility. Soft is hard, hard is easy. Ideally, we should be able to choose how soft or hard our technologies are depending on our contextual needs.\nAll technologies are assemblies and almost all, in a given context, contain a mixture of soft and hard. Humans are not just users of soft technologies but are a part of them: they are the orchestrators who play an active and continuing role in their creation. In a sense, therefore, almost all technologies are cybernetic organisms, cyborgs, a blend of human and machine.Meanwhile, through applications of collective intelligence such as Google Search's PageRank, tag clouds, wikis or collaborative filters that recommend things to us based on the actions of the crowd, the actions of people are combined with algorithms in order to create a single actor, an entity that does work and that influences how individuals behave. Thus we are not only parts of machines as individuals but also as collectives. But, depending on algorithms and interaction designs, collectives are as likely to embody stupid mobs as they are wise crowds.This cyborg amalgam of human and technology is as old as humanity itself: from the moment we began to talk or extract grubs with sticks, we have been a part of technology and it has been a part of us. However, thanks to the efflorescence of adjacent possibilities that emerge with each new technology, the technological assemblies of which we are a part inevitably become more complex and interconnected at an ever-increasing speed. Our potential for good and our potential for bad increases at the same rate, and the ways we exist as humans can become more machine or more human. Unless we are in control of the softness and hardness of our cyborg bodies and unless we design the collectives of which we are a part to be wise, we can become either cogs in a machine or victims of emergence in which we are the unconscious and unwilling creators of our own nightmares. In this conference we will be exploring ways to enable our inevitable and unstoppable co-evolution as cybernetic organisms that enhance and enable control of our own destinies.\n
  • \n
  • a useful fact to remember\n
  • \n
  • \n
  • examples of collectives. note not all involving computers\n recommender systems\n voting in elections\n crowds gathering on the street\n social navigation and stigmergy\n termites\n
  • collectives in the sense of the borg - set of agents linked that act in some ways as single entity. \n
  • a closer look. \ne.g. crowd: sees other crowd - if there is a crowd, join it - be part of crowd - others join\ne.g. tag cloud: collect tags for group - aggregate and normalise weights - weed out the old ones - display in different font sizes\ne.g. CF: collect user behaviour re resources - identify similarities - display recommendations\nnote feedback loop - may be mined by different crowd or individual but most interesting when affects the original crowd\nnote collective is entity: may be part of the crowd\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • flickr 2 years apart. almost the same\nbut note temporal parcellation - hot tags\nThe rich get richer while the poor get poorer\n
  • simpy and del.icio.us - note enormous similarity of tags - even down to weighting\nactually, simpy users are often disaffected delicious users but - uncanny\ninsufficient parcellation\n
  • Still a big issue: this is from FLickr top tags from last 24 hours - seems to be spam-ads - note different user names. note bizarre sponsored results!\n
  • Uriah Heap - ‘an internal yes-man’\n
  • how much do I want to share?\n need for fine-grained control over what i reveal of myself to who and when\n
  • Technological...\n RSS\n JSON\n RLOs\n ELF/OKI\nlearn from the learners\nprepare for change\n\nit will happen - use it\nsignposts, not fenceposts\n\ndeferred design\nexaptions\nreplication and variation\ndeath as a teacher\n\nscale\nislands and isthmuses\n\nsoft and hard security\ntechnical reliability\ncontrollable access\ntransparency\nidentity\n\nawareness of others\nmultiple channels\ndialogue\n\nwhat shape is a learning environment? What rules?\nmeaningful signposts\n\nself-organising - only a learning environment if it contains motivated learners\nlarge and slow moving provide constraint\n\n\neverything and everyone connected\n\n\nmultiple scales - large and slow influence small and fast\nhierarchies\n
  • \n
  • a screwdriver is a single tool, but it can be many technologies. This is an important distinction. \n
  • the same tool can be many technologies. The screwdriver is a different technology if it is used to stir pain than if it is used to tighten a screw\nto a computer programmer, a sales terminal is a soft device that can become whatever he or she wants it to be. for a sales assistant, it is a hard technology that forces one kind of behaviour (‘the computer says no’). The same tool is orchestrating different phenomena for different purposes \n
  • \n
  • ursula franklin notes there are as much technologies of prayer as of electronics and metallurgy. it aint what you do it’s the way that you do it. Laws and legal systems are technologies.\n
  • and that’s how they evolve - not so much adaptation and new innovation (though that happens) but mainly by mixing and mashing. So technologies are made of technologies - again, it depends on your point of view which level you look at things. A transport system is a technology, as is a car, as are rules of the road, as is a car radio. All are part of a whole.\n
  • soft technologies are flexible, allowing people to be creative. paint brushes and paint are soft technologies that can be used in many ways.\n
  • hard technologies reduce choices \n\norchestration of phenomena embedded in rules, laws, physical parts etc - e.g. automatic transmission vs manual transmission\n
  • hard technologies have their processes embedded - may be laws or rules or part of the software or hardware - \nnotably, LMSs embed implicit pedagogies\n\nhard technologies tell us what to do - they reduce choices. So, they make things easy. and reliable, fast, free from error\n
  • the orchestration is part of the technology so a hard technology is complete\n
  • \n
  • by which I mean soft technologies are more difficult (and unreliable, slow)\n\nWe have to invent social technologies and to literally be a part of them\n\nSofter technologies increase the adjacent possible by enabling and/or making more likely new choices to be made. They enable creativity\n\nMore choices come at a price - we have to make them. That is one thing that makes them difficult or hard.\n\n
  • we have to find ways to use soft technologies - without the parts we add, they are not technologies at all, just tools waiting for something to happen\n
  • because many different things can happen, we can orchestrate phenomena in many ways, so soft technologies are flexible\n
  • people do. it’s a dance, and we are the partners to technologies\n
  • The general principles of softening involve making things adaptable, using signposts rather than fence posts, opening up new uses and, above all, aggregating: adding new technologies to increase the adjacent possible. These may involve automation but, if so, not involving the loss of previous capacities.\nTo harden typically involves automation of things that were formerly manual but not just automation per se - it has to replace something softer. Automation that forces a particular way of doing things is hard. Filtering means removing of possibilities (good example: adaptive systems that only show what they think is relevant, rather than those that suggest possible alternatives or highlight things of value). Hard technologies explicitly limit choices.\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Mashing up is the most effective method of making systems as hard or soft as needed\n
  • \n
  • \n
  • Designing cyborgs

    1. 1. Designing Cyborgs Jon DronTechnology Enhanced Knowledge Research Institute (TEKRI), Athabasca University Hybrid Days, 2011
    2. 2. We are here
    3. 3. -40 is the same infahrenheit and celsius
    4. 4. cyborg “creatures simultaneously animal and machine, whopopulate worlds ambiguously natural and crafted” (DonnaHaraway, A Cyborg Manifesto)
    5. 5. everything you need to know about this presentation, for the time-poor (most) humans are not part- technology(most) technology is part-human (literally, not metaphorically)
    6. 6. Collective cyborgs
    7. 7. The Collective
    8. 8. The Collective
    9. 9. Information Information gathering processing Collective Information presentation
    10. 10. Kinds of collective sign-baseddirect mediated stigmergic sematectonic
    11. 11. Kinds of collective sign-baseddirect mediated stigmergic sematectonic
    12. 12. Collective types e.g. ant nest tidying e.g. termites, ant trails, money markets Wikipedia editse.g. flocks, shoals, herds, direct stigmergic e.g. 2nd Life crowds mediated e.g., tag clouds, Google Search e.g.reputation systems, rating systems, collaborative filters
    13. 13. Control in social systems Collective controlIndividual Negotiated Publisher control control controlOwnership, collaboration, hierarchies, autonomy dialogue structure Cooperation, sharing
    14. 14. but
    15. 15. wise crowd orstupid mob?
    16. 16. the Matthew Effect flickr tags over 2 years apart
    17. 17. the Matthew Effect flickr tags over 2 years apart
    18. 18. savannas
    19. 19. the problem of evil
    20. 20. confirmation bias and filter bubbles http://en.wikipedia.org/wiki/File:Fred_Barnard07.jpg
    21. 21. trust and privacy
    22. 22. Effective collectives 1. Adaptability 6. Sociability 2. Stigmergy 7. Constraint 3. Evolvability 8. Context 4. Parcellation 9. Connectivity 10. Scale 5. TrustDron, J. (2007). Control and Constraint in E-Learning: Choosing When to Choose. Hershey, PA: Idea Group International.
    23. 23. Soft and hardtechnologies
    24. 24. technology • screwdriver • paint stirrer • murder weapon • tin opener • back scratcher • door-stop • chisel • beard curler • awl • iPad stand • sink unblocker • door handle • tyre lever • plant aerator • paint scraper • etc ......
    25. 25. It depends on your point of view
    26. 26. technology “the orchestration of phenomena for some use” (W. Brian Arthur)Arthur, W. B. (2009). The Nature of Technology: what it is and how it evolves. New York, USA: Free Press.
    27. 27. technologyhttp://upload.wikimedia.org/wikipedia/commons/6/65/Shint%C5%8D_prayer.jpg
    28. 28. All technologiesare assemblies http://upload.wikimedia.org/wikipedia/commons/a/a1/Heath_Robinson_WWI.png By W. Heath Robinson, via Wikimedia Commons
    29. 29. Soft technologies Active orchestration of phenomena by people
    30. 30. Hard technologies Orchestration of phenomena embedded in the technology
    31. 31. Hard is easy
    32. 32. Hard is complete http://commons.wikimedia.org/wiki/File:Linkware.jpg
    33. 33. Hard is brittle
    34. 34. Soft is hard
    35. 35. Soft is incomplete http://commons.wikimedia.org/wiki/File:Jigsaw.svg
    36. 36. Soft is flexible
    37. 37. what makes a softtechnology complete
    38. 38. Design patterns Soft Hard• Adapt • Automate• Aggregate • Replace• Recommend • Filter• Extend • Limit
    39. 39. Artificial apesOur technologies are not just reflections of us orthings that we use. They are, in part or in whole,made of us
    40. 40. Good cyborg/bad cyborg • humans are part of technologies and humans are in control• humans are part of technologies and technologies are in control
    41. 41. some danger signs that a technology is too hard• rules that cannot be broken• easy paths• ‘the computer says no’
    42. 42. some danger signs that a technology is too soft• repetition of boring tasks• the need for skill• complexity and puzzlement
    43. 43. the holy grailnot too hard, not too soft, just right
    44. 44. Assembly
    45. 45. • http://jondron.athabascau.ca• jond@athabascau.ca

    ×