De liddo & Buckingham Shum jurix2012
Upcoming SlideShare
Loading in...5
×
 

De liddo & Buckingham Shum jurix2012

on

  • 587 views

 

Statistics

Views

Total Views
587
Views on SlideShare
532
Embed Views
55

Actions

Likes
2
Downloads
3
Comments
0

1 Embed 55

https://twitter.com 55

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

De liddo & Buckingham Shum jurix2012 De liddo & Buckingham Shum jurix2012 Presentation Transcript

  • Jurix 2012 Workshop on Argumentation Technology for Policy DeliberationsA Collective Intelligence Tool for Evidence Based Policy & Practice Anna De Liddo & Simon Buckingham Shum Knowledge Media Institute The Open University, Milton Keynes, MK7 6AA, UK http://evidence-hub.net/
  • Human-­‐Centred  Compu/ng  for  CCI     (Argumenta/on-­‐based  Collec/ve  Intelligence) Practice: CoPHV Policy: Research: Olnet, Ed Rcyp Hub Future Human Dynamics of Engagements UX design Computational Services/ Cohere/free semantics Analytics NLP, XIP, discourse analysis EH/simplified model, Structure widget/threaded searches/ interface Agents
  • Collective Intelligence: How do we Crowdsource Policy Deliberation?Collective intelligence is a new umbrella term used to express the augmentedfunctions that can be enabled by the existence of a community.It refers to the intelligence that emerges by the coexistence of multiple peoplein the same environment. This environment can be both a real worldenvironment and a virtual one.We are looking at the Web and at what intelligent hints, supports or behaviorscan be tracked and emerge by the coexistence of a collective of people online.
  • Collective IntelligenceNowadays successful CI tools have been developed, especially in the ITbusiness and e-commerce sector, that use those users traces to depictusers profiles and suggest user actions based on those profiles.These CI tools support quite simple user objectives, such as: deciding whatbook to buy (Amazon), finding the picture or video that mach their needs (Flickrand Youtube), or deciding what music to listen (LastFM).Collecting fragmented users traces seems to work to collect and exploit CI inthe business and commerce sector.
  • Contested Collective IntelligenceOn the other hand if we look at the social and political sector, or at higher levelorganizational and business strategy issues we need to support more complexusers goals such as i.e.:understanding policy actions; learning environmental responses to adaptorganizational actions; understanding the economical crisis and the possibleimplications for the community etc.CI tools that aims at supporting users in more complex knowledge worksneed to be thought and designed so that users collective intelligence can becaptured and shared in a much richer and explicit way.
  • Contested Collective IntelligenceIn the design space of CI systems•  where there is insufficient data to confidently compute an answer,•  when there is ambiguity about the trustworthiness of environmental signals,•  and uncertainty about the impact of actions,…then a more powerful scaffolding for thinking and discourse is required, inorder to support the emergence of CI around complex socio political dilemmas.
  • First Prototype Tool for ContestedCollective Intelligence (CCI)  With Cohere users can make their thinking visible and sharable with online communities by:ü  collaboratively annotating the Web,ü  leveraging lists of annotations into meaningful knowledge maps andü  Engaging in structured online discussions.
  • Cohere Conceptual Model  Cohere  builds  on  a  conceptual  model  which  consists  of  four  main  users  ac/vi/es   through  which  users  can  make  their  thinking  visible  and  contribute  to  the   development  of  Collec/ve  Intelligence  around  specific  issues:  
  • Collaborative Annotate Web Resources
  • Make Semantic Connections
  • Explore, Filter and Makesense Watch the demo video at: http://www.youtube.com/watch?v=Fcn2ab9PYo4Watch the Open Deliberation model video at: http://www.youtube.com/watch?v=vthygbKA2Mg
  • Despite the success of the web annotation paradigm…People seems to struggle to make semantic connections, moreover too many semantics produce often redundancy and duplication.This brought to the second design iteration: A new simplified data model and a new interface for connection making….
  • Experimenting CCI in a real case ofEducational Policy: The OLnet projectThe issue: lack of evidence of OER effectiveness olnet.org
  • OLnet project: The wider research questionRQ: How can we help researchers and practitioners inthe OER field to contribute to the evidences of OEReffectiveness and to investigate these evidencescollaboratively?
  • Approach:Contested Collective IntelligenceOur approach to CI focuses on:capturing the hidden knowledge of the OER movementand leveraging it so that can be:ü  debated (building and confronting arguments),ü  evaluated (assessing evidence), andü  put in value (distilling claims used to inform OER policy and practice)
  • What&Why:The Evidence Hub providesü the OER community with a space to harvest the evidence of OER effectivenessü policy makers with a community-generated knowledge base to make evidence based decision on Educational Policy.
  • The Evidence Hub: Mapping the social and discourse ecosystem  Social  Ecosystem  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges   Themes  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources     olnet.org
  • Social Ecosystems (Org and Projects)
  • Social Ecosystems (Org and Projects)
  • Social Ecosystems (Org and Projects)
  • Social Ecosystems (Org and Projects)
  • Social Ecosystems (Org and Projects)
  • Social Ecosystems (Org and Projects)
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • The Discourse Ecosystem Elements: A Simplified Data Model
  • Evidence Types:From simple data, anecdotes and stories toliterature Analysis and Experimental Results
  • CI: Community ActionsSocial  Ecosystem   Loca4ng:  Adding  geo-­‐loca/on  info  •  People  (Contributors)  •  Projects  •  Organiza/ons    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    
  • CI: Community ActionsSocial  Ecosystem   Loca4ng:  Adding  geo-­‐loca/on  info  •  People  (Contributors)  •  Projects  •  Organiza/ons   Following:     Expressing  Interest    Discourse  Ecosystem  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    
  • CI: Community ActionsSocial  Ecosystem   Loca4ng:  Adding  geo-­‐loca/on  info  •  People  (Contributors)  •  Projects  •  Organiza/ons   Following:     Expressing  Interest    Discourse  Ecosystem   Connec4ng:  Adding  to  Widget  •  Key  challenges  •  Issues,    •  Solu/ons,    •  Claims  •  Evidence    •  Resources    
  • CI: Community ActionsSocial  Ecosystem   Loca4ng:  Adding  geo-­‐loca/on  info  •  People  (Contributors)  •  Projects  •  Organiza/ons   Following:     Expressing  Interest    Discourse  Ecosystem   Connec4ng:  Adding  to  Widget  •  Key  challenges  •  Issues,     Ordering-­‐Priori/zing  •  Solu/ons,     Promo4ng:     lists    •  Claims   Vo/ng  Connec/ons  •  Evidence    •  Resources    
  • CI: Community ActionsSocial  Ecosystem   Loca4ng:  Adding  geo-­‐loca/on  info  •  People  (Contributors)  •  Projects  •  Organiza/ons   Following:     Expressing  Interest    Discourse  Ecosystem   Connec4ng:  Adding  to  Widget  •  Key  challenges  •  Issues,     Ordering-­‐Priori/zing  •  Solu/ons,     Promo4ng:     lists    •  Claims   Vo/ng  Connec/ons  •  Evidence    •  Resources   Theming:  Adding  OER  Themes    
  • CI: Community ActionsSocial  Ecosystem   Loca4ng:  Adding  geo-­‐loca/on  info  •  People  (Contributors)  •  Projects  •  Organiza/ons   Following:     Expressing  Interest    Discourse  Ecosystem   Connec4ng:  Adding  to  Widget  •  Key  challenges  •  Issues,     Ordering-­‐Priori/zing  •  Solu/ons,     Promo4ng:     lists    •  Claims   Vo/ng  Connec/ons  •  Evidence    •  Resources   Theming:  Adding  OER  Themes    
  • A simplified UI for connection making….
  • Widget Interface for Connection Making
  • Widget Interface for Connection Making
  • The Evidence HubSome Facts and FiguresThe Evidence Hub alpha version launched in April 2011.With 50 users, from 35 different countries, including key OER people.
  • Some Facts and FiguresOpened to the public at OpenEd11 in Utah. olnet.org
  • Some facts and figures: Engagement- 108 contributors,- received 3,054 visits from 1,053 unique visitors from57 different countries olnet.org
  • Some facts and figures on Content304 OER projects and organizations129 OER research claims79 OER issues89 proposed solutions323 Evidence and553 Resources olnet.org
  • Reflection on initial User Testing & Interviews Feedback  from  users  shows  that  the  EH  is  perceived  as:     “relevant”,  “organized”,  “desirable”  and  “engaging”     but  some/mes  “sophis1cated”  and  “complex”.       improving  the  user  experience  by  crea4ng  summary  views,   facilitate  and  simplify  content  seeding,    be:er  displays  and  filters  on  the  content.   olnet.org
  • Feedback from Lab-Based User TestingFragmented approach to argument construction: widgetinterface•  Easy to contribute to but•  Increases miscathegorization : interpretation biases on how content should be labeled under specific argumentation categories;•  Increases duplication of content•  Decreases argumentation coherenceThis lead to the third design Iteration….
  • Third design Iteration:The CoPHV Evidence Hub
  • Research  by  Children  and  Young  People  Evidence   Hub:  A  mixed  threaded/widget  interface    
  • Collective Intelligence Development Trajectories: Facilitating content seeding 1) Web Annotation to support seeding Evidence Hub bookmarklet to allow people to capture evidence by performing annotation of free web resources and OERs.Allows users tohighlight andannotate Webresourcesthrough anEvidence Hubbookmarklet olnet.org
  • 2) Combining Human and Machine Annotation:The Hewlett Grant Reports Project template report RESULTS XIP-annotated report De Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale,Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012), Page 417-448
  • Discourse analysis with the Xerox Incremental Parser Detection of salient sentences based on rhetorical markers:BACKGROUND KNOWLEDGE: NOVELTY: OPEN QUESTION:Recent studies indicate … ... new insights provide direct … little is known …… the previously proposed … evidence ...... we suggest a new ... … role … has been elusive approach ... Current data is insufficient …… is universally accepted ... ... results define a novel role ... CONRASTING IDEAS: SIGNIFICANCE: SUMMARIZING: … unorthodox view resolves … studies ... have provided important The goal of this study ... paradoxes … advances Here, we show ... In contrast with previous hypotheses ... Knowledge ... is crucial for ... Altogether, our results ... indicate ... inconsistent with past findings ... understanding valuable information ... from studies GENERALIZING: SURPRISE: ... emerging as a promising approach We have recently observed ... Our understanding ... has grown surprisingly exponentially ... We have identified ... unusual ... growing recognition of the The recent discovery ... suggests importance ... intriguing rolesDe Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012), Page 417-448
  • XIP annotations to Cohere CONTRAST converts into semantic connection’s label :””describes contrasting ideas in” PROBLEM_CONTRAST_ First, we discovered that there is no empirically based understanding of the challenges of using OER in K-12 settings. PROBLEM converts into Annotation node icon: Name entities extracted by XIP convert “Issue”=“Light Bulb” into Tags Annotation Node Report NodeDe Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012), Page 417-448
  • Human annotation and machine annotation 1. ~19 sentences annotated 22 sentences annotated 11 sentences = human annotation 2 consecutive sentences of human annotation 2. 71 sentences annotated 59 sentences annotated 42 sentences = human annotationDe Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012) Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study, Computer Supported Cooperative Work (CSCW) Journal : Volume 21, Issue 4 (2012), Page 417-448
  • 3)  Collabora4ve  PDF  annota4on   A high % of policy report and documents are in PDF format: we have a concept demo of direct PDF annotation shared back to CohereFuture developmentscould be devoted topower the EvidenceHub with PDFannotation so thatusers can shareevidence of Policyarguments andimpact directlyworking with PDF.Steve Pettifer, Utopia:http://getutopia.com/ olnet.org
  • Collective Intelligence Development Trajectories:4) Better visualization and filtering of content 5)Discourse AnalyticsWhen social and discourse elements become too many in number and complexity how can we make sense of them? Toward CI visualization and analysis… adding more formal logics to evaluate arguments &developing discourse analytics to create summaries, identify gaps, localize interests, focus contributions
  • Discourse Network Visualization Watch the demo video at: http://www.youtube.com/watch?v=Fcn2ab9PYo4Watch the Open Deliberation model video at: http://www.youtube.com/watch?v=vthygbKA2Mg
  • Social Network Visualization
  • Theoretical questions for future work•  How to evaluate arguments? - authomatic (based on argument computation) vs community lead mechanisms (such as voting and reputation systems)•  How to make optimal use of both human and machine annotation & argumentation skills? –  How to exploit machine consistency while reducing information overload and noise? –  How to exploit the unique human capacities to abstract, filter for relevance etc.?•  How to cope with visual complexity (new search interface, focused and structured network searches, collective filtering, identifying argument structures)?•  How do we crowdsource Policy Deliberation? What is the right interface? What is the architecture of Participation?
  • References•  De  Liddo,  A.,  Sándor,  Á.  and  Buckingham  Shum,  S.  (2012)  Contested  Collec/ve  Intelligence:  Ra/onale,  Technologies,  and  a   Human-­‐Machine  Annota/on  Study,  Computer  Supported  Coopera/ve  Work  (CSCW)  Journal  :  Volume  21,  Issue  4  (2012),   Page  417-­‐448  •  Buckingham  Shum,  Simon  (2008).  Cohere:  Towards  Web  2.0  Argumenta/on.  In:  Proc.  COMMA08:  2nd  Interna4onal   Conference  on  Computa4onal  Models  of  Argument,  28-­‐30  May  2008,  Toulouse,  France.  Available  at:hap://oro.open.ac.uk/ 10421/  •  De Liddo, Anna and Buckingham Shum, Simon (2010). Cohere: A prototype for contested collective intelligence. In: ACM Computer Supported Cooperative Work (CSCW 2010) - Workshop: Collective Intelligence In Organizations - Toward a Research Agenda, February 6-10, 2010, Savannah, Georgia, USA.  Available  at: http:// oro.open.ac.uk/19554/•  Buckingham  Shum,  Simon  and  De  Liddo,  Anna  (2010).  Collec/ve  intelligence  for  OER  sustainability.  In:  OpenED2010:  Seventh   Annual  Open  Educa4on  Conference,  2-­‐4  Nov  2010,  Barcelona,  Spain.  Available  at:  hap://oro.open.ac.uk/23352/•  De  Liddo,  Anna  (2010).  From  open  content  to  open  thinking.  In:  World  Conference  on  Educa4onal  Mul4media,  Hypermedia   and  Telecommunica4ons  (Ed-­‐Media  2010),  29  Jun,  Toronto,  Canada.  Available  at:  hap://oro.open.ac.uk/22283/  •  De  Liddo,  Anna  and  Alevizou,  Panagiota  (2010).  A  method  and  tool  to  support  the  analysis  and  enhance  the  understanding   of  peer-­‐-­‐to-­‐-­‐peer  learning  experiences.  In:  OpenED2010:  Seventh  Annual  Open  Educa4on  Conference,  2-­‐4  Nov  2010,   Barcelona,  Spain.  Available  at:  hap://oro.open.ac.uk/23392/  •  Buckingham  Shum,  Simon  (2007).  Hypermedia  Discourse:  Contes/ng  networks  of  ideas  and  arguments.  In:  Priss,  U.;   Polovina,  S.  and  Hill,  R.  eds.  Conceptual  Structures:  Knowledge  Architectures  for  Smart  Applica4ons.  Berlin:  Springer,  pp.  29– 44.   Thanks for Your Attention! Anna De Liddo anna.deliddo@open.ac.uk http://people.kmi.open.ac.uk/anna/