Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Discourse Centric Collective Intelligence for the Common Good

189 views

Published on

Slides of my invited talk given at the Computational Decision Making and Data Science Workshop in Belgrade, Serbia in June2018 http://cdmdsw2018.fon.bg.ac.rs/

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Discourse Centric Collective Intelligence for the Common Good

  1. 1. Discourse Centric Collective Intelligence for the Common Good idea.kmi.open.ac.uk Dr. Anna De Liddo Research Fellow
  2. 2. Anna De Liddo Research Fellow Lucia Lupi PhD Student Urban Informatics Michelle Bachler Senior Project Officer Alberto Ardito Web Developer Retno Lasarti PhD Student Explainable AI
  3. 3. Collective Intelligence Online Deliberation Human Dynamics of Engagements Analytics, & Visualization Crowdsourcing ideas, arguments and facts Structured Discourse and Argumentation Political Communication New class of Online Deliberation tools Contested Collective Intelligence for the Common Good (Visual and Argumentation-based CI) Urban Informatics Social Innovation Computational Services & Dialogic Agents
  4. 4. Collective Intelligence Aggregation Approach vCI generated by machine aggregation of networked but isolated human intelligence va wider challenge or work task is parcelled in micro-tasks that are then allocated to a crowd. vCrowds work in isolation and the system meaningfully aggregates contributions vCrowdsorucing, Croudfunding, Prediction Markets, ideation systems
  5. 5. Aggregation Approaches to CI provide vdo not require any group awareness or collective understanding of the problems at hand vdo not support social interaction and communication vno improvement of users’ activity or personal learning therefore are less suitable vTo improve societal awareness and civic intelligence [De Liddo et al.2012, Schuler et al 2018]; vWhen decision-makers need to share information and move toward consensual decisions [Romero et al. 2015].
  6. 6. When tackling complex and contested problems: vthere may not be one worldview, or clear option vevidence can be ambiguous or of dubious reliability requiring the construction of plausible, possibly competing narratives; vgrowth in intelligence results from learning, which is socially constructed through different forms of discourse, such as dialogue and debate. Contested Collective Intelligence (De Liddo 2012)
  7. 7. Contested Collective Intelligence Co-Creation Approach vCI is generated by small to large scale communities which work together, in mutual awareness and toward a collective goal, vEnables sensemaking, reflection, idea revision and “change” of personal actions and understandings as a consequence of the activities of others vSupport learning cycles which lead to collective change and improvement.
  8. 8. Collective Intelligence Spectrum Model of Collective Intelligence (CI): from sensing the environment, to interpreting it, to generating good options, to taking decisions and coordinating action... Collec&ve( Ac&on( Collec&ve( Decision( Collec&ve( Idea&on( Collec&ve( Sensemaking( Collec&ve( Sensing(( (
  9. 9. Collective Intelligence Spectrum Model of Collective Intelligence (CI): from sensing the environment, to interpreting it, to generating good options, to taking decisions and coordinating action... Collec&ve( Ac&on( Collec&ve( Decision( Collec&ve( Idea&on( Collec&ve( Sensemaking( Collec&ve( Sensing(( (
  10. 10. Social Media, Community Ideation and Question- Answering is proliferating on the Web
  11. 11. Setting the Problem: no ways to identify where idea contrast • Poor Debate: No tools to identify were ideas contrast, where people disagree and why... Reward popularity vs critical thinking
  12. 12. Flat listing of posts and no insight into the logical structure of ideas and arguments: such as coherence or evidential basis of an argument.
  13. 13. No support for idea refinement and improvement These tools are increasingly used to support online debate and facilitate citizens’ engagement in policy and decision-making. These are fundamentally chronological views which offer: • No support for idea refinement and improvement LINK to PETITION: http://www.change.org/en- GB/petitions/stand-against-russia-s- brutal-crackdown-on-gay-rights-urge- winter-olympics-2014-sponsors-to- condemn-anti-gay-laws
  14. 14. No ways to assess the quality of any given idea LINK to QUORA: http://www.quora.com/Physics/Do- wormholes-always-have-black-holes-at- the-beginning#answers
  15. 15. Setting the Problem • Poor Debate: No tools to identify were ideas contrast, where people disagree and why • Poor idea evaluation: No mechanisms to identify, contribute and discuss the evidence for an idea • Poor Summarization and Visualization • Shallow contributions and Cognitive clutters • Platform Island & Balkanization This hampers: • quality of users’ participation • the quality of proposed ideas • effective assessment of the state of the debate.
  16. 16. A new class of Collective Intelligence and Online Deliberation Platforms That make the structure and status of a dialogue or debate visible Coming from research on Argumentation and CSAV, these tools make visually explicit users’ lines of reasoning and (dis)agreements. • Deliberatorium • Debategraph • Cohere • CoPe_it! • Problem&Proposals • YourView • The Evidence Hub
  17. 17. A Common Data Model: simplified IBIS IBIS adds a simple semantic structure to the online conversation and has demonstrated to be usable by lay people in different public debates (Iandoli et al. 2009, Klein 2012).
  18. 18. vCollective Applied Intelligence and Analytics for Social Innovation vProduced an ecosystem of collective intelligence tools that have been validated with 9 difference SI communities
  19. 19. • Poor Commitment to Action • Poor Summarization • Poor Visualization Very High • Lack of Participation • Poor Idea Evaluation • Shallow Contribution High • Cognitive Clutters • Lack of InnovationModerate • Platform Island and Balkanization • Non-representative decisionsMinor Pain Point Prioritization of Common Social Media for deliberation-based social innovation
  20. 20. Collective Intelligence Spectrum Model of Collective Intelligence (CI): from sensing the environment, to interpreting it, to generating good options, to taking decisions and coordinating action... Collec&ve( Ac&on( Collec&ve( Decision( Collec&ve( Idea&on( Collec&ve( Sensemaking( Collec&ve( Sensing(( (
  21. 21. Collaborative Web Annotation and Knowledge Mapping http://litemap.open.ac.uk
  22. 22. Internationalization to English and German Connect and Map out the key issues and arguments visually with LiteMap Get the LiteMap bookmarklet Harvest, annotate and classify contributions from the Utopia’s discussion forum 1 2 3
  23. 23. available at: cidashboard.net
  24. 24. Since its first launch in 2015, has been used • By over 2000 users • in 10 different countries, • Over 100 community groups • 560 Maps to confirm an emerging public and education impact. • Local Area Coordinators in Leicester, LiteMap has proved to improve agency, promote digital skills • a Brazilian community of 1300 teachers carry out collaborative work and coordinate online course activities with, LiteMap improve collaborative online learning and collective inquiries.
  25. 25. Collective Intelligence Spectrum Model of Collective Intelligence (CI): from sensing the environment, to interpreting it, to generating good options, to taking decisions and coordinating action... Collec&ve( Ac&on( Collec&ve( Decision( Collec&ve( Idea&on( Collec&ve( Sensemaking( Collec&ve( Sensing(( (
  26. 26. Structured Online Discussion and Argumentation based Decision Making debatehub.net DebateHub is an online discussion tool which goes beyond simple commenting and facilitates activities such as: collective ideation, structured debate, and collective decision making.
  27. 27. v Facilitation features such as merge, move and split ideas to avoid duplication, redundancy and improve idea structuring v Analytics and Visualizations to help sense making of the debate v A Phased Deliberation Process in which online communities can alternate ideation, discussion and voting to support idea selection and decision making.
  28. 28. available at: cidashboard.net
  29. 29. Phased, dialogue based decision making Collective reach faster agreement when they reflect on what they hate rather that what they like. Uses the bag of lemons/bag of stars method (Klein and Garcia 2014)
  30. 30. Since its first launch in 2015, has been used has been mostly used in the social innovation sector • (OuiShare, Wisdom Hacker, DS&NY, CSPC, UTOPIA, I4P) and two Urban Community Networks for democratic decision making • (Ganemos Madrid and AutoConsulta Ciudadana) Spain.
  31. 31. Lessons Learned from Real World Deployments
  32. 32. Technical Lessons Learned v Our CI model Works v Success in sharing data between different components and data models v Hard getting large-scale community testing off the ground - need to tackle integration with existing communities’ platforms v paramount importance of user interface work: CI works best when it is transparent
  33. 33. Methodological lessons learnt v Co-creation approach to CI cannot emerge unless the community can recognize itself as such, involved in some form of common process. – need of an existing community v Participation is hard and follows a power law – how to ensure engagement, neutrality etc? v Different communities need different CI tools/enablers in the CI spectrum
  34. 34. New Modes of Engagement with Televised Political Debate through Audience Feedback
  35. 35. The past….
  36. 36. The present… The past….
  37. 37. The present… The past…. ?The Future?
  38. 38. Research Questions: • Is this new “participation experience” really informative? And to what extent does it improve citizens’ confidence about the issues discussed? • Do social media voices truly capture the richness of citizens’ reactions to political debates? • What could we learn about the audience of political election debate, and about the debate as media event, if we had better analytical tools to scrutinize audience’s understanding and reactions?
  39. 39. Real Time Audience Feedback Objectives • promoting active engagement by enabling the audience to react to the televised debates in new unitrusive, yet expressive, and timely manner; • harnessing and analysing viewers’ reactions to better understand the audience and their debate experience; • Enabling self and collective reflection, sensemaking and learning through advance analytics and visualisations • providing new metrics to assess the debate as media event in terms of its capability to engage the audience emotionally, intellectually, critically and democratically.
  40. 40. A New Method to Harness Audience Reactions • Instant • Nuanced meaning • Discourse-based: Provided in form of discourse elements • Voluntary and non-intrusive • Enabling analytics and visualisations ‘Soft’ Feedback:
  41. 41. A paper prototype: the flashcard experiment • 18 flascards in 3 categories • Emotion • Trust • Information need • 15 participants watched the second Clegg-Farage debate live • Video annotations in Compendium (and Youtube!)
  42. 42. Trust Cards designed to provide insights on the main motivations for audience’s trust/distrust. ….with the gaol in mind to distinguish between trust on the speaker, the debate content, and pre-existing beliefs.
  43. 43. Emotion Cards Designed to provide insights on audience’s emotional reactions to the debate and can be used as proxy to assess people engagement with the speakers and the debated topics.
  44. 44. Questions Cards Designed to provide insights on audience’s information needs. ..to inform the type of information analysis and visualizations to be implemented in the EDV replay platform, in order to make the audience viewing experience more informative.
  45. 45. A paper prototype: the flashcard experiment
  46. 46. Clegg’s VS Farage’s Reactions triggers
  47. 47. Explore in details one of the speakers perceived performance
  48. 48. Explore in details one of the speakers perceived performance
  49. 49. Farage: “…actually sixty-two percent of the people that were surveyed in that British car manufacture interview they want serious reform within the European Union if they're gonna stay as members. So, far from the top line being true, two-thirds of them are saying unless we get reform then the time has come to leave the EU.” Farage: “You can't do that. You haven't got this power. You haven't got this control.” Clegg: “Yes, you do. Yes, you do.” Farage: “We do not have that power as members of the European Union and that's the truth of it.” Clegg: “Yes, you do. Yes.” Who to TRUST? Is this TRUE? Where can I find more info on this?
  50. 50. Who are the outliers?
  51. 51. Self- Reflection: How do I differ/comply with the GROUP? Me VS the Group
  52. 52. Self- Reflection: How do I differ/comply with the GROUP?
  53. 53. From Paper Prototype to an Instant Audience Feedback Web App • For citizens/users at large • For analysts (political analysts, digital journalists) • For domain experts (Politicians, Media Broadcasters) Check it out at: democraticreflection.org
  54. 54. 2015 Election Debate • Panel of 400 people • Experiment in the wild
  55. 55. 2017 Election Debate • Mobile Application • First analytics interface • New feedback intensity interaction • 2 panels of 20 people • Experiment in the wild
  56. 56. Visual Analytics • Personal/Self reflection Analytics • Collective Analytics to be viewed: - during the live event or replay, - Post hoc - both static and dynamic visualisations
  57. 57. Advantages of the Real Time Audience Feedback Method The instant, nuanced feedback method we propose provides: • similarly powerful insights on the audience • while preserving the accountability of the results and addressing issues of scale • Enables new mechanisms of civic learning and collective sensemaking
  58. 58. Key Risks of Technological Enhancements • Powerful analytical tool are often used as persuasive tools but the same tools can be used for improving civic engagement and learning • Users profiling is more and more used by big corporations to target people but it can be also used by government to provide better services and to design effective civic learning experience • How to we design for this second class of applications and try preventing misuse of technology?
  59. 59. Advanced Visual Interfaces to improve Sensemaking of Political Debate
  60. 60. Democratic Replay
  61. 61. Lessons Learned from Users Testing of Democratic Replay comparison with BBC replay Democratic Replay enables the main sensemaking capabilities: • “unexpected insights on the debaters and on what they said,” • To “reflect on the debate in a deeper way” • significantly better “ways to evaluate facts and evidence • “focusing on different aspects of the debate” and • “reconstructing the arguments that the speakers made.” • “Assessing personal assumption” and “changing some initial assumptions had before the debate.”
  62. 62. • If we want to support people’s capability to question assumptions and think critically, we need to design spaces for personal reflection and sensemaking. • Individual sensemaking processes need human– machine support. • New tools are needed to bridge political debate across community platforms: a visual analytics and data science approach Lessons Learned from Users Testing of Democratic Replay
  63. 63. Future Research Challenges
  64. 64. How to Enable Very Large Scale Public Deliberation? a pervasive challenge for scaling up CI platforms adoption is: v Enabling collective sensemaking across community platforms v Defining the architecture of effective participation v Moving from discussion-based ideation to collective decision making - Closing up the decision making to action cycle
  65. 65. Collective Intelligence Spectrum Model of Collective Intelligence (CI): from sensing the environment, to interpreting it, to generating good options, to taking decisions and coordinating action... Collec&ve( Ac&on( Collec&ve( Decision( Collec&ve( Idea&on( Collec&ve( Sensemaking( Collec&ve( Sensing(( (
  66. 66. Interfaces for Sensemaking which build on Minimal Meaningful Participation Real Time Analytics, Argument Mining, Fact Checking and Human Machine Annotation a pervasive challenge for building CI platforms is balancing a critical tension between: • The need to structure and curate contributions from many people in order to maximise the signal-to-noise- ratio and provide more advanced CI services • versus permitting people to make contributions with very little useful indexing or structure
  67. 67. Interfaces for Explicability and Conversational Intelligence – to improve Trust and Accountability of Machine Predictions CI works best when it is transparent • participants want to understand how their contributions are integrated and must be given access to visible expressions of analytics processes. • On the other hand, the complexity of the underlying process can also scare participants away, and much raw data from analytics is hard to interpret without training
  68. 68. HOW TO TRANSFORM SOCIO-TECHNICAL INNOVATION INTO A collective learning process HAVING A PUBLIC/COLLECTIVE VALUE?
  69. 69. CARE FOR: v AWARENESS, Transparency and EXPLICABILITY v USERS’ Engagement, Interaction, EMPOWERMENT
  70. 70. NEW TECHNOLOGIES TO SUPPORT DIALOGUE BETWEEN CITIZENS, ORGANISATIONS, WITH INSTITUTIONS Dialogue based Action Political Dialogue Evidence Based Dialogue Local Dialogue & Geo-Deliberation Object Oriented Dialogue Human Machine Dialogue
  71. 71. Collective Intelligence For the Common Good Community - ci4cg.org Several international workshops and 2 Special issues
  72. 72. Thank you for listening! Please fell free to contact me at anna.deliddo@open.ac.uk to know more about our work please visit the research group website at: idea.kmi.open.ac.uk

×