New Ways of Deliberating Online:
An Empirical Comparison of Network and Threaded Interfaces for Online Discussion
Anna De Liddo, Simon Buckingham Shum
Knowledge Media Institute, The Open University, Walton Hall
MK76AA, Milton Keynes, United Kingdom
{anna.deliddo, simon.buckinghum.shum}@open.ac.uk
Abstract:
One of the Web’s most phenomenal impacts has been its capacity to connect and harness the ideas of many people seeking to tackle a problem. Social media appear to have played specific and significant roles in helping communities form and mobilize, even to the level of political uprisings. Nevertheless the online dialogue spaces we see on the Web today are often re-purposed social networks that offer no insight into the logical structure of the ideas, such as the coherence or evidential basis of an argument. This hampers both quality of citizen participation and effective assessment of the public debate. We report on an exploratory study in which we observed users interaction with a new tool for online deliberation and compared network and threaded visualizations of arguments. Results of the study suggest that network visualization of arguments can effectively improve online debate by facilitating higher-level inferences and making the debate more engaging and fun.
Keywords: Argumentation, Computer Supported Argument Visualisation (CSAV), Online Deliberation, Collective Intelligence
2nd Solid Symposium: Solid Pods vs Personal Knowledge Graphs
DeLiddo&BuckinghamShum-e-Part2014
1. 6th International Conference on e-Participation,
ePart2014
Sept 1-3, 2014
Trinity Collage, Dublin, Ireland
New Ways of Deliberating Online:
An Empirical Comparison of Network and
Threaded Interfaces for Online Discussion
Anna De Liddo & Simon Buckingham Shum
Knowledge Media Institute
The Open University, UK
3. Setting the Problem
• Poor Debate: No tools to identify were ideas
contrast, where people disagree and why... popularity
vs critical thinking
4. Setting the Problem
fundamentally chronological views which offer:
• no insight into the logical structure of the ideas, such as the
coherence or evidential basis of an argument.
• No support for idea refinement and improvement
LINK to PETITION:
http://www.change.org/en-GB/petitions/
stand-against-russia-s-brutal-crackdown-on-
gay-rights-urge-winter-olympics-2014-
sponsors-to-condemn-anti-gay-laws
5. Setting the Problem
• No ways to assess the quality of any given idea
LINK to QUORA:
http://www.quora.com/Physics/Do-wormholes-
always-have-black-holes-at-the-
beginning#answers
6. Pain Point Prioritization
• Poor Commitment to Action
• Poor Summarization
• Poor Visualization
Very High
• Lack of Participation
• Poor Idea Evaluation
• Shallow Contribution
High
• Cognitive Clutters
Moderate • Lack of Innovation
• Platform Island and Balkanization
Minor • Non-representative decisions
7. Setting the Problem
This hampers both:
• quality of users’ participation and
• The quality of proposed ideas
• effective assessment of the state of the debate.
8. A new class of Collective Intelligence and
Online Deliberation Platforms
That make the structure and status of a dialogue or debate visible
Coming from research on Argumentation and CSAV, these tools
make visually explicit users’ lines of reasoning and
(dis)agreements.
• Deliberatorium
• Debategraph
• Cohere
• CoPe_it!
• Problem&Proposals
• YourView
• The Evidence Hub
13. • No systematic evaluation of the merits or otherwise of
network vs. threaded interfaces.
• the bigger the number of participants to the discussion,
and the higher the complexity of the discourse ontology,
the more clumsy and less usable graph visualizations
become
• resistance to the use of graph visualization of arguments
in certain domains of application.
• It therefore remains unclear what are the advantages and
affordances of different graphical representation of
arguments to support online discussion
Motivation
14. • the focus of the study is on reading and searching the
online deliberation platforms
RQ:
• Does an interactive, self-organizing network
visualization of arguments provide advantages over a
more conventional threaded interface for reading and
search?
Research Question
15. • exploratory user study to observe users’ performance under
three information-seeking tasks, and
• compare their performances using two different user
interfaces for arguments visualization (threaded vs network
visualization of arguments)
• A grounded theory analysis of the experimentation’s video to
show how different graphical interfaces for the
representation of arguments affect the way in which users
read and understand the online discourse.
The Study
16. • 10 subjects (two groups of 5)
• members of different Open University departments
• with widely mixed IT expertise
• randomly allocated to the two different groups
• IT expert/non expert ratio in each group was
approximately the same.
• median age 40 (with range from 32 to 48)
• native or near-native English speakers
Participants
17. http://evidence-hub.net/
• The Evidence Hub is a collective intelligence and online
deliberation tool to support argumentative knowledge
construction by crowdsourcing contributions of:
issues, potential solutions, research claims and the related
evidence in favor or against those
The Online Environment:
The Evidence Hub
18. • two different versions of the evidence Hub
• different user interfaces for arguments visualization
• same database to make sure that participants in the two
groups would receive exactly the same quantity and type
of information
Study Conditions
21. • Identifying solutions to an issue (Task1)
• Identifying synergies between solutions (Task 2)
• Identifying contrasts in the wider debate (Task 3)
Tasks
22. Metrics used to compare user’s performance
across these three tasks:
• Task Accomplishment;
• Data Model Interpretation and
• Emotional Reactions to
Emerging metrics
23. Task accomplishment
Main reason of task failure is task understanding
Accomplished by incorrect” mostly in the linear interface group
• the linear structure of the interface does not help to interconnect
and compare content
• users focused more on the content rather than on the
argumentation process
• lead to digressions and incorrect responses.
“Accomplished easily” mostly in the network visualization
interface group. This is mainly due to the
• visual hints provided by the network representation.
• links labels and colors particularly predominant in determining
success.
• less attention paid to the iconography of the nodes.
• connections density very effective to provide answer to the task.
24. Data Model Interpretation
• DMI was better supported by network visualization of
arguments.
• Category misinterpretation and uncertainty in data model
interpretation tend to occur more frequently in Group1 than in
Group2,
• network visualization of arguments support a complete and
correct understanding of both categories and data model.
• learn by example mechanisms: exploration of a new argument
map they reinforce their understanding of the data model
25. Emotional Reactions
• a general sense of surprise and positivity toward the network
visualization is recorded, which easily sparks into likeness and even
excitement This also increase users confidence with the tool (“I feel
confident, I’m pretty sure this is the answer”-P7).
• main objects of surprise due to the self-arranging graph (“it is like a
jelly, it is so fantastic!”-P7; “it is all shifting! It is interesting… I quite
like that!”-P8).
• emotional reactions to the linear interface regards general skepticism
which can also decay into confusion and feeling lost (“I think I am
lost”-P1; I am now burnign.. because I haven't worked out how to do it”
”that is the all page I am looking at…ohhh ok I give up!”-P5).
• The main reason the need of “too many clicks” to seek information and
frequent “change of context” which often provokes disorientation.
26. Conclusions
• Linear interfaces perform worse especially when the
information is nested into more articulated
argumentation chains (Task 2).
• Network-like representation and the visual hints such as
network structure, iconography and links’ labels and
colors facilitate the identification of argumentation
chains, thus supporting indirect connection and higher-level
inferences of how the content connects.
27. Conclusions
• Data model interpretation is also improved by
argument visualization. Notably, exposing the data model
in form of argument maps appears to enable a learning by
example mechanism, whereby users reinforce their
understanding of the data as they navigate through the user
interface.
• Effectiveness of network visualization of arguments and
the positive impact it has on arguments reading and
comprehension increases as information complexity
increases: the bigger the discussion, the better network
visualizations performs compared to linear-threaded
visualizations.
28. Conclusion
• There is an element of fun and excitement associated to
dynamic network visualization of arguments.
• Network visualization of arguments augment online discussion
by providing a layer of structure that helps to improve human
comprehension of the argumentation structure behind the
online discourse.
• These findings open new avenues for combining social media
discourse with advanced network visualizations of discourse
elements (issues, solutions and arguments) to both improve
users’ engagement, sensemaking and appreciation of large-scale
online deliberation processes.
29. Catalyst’s Online Deliberation Tools
Collaborative
Knowledge
Production
Collaborative Web
Annotation and
Knowledge mapping
Structured Online
Discussion and
Argumentation
Social Network Analysis
and Visualization
Advanced Analytics for:
Attention mediation &
Deliberation diagnostic
31. Internationalization to
English and German
Get the LiteMap
bookmarklet
Connect and Map out the
key issues and arguments
visually with LiteMap
Harvest, annotate and classify
contributions from the Utopia’s
discussion forum
1
2
3
35. Follow us on Twitter
@CATALYST_FP7
Watch the Demos on Youtube
CATALYST FP7
Get involved
36. Tools: evidence-hub.net, debatehub.net, litemap.open.ac.uk
Dr. Anna De Liddo
Research Fellow in Collective Intelligence Infrastructures
KMi, Open University, UK
Thanks for your time!
email:
anna.deliddo@open.ac.uk
Twitter: Anna_De_Liddo
HomePage:
http://kmi.open.ac.uk/people/member/anna-de-liddo
Evidence Hub Website:
http://evidence-hub.net/
Projects:
Catalyst: http://catalyst-fp7.eu/
EDV – Election Debate Visualizations: http://edv-project.net/