Analyzing and Addressing Failed Questions on Social Q&A


Published on

Published in: Education
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Analyzing and Addressing Failed Questions on Social Q&A

  1. 1. “How much changedo you get from 40$?” The world’s libraries. Connected.
  2. 2. “What is Warm MixAsphalt?” The world’s libraries. Connected.
  3. 3. “What color are yourunderwear?” The world’s libraries. Connected.
  4. 4. Analyzing and Addressing FailedQuestions on Social Q&AChirag Shah, Marie Radford, Lynn Connaway,Erik Choi, & Vanessa Kitzie The world’s libraries. Connected.
  5. 5. The BackgroundBACKGROUND1. Online question-answering (Q&A) services are becoming increasingly popular among information seekers. (Yahoo Answers, WikiAnswers, Google Answers, Quora, etc)HOWEVER..2. There is no guarantee that the question will be answered.3. The large volume of content on some of SQA sites renders participants unable to answer the questions4. Some questions may be suitable in QA site A while others may be more suitable in QA site B The world’s libraries. Connected.
  6. 6. The GoalGOAL1. Analyzing why some questions on social Q&A sites are failed2. Developing a typology of failures for questions3. Since Q&A services encompass social Q&A (SQA) and virtual reference service (VRS), the study attempts to analyze some failed questions from SQA, and propose how SQA including the questions could be restructured or redirected by VRS The world’s libraries. Connected.
  7. 7. Information seeking questionsThe study focuses on only "failed information seekingquestions"1. Advice and opinion seeking questions (e.g., Is happiness a choice?) are hard to answer2. Previous research argue that SQA generate more conversational questions; VRS generate more informational questions.3. A focus of bridging Q&A services (SQA + VRS) - Suggesting the features of VRS to failed Qs on SQA The world’s libraries. Connected.
  8. 8. Data CollectionUsing the Yahoo! Search API (ApplicationProgramming Interface) to collect unresolvedquestions from November 2011 to March 2012- 13,867 such questions were collected- The remaining 4,638 (about 33%) questions were those with zero answers, and thus were considered to have “failed”- Identifying 200 (about 5%) failed information seeking questions The world’s libraries. Connected.
  9. 9. Data Analysis- Using a grounded theory (Glaser & Strauss, 1967)- Two coders analyzed data and constructed a typology of failed information seeking questions. (ICR - 90.50%)- Coders agreed that some questions have minor attributes for failures The world’s libraries. Connected.
  10. 10. Data AnalysisTypology of failed information seeking questions 1. Unclear 3. Inappropriate - Ambiguity - Socially Awkward - Lack of information - Prank - Poor syntax - Sloths 2. Complex 4. Multi-questions - Too complex and/or - Related questions overly broad - Un-related questions - Excessive information The world’s libraries. Connected.
  11. 11. Data AnalysisRESULTS The world’s libraries. Connected.
  12. 12. FindingThe first significant proportion of the failed questions: too complex and/or overly broad (n=68) “What were the effects of slavery as an institution in Frederick Douglas Narrative of the life of frederick dou?”- A lack of perceived effort on the asker’s part to craft a coherent question may cause difficulties in its subsequent interpretation- Questions from this category involve topics too complex and/or specialized, which few people could address.- Sometimes, they are too specific place/people focused. The world’s libraries. Connected.
  13. 13. FindingThe second most significant attribute of failure: lack of information (n=28) “How much would transmission swap cost?”- Inadequate information increases the chance of potential respondents misinterpreting the asker’s intent.- Questions lacking information often discourage responses as they can be perceived by potential respondents as being too complicated to address. The world’s libraries. Connected.
  14. 14. FindingThe third significant attribute of failed questions: multiple relatedquestions are assigned in one body of a question (26, 13%) Title: “What is Warm Mix Asphalt? Q1 Content: “I recently ….. could be placed and made at a lower temperature. How long has industry been using this product successfully? Q2 Does it last as long as new pavement that is placed at higher temperature and contains less receycled material?Q3 The world’s libraries. Connected.
  15. 15. FindingThe third significant attribute of failed questions: multiple relatedquestions are assigned in one body of a question (26, 13%)- Asking more than one question simultaneously distracts people to respond, since they must address each question and attempt to translate all of the questions into a single information need- Even if all of the questions are somehow related and intended to provide enough information to explicate the asker’s information need, multiple questions may conversely impair the understanding. The world’s libraries. Connected.
  16. 16. FindingThe last significant proportion of the failed questions: ambiguity (n=21) “How much change do you get from 40$?”- Questions that are too vague or too broad may cause misunderstanding regarding their meaning and/or foster multiple interpretations- Lack of a coherent and/or clear manifestation of the asker’s information needs discourages responses as people’s murky understanding of what the asker is looking for The world’s libraries. Connected.
  17. 17. Conclusion1.Identifying why questions fail could be the first step toward helpinginformation seekers revise their questions.2.Since there is little in the literature addressing failed questions inVRS or SQA, this typology could be used to better understand whysome questions fail.3.Testing the typology presented here could help experts(librarians) better assist end-users by identifying when it isappropriate and how to clarify questions The world’s libraries. Connected.
  18. 18. Possible applications1. Employing VRS techniques- Librarians in face-to-face and virtual environments rely on a process of clarifying or negotiating the reference question (Ross, Nilsen, & Radford, 2009) in order to translate the user’s initial statement of an information need into a strong research query that returns relevant results.- This process of negotiation is largely absent in SQA, so this findings suggest "modification" would help to compensate for this absence, or, minimally, provide feedback to allow the user to construct (or reconstruct) a better question. The world’s libraries. Connected.
  19. 19. Possible applications2. Incorporating relevance feedback within the SQA platform.- Relevance feedback represents a self-directed “question negotiation”- Helping to identify pertinent elements to addressing his/her information need- Providing the user with feedback of how to reformulate his/her question (using the coding scheme developed in the study) The world’s libraries. Connected.
  20. 20. Funding & Acknowledgements Cyber Synergy: Seeking Sustainability through Collaboration between Virtual Reference and Social Q&A Sites $250,000 for 2011-2013 Funded by IMLS, OCLC, & Rutgers University Co-PIs Marie Radford (RU), Lynn Silipigni Connaway (OCLC), & Chirag Shah (RU) The world’s libraries. Connected.
  21. 21. Questions?The world’s libraries. Connected.