"How much change do you get from 40$? Analyzing and Addressing Failed Questions on Social Q&A

  • 40 views
Uploaded on

Paper presented at the 75th Annual Meeting of the American Society for Information Science and Technology, October 26-30, 2012, Baltimore, Maryland.

Paper presented at the 75th Annual Meeting of the American Society for Information Science and Technology, October 26-30, 2012, Baltimore, Maryland.

More in: Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
40
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
1
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Content analysis of 200 questions from Yahoo! Answers was conducted by a team of two coders in order to develop a typology for investigating why these informational questions failed to get answers No previous work was identified in the literature review that has attempted to identify characteristics of failed informational questions within an SQA context.  Therefore, the analysis followed the "constant comparative method" constructed with grounded theoty to identify the characteristics of failed questions based on  deductions informed by related literature, prior expertise, and simple human judgment. Two coder were assigned to code 20% of questions first to identify how each coder address the reason of failure, and initial inter coding reliability score was below 80% Major disagreement was on a split between the “ambiguity” category and the “too complex, overly broad” category. So, they revisited questions to establish how to differentiate between these categories and develop more consistent agreement upon the coding scheme, and the final ICR score reached to 90.50%  and also agreed with that some of questions are minor attributes of failure.
     
  • There are 4 major attributes of why some information-seeking questions are failed, each major attribute also has more specific sub-categories of failure
     
  • Data results
  • The third significant attribute of failed questions is that multiple related questions are assigned in one body of a question (26, 13%), which may cause confusion regarding the asker’s intended information-seeking goal.
     
  • multiple related questions represent the asker’s desire to clarify what he/she is looking for in order to satisfy his/her information need.  However, it seems that asking more than one question simultaneously detracts from the opportunity for people to respond, since they must address each question and attempt to translate all of the questions into a single information need; therefore even if all of the questions are somehow  related and intended to provide enough information to explicate the asker’s information need, multiple questions may conversely impair this understanding.
     
  • The last significant attribute of failed questions is ambiguity. Questions that are too vague or too broad may cause misunderstanding regarding their meaning and/or foster multiple interpretations These questions reveal that lack of a coherent and/or clear manifestation of the asker’s information needs discourages responses as people’s murky understanding of what the asker is looking for may eventually impair better interpretation of the questions.
     

Transcript

  • 1. The world’s libraries. Connected. “How much change do you get from 40$?”
  • 2. The world’s libraries. Connected. “What is Warm Mix Asphalt?”
  • 3. The world’s libraries. Connected. “What color are your underwear?”
  • 4. The world’s libraries. Connected. Analyzing and Addressing Failed Questions on Social Q&A Chirag Shah, Marie Radford, Lynn Connaway, Erik Choi, & Vanessa Kitzie
  • 5. The world’s libraries. Connected. The Background BACKGROUND 1. Online question-answering (Q&A) services are becoming increasingly popular among information seekers. (Yahoo Answers, WikiAnswers, Google Answers, Quora, etc) HOWEVER.. 2. There is no guarantee that the question will be answered. 3. The large volume of content on some of SQA sites renders participants unable to answer the questions 4. Some questions may be suitable in QA site A while others may be more suitable in QA site B
  • 6. The world’s libraries. Connected. The Goal GOAL 1. Analyzing why some questions on social Q&A sites are failed 2. Developing a typology of failures for questions 3. Since Q&A services encompass social Q&A (SQA) and virtual reference service (VRS), the study attempts to analyze some failed questions from SQA, and propose how SQA including the questions could be restructured or redirected by VRS
  • 7. The world’s libraries. Connected. Information seeking questions The study focuses on only "failed information seeking questions" 1. Advice and opinion seeking questions (e.g., Is happiness a choice?) are hard to answer 2. Previous research argue that SQA generate more conversational questions; VRS generate more informational questions. 3. A focus of bridging Q&A services (SQA + VRS) - Suggesting the features of VRS to failed Qs on SQA
  • 8. The world’s libraries. Connected. Data Collection Using the Yahoo! Search API (Application Programming Interface) to collect unresolved questions from November 2011 to March 2012 - 13,867 such questions were collected - The remaining 4,638 (about 33%) questions were those with zero answers, and thus were considered to have “failed” - Identifying 200 (about 5%) failed information seeking questions
  • 9. The world’s libraries. Connected. Data Analysis - Using a grounded theory (Glaser & Strauss, 1967) - Two coders analyzed data and constructed a typology of failed information seeking questions. (ICR - 90.50%) - Coders agreed that some questions have minor attributes for failures
  • 10. The world’s libraries. Connected. Data Analysis Typology of failed information seeking questions 1. Unclear - Ambiguity - Lack of information - Poor syntax 2. Complex - Too complex and/or overly broad - Excessive information 3. Inappropriate - Socially Awkward - Prank - Sloths 4. Multi-questions - Related questions - Un-related questions
  • 11. The world’s libraries. Connected. Data Analysis RESULTS
  • 12. The world’s libraries. Connected. Finding The first significant proportion of the failed questions: too complex and/or overly broad (n=68) “What were the effects of slavery as an institution in Frederick Douglas Narrative of the life of frederick dou?” - A lack of perceived effort on the asker’s part to craft a coherent question may cause difficulties in its subsequent interpretation - Questions from this category involve topics too complex and/or specialized, which few people could address. - Sometimes, they are too specific place/people focused.
  • 13. The world’s libraries. Connected. Finding The second most significant attribute of failure: lack of information (n=28) “How much would transmission swap cost?” - Inadequate information increases the chance of potential respondents misinterpreting the asker’s intent. - Questions lacking information often discourage responses as they can be perceived by potential respondents as being too complicated to address.
  • 14. The world’s libraries. Connected. Finding The third significant attribute of failed questions: multiple related questions are assigned in one body of a question (26, 13%) Title: “What is Warm Mix Asphalt? Q1 Content: “I recently ….. could be placed and made at a lower temperature. How long has industry been using this product successfully? Q2 Does it last as long as new pavement that is placed at higher temperature and contains less receycled material?Q3
  • 15. The world’s libraries. Connected. Finding The third significant attribute of failed questions: multiple related questions are assigned in one body of a question (26, 13%) - Asking more than one question simultaneously distracts people to respond, since they must address each question and attempt to translate all of the questions into a single information need - Even if all of the questions are somehow related and intended to provide enough information to explicate the asker’s information need, multiple questions may conversely impair the understanding.
  • 16. The world’s libraries. Connected. Finding The last significant proportion of the failed questions: ambiguity (n=21) “How much change do you get from 40$?” - Questions that are too vague or too broad may cause misunderstanding regarding their meaning and/or foster multiple interpretations - Lack of a coherent and/or clear manifestation of the asker’s information needs discourages responses as people’s murky understanding of what the asker is looking for
  • 17. The world’s libraries. Connected. Conclusion 1. Identifying why questions fail could be the first step toward helping information seekers revise their questions. 2. Since there is little in the literature addressing failed questions in VRS or SQA, this typology could be used to better understand why some questions fail. 3. Testing the typology presented here could help experts (librarians) better assist end-users by identifying when it is appropriate and how to clarify questions
  • 18. The world’s libraries. Connected. Possible applications 1. Employing VRS techniques - Librarians in face-to-face and virtual environments rely on a process of clarifying or negotiating the reference question (Ross, Nilsen, & Radford, 2009) in order to translate the user’s initial statement of an information need into a strong research query that returns relevant results. - This process of 'negotiation' is largely absent in SQA, so this findings suggest "modification" would help to compensate for this absence, or, minimally, provide feedback to allow the user to construct (or reconstruct) a better question.
  • 19. The world’s libraries. Connected. Possible applications 2. Incorporating relevance feedback within the SQA platform. - Relevance feedback represents a self-directed “question negotiation” - Helping to identify pertinent elements to addressing his/her information need - Providing the user with feedback of how to reformulate his/her question (using the coding scheme developed in the study)
  • 20. The world’s libraries. Connected. Funding & Acknowledgements Cyber Synergy: Seeking Sustainability through Collaboration between Virtual Reference and Social Q&A Sites $250,000 for 2011-2013 Funded by IMLS, OCLC, & Rutgers University Co-PIs Marie Radford (RU), Lynn Silipigni Connaway (OCLC), & Chirag Shah (RU) http://www.oclc.org/research/activities/synergy.html
  • 21. The world’s libraries. Connected. Questions?