Your SlideShare is downloading. ×
0
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
UNU MERIT Wikipedia Survey
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

UNU MERIT Wikipedia Survey

3,237

Published on

Overview of the Wikipedia Survey data analysis. The survey is a collaboration between UNU MERIT and the Wikimedia Foundation.

Overview of the Wikipedia Survey data analysis. The survey is a collaboration between UNU MERIT and the Wikimedia Foundation.

Published in: Travel, Technology
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,237
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
28
Comments
0
Likes
4
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Wikipedia Survey Wikimania Buenos Aires 26 August 2009 Collaborative Creativity Group United Nations University MERIT Rishab Ghosh, Ruediger Glott, Philipp Schmidt http://ccg.merit.unu.edu schmidt@merit.unu.edu
  • 2. BACKGROUND • Wikimedia Foundation & United Nations University MERIT • First official Wikipedia survey (for readers and contributors) • Questionnaire developed with community input and building on existing research • Translated into 22 languages by Wikipedia community
  • 3. BACKGROUND • Online survey hosted at MERIT. Code reviewed by WP technical community • Survey went live November 2008 • Link to survey was posted in page headers of WP sites • Staggered across different language editions to deal with traffic loads
  • 4. RESEARCH QUESTIONS / OBJECTIVES • Who is contributing to Wikipedia and how? • Who is using Wikipedia? • What are users' and contributors' perceptions of quality? • Pragmatic findings that help the WMF improve use and benefits. • Establish baseline for possible monitoring system (panel studies)
  • 5. SCOPE • Questionnaire contains 50+ questions (with sub-questions) on a broad variety of topics and is broken down into sections: – General, Contributing, Reading, Non-contributors, Ex-contributors • 310,000 users/contributors accessed the survey • 175,000 valid responses
  • 6. ANALYSIS • Extensive data cleaning (removed more than 3500 cases) • First sub-reports shared with WMF – Survey Overview (available via blog) – Non-contributors (for WMF presentation) – Quality
  • 7. ANALYSIS - NEXT STEPS • August 2009 – Share moderately anonymized data with WMF and make available additional sub-reports • November 2009 – Publish comprehensive survey report (including all sub-reports) • Post publication – Open access to all fully anonymized data
  • 8. LANGUAGE EDITION SURVEYS • 22 languages (incl. 2 surveys for chinese) • Started with largest language editions • Added further editions based on interest by WMF, availability of volunteer translators, and diversity of sample • Top 5 language editions ~ 80% respondents • Russian largest group (tested against manipulation)
  • 9. LANGUAGE EDITION SURVEYS
  • 10. LOCATION • Responses from 231 countries
  • 11. USER/ACTIVITY TYPES • Readers 66% Contributors 31% • Contributors: 4 hrs / week • Additional categorization based on focus areas
  • 12. AGE • Quartile: 18 yrs - 22 yrs - 30 yrs - 85 yrs Type Avg Age All respondents 25.22 Readers 24.79 Contributors 26.14 Female 23.79 Male 25.69
  • 13. GENDER • Gender by user type • Female: 30% readers, 12.5% contributors
  • 14. EDUCATION • High levels of education (esp. given avg ages) • Contributors slightly higher than readers (~ 50% with tertiary education)
  • 15. MOTIVATIONS TO CONTRIBUTE • Ranked motivations (1st - 4th)
  • 16. REASONS FOR NOT CONTRIBUTING
  • 17. HOW TO INCREASE CONTRIBUTION • I would be much likelier to contribute, if …
  • 18. FOCUS AREAS AND EXPERTISE • Culture & Arts most popular, Technology & Applied Sciences (then History, Geography) • 70-90% of contributors self-identify as “experts” • Highest shares of experts in technical and scientific fields • Focus areas do not correspond perfectly with expertise levels. “Geography & Places” attracts high levels of contributors, but comparatively low levels of expertise.
  • 19. FOCUS AREAS AND EXPERTISE
  • 20. PERCEPTIONS OF QUALITY • Quality compared to “traditional” encyclopedia – Reliability (only category where “traditional” received higher scores) – Broadness – Variety – Depth – Understandability – Timeliness • Compare reader and contributor responses
  • 21. QUALITY - RELIABILITY • The information provided is correct
  • 22. QUALITY - DEPTH • The information provides deep understanding of a topic
  • 23. QUALITY - VARIETY • A wide range of topics is dealt with
  • 24. PERCEPTIONS OF QUALITY • Contributors are both more critical (reliability, understandability) and more supportive (all other dimensions) than readers. • Relationship between transparency, understanding of the processes and mechanisms, and perception of quality.
  • 25. ANNEX – ADDITIONAL TABLES
  • 26. MOTIVATIONS TO CONTRIBUTE
  • 27. REASONS FOR NOT CONTRIBUTING

×