• Save
An evaluation of the potential of Web 2.0 API's for social research
Upcoming SlideShare
Loading in...5
×
 

An evaluation of the potential of Web 2.0 API's for social research

on

  • 659 views

Mechant, P. & Courtois, C. (2011). An evaluation of the potential of Web 2.0 API's for social research. In: COST Action ISO906 : New challenges and methodological innovations in European Media ...

Mechant, P. & Courtois, C. (2011). An evaluation of the potential of Web 2.0 API's for social research. In: COST Action ISO906 : New challenges and methodological innovations in European Media Audience Research, Zagreb, Croatia, 2011-04-07. 43-44.

Statistics

Views

Total Views
659
Views on SlideShare
658
Embed Views
1

Actions

Likes
1
Downloads
3
Comments
0

1 Embed 1

http://www.docseek.net 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

An evaluation of the potential of Web 2.0 API's for social research An evaluation of the potential of Web 2.0 API's for social research Presentation Transcript

  • Cédric Courtois and Peter Mechant IBBT-MICT-Ghent University Using API’s in Web 2.0 user research: The case of networked public expectancies and feedback preferences on YouTube
    • Accounts for over 20% of the HTTP traffic
    • Over 50,000 new videos every day, mostly user-generated
    • RQ 1: Who do these uploaders expect to view their videos?
    • RQ 2: How do they know this public is attained?
    • RQ 3: Are these expectancies reliable?
    • Unfunded, multi-method research project
    • Consists of three phases:
      • Qualitative exploration (20 face to face interviews)
      • Quantitative validation (two samples: N = 450 and N = 242)
      • Longitudinal analysis of online feedback ( N = 242)
    • Combination of self-report and platform data (Google API)
    • Results in press:
      • Courtois, C., Mechant, P., Ostyn, V. & De Marez, L. (in press). Uploaders' Definitions of the Networked Public on YouTube and their Feedback Preferences: A Multimethod Approach. Behaviour & Information Technology.
      • Courtois, C., Mechant, P. & De Marez, L. (in press). Teenage Uploaders on YouTube: Networked Public Expectancies, Online Feedback Preference and Received On-Platform Feedback. Cyberpsychology, Behavior, and Social Networking.
    • Phase 1: qualitative exploration
    • Theoretical framework:
      • Audiences versus publics: passive versus active, critically engaged
      • Concept of Networked Publics
      • Collective Effort Model (Social Loafing): engage in collective task?
      • Importance of feedback in CMC (confirmation of ‘imagined audience’)
    • Analysis: combination of deductive and inductive coding
      • All instances of groups: differentiation
      • All instances of feedback types and the attributed importance
    • Phase 1: qualitative exploration (results)
    • Three types of networked publics mapped onto two dimensions
      • Identified offline public: physical acquaintances, socially embedded
      • Identified online public: like-minded online in-group
      • Unidentified online public: unfamiliar online out-group
    • Phase 1: qualitative exploration (results)
    • Seeming contingencies between public types and feedback
      • Identified offline public: offline feedback (real-life comments/conversations) and off-platform online feedback (via IM, E-mail, SNS)
      • Identified online public: on and off-platform feedback (via comments, rates, views)
      • Unidentified online public: online on-platform feedback
    • Expectancy strengths
      • Identified public types > unidentified public types
      • Offline public type > online public types
    • Phase 1: qualitative exploration (results)
    • Seeming contingencies between public types and feedback
      • Identified offline public: offline feedback (real-life comments/conversations) and off-platform online feedback (via IM, E-mail, SNS)
      • Identified online public: on and off-platform feedback (via comments, rates, views)
      • Unidentified online public: online on-platform feedback
    • Expectancy strengths
      • Identified public types > unidentified public types
      • Offline public type > online public types
    Subsequent quantitative phase: operationalization and testing of qualitative results
    • Phase 2: quantitative study (methodology)
      • Step 1 : selection of potentional respondents through Google API (harvested into database)
      • Step 2: invitation through comments underneath videos: call to action (with hyperlink to survey regarding the uploaders’ latest video, measuring public expectancies and feedback importance; N = 450, Mean age = 23.70 ( SD = 12.32), 73% males )
      • Step 3: harvesting of respondents’ public profile data
  • Phase 2: quantitative study (results) The majority of findings in the qualitative phase are supported Hypotheses Evidence H1a: Offline public > Online publics ✓ and ✗ Within-subjects ANOVA H1b: Identified publics > Unidentified public ✓ H2a: Identified offline public expectancy  Offline and off-platform online feedback importance (+) ✓ Structural Equation Modeling H2b: Identified online public expectancy  On- and off-platform online feedback importance (+) ✓ H2c: Identified offline public expectancy  On-platform online feedback importance (+) ✗ @  2 (108, N = 450 ) = 223.29, p < .001, CFI = .98, TLI = .97, RMSEA = .05, CI 90 .04, .06
    • Implications
      • Majority of users upload for people they have affinity with, the broader YouTube community is not important
      • Strong contingencies between the type of expected public and the feedback that is appreciated (lack of feedback does not imply an unsatisfied uploader)
      • Yet: do uploaders who seek online feedback get what they want? A second study was set up to test this …
    • Second study
      • Quantitative study on teenagers (N = 242, age 12-18), partial replication of previous study: all three public types and online feedback importance were included
      • Results show an exact replication of the first study’s results (expectancy strengths and expected public-feedback contingencies)
      • Additional analysis: analysis of longitudinal growth of received online on-platform feedback (comments, rates, views)
    • Methodology
      • Step 1-3: identical: invite respondents through API, gather self-report survey data and harvest profile and latest video information
      • Step 4-5: collect video information at fixed intervals of 1 month (so three measures were gathered)
      • Hypotheses:
      • H1-3: Online identified public (similar interest, activity): more views, comments and rates
      • H4-6: Online unidentified public (online out-group): no effect on # views, comments and rates
    @
    • Analysis and results (latent growth modeling)
      • H1-3: Online identified public (similar interest, activity): more views ( partial ✓ ) , comments ( partial ✓ ) and rates ( ✓ )
      • H4-6: Online unidentified public (online out-group): no effect on views ( ✓ ), comments ( ✓ ) and rates ( ✓ )
      • Hence, expectancies are accurate …
    T1 = after filling out survey T2 = one month later T3 = two months later Separate models for views, comments and rates
    • Conclusion
      • Results of first study replicate fine, even in a different context (replications are important)
      • Combination of multiple data sources amplifies strength of design (huge amounts of informative data are for the taking, free of cost)
      • For recruitment as well as dedicated analysis
    • Conclusion
      • Results of first study replicate fine, even in a different context (replications are important)
      • Combination of multiple data sources amplifies strength of design (huge amounts of informative data are for the taking, free of cost)
      • For recruitment as well as dedicated analysis
    Thank you for listening… Any questions? Contact: cedric.courtois@ugent.be, www.mict.be