Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Crowdsourcing the Lab:Learning from “The Netflix Prize”Team D9Gian CalvesbertKobi MagneziWill ReidSarah RubintonAbhishek S...
AgendaInnovationCrowdsourcingNetflix PrizeAnalysis of Process and ResultsUpdatesKey Learnings and Takeaways
The traditional approach forproblem solvingMany expensive R&D processeseventually fail.Success rates are low, and the cost...
The Crowdsourcing approach forproblem solving“The act of taking a job traditionally performed by adesignated agent (usuall...
Why yes•Less expensive•Larger and broader knowledgebase•Greater potential forcreativity, “outside the box”thinking•Fun, at...
Two Models for CrowdsourcingInnovationThird Party Platforms Manage relationshipbetween solvers andsearchers Provide hub ...
The Netflix PrizeStart 1%: $50K10%:$1MAnnouncedOctober2, 2006Provided thepublic with areal data set of1.4M+ uniquerecordsN...
The Netflix Prize: ResultsSeveral leading teamscombined over 800algorithms, developinga solution that beatthe Netflix algo...
Would you tellme, please, whichway I ought to gofrom here?That depends a good deal onwhere you want to get toI dont much c...
Success?As of April 16th, still hadnot implemented the10.05% improvementsolution, over 5 yearsfrom the announcementdateBas...
Intellectual Property Ownership of IP for the new algorithm stayed with thesolvers Winning team from: AT&T, Commendo Res...
The Fallout20062009“additional accuracy gains thatwe measured did not seem tojustify the engineering effortneeded to bring...
Why did Netflix fail?Implementationteam notinvolved inwinnerselectionContestPeriodtoo longSolutionnot alignedwith goals
Crowdsourcingworkbest• Everyone can interpret the factsin their own wayDiversity ofOpinion• Opinions of one actor are noti...
 10 $10,000 awards for various cloud service applications Best feature Best Code Contribution Runs March-September, 20...
Problem DefinitionKnowledge SelectionExperimentation CyclesDetermining Value
Problem DefinitionKnowledge SelectionExperimentation CyclesDetermining Value
Learning from Past MistakesKey Improvements• Shorter timeframereduces risk of majortechnological change• Smaller prize and...
The crowd doesn’t solve allproblems, but appropriate andintentional use of crowdsourcing canproduce results for companies ...
Crowdsourcing in business
Upcoming SlideShare
Loading in …5
×

Crowdsourcing in business

597 views

Published on

Crowdsourcing applications in business world. Netflix prize and other examples.
Team D9 Boston University School of Management MBA '2014
Gian Calvesbert, Kobi Magnezi, Will Reid, Sarah Rubinton, Abhishek Sinha

Published in: Business
  • Be the first to comment

Crowdsourcing in business

  1. 1. Crowdsourcing the Lab:Learning from “The Netflix Prize”Team D9Gian CalvesbertKobi MagneziWill ReidSarah RubintonAbhishek Sinha
  2. 2. AgendaInnovationCrowdsourcingNetflix PrizeAnalysis of Process and ResultsUpdatesKey Learnings and Takeaways
  3. 3. The traditional approach forproblem solvingMany expensive R&D processeseventually fail.Success rates are low, and the cost offailure is high.
  4. 4. The Crowdsourcing approach forproblem solving“The act of taking a job traditionally performed by adesignated agent (usually an employee) and outsourcing itto an undefined, generally large group of people in theform of an open call.” -Jeff Howe ,Wired
  5. 5. Why yes•Less expensive•Larger and broader knowledgebase•Greater potential forcreativity, “outside the box”thinking•Fun, attractive for more casualcontributorsWhy not•Potential to expose strategic orproduct weaknesses to competitors•Complicates intellectual propertyissues•Contributors may not be alignedbehind larger goals
  6. 6. Two Models for CrowdsourcingInnovationThird Party Platforms Manage relationshipbetween solvers andsearchers Provide hub for potentialsolvers with a variety ofproblems Build a community andnetwork Protect anonymityInternally Run Contests Control over all aspects ofcontest rules and entrycapability Immediate access to allsolutions “Test as you go” process Control over collaboration
  7. 7. The Netflix PrizeStart 1%: $50K10%:$1MAnnouncedOctober2, 2006Provided thepublic with areal data set of1.4M+ uniquerecordsNetflix blog:updatedcontest leadersand onlinecollaborativeenvironmentfor solvers
  8. 8. The Netflix Prize: ResultsSeveral leading teamscombined over 800algorithms, developinga solution that beatthe Netflix algorithmby 10.05%Collaboration betweenteams allowed shared-resources to create amore powerfulAlgorithmLots of publicity oversuccess ofcrowdsourcing and the“future” of science“A $1 Million Research Bargain for Netflix, andMaybe a Model for Others”“The Netflix Prize Was Brilliant -- Googleand Microsoft should steal the idea”
  9. 9. Would you tellme, please, whichway I ought to gofrom here?That depends a good deal onwhere you want to get toI dont much care where.Then it doesnt muchmatter which way yougo.
  10. 10. Success?As of April 16th, still hadnot implemented the10.05% improvementsolution, over 5 yearsfrom the announcementdateBased grading criteria onRoot Mean Squared ErrorMedia backlash over“Success”
  11. 11. Intellectual Property Ownership of IP for the new algorithm stayed with thesolvers Winning team from: AT&T, Commendo Research andConsulting, and individual contributors Netflix gained a non-exclusive license to the solutions Encouraged more solutions, from employees of othercompanies
  12. 12. The Fallout20062009“additional accuracy gains thatwe measured did not seem tojustify the engineering effortneeded to bring them into aproduction environment.”Netflix business primarilybased on through the mailDVDsMarket shift to VOD:Much larger, morecomplex datasets andanalytics
  13. 13. Why did Netflix fail?Implementationteam notinvolved inwinnerselectionContestPeriodtoo longSolutionnot alignedwith goals
  14. 14. Crowdsourcingworkbest• Everyone can interpret the factsin their own wayDiversity ofOpinion• Opinions of one actor are notinfluenced by the opinions ofothersIndependence• Different specializationsDecentralization• Method for successfully making adecisionAggregation
  15. 15.  10 $10,000 awards for various cloud service applications Best feature Best Code Contribution Runs March-September, 2013 Smaller overall awards for smaller projects “Quick hit”, easy to implement contributions
  16. 16. Problem DefinitionKnowledge SelectionExperimentation CyclesDetermining Value
  17. 17. Problem DefinitionKnowledge SelectionExperimentation CyclesDetermining Value
  18. 18. Learning from Past MistakesKey Improvements• Shorter timeframereduces risk of majortechnological change• Smaller prize andshorter time serve tolimit scope of work• Nomination processscreens submissionsRemaining Concerns• Unclear goal ofinnovation• Unmanagedcollaborationenvironment• Licensing modeldenies Netflix fullaccess to intellectualproperty they paid todevelop
  19. 19. The crowd doesn’t solve allproblems, but appropriate andintentional use of crowdsourcing canproduce results for companies lookingto innovate.

×