Crowdsourcing the Lab:
Learning from “The Netflix Prize”
Team D9
Gian Calvesbert
Kobi Magnezi
Will Reid
Sarah Rubinton
Abhishek Sinha
Agenda
Innovation
Crowdsourcing
Netflix Prize
Analysis of Process and Results
Updates
Key Learnings and Takeaways
The traditional approach for
problem solving
Many expensive R&D processes
eventually fail.
Success rates are low, and the cost of
failure is high.
The Crowdsourcing approach for
problem solving
“The act of taking a job traditionally performed by a
designated agent (usually an employee) and outsourcing it
to an undefined, generally large group of people in the
form of an open call.” -Jeff Howe ,Wired
Why yes
•Less expensive
•Larger and broader knowledge
base
•Greater potential for
creativity, “outside the box”
thinking
•Fun, attractive for more casual
contributors
Why not
•Potential to expose strategic or
product weaknesses to competitors
•Complicates intellectual property
issues
•Contributors may not be aligned
behind larger goals
Two Models for Crowdsourcing
Innovation
Third Party Platforms
 Manage relationship
between solvers and
searchers
 Provide hub for potential
solvers with a variety of
problems
 Build a community and
network
 Protect anonymity
Internally Run Contests
 Control over all aspects of
contest rules and entry
capability
 Immediate access to all
solutions
 “Test as you go” process
 Control over collaboration
The Netflix Prize
Start 1%: $50K
10%:
$1M
Announced
October
2, 2006
Provided the
public with a
real data set of
1.4M+ unique
records
Netflix blog:
updated
contest leaders
and online
collaborative
environment
for solvers
The Netflix Prize: Results
Several leading teams
combined over 800
algorithms, developing
a solution that beat
the Netflix algorithm
by 10.05%
Collaboration between
teams allowed shared-
resources to create a
more powerful
Algorithm
Lots of publicity over
success of
crowdsourcing and the
“future” of science
“A $1 Million Research Bargain for Netflix, and
Maybe a Model for Others”
“The Netflix Prize Was Brilliant -- Google
and Microsoft should steal the idea”
Would you tell
me, please, which
way I ought to go
from here?
That depends a good deal on
where you want to get to
I don't much care where.
Then it doesn't much
matter which way you
go.
Success?
As of April 16th, still had
not implemented the
10.05% improvement
solution, over 5 years
from the announcement
date
Based grading criteria on
Root Mean Squared Error
Media backlash over
“Success”
Intellectual Property
 Ownership of IP for the new algorithm stayed with the
solvers
 Winning team from: AT&T, Commendo Research and
Consulting, and individual contributors
 Netflix gained a non-exclusive license to the solutions
 Encouraged more solutions, from employees of other
companies
The Fallout
2006
2009
“additional accuracy gains that
we measured did not seem to
justify the engineering effort
needed to bring them into a
production environment.”
Netflix business primarily
based on through the mail
DVDs
Market shift to VOD:
Much larger, more
complex datasets and
analytics
Why did Netflix fail?
Implementation
team not
involved in
winner
selection
Contest
Period
too long
Solution
not aligned
with goals
Crowdsourcingworkbest
• Everyone can interpret the facts
in their own way
Diversity of
Opinion
• Opinions of one actor are not
influenced by the opinions of
others
Independence
• Different specializationsDecentralization
• Method for successfully making a
decisionAggregation
 10 $10,000 awards for various cloud service applications
 Best feature
 Best Code Contribution
 Runs March-September, 2013
 Smaller overall awards for smaller projects
 “Quick hit”, easy to implement contributions
Problem Definition
Knowledge Selection
Experimentation Cycles
Determining Value
Problem Definition
Knowledge Selection
Experimentation Cycles
Determining Value
Learning from Past Mistakes
Key Improvements
• Shorter timeframe
reduces risk of major
technological change
• Smaller prize and
shorter time serve to
limit scope of work
• Nomination process
screens submissions
Remaining Concerns
• Unclear goal of
innovation
• Unmanaged
collaboration
environment
• Licensing model
denies Netflix full
access to intellectual
property they paid to
develop
The crowd doesn’t solve all
problems, but appropriate and
intentional use of crowdsourcing can
produce results for companies looking
to innovate.

Crowdsourcing in business

  • 1.
    Crowdsourcing the Lab: Learningfrom “The Netflix Prize” Team D9 Gian Calvesbert Kobi Magnezi Will Reid Sarah Rubinton Abhishek Sinha
  • 3.
    Agenda Innovation Crowdsourcing Netflix Prize Analysis ofProcess and Results Updates Key Learnings and Takeaways
  • 4.
    The traditional approachfor problem solving Many expensive R&D processes eventually fail. Success rates are low, and the cost of failure is high.
  • 5.
    The Crowdsourcing approachfor problem solving “The act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call.” -Jeff Howe ,Wired
  • 6.
    Why yes •Less expensive •Largerand broader knowledge base •Greater potential for creativity, “outside the box” thinking •Fun, attractive for more casual contributors Why not •Potential to expose strategic or product weaknesses to competitors •Complicates intellectual property issues •Contributors may not be aligned behind larger goals
  • 7.
    Two Models forCrowdsourcing Innovation Third Party Platforms  Manage relationship between solvers and searchers  Provide hub for potential solvers with a variety of problems  Build a community and network  Protect anonymity Internally Run Contests  Control over all aspects of contest rules and entry capability  Immediate access to all solutions  “Test as you go” process  Control over collaboration
  • 8.
    The Netflix Prize Start1%: $50K 10%: $1M Announced October 2, 2006 Provided the public with a real data set of 1.4M+ unique records Netflix blog: updated contest leaders and online collaborative environment for solvers
  • 9.
    The Netflix Prize:Results Several leading teams combined over 800 algorithms, developing a solution that beat the Netflix algorithm by 10.05% Collaboration between teams allowed shared- resources to create a more powerful Algorithm Lots of publicity over success of crowdsourcing and the “future” of science “A $1 Million Research Bargain for Netflix, and Maybe a Model for Others” “The Netflix Prize Was Brilliant -- Google and Microsoft should steal the idea”
  • 10.
    Would you tell me,please, which way I ought to go from here? That depends a good deal on where you want to get to I don't much care where. Then it doesn't much matter which way you go.
  • 11.
    Success? As of April16th, still had not implemented the 10.05% improvement solution, over 5 years from the announcement date Based grading criteria on Root Mean Squared Error Media backlash over “Success”
  • 12.
    Intellectual Property  Ownershipof IP for the new algorithm stayed with the solvers  Winning team from: AT&T, Commendo Research and Consulting, and individual contributors  Netflix gained a non-exclusive license to the solutions  Encouraged more solutions, from employees of other companies
  • 13.
    The Fallout 2006 2009 “additional accuracygains that we measured did not seem to justify the engineering effort needed to bring them into a production environment.” Netflix business primarily based on through the mail DVDs Market shift to VOD: Much larger, more complex datasets and analytics
  • 14.
    Why did Netflixfail? Implementation team not involved in winner selection Contest Period too long Solution not aligned with goals
  • 15.
    Crowdsourcingworkbest • Everyone caninterpret the facts in their own way Diversity of Opinion • Opinions of one actor are not influenced by the opinions of others Independence • Different specializationsDecentralization • Method for successfully making a decisionAggregation
  • 16.
     10 $10,000awards for various cloud service applications  Best feature  Best Code Contribution  Runs March-September, 2013  Smaller overall awards for smaller projects  “Quick hit”, easy to implement contributions
  • 17.
  • 18.
  • 19.
    Learning from PastMistakes Key Improvements • Shorter timeframe reduces risk of major technological change • Smaller prize and shorter time serve to limit scope of work • Nomination process screens submissions Remaining Concerns • Unclear goal of innovation • Unmanaged collaboration environment • Licensing model denies Netflix full access to intellectual property they paid to develop
  • 20.
    The crowd doesn’tsolve all problems, but appropriate and intentional use of crowdsourcing can produce results for companies looking to innovate.

Editor's Notes

  • #11 Alice: Would you tell me, please, which way I ought to go from here?The Cat: That depends a good deal on where you want to get toAlice: I don't much care where.The Cat: Then it doesn't much matter which way you go.
  • #12 Thanks Will. So was the Netflix competition a success? Well, the winning team did achieve the milestone set by Netflix, but it came at a cost. As Will mentioned the algorithm was a combination of hundreds of algorithms and the company management in the end could not justify the use of the solution. The company, even now, after 5 years has not implemented the final solution; however they did use a intermediary solution.There was also a lot of discussion and criticism from participants that the method used for evaluation of the algorithm i.e the Root Mean Square of the Error was not the appropriate method for this analysis.There was a lot of media backlash and negative press associated with Netflix’s decision to not use the algorithm. Reporters started regarding started calling it the “The Myth of Crowdsourcing”
  • #13 Intellectual property was another issue that came up after this competition. The winners were a team of research scientists from AT&T research labs and granted only a working license to Netflix for this algorithm. The ownership of the algorithm stayed with the developers. This is in contrast with the Innnocentive model where all the IP rights are transferred to “Seekers”. The Netflix algorithm challenge was a very high mathematical and statistical challenge that required scholars who were working from different companies to work and solve this problem. Out of all the people from 180 countries around the world who submitted solutions, only a handful came close to the providing any substantial improvements to the algorithm.
  • #14 So what was the fallout of this competition. Netflix wasn’t able to implement this solution due to high development costs. They were sued by some people who feared that private information was used to run this contest. Netflix faced similar problem with their second competition which they tried to launch in 2009. This competition was based on using user demographic data. The time frame for this competition was too long in this case. The 3 year time line for development of the algorithm was unnecessary and by the time the solution was announced and accepted the company’s model had moved on from mailing of DVDs to Video on Demand, which had much larger and more complex datasets.
  • #16 http://en.wikipedia.org/wiki/The_Wisdom_of_Crowds