Does It Have Legs?
Test Management Forum
Balls Brothers, London
25th April 2012
Warning To Delegates (Old & New)
This is NOT a presentation per se.
This is a facilitated discussion workshop.
These slides were written to:
1) Stimulate interactive discussion.
2) Give that discussion some shape.
3) Provide a minimum level of content for the session if
we can’t spark and sustain a discussion.
• Academic background in computer science.
• Graduated in ’98, straight into contracting;
punctuated by a 5 year perm stint.
• These days working at or above programme level,
often troubleshooting for big transformational
• Head Of Testing at the BBC is probably my most
• Currently available (part-time work is of
I’m fully independent and have nothing to declare!
I have no personal or commercial relationship with any of
the vendors/authors/publishers cited in these slides.
• As ably demonstrated by this slide, I’m a PowerPoint
amateur of the lowest possible calibre.
• I’m a very inexperienced speaker (although I did this
talk at the Test Management Summit last Feb)
• I probably should be more neutral. Crowdsourcing has
really captured my imagination and I think that in the
right circumstances it can be brilliant.
My Primary Objectives
• Impart some of what I’ve learned.
• Arrive, through discussion, at a realistic view of
the broad pros, cons, applicability, etc in the hope
that it will be possible to:
– Assuage some fears that may be out there
– Tempt people to look into this (if just 1 person here
does a pilot in their organisation I’ll be happy).
• Identify some potentially useful resources.
• Give my own answer to “Does It Have Legs?”.
• Get you guys talking and thinking about the topic
(by far the most important objective!).
• Me: Crowdsourcing - A brief introduction.
• Discussion: Real world example(s).
• Me: Crowdsourced Testing - A quick overview.
• Group Poll: Who has experience as a client and/or crowd
• Discussion: Motivations/pros/cons/applicability.
• Me: My own case study from Summer 2011.
• Me: Map case study experience to the output of the
previous group discussion (addressing any gaps between
Session Outline Continued…
• Discussion: Questions about the case study.
• Me: Crowdsourcing experiences beyond testing (time
• Me: A look ahead to the future.
• Me: Conclusions.
• Discussion: Reactions to my conclusions or view of the
future? Crowdsourcing anecdotes outside of testing? Other
reflections directly or tangentially related to the topic?
• Me: Close.
***I’ll try to accommodate questions as we go***
Introduction To ‘Crowdsourcing’
A term first coined by journalist/author Jeff Howe
who offers two definitions:
“The act of taking a job traditionally performed by a
designated agent (usually an employee) and
outsourcing it to an undefined, generally large
group of people in the form of an open call.”
“The application of Open Source principles to fields
outside of software”.
Introduction To ‘Crowdsourcing’
• Jeff Howe’s (seemingly inactive) blog is here.
• I’ve read his excellent book that contains lots
• Key lessons are that:
– Money is rarely a motivator for crowd members.
– The crowd knows when it is being exploited
Treading this line is hard for crowd users.
Introduction To ‘Crowdsourcing’
Google Trends says more people are looking in to it:
Introduction To ‘Crowdsourcing’
Lots going on in this space:
• Venture Capitalists are active in this area. Allegedly over
$280m invested in 2011.
• There are a number of high profile hub sites:
• Discussions on things like a code of practise for ethical
treatment of crowd members illustrate the area’s growing
My Real-World Example
• Simon Cowell crowdsources market research and publicity for the acts.
• This isn’t just free for Mr Cowell, viewers actually pay to access premium
voting lines and the media rights are heavily exploited.
• Having de-risked the product gestation, both financially and creatively, the
punters then dutifully buy the product that they said they wanted and paid to
• Unadulterated genius.
• And with the format being syndicated around the world, it’s an excellent
example of commercial crowdsourcing at work.
Crowdsourced Testing: A Quick Overview
• I first heard of it (and the broader crowdsourcing topic) at
the Jul 2008 TMF in a talk by James Whittaker (who said it
was part of Microsoft’s vision of the future).
• Piqued my interest as I was at the BBC wrestling with
extensive browser/OS coverage requirements triggered by
the BBC Charter.
• uTest.com the only player in the game back then.
• uTest had raised about $12m at that stage ($37m last time I
looked), all of it after the credit crunch.
Crowdsourced Testing: A Quick Overview
• Unlike crowdcasting (e.g. logo design) where the winner takes
all, in testing everyone can contribute a piece of the overall
• That makes testing ideally suited to engaging a motivated
• Go to the brilliant Zappers event run by TCL (uTest’s UK
affiliate) and you’ll meet excellent crowd testers who are
motivated by non-monetary factors (more on that shortly).
• The existing maturity of test management tools has provided
an excellent basis for the (essential) crowd management.
• US-based company, pilot system launched in Feb ’08.
• Circa 53,600 in the crowd (circa 3300 in the UK). But how many active?
• Sales team in US (at least some of them), handing over to TCL (uTest’s UK
affiliate) to project manage the cycles.
• A functional management interface (but not commensurate with the
funding received or the time since launch).
• iPhone app available (but bugged as of last summer).
• Fixed price model for the client, testers paid per bug or artefact on the
other side of the equation.
• TCL PM drives crowd selection, but with a client steer.
• Daily reporting (if desired) with an optional test lead to sanity check
submissions at minimal cost (approx £8 per day).
• Offers functional and non-functional (although I’m not sure about the
repeatability of the latter). Covers mobile, web and desktop.
Alternatives To uTest
I’ve briefly looked at, but not used:
– Bugfinders (UK company, client pays per bug).
– Centre4Testing (Leverages C4T’s candidate pool, UK crowd,
formally engaged to deliver coverage. In quoting for my case
study they wanted a lot more info and lead time than uTest.
About 10% dearer than uTest on the 1st cycle but about 50%
cheaper for subsequent cycles. Also slightly cheaper on 1st cycle
if you exclude uTest’s new client discount).
I’m aware of, but have not looked at:
– 99tests (Seemingly not associated with 99designs.com)
I’m sure there are (and will be) others.
Crowdsourced Testers: Some Motivations
• Practising skills/applying theory.
• Escapism from the bureaucracy of the day job.
• Indulging desire to break things.
• Raising profile within the testing community/job market *
• Exploring testing as a career choice.
• Taking a trip down memory lane (for ex-practitioners).
• Ego boost.
• Benchmarking bug hunting capability against peers (for kudos).
• Professional/social networking within the testing community.
• Trying/researching the latest (or upcoming) software.
• Altruism/making a difference/being part of something.
• Gain potential insights on the offerings of competitors.
• Having fun/alleviating boredom.
• For many (most?), money is not a material factor and they have a day job for subsistence.
* I’ve since met the uTest team lead for my case study and would happily engage him in a full-time role.
Crowdsourced Testing: Some Pros
• Cheap (at least superficially). Significantly more eyeballs on the case £ for £; compares
extremely favourably with a single contractor (more on that later).
• Externalise accommodation costs (desks, PCs, servers, etc).
• Ultra rapid ramp-up time (vis-à-vis recruitment) and feedback loop (potentially get bugs
• Flexible and rapid engagement model for reactively rolling testers on and off the project (e.g.
good for post-release testing).
• Mitigates costs of test environment provision in the face platform proliferation.
• 24/7 productivity with evening and weekend work at zero marginal cost.
• Potentially cheap form of usability testing if you know how to frame the survey.
• Potentially cheap form of accessibility testing.
• Access to a huge diversity of testers with an enormous breadth of experience.
• Some things need testing in the wild.
• Suited for multi time zone testing.
• Suited for localisation testing (e.g. language).
• The ultimate in laying your software bare for scrutiny by 100% independent testers.
• Can benefit from completely fresh thinking if you solicit feature suggestions from the testers.
• Can be used to confirm the suitability of operations procedures around aspects like user
Crowdsourced Testing: Some Cons
• Lack of direct accountability (there are some ‘soft’ sanctions like star ratings and feedback to platform operator).
• System knowledge and trust vested in un-contracted resources (scope to reverse engineer system under test?)
• Crowd members may detect vulnerabilities, not report them and then subsequently exploit them.
• Could be unsettling to in-house testers.
• Could be seen as de-valuing testing within the organisation.
• If it’s a desktop application and it gets leaked – a watermark system may be scant consolation.
• Testers may (probably?) care less than their internal counterparts.
• Need to provide strangers with access to your test instance.
• PR risks associated with exposing early-stage code.
• Almost certainly a greater variability in the quality of defect reports versus an internal team.
• Management and auditability of coverage (and test data) is probably harder to achieve.
• Access to lots of environments but at the expense of configuration control (e.g. knowledge of windows patches).
• Exposing intellectual property (to public and competitors).
• Some tests may be considered too ‘core’ to delegate to the crowd.
• Danger of defect overload (especially if duplicates aren’t managed).
• Crowd is unlikely to meet very specific requirements for subject matter expertise.
• The more specific the test requirement, the less likely it is that the crowd can resource that requirement.
• SLA management – there isn’t any!
• Continuity of participation not guaranteed (e.g. a top performing crowd member may go off travelling).
• Could confer unwarranted confidence; relies on stakeholders both hearing and understanding these downsides.
Crowdsourced Testing: Applicability
• Start-ups that can’t sustain in house test resource (or established organisations that have no test resources).
• Organisations that aren’t exclusively reliant on the crowd (not withstanding my point above).
• Agile teams that want out-of-hours productivity (especially for smoke testing purposes) and/or to mitigate the
effect of having a single embedded tester working in mini-water falls.
• Non-regulated public facing systems that need to undergo testing out in the wild (corporate systems involving
sensitive/financial data are much less appropriate).
• Organisations where the workload is highly bi-modal since this approach can smooth out the peaks and
• Places that lack the time/environments to test in-scope OS/browser combinations.
• Environments where exploratory testing has been culturally accepted.
• Environments that may want to target users in specific geographies to enable, for example, localisation testing
out in the wild or to expose ways in which users from various cultures may use the software differently.
• Environments that produce the proverbial $3 mobile app.
• Sites that are subject to continual functional change and/or will experience high traffic volume.
• Environments that are looking for a sanity check/mop-up to complement their existing in-house test team.
• Sites where public buy-in is important to the marketing and brand-building effort.
• Sites that are intended to be intuitive enough for the public to use with no training.
• Environments producing low risk software i.e. on which lives and large commercial revenues aren’t
• Environments that have test requirements around usability and accessibility but can’t afford specialist advice.
• Bleeding edge software that would benefit from the kind of tyre-kicking that crowdsourced testing enables.
Case Study: The Context
• One of Europe’s largest price comparison sites with profits measured in
• Wholly reliant on TDD (within Kanban) save for 1 over-worked manual
tester located on the continent.
• Requirement was for post-release testing of a UK-only web proposition in
• No time or money to recruit contract testers.
• 3 week window of opportunity for testing but with scope for it to slip.
• No meaningful collateral from which to derive tests (just a load of binned
• Engaged by the Programme Director who wanted to de-risk the testing
and was open to suggestions. He knew about uTest but had no
time/patience with working through the relevant questions.
• I wanted crowdsourcing experience so I offered take this off his plate.
Case Study: The Solution
• I became the uTest interface and clarified the commercials, legals,
process etc. Need to read the terms – uTest can quote you as a
case study unless you opt out.
• uTest sales team willing to progress paperwork over the weekend.
• Commissioned a UK-only crowd for 3 calendar weeks of exploratory
testing at a cost of $3000 which factored in a $600 new client
discount. We paid $250 to have a team lead (in addition to the TCL
PM) to sanity check submissions. $3,250=£2,075.
• uTest provided a fixed IP to enable test traffic to be removed from
• Coverage: Windows: Chrome, FF, IE 7/8/9 (+ Mac/Safari).
• Testers were given a minimal briefing with goals comparable in
detail to a charter within session-based test management.
• TCL PM provided lightweight daily reports tailored to our
requirements (weekdays only).
Case Study: The Outcome
• TCL invited 35 testers of which 17 accepted with 10 submitting defects.
Around 80 defects (from memory) with a rejection rate circa 10%.
Some useful site review feedback was also provided (at no extra cost).
• The reporting and defect sanity checking worked well and made the
internal triage process more effective.
• Bug detection continued during weekends and evenings through to the
early hours of the morning.
• Rightly or wrongly, the client was delighted with the results.
• Programme Director – who is a contractor - has vowed to “try and use
uTest in as many future roles as possible as it worked brilliantly”.
• Whilst recognising that the bar had been set low for adding value
(post-release, no internal testers, non-complex web app etc) I also felt
positive about the experience.
• I felt it was too cheap; I wonder what time horizon the uTest backers
have in mind for turning a profit.
My Crowdsourcing Experiences Beyond Testing
• I used to work at Credit Market Analysis which ingests, scrubs and aggregates
crowdsourced emails containing unstructured dealer quotes for credit default
swaps to enable premium analytics to be sold.
• I managed the process of crowdsourcing both a name and a logo for a start-up
company I’ve been involved with which just happens to owe its very existence
to crowdsourcing . I wrote the brief in both cases and used:
www.squadhelp.com for the name (these guys also do crowdsourced testing)
www.99designs.com for the logo.
In both cases the end-results were excellent but the process was a shambles
and the commercial opportunity was embarrassingly under-exploited by the
My Guesses About The Future
• Will not replace in-house testing but will become more prevalent as the awareness of the broader
crowdsourcing paradigm increases.
• Aggressive start-ups and new media organisations without an engineering culture will lead the
• Utest will remain dominant courtesy of its head start and funding. If the pie looks big enough a
blue chip might step in, either as a competitor or, more likely, to give uTest’s backers their exit. A
cloud infrastructure provider may be interested.
• Public service role? Potentially useful to the probation service or entities supporting
• Uptake will be driven by structural market factors such as:
– Increased adoption of exploratory testing.
– Further erosion in the status of manual testers.
– Proliferation of devices requiring coverage (smartphones, tablets, etc).
– An increasing acceptance of cloud-based solutions.
• A rich and burgeoning topic area in which venture
capitalists are quite active.
• Still a young area with massively un-tapped potential, but
established enough to not be dismissed out of hand.
• uTest looks dominant but there are other options.
• In the right hands and circumstances it can be a very
powerful addition to your context-sensitive toolbox.
• My uTest experience was positive but the bar was set
• Its disadvantages will be unacceptable for some clients.
• In my view, crowdsourced testing does have legs which
will see its adoption rise over the coming years.
• Thanks for listening and participating.
• I’m around for the rest of the day (and the drinks).
• Let me know if you pilot this approach or if you’re
interested in engaging me as a consultant for this or
other aspects of your testing regime (I‘m on Linkedin).
• Hopefully it will be possible to rejuvenate this thread:
Paul Gerrard's 2009 TMF Thread About Crowdsourced Testing