Analytics and Witch Doctoring:
A Cure for the Black Box
Mentality
February 1, 2011
O’Reilly Strata Conference
J.C. Herz, Batchtags LLC
jc@tripledex.com
Analytics: An Intervention
• Original Sin
Enterprise
Pathologies Critical Questions
Analytics: Occult Phenomenon
Very powerful
Don’t understand it
Practitioners possess
arcane knowledge
Alchemy
Secret
Algorithms
Greek Letters Are Your
Kryptonite
High Status Helplessness
• If you understood the technology, you’d
be one of those people whose job it is
to make technology work.
• You know, underlings
Executive ADD - Kaching!
• Re-starts are where
consulting shops
make their money
Mid-Life Crisis
Shiny Pebble Syndrome
• Infoviz Porn: Visualization with no
use case
– Ex: Social Network visualization. Why?
– START with a use case and work
forward
• Demo Envy: just because it looks
slick doesn’t mean it’s possible, or
even advisable, to pipe your data
into it.
A Ballad of Spectacular
Information Display
• Time Magazine 1976
• Telex text routing:
information off the wire
goes to terminals, properly
foldered
• Z8 terminal display awes
executives
• Pneumatic system not
eliminated
Technical Reality vs.
Leadership Attention Span
Half Ass Syndrome
• Halfway into the project, jump off into
the next problem.
• Haven’t refined results or hypothesis
• Failure blamed on technology, but it’s
really loss of interest and desire for
instant gratification
Shelfware Syndrome
• The guy who was driving the program left...
• Approach-Avoidance conflict --> pilot-itis
• A US agency has $30M of software that
hasnʼt been installed…some of it with
maintenance contracts.
• Base Model vs. Fully Loaded
– One enterprise bought $12M worth of
Autonomy before figuring out that the add-ons
they needed would be another $22M.
Customization Before Testing
Critical Question: What is the
Validation Test?
• Formulating the validation test keeps
both the customer and the developer
focused - and honest
• Suggest pay for performance, and see if
the developer or vendor freaks out.
• Make sure validation is ongoing - in
case the ground is shifting
Data Due Diligence & Auditing
Shame
Shame
Ugly Babies
Critical Question: Data Quality
• How complete is it?
– Ex: 600 custom fields, only two have more than 50%
coverage
• How accurate is it? How do you know?
• How consistent is it?
– Good test: make three calls to different parts
of the company, to get an answer to a factual
question that doesn’t require calculation.
Critical Question: Half-Life of Data
• How long is the
data accurate?
• How long is the
data useful?
Critical Question: Real World Context
• Without real world data, “behavioral” metrics
are misleading
• Where is the transactional data that validates
insights from non-transactional data?
• How would you prove the magic analytics
WRONG?
• Are you prepared to spend painful
amounts of money cleaning up your data?
• Crack heads if people don’t share data?
• Make business units accountable for their
data?
• Play hardball to make sure data is not
stored in single-application proprietary
formats?
Data: Gut Check
Critical Questions: Workflow
• What workflow changes will this
proposed capability require?
• People hate changing their workflow,
even if it’s an improvement
• Never attribute to stupidity what can be
attributed to laziness
• What is your plan for changing
workflow? How do you enforce it?
The perfect application that
no-one uses is still worthless
Process
Politics
What are you prepared to do?
Critical Question: Consequences
• What actions are you willing to take on
the basis of validated analytic insight?
– Change your product?
– Change your marketing budget?
– Change people’s job descriptions?
– Re-allocate R&D budgets?
• What actions are you not willing to take?
Stakes
Critical Question: Tempo
• How fast will a decision be made on the
basis of analytic insight?
• Quarterly?
• Daily?
• Within seconds?
• Milliseconds?
• Never?
• Realtime vs. Continuous vs. Batch
Precision vs. Accuracy
When precision exceeds accuracy, you’re
setting yourself up for analytic failure
OODA Loop
Which of these does an analytic tool/technology do?
Before You Rip ‘n’ Replace:
What is the exit cost of this
technology?
Does “turnkey” mean
monoculture?
Business Payoff vs.
Intellectual Appeal
Social Network
Analysis
Operations
Research
Market
Segmentation
Competitive
Intelligence
Pilots to test new
analyst tools with
tiny amounts of
generic data
360º
Lead
Scoring
Validate Marketing
Effectiveness
Questions?
• J.C. Herz jc@tripledex.com (202) 213-3151
Base Model vs. Fully Loaded
“Never attribute to stupidity
what can be explained by
laziness.”
Ugly Babies & Pretty Babies

Pdf analytics-and-witch-doctoring -why-executives-succumb-to-the-black-box-mentality-presentation

  • 1.
    Analytics and WitchDoctoring: A Cure for the Black Box Mentality February 1, 2011 O’Reilly Strata Conference J.C. Herz, Batchtags LLC jc@tripledex.com
  • 2.
    Analytics: An Intervention •Original Sin Enterprise Pathologies Critical Questions
  • 3.
    Analytics: Occult Phenomenon Verypowerful Don’t understand it Practitioners possess arcane knowledge
  • 4.
  • 5.
  • 6.
    Greek Letters AreYour Kryptonite
  • 7.
    High Status Helplessness •If you understood the technology, you’d be one of those people whose job it is to make technology work. • You know, underlings
  • 8.
    Executive ADD -Kaching! • Re-starts are where consulting shops make their money
  • 9.
  • 10.
    Shiny Pebble Syndrome •Infoviz Porn: Visualization with no use case – Ex: Social Network visualization. Why? – START with a use case and work forward • Demo Envy: just because it looks slick doesn’t mean it’s possible, or even advisable, to pipe your data into it.
  • 11.
    A Ballad ofSpectacular Information Display • Time Magazine 1976 • Telex text routing: information off the wire goes to terminals, properly foldered • Z8 terminal display awes executives • Pneumatic system not eliminated
  • 12.
  • 13.
    Half Ass Syndrome •Halfway into the project, jump off into the next problem. • Haven’t refined results or hypothesis • Failure blamed on technology, but it’s really loss of interest and desire for instant gratification
  • 14.
    Shelfware Syndrome • Theguy who was driving the program left... • Approach-Avoidance conflict --> pilot-itis • A US agency has $30M of software that hasnʼt been installed…some of it with maintenance contracts. • Base Model vs. Fully Loaded – One enterprise bought $12M worth of Autonomy before figuring out that the add-ons they needed would be another $22M.
  • 15.
  • 16.
    Critical Question: Whatis the Validation Test? • Formulating the validation test keeps both the customer and the developer focused - and honest • Suggest pay for performance, and see if the developer or vendor freaks out. • Make sure validation is ongoing - in case the ground is shifting
  • 17.
  • 18.
  • 19.
  • 20.
    Critical Question: DataQuality • How complete is it? – Ex: 600 custom fields, only two have more than 50% coverage • How accurate is it? How do you know? • How consistent is it? – Good test: make three calls to different parts of the company, to get an answer to a factual question that doesn’t require calculation.
  • 21.
    Critical Question: Half-Lifeof Data • How long is the data accurate? • How long is the data useful?
  • 22.
    Critical Question: RealWorld Context • Without real world data, “behavioral” metrics are misleading • Where is the transactional data that validates insights from non-transactional data? • How would you prove the magic analytics WRONG?
  • 23.
    • Are youprepared to spend painful amounts of money cleaning up your data? • Crack heads if people don’t share data? • Make business units accountable for their data? • Play hardball to make sure data is not stored in single-application proprietary formats? Data: Gut Check
  • 24.
    Critical Questions: Workflow •What workflow changes will this proposed capability require? • People hate changing their workflow, even if it’s an improvement • Never attribute to stupidity what can be attributed to laziness • What is your plan for changing workflow? How do you enforce it?
  • 25.
    The perfect applicationthat no-one uses is still worthless
  • 26.
  • 27.
  • 28.
    What are youprepared to do?
  • 29.
    Critical Question: Consequences •What actions are you willing to take on the basis of validated analytic insight? – Change your product? – Change your marketing budget? – Change people’s job descriptions? – Re-allocate R&D budgets? • What actions are you not willing to take?
  • 30.
  • 31.
    Critical Question: Tempo •How fast will a decision be made on the basis of analytic insight? • Quarterly? • Daily? • Within seconds? • Milliseconds? • Never? • Realtime vs. Continuous vs. Batch
  • 32.
    Precision vs. Accuracy Whenprecision exceeds accuracy, you’re setting yourself up for analytic failure
  • 33.
    OODA Loop Which ofthese does an analytic tool/technology do?
  • 34.
    Before You Rip‘n’ Replace: What is the exit cost of this technology? Does “turnkey” mean monoculture?
  • 35.
    Business Payoff vs. IntellectualAppeal Social Network Analysis Operations Research Market Segmentation Competitive Intelligence Pilots to test new analyst tools with tiny amounts of generic data 360º Lead Scoring Validate Marketing Effectiveness
  • 36.
    Questions? • J.C. Herzjc@tripledex.com (202) 213-3151
  • 37.
    Base Model vs.Fully Loaded
  • 38.
    “Never attribute tostupidity what can be explained by laziness.”
  • 39.
    Ugly Babies &Pretty Babies