GayleL. McDowell | Founder/ CEO, CareerCup
gayle in/gaylemcdgayle
The Architecture of Interviews
Consistency+ Efficiency+ High Bar + Happiness
June7, 2016 | Talent42
&
candidates are
frustrated confused
gayle in/gaylemcdgayle 3Gayle Laakmann McDowell
They
Don’t
Know…
 Howmany interviews
 Who will be interviewing
 If they’ll code?How?
 What they need to know
 Howdecision gets made
 WHY?
Lotsofmyths
(andmisinformation)!
&
&
companies need
consistency efficiency
high bar happiness
Gayle Laakmann McDowell 5gayle in/gaylemcdgayle
Consistency & Efficiency
Consistency
 Outcome
 Process
 Questions
Efficiency
 Speedy process
 Able to expedite
 Minimal overhead
 Minimal false negatives
Gayle Laakmann McDowell 6gayle in/gaylemcdgayle
High Bar & Happiness
High Bar
 Minimize false positives
 Good, adaptable people
Happiness
 Enjoyable experience
 Makes company look good
 Transparency
7 gayle in/gaylemcdgayle
The Process
Resume Selection
Intro Call w/
Recruiter
Email that outlines
process
Code Assessment
Phone Interview
~4 onsite
interviews
Discussion &
Decision
“Sell” Call / Dinner
gayle in/gaylemcdgayle 8Gayle Laakmann McDowell
Stuff
I’ll
Discuss
 BarRaisersvs. Hiring Committees
 OfflineWork
 Homework vs code assessment tools
 QuestionStyle
 Knowledge, algorithms, pair programming
 Coding Platform
 Real code vs. pseudocode
 Whiteboard vs. computer
Bar Raisers or
Hiring Committees
So different, yet so similar
01
Gayle Laakmann McDowell 10gayle in/gaylemcdgayle
Bar Raisers andHC
Offer transparency
Offer consistency
Keepbar high
Facilitatechange
Can override manager
gayle in/gaylemcdgayleGayle Laakmann McDowell 11
HiringCommittee
Cons
 Overhead
 Delays
 Un-empowering
 Can feel “black box”
 Need good feedback
Pros
 Cross-company consistency
 Keeps barhigh
 Easierto improveprocess
Gayle Laakmann McDowell 12gayle in/gaylemcdgayle
Who’s it good for?
Companies that:
 See5or more devcandidatesperweek
 Wanttoimproveprocess
 Hireforcompany,notteam
 Arenotveryknowledgefocused
Easier to implement early!
gayle in/gaylemcdgayleGayle Laakmann McDowell 13
HiringCommittee: Best Practices
Meet at least 2x per week
Multiple HCs:
 Beware ofbarcreep /inconsistencies
Let interviewers observe HC
Traininterviewers to write feedback
Quality of decisions rests onfeedback
gayle in/gaylemcdgayleGayle Laakmann McDowell 14
Bar Raisers
Cons
 Need consistency across
company
 Need to scale team
Pros
 Many ofHC benefits:
 Consistency
 High bar
 Transparency
 But easierto implement
 No bottleneck
gayle in/gaylemcdgayleGayle Laakmann McDowell 15
Bar Raisers:Best Practices
 Select people who areinherently good
 Experiencedatinterviewing
 Nice, empathetic
 Smart&can challengecandidate
 Train them thoroughly
 Empowerthem
 Assign outside of team
 Watch out for scale/exhaustion!
Offline
Assessments
Homework, code assessment tools,
etc
02
Gayle Laakmann McDowell 17gayle in/gaylemcdgayle
OfflineAssessments
Homework Projects Code Assessment Tools
gayle in/gaylemcdgayleGayle Laakmann McDowell 18
Homework Projects
Big
Very Practical
Some love this
Lesscheating
 Except:algos
Too immediate
Needs eng time
Disproportionate workload
Scales poorly for candidate
gayle in/gaylemcdgayleGayle Laakmann McDowell 19
Homework: Best Practices
Show candidate
interestfirst
< 4 hours
 If >4, onsite project review
Architecture, not algorithms
Define review criteria
Avoid confusion with
company work
gayle in/gaylemcdgayleGayle Laakmann McDowell 20
Homework: Who It’s Good For
Language focused
 Low priority on algorithms / thought process
Experienced candidates (maybe)
gayle in/gaylemcdgayleGayle Laakmann McDowell 21
Code Assessment Tools
Fast, cheap eval
 Morecandidates
 Non-traditional
Sets expectations for onsite
Consistent data point
Cheating
May turn off senior
candidates
gayle in/gaylemcdgayleGayle Laakmann McDowell 22
ImplementationOptions
Everyone
Just your “maybe” candidates
Fast-Track
gayle in/gaylemcdgayleGayle Laakmann McDowell 23
Who It’s GoodFor
Small, mid-sized, and big companies
Value algorithms / problem solving
Lots of candidates
Want to look at non-traditional candidates
gayle in/gaylemcdgayleGayle Laakmann McDowell 24
Code Assessment: Best Practices
Show candidate interest
first
Beware of cheating
 (But nobiggie!)
Clearexpectations
Pick GREAT questions
 Similar to real interviews
 Unique questions
1 – 2 hour test
Question Style
Pair programming, algorithms,
knowledge
03
Gayle Laakmann McDowell 26gayle in/gaylemcdgayle
What ToAsk
Knowledge
Algorithms
Design/Architecture
Pairprogramming
Gayle Laakmann McDowell 27gayle in/gaylemcdgayle
Knowledge Questions
Good whenyou can’t train skilleasily
Best practice:
 In-depth,ifat all
 Keepita discussion
Gayle Laakmann McDowell 28gayle in/gaylemcdgayle
Algorithm Questions
Smartmatters.
Good for everyone
Best practices:
 Clear expectations with interviewers & candidates
 Ask medium-to-hard & unusual questions
Gayle Laakmann McDowell 29gayle in/gaylemcdgayle
Design/Architecture
Great for experienced candidates
Shows communicationskills
Best practice:
 Prepcandidates.Big unknown!
gayle in/gaylemcdgayleGayle Laakmann McDowell 30
PairProgramming
 Many candidates enjoy it
 Feels fair & real world
 Assesses codestyle / structure
 Shows interpersonal interaction
 Less understood
 Not greatfor algos
 Interviewer really matters
 Biased by tools
Gayle Laakmann McDowell 31gayle in/gaylemcdgayle
PairProgramming: Best Practices
 Prep/warn candidates
 Need GREAT interviewer
 Give choice of problems
 Okay/good to pick unreasonably big problems
 Guide candidates
 (Okaytoaskquestions,notknowtools,etc.)
Coding Platform
Whiteboard vs. Computers
04
Gayle Laakmann McDowell 33gayle in/gaylemcdgayle
Why We Make Them Code
Can theyput“thoughts”into “actions”?
Do they show good structure and style?
Do they thinkabout theimpact of decisions?
Why not pseudocode?
A Game with Secret Rules
… and this is for a
simple problem
Gayle Laakmann McDowell 36gayle in/gaylemcdgayle
Don’t Allow Pseudocode
Unpredictable playing field
Detailsmatter
If “real code” is too hard for them…
gayle in/gaylemcdgayleGayle Laakmann McDowell 37
How toCode
Big Practical Stuff
Use computer
Pair Programming
Small Stuff
Algorithm-focused
Computer or
whiteboard
gayle in/gaylemcdgayleGayle Laakmann McDowell 38
Buthow to code?
whiteboard computer
gayle in/gaylemcdgayleGayle Laakmann McDowell 39
A Case for Computers
 Realistic. Allows tools.
 Candidates feel more comfortable
 (Especially experienced &diversity candidates)
 Faster to write (often)
 Morecode
gayle in/gaylemcdgayleGayle Laakmann McDowell 40
The Downsideof Computers
Oftenwrite stupid stuff
Desperate attemptfor compilation
Communicationshutsdown
Biased by tools/laptop
“Transition” betweenalgorithm & code
gayle in/gaylemcdgayle 41
z
Gayle Laakmann McDowell
Computer
Best
Practices
Let candidate bring laptop
Instruct: not every detail
Encourage communication and
thinking
Recognize the bias!
gayle in/gaylemcdgayleGayle Laakmann McDowell 42
A Case for Whiteboards
Encourages thinking & communication
More language agnostic
Consistent across candidates
 Better laptop/tools doesn’t matter
It’s “standard”
gayle in/gaylemcdgayleGayle Laakmann McDowell 43
The Downsideof Whiteboards
Slow to write
Artificial environment
Can be intimidating
gayle in/gaylemcdgayle 44
z
Gayle Laakmann McDowell
Whiteboard
Best
Practices
Encourage shorthand
Be upbeat & encouraging
Reasonable expectations
gayle in/gaylemcdgayleGayle Laakmann McDowell 45
Recommendations
If skill-focused:
then Computer
If algos-focused:
then Whiteboard
If a little of each:
then Either/or
 Both can work!
 … with proper training
 Whynot let candidate choose?
Last Remarks
05
gayle in/gaylemcdgayleGayle Laakmann McDowell 47
Thingsto Consider
 BarRaisersor Hiring Committees
 Code assessmenttools
 Pairprogramming(forpracticalstuff)
 Whiteboard(orpick-your-poison)foralgorithmsstuff
there is no
perfect system
THANK YOU
gayle@gayle.com
gayle in/gaylemcdgayle

Architecture of Tech Interviews

  • 1.
    GayleL. McDowell |Founder/ CEO, CareerCup gayle in/gaylemcdgayle The Architecture of Interviews Consistency+ Efficiency+ High Bar + Happiness June7, 2016 | Talent42
  • 2.
  • 3.
    gayle in/gaylemcdgayle 3GayleLaakmann McDowell They Don’t Know…  Howmany interviews  Who will be interviewing  If they’ll code?How?  What they need to know  Howdecision gets made  WHY? Lotsofmyths (andmisinformation)!
  • 4.
  • 5.
    Gayle Laakmann McDowell5gayle in/gaylemcdgayle Consistency & Efficiency Consistency  Outcome  Process  Questions Efficiency  Speedy process  Able to expedite  Minimal overhead  Minimal false negatives
  • 6.
    Gayle Laakmann McDowell6gayle in/gaylemcdgayle High Bar & Happiness High Bar  Minimize false positives  Good, adaptable people Happiness  Enjoyable experience  Makes company look good  Transparency
  • 7.
    7 gayle in/gaylemcdgayle TheProcess Resume Selection Intro Call w/ Recruiter Email that outlines process Code Assessment Phone Interview ~4 onsite interviews Discussion & Decision “Sell” Call / Dinner
  • 8.
    gayle in/gaylemcdgayle 8GayleLaakmann McDowell Stuff I’ll Discuss  BarRaisersvs. Hiring Committees  OfflineWork  Homework vs code assessment tools  QuestionStyle  Knowledge, algorithms, pair programming  Coding Platform  Real code vs. pseudocode  Whiteboard vs. computer
  • 9.
    Bar Raisers or HiringCommittees So different, yet so similar 01
  • 10.
    Gayle Laakmann McDowell10gayle in/gaylemcdgayle Bar Raisers andHC Offer transparency Offer consistency Keepbar high Facilitatechange Can override manager
  • 11.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 11 HiringCommittee Cons  Overhead  Delays  Un-empowering  Can feel “black box”  Need good feedback Pros  Cross-company consistency  Keeps barhigh  Easierto improveprocess
  • 12.
    Gayle Laakmann McDowell12gayle in/gaylemcdgayle Who’s it good for? Companies that:  See5or more devcandidatesperweek  Wanttoimproveprocess  Hireforcompany,notteam  Arenotveryknowledgefocused Easier to implement early!
  • 13.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 13 HiringCommittee: Best Practices Meet at least 2x per week Multiple HCs:  Beware ofbarcreep /inconsistencies Let interviewers observe HC Traininterviewers to write feedback Quality of decisions rests onfeedback
  • 14.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 14 Bar Raisers Cons  Need consistency across company  Need to scale team Pros  Many ofHC benefits:  Consistency  High bar  Transparency  But easierto implement  No bottleneck
  • 15.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 15 Bar Raisers:Best Practices  Select people who areinherently good  Experiencedatinterviewing  Nice, empathetic  Smart&can challengecandidate  Train them thoroughly  Empowerthem  Assign outside of team  Watch out for scale/exhaustion!
  • 16.
  • 17.
    Gayle Laakmann McDowell17gayle in/gaylemcdgayle OfflineAssessments Homework Projects Code Assessment Tools
  • 18.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 18 Homework Projects Big Very Practical Some love this Lesscheating  Except:algos Too immediate Needs eng time Disproportionate workload Scales poorly for candidate
  • 19.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 19 Homework: Best Practices Show candidate interestfirst < 4 hours  If >4, onsite project review Architecture, not algorithms Define review criteria Avoid confusion with company work
  • 20.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 20 Homework: Who It’s Good For Language focused  Low priority on algorithms / thought process Experienced candidates (maybe)
  • 21.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 21 Code Assessment Tools Fast, cheap eval  Morecandidates  Non-traditional Sets expectations for onsite Consistent data point Cheating May turn off senior candidates
  • 22.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 22 ImplementationOptions Everyone Just your “maybe” candidates Fast-Track
  • 23.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 23 Who It’s GoodFor Small, mid-sized, and big companies Value algorithms / problem solving Lots of candidates Want to look at non-traditional candidates
  • 24.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 24 Code Assessment: Best Practices Show candidate interest first Beware of cheating  (But nobiggie!) Clearexpectations Pick GREAT questions  Similar to real interviews  Unique questions 1 – 2 hour test
  • 25.
    Question Style Pair programming,algorithms, knowledge 03
  • 26.
    Gayle Laakmann McDowell26gayle in/gaylemcdgayle What ToAsk Knowledge Algorithms Design/Architecture Pairprogramming
  • 27.
    Gayle Laakmann McDowell27gayle in/gaylemcdgayle Knowledge Questions Good whenyou can’t train skilleasily Best practice:  In-depth,ifat all  Keepita discussion
  • 28.
    Gayle Laakmann McDowell28gayle in/gaylemcdgayle Algorithm Questions Smartmatters. Good for everyone Best practices:  Clear expectations with interviewers & candidates  Ask medium-to-hard & unusual questions
  • 29.
    Gayle Laakmann McDowell29gayle in/gaylemcdgayle Design/Architecture Great for experienced candidates Shows communicationskills Best practice:  Prepcandidates.Big unknown!
  • 30.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 30 PairProgramming  Many candidates enjoy it  Feels fair & real world  Assesses codestyle / structure  Shows interpersonal interaction  Less understood  Not greatfor algos  Interviewer really matters  Biased by tools
  • 31.
    Gayle Laakmann McDowell31gayle in/gaylemcdgayle PairProgramming: Best Practices  Prep/warn candidates  Need GREAT interviewer  Give choice of problems  Okay/good to pick unreasonably big problems  Guide candidates  (Okaytoaskquestions,notknowtools,etc.)
  • 32.
  • 33.
    Gayle Laakmann McDowell33gayle in/gaylemcdgayle Why We Make Them Code Can theyput“thoughts”into “actions”? Do they show good structure and style? Do they thinkabout theimpact of decisions?
  • 34.
  • 35.
    A Game withSecret Rules … and this is for a simple problem
  • 36.
    Gayle Laakmann McDowell36gayle in/gaylemcdgayle Don’t Allow Pseudocode Unpredictable playing field Detailsmatter If “real code” is too hard for them…
  • 37.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 37 How toCode Big Practical Stuff Use computer Pair Programming Small Stuff Algorithm-focused Computer or whiteboard
  • 38.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 38 Buthow to code? whiteboard computer
  • 39.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 39 A Case for Computers  Realistic. Allows tools.  Candidates feel more comfortable  (Especially experienced &diversity candidates)  Faster to write (often)  Morecode
  • 40.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 40 The Downsideof Computers Oftenwrite stupid stuff Desperate attemptfor compilation Communicationshutsdown Biased by tools/laptop “Transition” betweenalgorithm & code
  • 41.
    gayle in/gaylemcdgayle 41 z GayleLaakmann McDowell Computer Best Practices Let candidate bring laptop Instruct: not every detail Encourage communication and thinking Recognize the bias!
  • 42.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 42 A Case for Whiteboards Encourages thinking & communication More language agnostic Consistent across candidates  Better laptop/tools doesn’t matter It’s “standard”
  • 43.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 43 The Downsideof Whiteboards Slow to write Artificial environment Can be intimidating
  • 44.
    gayle in/gaylemcdgayle 44 z GayleLaakmann McDowell Whiteboard Best Practices Encourage shorthand Be upbeat & encouraging Reasonable expectations
  • 45.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 45 Recommendations If skill-focused: then Computer If algos-focused: then Whiteboard If a little of each: then Either/or  Both can work!  … with proper training  Whynot let candidate choose?
  • 46.
  • 47.
    gayle in/gaylemcdgayleGayle LaakmannMcDowell 47 Thingsto Consider  BarRaisersor Hiring Committees  Code assessmenttools  Pairprogramming(forpracticalstuff)  Whiteboard(orpick-your-poison)foralgorithmsstuff
  • 48.
  • 49.