I've been thinking a lot recently about how to help a digital delivery team be more Lean - by highlighting to them (in a simple way) just what they're spending their time on.
I've come up with a model recently for visualising time spent on Features, on Infrastructure, Bugs and the other stuff (e.g. overheads, or the cost of doing business).
The question I pose, and try to answer in this slide deck is: "How do I decide what to do next?".
* I'd be interested to hear how you decide what to do next?
* What tools and techniques do you have in this area? and,
* How might this model help you focus on generating value?
1. … the process of managing
software that is built and
implemented as a product ...
to generate biggest possible
value to the business ...
http://en.wikipedia.org/wiki/Software_product_management
5. features
+ve
visible stakeholder ♥
not always feasible
bugs
dependancies?
spikes?
tech debt
users don't like
what we bult
stakeholders
change their mind
market context
changes
problems with this simple model
7. features bugsvisible
+ve -ve
infrastructure
£ of doing
business
invisible
errors in
implementation
tech debt, spikes, tech
design, user testing,
meetings, training,
anything not in the
other boxes ...
anything which you need to
do, to enable a feature to
be delivered to appropriate
level of quality
remember: this is what our
stakeholders love, lots of
this
a new model
8. features bugsvisible
+ve -ve
infrastructure
£ of doing
business
invisible
bare minimum
yes here to - but there's a
limit before stakeholders fall
out of love
mostly here please
where to spend our time
also, bare minimum
11. features
bugs
visible
+ve -ve
infrastructure £invisible
triage immediatly
how to plan for?
2. then deliver bare
minimum of
enablers
1. build mvp to prove
the requirement and
test any assumptions
3. then just enough
to maximise the
value before
pivoting
14. features
bugs
visible
+ve -ve
infrastructure £invisible
features which don't
have a real user in the
story are invisible and
=> most likely
infrastructure
which things should we watch out for?
'requirements' =
perceived user needs.
test all assumptions
large batches. e.g. 3
months between UX
designs completed
and feature build
bugs reports which are
really new
requirements
bugs you don't need to
fix (is not "must-be
quality" - see KANO
method)
spikes which aren't
timeboxed
meetings you don't
need
doing more than the
bare minimum