Your SlideShare is downloading. ×
0
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Graphorisms applied v0.03
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Graphorisms applied v0.03

1,577

Published on

A graphic language for articulating the state and boundaries of complex systems at a high level. The goal is high level abstraction so that false reductionist data or approximation isn't allowed. High …

A graphic language for articulating the state and boundaries of complex systems at a high level. The goal is high level abstraction so that false reductionist data or approximation isn't allowed. High Level Metaphor is used to convey meaning which can then be expanded on.

Published in: Business, Technology
2 Comments
0 Likes
Statistics
Notes
  • let me know if you would like access to the original graphorism files for download etc.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • this is probably horrible as a stand alone presentation, but explains complex processes using a high level of graphic abstraction method.

    It is used to get people seeing systems, state and risk from a shared perspective. A lot of systems thinking going on here.

    feel free to expand on it and adapt it for your domain of expertise. It is CC rights of attribution.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

No Downloads
Views
Total Views
1,577
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
2
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1.
  • 2. systemsthinking& Graphorismsfor capital allocation Nick Gogerty<br />Seekfeedback of prototype project<br />Describe complex problems & environments as systems<br />Graphorisms = graphic models & metaphors<br />Allocate capital to stable, long lived, higher margin competitive systems<br />
  • 3. systems thinking:framing problems<br />System involving flows or repeat process<br />change and dynamics of change<br />
  • 4. process<br />Cycles or repeats<br />growth clockwise<br />shrinkage counter clockwise<br />Abstract with open boundary<br />Operates in an environment<br />Interacts at boundaries<br />Purposely shown as abstract<br />Stasis is death<br />Has limits (input, flow or output)<br />
  • 5. narrative systems thinking <br />Talking, showing and walking through some problems<br />Talk through the paths<br />normal accidents: hidden system paths<br />
  • 6. shared perspective & people’s goals<br />Growth/decline (increasing or decreasing an externality of the system)<br />Stability (maintaining the status quo / usually focuses of risk management prevention of instability)<br />Termination of the current system (wholesale change)<br />
  • 7. critical instability & system capacity <br />risk tight coupling<br />grains of sand/avalanche model (log normal events) from linear input<br />straw on the camels back not important<br />Why?<br />Tension & Capacity<br />
  • 8. butterfly wing flaps aren’t important: system tension (sensitivity) is<br />Causal loops attenuate (over link focus is useless) and counterproductive “reductionist folly”<br />Tension among components and process is important ”specificevents are irrelevant"<br />Understand homeostasis boundaries of stability / failure shift<br />
  • 9. Risk types<br />Concentration (low diversity, low redundancy)<br />Tight coupling /Too efficient optimized<br />Over capacity / tension<br />Failure by design: System over optimize = normal accident <br />Model truth: All components fail. All systems end.<br />
  • 10. Graphorism<br />
  • 11. System questions make a graphorism<br />required elements: relationship, state and boundary<br />What are the actors/resources and context?<br />What is the system, process, output or input to be explained? <br />What are the boundaries and capacities of the system (links, resources)?<br />What is the state or dynamic to be shown?<br />Identify tensions (safety stability v. efficiency)<br />
  • 12. Graphorisms: goals and uses of<br />Graphic metaphor + aphorism<br />not complete state diagram<br />language for (Subject, object verb and state) representations<br />designed for cocktail napkins<br />graphical abstractions for how system growth, change and end<br />fuzzy flow chart or operations diagram easy to learn / share<br />
  • 13. Graphorismrules<br />Not literal representation: real outputs are sloppy and lumpy not smooth<br />Purposely abstract away details of system ”reduce false complexity completeness”<br />Presented as open for debate<br />Graphorism requires context<br />Layer specific "not wholistic”<br />Multiple graphorisms used for components, relationships and conditional states<br />
  • 14. elements<br />Process (growing, shrinking, dead, failing)<br />Resource competition "crowding/capacity"<br />Sub system components (linked/related)<br />Scale variant (limited presentation) = more is different (network of networks) <br />Process environment "stable, failing to collapse, growing, tightly coupled, loosely coupled"<br />Boundary (input, output, capacity for flow)<br />Fuzzy noise, error bounds / futility of detailed measurement<br />
  • 15. actors<br />Subject /nouns (process/person/entity)<br />Lifecycle limit<br />Resources (money, energy, carbon, air, time)<br />boundary / barrier or threshold<br />links / chains (often sub processes themselves)<br />
  • 16. relationships of components (verbs)<br />Throughput change<br />Component and system tension ratio with an instability threshold<br />Capacity risk<br />
  • 17. states: system dynamics (adjectives/modifiers)<br />Linked / dependent / clustered<br />Symbiotic (stable) v. Parasitic (detrimental)<br />Activity level: (static = dead),Homestatic (stable / governed / robust)<br />Failure/d (at component/system level)<br />Point failure ≠ system failure<br />Growing/shrinking (increasing decreasing<br />Tension/capacity/limit (used/available)<br />Coupled component dependency (tight/loose)<br />
  • 18. rules & universal system ?’s<br />All systems end. When?<br />All components fail. What happens?<br />All systems linked? What links count?<br />All things change. What next adjacency?<br />Everything is finite? What are capacities tensions?<br />
  • 19. Sources of risk system failure<br />Capacity / tension/ brittleness<br />Supply / externality<br />Hidden path (normal accident)<br />Age / component failure<br />
  • 20. tragedy of the commons<br />
  • 21. Gresham’s law (cluster evolutionrisk) in banking, insurance & CDS<br />
  • 22. Gresham’s process<br />Designed / behavorial to failure / instability<br />Needs regulation of control<br />Tragedy of commons due to feedback delay<br />
  • 23. Wicked problems<br />
  • 24. Graphorisms may help communication of "wicked problems"<br />No unique "correct view of the problem”<br />Many possible intervention points<br />Often a-logical, illogical or multi-valued<br />Different views of problem and solutions are contradictory<br />Problem solver out of contact with problems and solutions<br />Considerable uncertainty / ambiguous<br />Problems are interconnected to other problems<br />Data are often uncertain or missing<br />A<br />B<br />
  • 25. Capital Allocation<br />
  • 26. systems thinking applied to capital allocation to make money<br />Can you identify competitive cluster & lifecycle stability?<br />What is source of value / pricing power?<br />What are relative growth factors?<br />How are hidden paths minimized?<br />
  • 27. not value or growth investing<br />Systems thinking investing “thoughtful capital allocation”<br />Process driving margin & capital returns<br />Process driving sustainability<br />Capacity constraints & tension<br />
  • 28. first rule don't lose capital / value = first rule survive!<br />always trade safety for efficiency or speed<br />Goal compounding growth (system time)<br />risk =loss of value creation systemin portfolio (not volatility)<br />Lumpy (true natural) not false smooth<br />
  • 29. price asabstraction / opinion<br />Price is 2 opposing views about value expressed at a point in time<br />99% can’t beat buy and hold (don’t understand value creation process)<br />99% of price generation is useless information about the value creating system<br />
  • 30. 1. thinkin competitive clusters<br />Customer choice set<br />Seek stable low innovation = niche survival<br />Inverse Gresham's law (process good crowding)<br />Winner take most <br />Capacity boundaries?<br />Niche capacity relative to what ?<br />
  • 31. Unit of moat =innovationarchetypes (10 Doblin)<br />Business Model innovation: <br />Networking innovation: <br />Enabling Process:<br />Core Process:<br />Product Performance: <br />Product System: <br />Service: <br />Channel: <br />Brand: <br />Customer Experience: <br />
  • 32. innovation as evolution<br />
  • 33. 3 blind men view the elephant moat<br />Customers know what they like may not be able to articulate rational<br />Competitors know what they can’t do<br />Neither may the know the how, what or why of the moat<br />
  • 34. Moat types& metrics (depth /duration)<br />Belief-Brand (Coke, Wrigley, P&G)<br />Learned behavior / (msoffice, Geico) switching cost <br />Geography (Wal-mart, BNSF) displacement cost<br />Social signalling (Rolex) belief barrier<br />Network and scale effects (BNSF, i-tunes) switch cost<br />
  • 35. sloppy process of system life<br />details are less important than states<br />“moat” margin homeostasis<br />Understand tension among componentsis risk<br />Carryingcapacity everything is a niche<br />
  • 36. risk<br />inflection points tight coupling "tension"<br />Events are less important (system time counts)<br />Capacity demand?<br />Source of moat /cluster shape<br />
  • 37. Key questions ???<br />All things end. When?<br />What is the relative value creating process for survival?<br />How stable is the shape of the cluster and evolution in the niche?<br />What are the boundaries and carrying capacities?<br />
  • 38. Buffett example:Lubrizol $9b<br />Complex entity (system of system moats) portfolio of clusters<br />multiple dominant clusters moat shows up in ROC long term position (critical pieces of bigger systems) low price sensitivity<br />Stable 40% dominance in some markets<br />Distribution, technical, brand and price leadership moats<br />Visible in high historical ROEs stable cluster (out of commodity box)<br />Management knows how to allocate to earn cluster returns<br />

×