While there are many methods and practical advice on how to deliver projects more efficiently, there still isn’t much guidance on exactly how we ensure we’re focused on the most valuable projects…yes, we need to define objectives and measures, break down the larger project into smaller, more manageable chunks, deliver value iteratively, and ensure continuous improvement and change management…if we do all of that, are we successful as project teams? Are we maximizing our value to our respective organizations? What happens when our stakeholders don’t know what they value?In essence, our project teams are typically only as effective at delivering value as the projects we work on. And, therein lies the problem.
Value delivery is more than just delivering on budget, on spec, and on time. And, to take it even further, it is more than delivering the value stated as the project’s objectives. So,it isn’t the project failure rate that is the important metric. We should be focused on the overall organizational success rate. And to improve the overall organization success rate, we must make good decisions on how to define and prioritize value.This presentation focuses on how we make strategic decisions that drive organizational value to further increase the value we deliver through effective project delivery.
As times have changed, we have witnessed an increase in innovation and technology across all industries. Yet, there remains a constant in all of this change—us. This section of the presentation will provide a high level overview of this important constant.
Image Source: “Financial Crisis Montage” by D.F. Shapinsky http://picasaweb.google.com/lh/photo/WmTwW-vqSUvArQo1TW_ITg
Much energy is being spent on debating the merits of regulation. The focus is certainly there on controlling the situation.New financial reforms in the U.S. like…Increased authority by regulators to monitor everything from mortgages to complex derivative securitiesForcing financial firms to reduce debt and hold more capital on reservesIncreased transparency into derivative trading through open market places (vs. private transactions)Creating a new consumer protection agency to curb abusive lending…are all good ideas. Yet, governance alone has proven to not be enough. This is not a talk on the merits of financial reform. Rather, we will delve into why I believe there is a deeper problem to address. One that has been with us throughout our history.
As far back as the 1600s, our biases have challenged our definition of and behaviors towardsvalue. MacKay’s account of tulip mania is a good example of how irrational we can be when it comes to making decisions in uncertain times. In the painting, we see monkeys instead of humans, in a market place trading tulips. The analogy is closer than you may think. We’ll explore further throughout this presentation. Image Source: "A Satire of Tulip Mania" by Jan Brueghel the Younger, ca. 1640,http://researchguides.austincc.edu/economicsContent source: “Memoirs of Extraordinary Popular Delusions and the Madness of Crowds”, Charles MacKay (1841)
While the esteemed cartoon character, Pogo, was referring to humans and our impact to the environment, the quote also applies to our decision making skills. WE are the enemy…not technology, not innovation…it is US. Studies since the 1970s have shown that we are biased decision makers, meaning we make systematic errors in judgment in the face of complexity and uncertainty. Source: http://en.wikipedia.org/wiki/Pogo_(comics)
Much of the credit to understanding our behavioral flaws, also known as cognitive biases, stems from the seminal research started by Amos Tversky and Daniel Kahneman the 1970s. Tversky and Kahneman sought to understand when human beings are not endowed with the rational probabilistic thinking and optimal behavior under uncertainty. Their collaboration over a 30 year period revolutionized the scientific approach to decision making, ultimately affecting all social sciences and many related disciplines. Their research does much to debunk the basis of academic economic theory, which is rooted in the assumption (among others) that humans are rational beings. Our irrationality in decision making is an important factor in value delivery. Only by understanding our biases can we mitigate them to make more robust decisions on how we define and implement value in our organizations. Kahneman was awarded the Nobel Prize in economics in 2002 for his contributions toward the field of behavioral finance.
We all operate under a set of heuristics, or shortcuts, that reduce the complex tasks of assessing probabilities and predicting values. We will show how these heuristics can be useful, and then highlight cases when they lead to severe and systematic errors in judgment that hinders our ability to realize the benefits of maximizing value delivery at the project level. Again, remember we are trying to maximize value at the organizational or corporate level.
The intuitive system draws from our past experiences and enables us to make rapid calls “straight from the gut” (reference to Jack Welch). Think of this as your internal rulebook that serves as a short cut to save you time and effort. Jack Welch is an admirable leader. Yet, we don’t admire Mr. Welch, or other gut decision makers, for the quality of their decisions so much as for their courage to make them. Gut decisions testify to the confidence of the decision maker, no doubt an invaluable trait in a leader. We’ll see later how it is this same confidence that, in times of uncertainty, may result in sub-optimal decisions. A 2002 survey (Miller and Ireland, 2005) revealed that 45% of executives said that they rely more on instinct than facts and figures in running their businesses.
The intuitive system certainly has its place. Understanding the limitations associated with the biases inherent in the intuitive system is also important.Image Source: http://upload.wikimedia.org/wikipedia/commons/5/50/Getty_kouros.jpg
Since executives make hundreds of decisions daily, the time consuming demands of rational decision making are often not viable. Can you imagine an executive taking the time to assess all data and options before making every decision? That is not only impractical, but also not necessary. However, it is the rapid fire decisions that lead from our intuitive system that are the basis for cycles of corruption. These decisions that are rapidly arrived at are less than consciously considered and made without being fully unaware of the extent to which a small wrongdoing can escalate. And because such decision making is based largely in the unconscious we tend not to be aware of its flaws.Source: “The Cognitive and Social Psychology of Contagious Organizational Corruption” Brooklyn Law Review (2004), Daily, J.M.
While the implications of our cognitive biases have gained traction in the world of investment finance, little practical advice on how we can improve corporate decision making, and thus corporate value, has resulted. We will apply the academic research and empirical lessons learned about our cognitive biases from behavioral finance to corporate decision making at the highest levels down to project level decisions and execution. We will…Introduce three cognitive biases that inhibit corporate decision makingPropose a solution and actionable steps to begin making more holistic, unbiased decisions at the corporate level down to the project level
In general, people tend to overestimate their ability to perform well. This, in turn, leads to impulsive decisions as managers who think they know more than they really do or are overly confident in their own abilities. Therefore, they search for less help and direction in making major decisions
Our self-confidence makes it difficult to assess the true odds of a decision and the inherent variability that is part of uncertain situations. New product introductions, M&A, and “bet the farm” type of decisions are accompanied by extremely volatile variables. Yet, we justify these decisions through detailed business cases that forecast costs and benefits out five years down to the penny…right. We’re that good?An executive who projects great confidence in a plan is more likely to get it approved than one who lays out all the risks and uncertainties surrounding it. We seldom see confidence as a warning sign…a hint that the bias is at work in many of us. What makes biases such as overconfidence so dangerous for decision making is its invisibility. We are subject to believe in the quality of our decision making, particularly where those processes have led us to success in the past.
The overconfidence bias has been identified as one the main reasons for the Denver airport baggage system project failure. Although the project was technically very ambitious, the contractor (BAE Systems) assumed that all technical obstacles could be overcome within the project timeframes. Key areas of failure:Underestimation of complexityFirm commitments in the face of massive risks and uncertaintyAcceptance of change requests with significant system design implications further added to the impacts of #1 and #2Image Source: Keeshond Club of America http://swankeekeeshonden.com/National/Air.htmlContent Source: “The role of cognitive biases in project failure” Kailash Awati http://eight2late.wordpress.com/2009/05/08/the-role-of-cognitive-biases-in-project-failure/“Denver Airport Saw the Future. It Didn’t Work” Kirk Johnson, New York Times http://www.nytimes.com/2005/08/27/national/27denver.html?ex=1282795200&en=55c1a4d8ddb7988a&ei=5088&partner=rssnyt&emc=rss
Joseph Schlitz, the owner of the US beer company Schlitz, believed that consumers couldn’t tell beers apart, and he cited market research as evidence and thus decided to use a cheaper brewing process. Although they received constant evidence in the form of lower sales, that customers did not like the beer brewed with the new process, they still stuck with the new strategy which led to disaster. Schlitz, once the third largest brewer in the US, went into decline, and was bought by its rival Stroh in 1982.
The availability bias can substantially and unconsciously influence decisions since individuals easily assume that their beliefs and assumptions are true. Because of the availability heuristic, the perceptions of risk may be erroneous thereby having disastrous impacts.
Availability leads to an overestimation of the probabilities of frequent or recent events and underestimation of the rare or distant events.
I was working on a project with the FAA on September 11, and I saw first hand all the additional security measures that were put in place at airports immediately following the terrorist attack. However, in light of the very recent (at the time), and dramatic events of September 11, travellers considered the probability of danger when travelling by air to be much greater than travelling by car. It was far easier to imagine something bad happening when travelling by air. This was the availability heuristic at work. The truth of the matter was that firstly, air travel had never been safer than in the months following September 11, on account of the massively increased security. And secondly, on account of far more people hitting the roads, there were inevitably many more fatal accidents. Upon examination of the statistics, it was far more dangerous to drive than to fly (US road fatalities in October-December 2001 were well above average) and yet, the availability bias humans suffer from made many feel that driving was the smarter choice.
Image Source: “A Closer Look at the Global Financial Crisis” GOOD Magazine Liam Johnstone http://awesome.good.is/transparency/usersubmissions/financialcrisis/johnstone/johnstone.jpg
Our biases are pervasive and resistant to feedback. Attempts to augment our intuition, and thus, cognitive biases must be holistic and structural in nature. Providing added governance and regulation, while helpful, is inadequate in addressing the root cause. Our four steps:Identify “at risk” decisions and processes Consider the improbable through an explicit exploration of major uncertaintiesConsider the unpopular and broader view to avoid confirmation and availability biasBuild quality in a structured decision making process
Not all decisions are as susceptible to our biases and not all decisions warrant the additional rigor that is required to mitigate our biases. Two types of decisions require special attention…1) Rare, one of a kind strategic decisions and 2) Repetitive, but high stakes decisions that shape a company’s strategy over timeIt’s important to take an inventory of your different decisions and highlight those that fit in these buckets for special attention.
These rare, one of a kind decisions are typically made by a small subgroup of executives. The process is ad hoc and informal.Because of the size and importance of these decisions, it is normal to have armies of analysts and consultants perform detailed due diligence to include market research, customer research, and financial forecasting, etc. Yet, with all of this effort and focus on making the right decision, we often times do not take into consideration our inherent biases. We will depend on information that:Is static in nature – does not take into consideration (or underweights the probability of) the variable outcomes due to the false accuracy presented in a detailed spreadsheetOverly weights more recent data (availability bias)Ultimately, we will still make decisions based on our intuitive decision process, or our gut because we tend to be overly confident in our own abilities (illusion of control), even if the data does not support our decisions.
These important repetitive decisions, while typically formal, are still affected by our biases. Too often, formal governance or investment committees are formed, but do not have the guard rails necessary to avoid the pitfalls of our biases. How often is it the loudest guy/gal in the room who gets his/her project pushed through? Regardless of portfolio prioritization frameworks and processes, it is the squeaky wheel who gets the budget.
We will highlight two ways to explore uncertainties:Sensitivity analysisMonte Carlo Simulation
The example here comes from an engagement with a Fortune 1000 insurance company that was considering different initiatives to increase their premiums sold. Rather than focusing on a static set of assumptions, the sensitivity analysis was able to show that minor improvements in underwriter efficiency would result in meaningful gains in premiums. The sensitivity analysis allowed the decision maker to fine tune assumptions to get to a reasonable estimate of potential gains, that eventually lead to a portfolio of operations and technology initiatives to maximize underwriter efficiency. Assessing ranges for the decision’s inputs to represent uncertainty instead of precise point values clearly has advantages. When you allow yourself to use ranges and probabilities, you move away from making concrete assumptions in areas you don’t know for a fact.The limitations of sensitivity analysis are mainly the time it takes to develop the scenarios and the involvement of the right people to create the appropriate assumptions.
Instead of static “what if” analysis in sensitivity analysis, MCS create alternative scenarios based on ranges for all variables to assess impact on the desired objective. For each scenario, a specific value would be randomly generated for each of the unknown variables. Then these specific values would go into a formula to compute an output for that single scenario. This process can go on for thousands of scenarios.MCS are more complex and require greater effort. It is a recommended tool for large initiatives with uncertain outcomes.In this example, we worked with a marketing company to assess the potential benefits of implementing a business process management system. Rather than focusing solely on the best case or likely case, it was important to help the executive stakeholders understand the potential risk of loss. The big potential positive outcomes were balanced with the potential risk, so that the executives could make an educated decision.
Allow participation in discussion by skill and experience rather than by rank through prediction markets and other forms of crowdsourcing
Surowieki acknowledges that crowds have their limitations too, but highlights four elements that are required for a wise crowd:Prior knowledge – group members should have some prior knowledge of the subject. Random guessing does not contribute to the collective wisdom of the groupDiverse opinion – offer the opportunity of input to all functions and levels to get a broader perspectiveIndependence – members of the group must decide independently and not be skewed by their peersDecentralization – any organizational structure reporting relationships must be removed and freedom of thought encouraged
Best Buy created an internal prediction market available to all employees in 2006. Goals of the market place were to provide senior leaders with leading indicators for the most critical initiatives in the company and to engage employees in an efficient manner. Best Buy learned that the prediction markets improved their forecasting ability, especially for decisions with extreme uncertainty like new product and service introductions. Source: Consensus Point http://www.consensuspoint.com/images/BestBuy-CaseStudy-Approved.pdfPhoto Source: http://www.puppetgov.com/2009/10/13/dont-wear-a-blue-shirt-to-best-buy/
The path to improvement is truly a journey. That journey starts with acknowledging that we are indeed susceptible to consistent and systematic errors in judgment. The last step is the key to ensuring sustainable improvements in decision making—creating a foundational process that builds quality of decisions into the fabric of the organization. We do this through:Pre-mortemsStage gatingDevil’s advocate
The concept of the pre-mortem comes from Gary Klein, a research psychologist. We recently incorporated a pre-mortem into a major organizational change initiative, and it raised some concerns (transient leadership and ownership) that typically would not have been raised so early in the project’s life. In doing so, we were able to create mitigation plans and work with the organization’s leadership to build stability as a key building block of the project’s foundation.Although many project teams engage in pre-project risk assessments, the pre-mortem offers some unique benefits. Many risk assessment techniques identify potential problems early on, but the collaborative nature and prospective hindsight approach serves to engage team member’s perspectives and thoughts that might otherwise go unsaid. In describing weaknesses that no one else has mentioned, team members feel valued for their intelligence and experience, and others learn from them. The process does not have to involve an extensive discussion. The value comes from the contrarian thinking that it drives.
One of the biases not discussed earlier, but a reality in large program investments, is the inability to end projects because of the sunk costs incurred. “We’ve already spent $5M on this project. How can we end it now?”Stage gating is an ideal framework for organizations leveraging an Agile, iterative approach to project implementations.The aim of stage gating is to build the quality of decisions in the immediate term with the understanding that the longer term outcome is uncertain. Better intermediate term decisions can be made at critical milestones when there is newer and more accurate information. Typical actions at stage gating points are to rehabilitate projects that have slipped, kill projects that are beyond saving, and reallocate funding to attractive unfunded projects. Ultimately, creating greater alignment between project objectives and execution to broader organizational objectives and value realization.
The team starts with a contrarian hypothesis at the onset of a decision and proceeds with identifying flaws in the research and thesis as the strategy clarifies. The devil’s advocate role is an uncomfortable one. It requires management’s commitment to creating an environment of healthy challenge and debate, not only at the beginning of a project, but throughout its lifecycle.
Value Before Delivery
Applying lessons learned from cognitive psychology to increase organizational value<br />Value Before Delivery<br />
The next, and bigger, step in value delivery begins with improving the way we make strategic decisions at the organizational level<br />THESIS STATEMENT<br />
Are we working on the right projects?<br />External Influences & Drivers<br />Mission & Vision<br />Decisions to define organizational value<br />Strategy<br />Goal<br />Goal<br />Goal<br />Goal<br />Typical value delivery at the project level<br />Objective<br />Performance Indicators<br />Capabilities<br />Requirements<br />As we improve the way we deliver on projects, we continue to get better at ensuring that we work on the right features and capabilities. Yet, our value delivered to the organization is still capped by the types of projects we work on. <br />
A Constant Among the Change<br />BACKGROUND<br />
The global credit and financial crisis starting in 2007 is one of the worst ever. There are signs that we are on the road to economic recovery. Along with the recovery are many initiatives to reform our financial system. <br />
Delving deeper into the root causes of the major financial crises throughout our history, we see a consistent theme–our biases. “A Satire of Tulip Mania” recounts the tulip mania that swept Holland in the 1630s. Single tulip bulbs sold for 10x the annual income of a skilled craftsman.<br />
“We have met the enemy<br />and he is us.”<br />- Pogo<br />
We are biased<br />decision makers.<br />Amos Tversky<br />Daniel Kahneman<br />
Opaque</li></ul>Cognitive research has demonstrated that there are at least two systems of reasoning or decision making: 1) the intuitive and 2) the rational. It is the intuitive system that serves as the basis for our cognitive biases. <br />
Am I real?<br />Malcolm Gladwell calls the intuitive system “rapid cognition” or “thin slicing.” In Blink, an expert recognizes a statue as a fake at first glance, or within a “blink,” after an initial group of experts exhaustively studied and analyzed the statue and confirmed its authenticity.<br />
Rapid fire decisions <br />are the basis for <br />cycles of corruption. <br />
Top Three Biases In Corporate Decision Making<br />OUR BIASES<br />
…which leads to an underestimation of odds and uncertainty<br />in complex decisions.<br />
“It was hubris”<br />- Richard de Neufville<br />Source: “Denver Airports Saw the Future. It Didn’t Work” Kirk Johnson, NYT<br />With high expectations to be the most advanced baggage handling system in the world, the system at Denver Int’l was finished 16 months late, riddled with problems. After almost a decade of struggling to fix the problems, the project was abandoned…Billions of dollars over budget.<br />
Confirmation bias <br />is our tendency to <br />seek and interpret information that confirms our preconceptions...<br />
Need to change + Strong beliefs +<br />Confirmation bias =Disaster<br />Driven by the need to lower costs and meet increased demand, Schlitz started a cheaper brewing process in the 1970’s and cited research that consumers couldn’t tell beers apart. Although customers voted through lower sales, Schlitz continued with the flawed strategy and went into a severe decline. <br />
Availability bias <br />is our tendency to <br />judge an event as likely or frequent if it is easy to imagine or recall… <br />
…thus, narrowing our vision and discounting events outside of <br />our immediate memory<br />
Photo by -wink- on Flickr<br />Immediately after the September 11 terrorist attacks, travellers made the decision that travelling to their destination by car was a far safer way to travel than by air. The reality was that air travel had never been safer with all the increased security. Our viewpoint was certainly narrowed.<br />
“A Closer Look at the Global Financial <br />Crisis” GOOD Magazine Liam Johnstone <br />Unsound business practices were taken to the extreme chasing the short sighted “success” of competition…leading to incorrect incentives…leading to unsustainable irrational behavior…leading to the current financial crisis. <br />
Four Steps to More Robust Decisions<br />PROPOSED SOLUTIONS<br />
1. <br />Identify “at risk” decisions and processes.<br />
Rare, one of a kind strategic decisions–major M&A, crucial technology choices, and “bet the company” investments. These decisions are characterized as extremely uncertain and highly complex. Both characteristics increase the opportunity for our biases to lead us astray.<br />
Beware <br />of Biases!<br />Beware of Biases!<br />Beware <br />of Biases!<br />Beware <br />of Biases!<br />Beware <br />of Biases!<br />Repetitive, but high stakes decisions that shape a company’s strategy over time–R&D allocations in a pharmaceutical company, capital expenditure decisions, and new product launches. The frequency and complexity of these investment decisions make them prime candidates for greater rigor in protecting against our biases. <br />
2.<br />Consider the improbable through an explicit exploration of major uncertainties…<br />
Objective: Increase premiums sold<br />Assumptions: Total underwriting hours per year = 64,144 Average premium per hour = $3,813<br />1<br />2<br />Identify specific performance drivers/variables <br />1<br />2<br />Assess impact of changing each major driver on desired objective<br />Example for illustrative purposes only<br />As large decisions increase in complexity and variability, performing sensitivity analysis can be useful. Sensitivity analysis provides a means to understand a decision’s drivers and the sensitivity of results to changes in these drivers. <br />
Embrace uncertainty<br />Likely Case =$7M<br />There’s a 10% probability of a loss<br />Best Case = $27M<br />Example for illustrative purposes only<br />Another tool that takes the analysis of uncertainty even further is Monte Carlo Simulation (MCS). MCS is a method for iteratively evaluating different probabilities and scenarios through a computer generated simulation. <br />
3.<br />Consider the unpopular and broader viewto avoid confirmation and availability bias…<br />
“Large groups of people are smarter than an elite few, no matter how brilliant. These groups are better <br />at solving problems, fostering innovation, coming to wise decisions, even predicting the future.”<br /><ul><li>James Surowieki, </li></ul>The Wisdom of Crowds<br />
Identify potential problems early<br />Engage employees in the business’ success<br />Provide insights into business processes<br />Prediction markets are competitive exchanges or markets. Prediction markets are a great way to engage employees and involve them in the creation of projections and ideas, and ultimately increasingly socialize and distribute decision inputs within the organization.<br />
4.<br />Build qualityina structured decision making process…<br />
Before a project starts, ask this question…<br />“We’re looking in a crystal ball, and this project has failed. What are all the reasons why you think the project failed?”<br />A pre-mortem is the hypothetical opposite of a post-mortem. The pre-mortem explores why a project might fail before it starts. It is a simple idea with significant benefits by not only encouraging, but legitimizing dissent. <br />
GO<br />GO<br />GO<br />GO<br />Iterative value delivery and confirmation<br />Project<br />Completed<br />Stage Gate 3<br />Stage Gate 2<br />Stage Gate 1<br />Cost Benefit Analysis (CBA) Review<br />Project funding and resources requested<br />STOP<br />STOP<br />STOP<br />STOP<br />Stage gating is a way to ensure consistent monitoring of projects. Rather than committing to full funding of a project at the beginning of a large investment decision, key milestones are laid out at intervals throughout the project with go-no go decisions built in. <br />
Inserting a devil’s advocate into the process is another mechanism for capturing dissension. A “challenge team” whose sole role is to act as a contrarian viewpoint throughout the lifecycle of a decision and projectcreates a healthy tension in large, complex decisions.<br />