The document provides tips and techniques for estimating software development timelines. It recommends doubling the initial time estimate and rounding up to the next unit to account for uncertainties. Key aspects to define include requirements, parameters to validate, and historical data. Estimates should have confidence intervals rather than precise figures. Prioritization involves assessing features' relative benefits, costs and risks to determine priority. Communication and removing personal biases are also important.
The document discusses various techniques for estimating software project timelines and schedules. It recommends doubling your initial time estimate and then rounding up to the next unit on the time scale (e.g. hours to days). It also stresses accounting for impediments by padding estimates. Other estimation techniques mentioned include defining parameters, using historical data, confidence intervals, fuzzy logic based on lines of code, the Delphi method, and prioritizing requirements based on factors like benefit, penalty, cost and risk. The document recommends preferring hour estimates over days for easier scaling and warns against assuming more developers will solve scheduling problems or negotiating estimates. It provides references for further reading on software estimation.
Many developers are often asked by project owners to give time estimates for features or bug fixes. But how many developers have the ability to provide project owners a reasonable estimate? Many developers will just follow irrational formulas or arbitrary methods to create a number that is not only wrong, but costly. "Stop Multiplying by 4" will teach developers of all skill levels easy techniques to provide accurate estimations. We will start with a small calibration exercise to find out how good you are. We will then go over procedures to improve accuracy . At the end of the talk, you will possess the skills to get you started on improving the certainty of your estimates.
Many developers are often asked by project owners to give time estimates for features or bug fixes. But how many developers have the ability to provide project owners a reasonable estimate? Many developers will just follow irrational formulas or arbitrary methods to create a number that is not only wrong, but costly. "Stop Multiplying by 4" will teach developers of all skill levels easy techniques to provide accurate estimations. We will start with a small calibration exercise to find out how good you are. We will then go over procedures to improve accuracy. At the end of the talk, you will possess the skills to get you started on improving the certainty of your estimates.
The document discusses techniques for estimating software development timelines. It recommends doubling the initial time estimate and rounding up to the next unit on the time scale. Factors like involving other people and unexpected issues mean schedules are difficult to accurately predict. Breaking tasks down into smaller parts and using historical data, testing, and confidence intervals can help. Prioritization methods include urgency matrices and spreadsheets weighing factors like benefits, costs, and risks. The document provides references for further reading on software estimation.
This document discusses how to close the gap between measuring product metrics and learning from those metrics through an iterative process. It emphasizes establishing the right metrics to track, ensuring metrics are actionable by tying them to goals that change behaviors, and communicating metrics and plans to take action on results across teams. The document provides examples of metrics to track at different stages of product development and recommends establishing a single source of truth for metrics as well as communication checklists.
[CXL Live 16] When, Why and How to Do Innovative Testing by Marie PolliCXL
The document discusses different strategies for innovative testing on websites, including iterative and innovative testing. Iterative testing involves changing small elements like button colors or copy, while innovative testing involves changing multiple elements or entire designs. It recommends using innovative testing when an iteration won't work, testing potential is small, or traffic is low. This allows uncovering bigger gains. Innovative testing carries more risk but also higher potential benefit than iterative testing. The document advises making sure the testing process is solid by asking questions like whether the test matches the hypothesis and goals are set up correctly. It also recommends continuing qualitative research for feedback.
[CXL Live 16] Beyond Test-by-Test Results: CRO Metrics for Performance & Insi...CXL
Individual tests drive insights & ROI, but the most sophisticated optimizers look beyond what an individual test is telling them and use data to optimize their overall testing performance.
In this talk, Claire will dive into the specifics of how to track, improve, and drive insight from performance metrics for your conversion program, so you can not only run better tests, but get more out of your investment in CRO.
The document discusses various techniques for estimating software project timelines and schedules. It recommends doubling your initial time estimate and then rounding up to the next unit on the time scale (e.g. hours to days). It also stresses accounting for impediments by padding estimates. Other estimation techniques mentioned include defining parameters, using historical data, confidence intervals, fuzzy logic based on lines of code, the Delphi method, and prioritizing requirements based on factors like benefit, penalty, cost and risk. The document recommends preferring hour estimates over days for easier scaling and warns against assuming more developers will solve scheduling problems or negotiating estimates. It provides references for further reading on software estimation.
Many developers are often asked by project owners to give time estimates for features or bug fixes. But how many developers have the ability to provide project owners a reasonable estimate? Many developers will just follow irrational formulas or arbitrary methods to create a number that is not only wrong, but costly. "Stop Multiplying by 4" will teach developers of all skill levels easy techniques to provide accurate estimations. We will start with a small calibration exercise to find out how good you are. We will then go over procedures to improve accuracy . At the end of the talk, you will possess the skills to get you started on improving the certainty of your estimates.
Many developers are often asked by project owners to give time estimates for features or bug fixes. But how many developers have the ability to provide project owners a reasonable estimate? Many developers will just follow irrational formulas or arbitrary methods to create a number that is not only wrong, but costly. "Stop Multiplying by 4" will teach developers of all skill levels easy techniques to provide accurate estimations. We will start with a small calibration exercise to find out how good you are. We will then go over procedures to improve accuracy. At the end of the talk, you will possess the skills to get you started on improving the certainty of your estimates.
The document discusses techniques for estimating software development timelines. It recommends doubling the initial time estimate and rounding up to the next unit on the time scale. Factors like involving other people and unexpected issues mean schedules are difficult to accurately predict. Breaking tasks down into smaller parts and using historical data, testing, and confidence intervals can help. Prioritization methods include urgency matrices and spreadsheets weighing factors like benefits, costs, and risks. The document provides references for further reading on software estimation.
This document discusses how to close the gap between measuring product metrics and learning from those metrics through an iterative process. It emphasizes establishing the right metrics to track, ensuring metrics are actionable by tying them to goals that change behaviors, and communicating metrics and plans to take action on results across teams. The document provides examples of metrics to track at different stages of product development and recommends establishing a single source of truth for metrics as well as communication checklists.
[CXL Live 16] When, Why and How to Do Innovative Testing by Marie PolliCXL
The document discusses different strategies for innovative testing on websites, including iterative and innovative testing. Iterative testing involves changing small elements like button colors or copy, while innovative testing involves changing multiple elements or entire designs. It recommends using innovative testing when an iteration won't work, testing potential is small, or traffic is low. This allows uncovering bigger gains. Innovative testing carries more risk but also higher potential benefit than iterative testing. The document advises making sure the testing process is solid by asking questions like whether the test matches the hypothesis and goals are set up correctly. It also recommends continuing qualitative research for feedback.
[CXL Live 16] Beyond Test-by-Test Results: CRO Metrics for Performance & Insi...CXL
Individual tests drive insights & ROI, but the most sophisticated optimizers look beyond what an individual test is telling them and use data to optimize their overall testing performance.
In this talk, Claire will dive into the specifics of how to track, improve, and drive insight from performance metrics for your conversion program, so you can not only run better tests, but get more out of your investment in CRO.
[CXL Live 16] How to Utilize Your Test Capacity? by Ton WesselingCXL
Ton Wesseling gave a presentation at ConversionXL Live in Austin on March 31st 2016 about utilizing test capacity. He discussed optimizing conversions through the ROAR model of risk, optimization, automation and re-thinking. Wesseling emphasized fully using a company's test capacity for impactful A/B tests and separating that capacity for IT releases, campaigns and behavioral learning. He advised celebrating failures to encourage risk-taking and continuous learning.
HostingLabs was created by a small team at HOSTING to rapidly develop and test new technologies using an agile "Skunk Works" approach. It aims to foster innovation, thought leadership, staff development and community engagement through open source projects. Some examples of previous internal projects and current public projects are provided. The presentation encourages supporting employee satisfaction, learning, engagement and automating tasks to increase productivity. It promotes measuring the impact of changes using metrics.
The document provides 10 guidelines for running effective A/B tests:
1. Have one key metric per experiment to clarify decision making.
2. Use your key metric to calculate statistical power and determine required sample size.
3. Run experiments for the planned duration without early stopping.
4. Don't search for differences across many segments to avoid false positives.
5. Ensure experiment groups are balanced to avoid bucketing issues.
6. Don't overcomplicate methods when basics suffice.
7. Be cautious launching changes that didn't hurt without evidence of benefit.
8. Involve data scientists in the entire process for better design and analysis.
9. Only analyze people actually exposed to variations
This document provides guidelines for A/B testing, including prioritizing test ideas based on estimated new conversions per day, creating tests by running a power analysis and having incremental tests, analyzing tests by monitoring health metrics, and making decisions carefully based on analysis results. It recommends calculating potential impact, having a data scientist involved, and not launching on neutral results to avoid technical debt.
This presentation discusses good practices for successful project management. It recommends 1) defining success criteria at the start, 2) identifying drivers, constraints and flexibility, 3) defining release criteria during planning, and 4) negotiating achievable commitments. Additional tips include writing detailed plans, estimating efforts not time, tracking progress openly and honestly, and conducting retrospectives. The presentation notes that requirements will constantly change and the key is to simplify change management.
No estimates - 10 new principles for testingVasco Duarte
This document outlines 10 principles for software development without estimates. It begins by discussing trusting or changing your process (Principle 1) and shortening feedback cycles (Principle 2). Data is presented showing estimates are often inaccurate, with 80% of projects being late or over budget. Principle 3 states to believe data over estimates. Alternatives to estimate-driven decision making are suggested in Principle 4. Principles 5-8 discuss testing for value, measuring progress with working software, and understanding predictable system outputs. Principle 9 advocates using methods with proven track records over hoping estimates will improve. The transformation begins with individuals, per Principle 10.
This document summarizes NCLab's digital skills training services for organizations. It offers customized, practice-based courses in technologies like SQL, Python, Java, data analytics, and machine learning. Training is through hands-on exercises with verified learning outcomes. Courses can be used to upskill employees, adapt to new technologies, and help new hires learn quickly. Target customers include workforce agencies, companies, community colleges, and universities.
This document introduces the A3 problem solving process. The A3 process provides a structured approach to address complex problems involving multiple causes across an organization. It involves planning to understand the current and target conditions, analyzing root causes, developing countermeasures through experiments, checking the results, and acting on lessons learned. An example is provided of using the A3 process to address an increase in serious defects found in code releases. The example walks through planning, root cause analysis identifying potential causes like insufficient testing time and large stories, developing countermeasures like weekly backlog grooming and test automation, and checking for results like reduced defects.
Ecommerce Conversion World, London March 23 2017 - Ton Wesseling keynoteOnline Dialogue
Keynote by Ton Wesseling of Online Dialogue at the Ecommerce Conversion World event in London on March 23, 2017 about "Online Experiments" and explaining the ROAR model.
The document discusses improving software project estimates by putting more effort into understanding project scope, gaining experience in the relevant domain and technology, and accounting for unanticipated work. It notes that estimates are often too optimistic due to the "planning fallacy". Better approaches include basing estimates on past actuals, recording time ongoing, and communicating lessons learned beyond just hour estimates. The goal is to provide more accurate and useful information to decision-makers while acknowledging estimates will always have some uncertainty for large projects.
You Got Your Engineering in my Data Science - Addressing the Reproducibility ...jonbodner
Presented at PyData DC 2016.
Data science is the backbone of modern scientific discovery and industry. It makes sense of everything from cancer trials to package delivery logistics. But all is not well with data science. Over the past decade, multiple studies have been found to be unreliable and non-reproducible when other scientists tried to recreate their results. This is due to a variety of factors, including fraud, pressure to publish, improper data handling practices, and bugs in analytic tools.
The problems faced by data science mirror problems that software engineering has been trying to solve. While there are no silver bullets to guarantee quality software, techniques have been developed over time that have improved quality and reliability. Some of these techniques, including open source, version control, automation, and fuzzing could be adapted to the data science domain to improve reliability and help address the reproducibility crisis.
The document provides guidance on problem solving techniques, with an emphasis on identifying the root cause rather than just fixing the problem. It outlines 8 keys to better problem solving, including keeping analyses simple, focusing on performance differences rather than possible causes, thoroughly documenting all steps, and maintaining discipline. The document then provides a specific methodology for problem solving, beginning with defining whether the problem is new or long-standing, writing a general and defined problem statement, considering any recent changes, establishing performance metrics for evaluation, and writing a final problem statement incorporating the metrics and comparison strategy.
A/B Testing and the Infinite Monkey TheoryUseItBetter
Surveys show that on average only 1 out of 7 A/B tests run by e-commerces end up to be successful. Lukasz Twardowski, the CEO of UseItBetter, tries to explain how some of the most successful online businesses master this process turning it into iterative, evidence-led experimentation at scale programme.
What is the story with agile data keynote agile 2018 (Magennis)Troy Magennis
This document discusses using data to improve agile practices and outcomes. It argues that agile has lost the "data war" by not capturing and utilizing data from teams effectively. It suggests that data needs to be handled safely to avoid embarrassing people and destroying the utility of historical data. Better ways are needed to measure outcomes rather than just output, and to balance predictability with creativity. The document also discusses visualizing and managing dependencies, comparing performance across teams, and using the right metrics depending on a team's characteristics and challenges. The overarching message is that data needs to be used carefully and conversationally to drive the right actions and improve agile practices.
D. Aitcheson. How to make forecasts that are actually accurate.Agile Lietuva
If you're fed up with endless arguments, over whether a story should be 3 points or 5 points? Irritated with having to provide estimates to your management that you know are probably going to be wrong? We investigated that there's a BETTER WAY. Can we make the unpredictable world of product development a little more predictable?
CYCLE TIME ANALYTICS: RELIABLE #NOESTIMATES FORECASTING USING DATA, TROY MAGE...Lean Kanban Central Europe
If you are struggling to forecast project delivery dates and cost, or you want to eliminate the story estimation process because you feel it is waste, or you need to build the business case for hiring more staff, then this session is relevant to you. All estimates have uncertainty, and understanding how multiple uncertain factors compound is the first step to improving project and team predictability. A major benefit of Lean is the low weight capture of cycle time metrics. This session looks at how to use historical cycle time data to answer questions of forecasting and staff skill balancing. This session compares the benefits of using cycle time for analysis over current planning techniques such as velocity, burn-down charts, and cumulative flow diagrams. This session takes you on a journey of what to do after capturing cycle time data or what to do if you have no history to rely upon. Reducing reliance on developer estimation (popularized by the twitter hashtag of #NoEstimates movement) is good general advice, having the tools to plan and manage teams and projects is still important to maintain support at the executive level. This session details the approaches to getting the numbers you need to have whilst minimizing un-necesary overhead and estimating ONLY this factors that matter most.
Adaptive Case Management Workshop 2014 - KeynoteKeith Swenson
This is the first talk from the 3rd International Workshop on Adaptive Case Management and Non-Workflow BPM. conference overview at: http://acm2014.blogs.dsv.su.se/
This document discusses strategies for improving clinical trial site performance through quantifying site metrics and providing feedback. It recommends generating site-specific performance reports using data from the IVRS, calculating metrics like screening and enrollment rates. These reports should be shared with sites via email on a regular basis to start evidence-based conversations about performance. It also suggests using a web-based platform to provide ongoing feedback through features like leaderboards, awards, and educational resources in order to build relationships with sites and motivate improved performance. While data is important, it's also critical to understand the human factors influencing performance and support sites in addressing challenges.
[CXL Live 16] How to Utilize Your Test Capacity? by Ton WesselingCXL
Ton Wesseling gave a presentation at ConversionXL Live in Austin on March 31st 2016 about utilizing test capacity. He discussed optimizing conversions through the ROAR model of risk, optimization, automation and re-thinking. Wesseling emphasized fully using a company's test capacity for impactful A/B tests and separating that capacity for IT releases, campaigns and behavioral learning. He advised celebrating failures to encourage risk-taking and continuous learning.
HostingLabs was created by a small team at HOSTING to rapidly develop and test new technologies using an agile "Skunk Works" approach. It aims to foster innovation, thought leadership, staff development and community engagement through open source projects. Some examples of previous internal projects and current public projects are provided. The presentation encourages supporting employee satisfaction, learning, engagement and automating tasks to increase productivity. It promotes measuring the impact of changes using metrics.
The document provides 10 guidelines for running effective A/B tests:
1. Have one key metric per experiment to clarify decision making.
2. Use your key metric to calculate statistical power and determine required sample size.
3. Run experiments for the planned duration without early stopping.
4. Don't search for differences across many segments to avoid false positives.
5. Ensure experiment groups are balanced to avoid bucketing issues.
6. Don't overcomplicate methods when basics suffice.
7. Be cautious launching changes that didn't hurt without evidence of benefit.
8. Involve data scientists in the entire process for better design and analysis.
9. Only analyze people actually exposed to variations
This document provides guidelines for A/B testing, including prioritizing test ideas based on estimated new conversions per day, creating tests by running a power analysis and having incremental tests, analyzing tests by monitoring health metrics, and making decisions carefully based on analysis results. It recommends calculating potential impact, having a data scientist involved, and not launching on neutral results to avoid technical debt.
This presentation discusses good practices for successful project management. It recommends 1) defining success criteria at the start, 2) identifying drivers, constraints and flexibility, 3) defining release criteria during planning, and 4) negotiating achievable commitments. Additional tips include writing detailed plans, estimating efforts not time, tracking progress openly and honestly, and conducting retrospectives. The presentation notes that requirements will constantly change and the key is to simplify change management.
No estimates - 10 new principles for testingVasco Duarte
This document outlines 10 principles for software development without estimates. It begins by discussing trusting or changing your process (Principle 1) and shortening feedback cycles (Principle 2). Data is presented showing estimates are often inaccurate, with 80% of projects being late or over budget. Principle 3 states to believe data over estimates. Alternatives to estimate-driven decision making are suggested in Principle 4. Principles 5-8 discuss testing for value, measuring progress with working software, and understanding predictable system outputs. Principle 9 advocates using methods with proven track records over hoping estimates will improve. The transformation begins with individuals, per Principle 10.
This document summarizes NCLab's digital skills training services for organizations. It offers customized, practice-based courses in technologies like SQL, Python, Java, data analytics, and machine learning. Training is through hands-on exercises with verified learning outcomes. Courses can be used to upskill employees, adapt to new technologies, and help new hires learn quickly. Target customers include workforce agencies, companies, community colleges, and universities.
This document introduces the A3 problem solving process. The A3 process provides a structured approach to address complex problems involving multiple causes across an organization. It involves planning to understand the current and target conditions, analyzing root causes, developing countermeasures through experiments, checking the results, and acting on lessons learned. An example is provided of using the A3 process to address an increase in serious defects found in code releases. The example walks through planning, root cause analysis identifying potential causes like insufficient testing time and large stories, developing countermeasures like weekly backlog grooming and test automation, and checking for results like reduced defects.
Ecommerce Conversion World, London March 23 2017 - Ton Wesseling keynoteOnline Dialogue
Keynote by Ton Wesseling of Online Dialogue at the Ecommerce Conversion World event in London on March 23, 2017 about "Online Experiments" and explaining the ROAR model.
The document discusses improving software project estimates by putting more effort into understanding project scope, gaining experience in the relevant domain and technology, and accounting for unanticipated work. It notes that estimates are often too optimistic due to the "planning fallacy". Better approaches include basing estimates on past actuals, recording time ongoing, and communicating lessons learned beyond just hour estimates. The goal is to provide more accurate and useful information to decision-makers while acknowledging estimates will always have some uncertainty for large projects.
You Got Your Engineering in my Data Science - Addressing the Reproducibility ...jonbodner
Presented at PyData DC 2016.
Data science is the backbone of modern scientific discovery and industry. It makes sense of everything from cancer trials to package delivery logistics. But all is not well with data science. Over the past decade, multiple studies have been found to be unreliable and non-reproducible when other scientists tried to recreate their results. This is due to a variety of factors, including fraud, pressure to publish, improper data handling practices, and bugs in analytic tools.
The problems faced by data science mirror problems that software engineering has been trying to solve. While there are no silver bullets to guarantee quality software, techniques have been developed over time that have improved quality and reliability. Some of these techniques, including open source, version control, automation, and fuzzing could be adapted to the data science domain to improve reliability and help address the reproducibility crisis.
The document provides guidance on problem solving techniques, with an emphasis on identifying the root cause rather than just fixing the problem. It outlines 8 keys to better problem solving, including keeping analyses simple, focusing on performance differences rather than possible causes, thoroughly documenting all steps, and maintaining discipline. The document then provides a specific methodology for problem solving, beginning with defining whether the problem is new or long-standing, writing a general and defined problem statement, considering any recent changes, establishing performance metrics for evaluation, and writing a final problem statement incorporating the metrics and comparison strategy.
A/B Testing and the Infinite Monkey TheoryUseItBetter
Surveys show that on average only 1 out of 7 A/B tests run by e-commerces end up to be successful. Lukasz Twardowski, the CEO of UseItBetter, tries to explain how some of the most successful online businesses master this process turning it into iterative, evidence-led experimentation at scale programme.
What is the story with agile data keynote agile 2018 (Magennis)Troy Magennis
This document discusses using data to improve agile practices and outcomes. It argues that agile has lost the "data war" by not capturing and utilizing data from teams effectively. It suggests that data needs to be handled safely to avoid embarrassing people and destroying the utility of historical data. Better ways are needed to measure outcomes rather than just output, and to balance predictability with creativity. The document also discusses visualizing and managing dependencies, comparing performance across teams, and using the right metrics depending on a team's characteristics and challenges. The overarching message is that data needs to be used carefully and conversationally to drive the right actions and improve agile practices.
D. Aitcheson. How to make forecasts that are actually accurate.Agile Lietuva
If you're fed up with endless arguments, over whether a story should be 3 points or 5 points? Irritated with having to provide estimates to your management that you know are probably going to be wrong? We investigated that there's a BETTER WAY. Can we make the unpredictable world of product development a little more predictable?
CYCLE TIME ANALYTICS: RELIABLE #NOESTIMATES FORECASTING USING DATA, TROY MAGE...Lean Kanban Central Europe
If you are struggling to forecast project delivery dates and cost, or you want to eliminate the story estimation process because you feel it is waste, or you need to build the business case for hiring more staff, then this session is relevant to you. All estimates have uncertainty, and understanding how multiple uncertain factors compound is the first step to improving project and team predictability. A major benefit of Lean is the low weight capture of cycle time metrics. This session looks at how to use historical cycle time data to answer questions of forecasting and staff skill balancing. This session compares the benefits of using cycle time for analysis over current planning techniques such as velocity, burn-down charts, and cumulative flow diagrams. This session takes you on a journey of what to do after capturing cycle time data or what to do if you have no history to rely upon. Reducing reliance on developer estimation (popularized by the twitter hashtag of #NoEstimates movement) is good general advice, having the tools to plan and manage teams and projects is still important to maintain support at the executive level. This session details the approaches to getting the numbers you need to have whilst minimizing un-necesary overhead and estimating ONLY this factors that matter most.
Adaptive Case Management Workshop 2014 - KeynoteKeith Swenson
This is the first talk from the 3rd International Workshop on Adaptive Case Management and Non-Workflow BPM. conference overview at: http://acm2014.blogs.dsv.su.se/
This document discusses strategies for improving clinical trial site performance through quantifying site metrics and providing feedback. It recommends generating site-specific performance reports using data from the IVRS, calculating metrics like screening and enrollment rates. These reports should be shared with sites via email on a regular basis to start evidence-based conversations about performance. It also suggests using a web-based platform to provide ongoing feedback through features like leaderboards, awards, and educational resources in order to build relationships with sites and motivate improved performance. While data is important, it's also critical to understand the human factors influencing performance and support sites in addressing challenges.
Program Studi S1 Sistem Informasi
Fakultas Sains dan Teknologi
Universitas Islam Negeri Sultan Syarif Kasim Riau
Backlink ke website resmi kampus:
http://sif.uin-suska.ac.id/
http://fst.uin-suska.ac.id/
http://www.uin-suska.ac.id/
Referensi ke Graham et.al (2006)
Metric Abuse: Frequently Misused Metrics in OracleSteve Karam
This is a presentation I created for RMOUG 2014 which I was sadly unable to attend. However, I wanted to share it with the Oracle community so that you can learn a bit about metrics that are frequently cited, frequently demonized, and frequently misused. In this deck we will go through the steps to diagnose issues and what NOT to blame as you go through the process.
The topics and concepts discussed here were originally formed in a blog post on the OracleAlchemist.com site: http://www.oraclealchemist.com/news/these-arent-the-metrics-youre-looking-for/
This document discusses test design techniques, including identifying test conditions from requirements or code, specifying test cases with inputs, expected outputs and pre/post conditions, and implementing test procedures or scripts. It provides examples of testing a marketing campaign for a mobile phone company, including setting up customer records and running specific tests for low-credit teenagers. The importance of prioritizing tests and scheduling test procedures is also covered.
7 DIGITAL BEST PRACTICES FOR HR PROFESSIONALS: WAYS TO RECRUIT AND HIRE FASTERHuman Capital Media
Industry experts predict that all successful businesses will soon become 100% digital. The biggest challenge is the “how”. The HR world is rapidly changing; global candidates are becoming more tech-savvy and rely on mobile-enabled interactions. Now more than ever, HR teams are looking to attract and retain the right talent, increase employee satisfaction, and enrich the overall recruiting and onboarding experience.
Here’s the time to make 2017 a year of digitization and transformation. The plan: revamp all of your current HR processes and swap paper for an eSignature solution for everything ranging from offer letters, new hire approvals, candidate NDAs, and contractor agreements.
Join this webinar to learn about the 7 important ways that HR teams can make meaningful progress in their digital transformation. You will also learn:
Top HR pains in the workplace
How other companies achieved success in digital transformation
How leveraging an eSignature solution can help you save time and money
This document summarizes the keynote presentation "Oracle DBA Best Practices" by Dennis Williams. It discusses best practices for Oracle DBAs, including ensuring proper backup and recovery procedures, monitoring database and server performance, keeping skills up to date through training and conferences, and developing strong people skills to effectively communicate the value of database administration. The presentation provides an overview of different types of DBAs and their roles, and emphasizes the importance of documentation, testing, and change management processes.
This document discusses test design techniques, including identifying test conditions, designing test cases, implementing tests through procedures or scripts, and determining the formality of test documentation. It provides definitions for key terms like test case, test condition, and traceability. It also covers analyzing requirements to identify test conditions, designing specific test cases, writing procedures to implement tests in a certain order, and prioritizing tests in an execution schedule. Exploratory testing involves minimum planning and is hands-on.
The document discusses issues with how testing is commonly done on projects and provides recommendations for improving the testing process. It notes that testing is often an afterthought and not given proper time, resources or importance. It recommends planning testing from the start, having clear requirements, proper tools, defined roles and processes, as well as measurements to track progress. Effective testing requires involvement from all stakeholders and aspects like acceptance criteria, facilitated user acceptance testing, and following good testing practices.
Software Development in the Brave New worldDavid Leip
The document discusses the agile software development methodology of Extreme Programming (XP). It provides an overview of XP, including its values, practices, and roles. It notes that XP focuses on communication, simplicity, feedback, and courage. Key practices include pair programming, user stories, planning iterations based on velocity, and daily stand-up meetings. The document also covers challenges and lessons learned with adopting XP.
The document discusses the agile software development methodology of Extreme Programming (XP). It provides an overview of XP, including its values, practices, and roles. It notes that XP focuses on communication, simplicity, feedback, and courage. Key practices include pair programming, user stories, planning games, and frequent small releases. The document also covers challenges and lessons learned with adopting XP.
This document discusses different methods for conducting retrospectives in Agile software development. It outlines several common retrospective structures including using three questions to gather data on what went well, what didn't go well, and what puzzles the team; using a starfish model to gather data on what to keep doing, start doing, stop doing, and less of; and using a timeline to map out significant, problematic, and good events over the project. The document also discusses setting the stage, gathering data, generating insights, deciding on actions, and closing out the retrospective. The goal of retrospectives is for teams to reflect on how to continuously improve.
Rejuvenating Agile Operations By Putting Lead And Cycle Time Front And Centre.Zan Kavtaskin
Agile methodologies such as Scrum, Extreme Programming and DSDM emerged in the 1990s and most of them were inspired by the Lean Manufacturing movement. While Lean Manufacturing focuses on increasing value and reducing cycle time, work in progress and lead time, Agile methodologies tend to focus on methods. Over the past few decades these methods became dogmatic, businesses struggle to align these methods with their goals and practitioners become disenchanted when they run out of Agile methods to increase delivery speed.
During this presentation Zan will present some of his research and show how it is possible to amalgamate Agile methods, Lean Manufacturing and Data Science to get your business back on track.
See the full analysis here:
https://medium.com/@zankavtaskin/list/research-rejuvenating-agile-operations-by-putting-lead-and-cycle-time-front-and-centre-766cc7993007
Rinse and Repeat : The Spiral of Applied Machine LearningAnna Chaney
The document describes a four step process for improving machine learning performance through continuous evaluation and retraining. Step 1 involves assessing performance using human judgment. Step 2 optimizes operating thresholds. Step 3 retrains the model with examples from humans. Step 4 redeploys the updated model and repeats the process. The goal is to continuously refine models through an iterative "spiral" approach as performance gains diminish over time.
Pin the tail on the metric v00 75 min versionSteven Martin
This presentation shows a different approach to metrics. Instead of listing the Top 10 field-tested metrics, we first talk about goals as prerequisites for metrics. Next, we discuss characteristics of good and bad metrics. We end with walking through an activity called “Pin the Tail on the Metric,” a technique to facilitate the critical thinking needed to determine what types of metrics can help your organization discuss trade-offs, options, and ultimately make better forward-looking decisions.
This document discusses keys to a high performance workplace, including managing information overload, interruptions, commitments, energy, and priorities. It notes that information overload can negatively impact focus, decision making, and well-being. Strategies are presented for decreasing email frequency and volume, handling emails using the 4D method, automating processes, and scheduling focused times to check email. The importance of prioritizing tasks, harnessing peak energy periods, and managing commitments and interruptions are also covered.
PROFES 2018, Wolfsburg: Talk by Tilman Seifert (Principal IT Consultant at QAware)
=== Please download slides if blurred! ===
Abstract: Processes cannot just be judged as ``good'' or ``efficient''---they must be appropriate for the type of project. As the type of a project changes over time,
the processes must adjust in order to stay efficient and appropriate.
We accompanied the transformation of a large and fast-growing project, using agile development methods and cloud-native technologies, from the very first steps of a prototype to the development of a customer-ready product.
This experience report shows patterns we found on the way.
It argues that systematic process evolution can be done without documentation overhead or relying on questionable process KPIs.
We only used information which is available anyway; this includes our archive of sprint retro boards which allows to create a clear picture of the project's evolution, regarding both the process and the product quality.
In this presentation will examine Over Processing within Six Sigma. Over Processing, unnecessary steps will waste time causing slow down in production lines. This waste can result in loss of time and money.
Similar to Stop multiplying by 4: Practical Software Estimation (20)
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
4. I'll tell you how I estimate things. I sit down and figure out how long I think
it would take me to do it. Then I double that time and then I push it up to
the next point on the “Time Progression Scale”. Were the “Time Progress
Scale” starts off: seconds, minuets, hours and days. So if I think
something is going to take me 4 hours to do. I will double it to 8 and then I
would say it would take me 8 days to get it all completely done start to
finish. And I'm usually right, because there are so many impediments. If
you are the only one doing it yourself your schedule can be very accurate.
Once you have to involve other people, you might as well just make
numbers up. You might as well say “I think this will take me Blue days to
get done”
5.
6. Requirements were the key
Measure what is measurable, but make measurable what is not so
-Galileo Galilei
7. “Wordy” Expression
Our new contact form for the sales team to call or email potential leads, will
require the potential leads to provide the following contact information: First
and Last name, Email address, and Phone number
8. Misplaced Modifier
The contact form has fields for entering a valid First and Last
Name, Email address and Phone number
Top Tip: Prevent children from ingesting dangerous medicines by
locking them in a childproof cupboard. 3 children per cupboard is a
good fit
-Periwinkle Jones @peachesanscream
9. The contact form has fields for First and Last Name, Email address
and Phone number. All fields are required and must be validated
(202) 456-1111
555-1212
212-867-5309
1-800-MATTRES
911
+44 871 984 6352
+852 2280 2898
+91 11 2679 1234
Valid Phone numbers:
10. fast, rapid, efficient Use a set time: "5 seconds"
valid, including but not limited
too, etc., and so on
Describe what is valid or invalid.
use comprehensive list
maximize, minimize, optimize, at
least, between, several
Be sure to include appropriate
values
simple, easy, quick, user-friendly Describe what makes it these
reasonable, when necessary How do you make this judgment?
Source: Software Requirements 2 – Karl Weigers
11. In order to estimate, you must define the parameters of what you
are estimating:
UTF-8
First and Last name must be between 3 and 100 characters
Email complies with RFC 822 and no longer than 300 characters
Phone Number < 25 characters and validated with Foo-Bar REST
service
12. Historical data
Dry run / Unit test
Confidence Interval (CI)
It is better to be roughly right than precisely wrong.
- John Maynard Keynes
13. What is the wingspan of a 747
How far is NY from LA
The average house in the United States uses how many gallons/liters of
water per day?
Francis Scott Key wrote the lyrics, but not the music, for the American
National Anthem
28 degrees Fahrenheit is colder than -15 degrees Celsius.
America On Line purchased Netscape.
15. Q. What is the wingspan of a 747
A. 211 ft (64m)
Q. How far is NY from LA
A. 2,808 mi (4,519 km)
Q. The average house in the United States uses how many
gallons/liters of water per day?
A. 350 g (1,324 l)
Q. Francis Scott Key wrote the lyrics, but not the music, for the
American National Anthem
A. True
Q. 28 degrees Fahrenheit is colder than -15 degrees Celsius.
A. False
Q. America On Line purchased Netscape.
A. True
16. Fuzzy Logic / Tee Shirt
Size Average LOC
Very Small 127
Small 253
Medium 500
Large 1,014
Very Large 1,988
Source: Software Estimation 2 – Steve McConnell
19. Priorities – Urgency Matrix
Important Not Important
Urgent High Priority
Not Urgent Medium Priority Low Priority
Source: Software Requirements 2 – Karl Weigers
20. Priorities – Prioritization Spreadsheet
Source: Software Requirements 2 – Karl Weigers
https://www.microsoftpressstore.com/store/software-requirements-9780735679665
Feature
Relative
Benefit
Relative
Penalty
Total
Value Value %
Relative
Cost Cost %
Relative
Risk Risk % Priority
Validate Phone 2 4 8 13.1 1 9.1 1 14.3 0.81
Connect to Service 5 3 13 21.3 2 18.2 1 14.3 0.84
Report on Data 9 7 25 41.0 5 45.5 3 42.9 0.61
Mark off for contacted leads 5 5 15 24.6 3 27.3 2 28.6 0.59
Totals 21 19 61 100.0 11 100.0 7 100.0
29. Thank You!
Chuck Reeves @manchuck
http://joind.in/talk/view/10634
Software Estimation: Demystifying the Black Art – by Steve
McConnell | ISBN-13: 978-0735605350
Software Requirements: Practical Techniques for Gathering and
Managing Requirements – by Karl Wiegers
How to Measure Anything: Finding the Value of Intangibles in
Business – by Douglas Hubbard | ISBN-13: 978-1118539279
Dev Hell Podcast - Episode 29: Snappy Answers to Stupid
Questions
Editor's Notes
With Companies like Google and Spotify, not caring about deadlines, why should we care about estimating software?Why should we estimate?Estimation is not just about meeting deadlines. Managers need to know if the cost of building is worth the effort. Banks, budgets and backers need to gage how their investment is going to be used.Why should developers Estimate?Developers own the code MagneJørgensen + Stein Grimstad proved If you have any inkling of budget or time line, Your estimate will be biased. Project Managers and Owners know this information and might try to “fit” the cost into what they are willing to spend
What Developers do Wrong?Developers are capable of taking problems and break it down into smaller parts. When it comes to Estimating, we follow arbitrary means that can cause projects to go over budget or worse, fail completely.
A story about estimation that went “well”I needed a developer that was working on another project. He was working with someone who has been programming 30 years. Both were not sure on how long it would take to complete the huge laundry list of items. I sat down with both of them, went over the whole project and broke it up into smaller parts. We then came up with estimates for each items, then sat down with the Project Manager to come up with a time line the client was happy with. The project goes along, for a month, then panic. One of the libs we were using was not working with some version of Safari. Everyone panicked to get a fix in. There was fear that we would miss the deadline. Which would set back my project. We got a fix in, deployed all the changes, and when the smoke cleared, we found that we launched a day early.
How did we do it?When dealing with estimates, you are fighting a battle of uncertainty. Remember this:Its easy to estimate what you knowIts hard to estimate what you don’t know Its very hard to estimate what you don’t know you don’t know**Use driving to work example**
Requirements – Wordy ExpressionsManagers try to “sell” requirements to you. Which means that some times they will add “fluff” to requirements.Wordy expressions add useless information to a statement
Requirements – Misplaced ModifiersMisplaced modifier happens when a word, phrase or clause is improperly separated from the word it describes. In this case, we have valid next to first and last name. Valid modifies First and Last name. Does that mean that we can enter an invalid email address or phone number?Groucho Marx has a famous one: I shot an elephant in my pajamas the other day. How it got into my pajamas Ill never know
Requirements – ClarityNow that we have cleaned up the format and grammar of the requirement, we now look for smells in the requirement. For example, what is considered a “valid” phone number?
Requirement – SmellsIf estimation is a battle for uncertainty, requirement gathering is a battle against ambiguity. Words used in requirements that cause ambiguity, I like to call Requirement smells. Like Code Smells, these words or phrases raise some red flags about the requirements. Take the time to clarify the requirements before making an estimate.
Requirements – Finial Once you have defined the requirements as best you can, you can now start estimating. We now see that our one requirement for a contact form has grown into a much bigger project then we thought. We need to break down our requirements into smaller parts. Break the requirement into smaller more manageable units. We can then take a look and see which parts have the most uncertainty.
Tools for EstimatingBefore getting into some techniques, we are going to need some more information. Using the following tools we can then break down our requirementHistorical dataTake current data about development and proxy it to new requirements. Start tracking metrics like LOC, Number of Functions, Avg LOC / Function. Apply time to each of those metrics to get a rough idea on how long it takes to create each metric: LOC / Hour / Day.I wrote a script that would run git commit every hour to help with thisDry run / Unit testYou don’t need to use a full stack testing framework, but you can test out some critical functions. If you have not previous experience with a service, you therefore have no historical data to base your estimate on. Spending an hour or two on testing out the logic, can provide you with better insight on the complexity for the requirement. Even if you have worked on something similar in the past, do a dry run for the more complex tasks. I was asked to connect to the OH tax service using a SOAP service. In the past I made many SOAP calls, so my estimate reflected that experience. After spending about 15 hours of my 12 hour estimate, I was unable to make the connection for technical reasons and the requirement was dropped (I was told by the developer that I need to use .NET or Java and not PHP in order to use the service).Confidence Interval (CI)This is a statistical model that represents uncertainty. It is calculated by using means and variances. We see them in the real world with hurricane paths. They are great because we do not need to “pad” our estimate. The interval uses a High and Low range that represents our 90% confidence that the “True” value is between them.
Calibration ExerciseWe are going to do a practice calibration test. Three questions have a number value, for these try use the confidence interval. Three questions will be true or false, do not answer those with null ;).
Equivalent BetThis works on a gut feeling. Imagine a spinner that pays out 90% of the time. You choose between your estimate and betting against the spinner. If you choose the spinner, you most likely not confident your value. If you choose your estimate, you might be overconfident and your range is too wide. RepetitionTake a lunch break come back in an hour and try your estimate again. Clear your mind and try the estimate again. Don’t read your 1st estimate before trying again to avoid anchoring to your original estimate.Pros and ConsMake a list of things that will happen if your estimate is right and if your estimate is wrong. This helps bring clarity to the problem and remove some bias. After the list, try againAbsurdity TestNarrow down your range by using absurd values for your CI and making them smaller. For example, for the wingspan of a 747, starting with a range of 1 to 1000 ft. is absurd. Is 80 to 250 ft sound better? What about 180 to 220 ft?
Fuzzy logicA simple estimation tool to get an idea on effort. Classify features into Very Small, Small, Medium Large, Very Large. You then have an idea on how much work is needed based on Historical data or Industry average. Keep track of your estimate with hours.
Wideband Delphi AKA GroupBased on the statistical Law of Large Numbers. Where by the more information you have, the closer your average is to the true value. This requires a team of at lease 4 people and works best with about 10 total. First choose a coordinator. The coordinator presents the feature requested and takes estimates from each member, and averages the numbers. The coordinator then presents the data to the team. A vote is then cast and if it fails, the team estimates again. It is critical that estimates are kept secret to avoid bias. Traditional Wideband Delphi fails to represent uncertainty since the average is voted on, I recommend that you take ranges and then average the high and low numbers.
Bayes TheoremAs stated in the beginning of this presentation, we are going to avoid math as much as possible. However you cannot talk about probabilities with out talking about Bayes Theorem. It was developed by Howard Bayes in the 17th century and is widely used in many applications today from predictive text to suggested ads. The simplicity of the equation makes for highly complex “Bayesian trees”. The take away from Bayes Theorem is this: when you have more information about a probability, you must change your original estimate. The estimate at the start of a project might not always reflect the estimate half way through. By that point you have more clarity and now can better predict the outcome.
PrioritiesOnce all the features have estimates attached, how do you set the priorities on when items are going to get done? Most of the time we use order them in High, Medium, or Low. How effective is that? Based on surveys from project managers, you will find that ~85% of your tasks become High, ~10% Medium and ~5% Low. Three level scale is an easy way to avoid this trap. We create a matrix of Importance vs Urgency. By comparing the values, you get a grasp of the priority scale.
Priorities Prioritization SpreadsheetThis is the ultimate way to reduce bias. Have the customer rate relative benefit for each feature on a scale of 1-9. 9 is extremely valuable. Then estimate the penalty on a scale of 1 – 9, 9 means a serious impact on businessDevelopers Rate the relative on a scale of 1 – 9, 1 is quick an easy. Then rate the technical risk. 1 means you can do in your sleepThen sort the list descending on priority. The priority is based on the Value % / Cost % + Risk %
PoliticsDealing with the rest of the company can be a challenge. Keep in mind that there will be politics everywhere you go but you can curve some of resistance you will get. When dealing with managers or product owners remember this: You are responsible for the code. Imagine a scenario where a patient needs to be operated on by a Dr. The patient is on the table and sees the Dr. washing his hands. If the patient yells out to the Dr to stop washing his hands and get in the OR to start, is the Dr. going to listen? The patient is the boss because he is paying the Dr. but the Dr. is going to act in a manner that benefits the Patient.
Remove people from the problemEveryone will want everything done yesterday. You also have your own needs for the project that must be met. If you focus on what is best for the project (or better yet the dollar amount), it helps change the perspective. In the example early with connecting to the FooBar Rest service, if development will cost $2,000 plus another $500 per month to maintain, but saves $2.00 per lead. If you only get 2 leads a day, the savings per lead per month is only about $120. If a sales rep is demanding that this feature be implemented, prove that there is a loss per month with this service, make the priority lower.
Focus on Interests not PositionsIf the estimate takes the project longer than the ship date, work out with the project owners what you can deliver in the time. Try to get features implemented with known bugs and workarounds that you can fix the ship. If this is needed for a trade show, work out a “white page” demoing the features that you wont be able to deliver
DO NOT NEGOTIATE YOUR ESTIMATE!!!!!If you start down this path, managers are going wont believe your estimates. They will stop asking your for estimates, or think you are over or under estimating. Using the Dr. example again, if a Dr diagnoses you with cancer and tells you that you only have 6 months to live. Would the Dr. look at your face, see your sorrow and change the estimate to 3 years?