This document introduces various graphical methods in Minitab that can be used at different stages of the DMAIC cycle for process improvement projects. It discusses histograms, run charts, control charts, box plots, dot plots, scatter plots, marginal plots, matrix plots, and Pareto diagrams. For each type of graph, it provides an example using sample data and step-by-step instructions for creating the graph in Minitab. The document emphasizes that graphics are useful for visualizing relationships in data and communicating findings to others.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Six Sigma IntroductionJ. García - Verdugo
Six Sigma is a structured methodology that uses statistical methods to eliminate defects and reduce process variation. It is applied through projects led by specialists using the DMAIC cycle of Define, Measure, Analyze, Improve, and Control. The goals of Six Sigma are to fully meet customer needs economically and achieve breakthrough process improvement and profitability. It supports existing quality programs to make them more successful by providing a framework to consistently deliver precisely defined financial contributions.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Cause and Effect An...J. García - Verdugo
This document discusses a cause and effect matrix tool used in Six Sigma process improvement. It provides instructions for creating a cause and effect matrix, including identifying key customer outputs, rating their importance, evaluating the correlation between process inputs and each output, and calculating total scores to identify important inputs. The document includes an example of a cause and effect matrix applied to a cleaning process, rating inputs like training and regulations on their impact to outputs of clean, undamaged parts. It suggests the matrix helps determine which inputs and process steps require further investigation.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Z TransformationJ. García - Verdugo
The document provides an overview of Z-transformation and capability calculations based on defective units. It explains the theoretical background of Z-transformation and how it can be used to determine the portion of production outside specifications. An example shows how to calculate the percentage of a normal distribution that is above a given value using Z-transformation. The document also discusses how to calculate sigma values from defect portions and provides tables to convert between defects per million opportunities (DPMO) and sigma.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Statistical MethodsJ. García - Verdugo
This document discusses statistical process control and provides examples of statistical concepts and tools. It begins by explaining why statistics are needed for process improvement, specifically to understand variability and stability. It then gives examples of X-bar control charts to show differences between a stable, controlled process (Process A) and an unstable, uncontrolled process (Process B). Further concepts introduced include sources of variation, process capability, probability, the normal distribution, and descriptive statistics. Analytical approaches and a selection of statistical techniques are presented for analyzing different data types.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Process CapabilityJ. García - Verdugo
The document discusses process capability analysis and metrics. It provides information on calculating and interpreting process capability ratios Cp and Cpk using Minitab. Key steps in building a capability study include identifying rational subgroups, collecting a short-term dataset of 30-50 points, and analyzing the data to determine if the process is stable and normally distributed. Process capability can be estimated using pooled standard deviation for potential capability or overall standard deviation for true process capability.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Thought Process MapJ. García - Verdugo
The document provides an example of a thought process map used to structure thoughts, ideas, questions, and tools for developing a project strategy. It includes examples of how thought process maps can be used to plan cost optimization, quality improvement, and other projects. The maps allow graphical presentation of the critical thoughts, paths and questions to guide the project through identification of goals, potential issues to address, and actions needed.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Analysis of Measure...J. García - Verdugo
This document provides an introduction to measurement system analysis. It discusses key concepts like accuracy, precision, bias, repeatability, reproducibility and linearity. Accuracy refers to how close a measurement is to the true value, while precision describes the variation of repeated measurements. Sources of variation include the measurement system itself and actual process variation. The document emphasizes that the measurement system variation must be determined and separated from the process variation in order to improve the actual process. It provides examples of stability, correlation and the precision to tolerance ratio as a way to evaluate measurement systems.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Attributive Data (MSA)J. García - Verdugo
This document provides an overview of analyzing measurement systems, specifically for attributive or subjective measurements. It discusses evaluating measurement systems using the Kappa coefficient, which calculates inter-rater reliability for classification data. A Kappa value above 0.7 indicates an acceptable measurement system, while a value near 1 represents an excellent system. The document uses examples to demonstrate how to calculate Kappa values from classification data involving two or more raters. Analyzing measurement systems with Kappa helps determine if a measurement process is reliable before making process changes.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Six Sigma IntroductionJ. García - Verdugo
Six Sigma is a structured methodology that uses statistical methods to eliminate defects and reduce process variation. It is applied through projects led by specialists using the DMAIC cycle of Define, Measure, Analyze, Improve, and Control. The goals of Six Sigma are to fully meet customer needs economically and achieve breakthrough process improvement and profitability. It supports existing quality programs to make them more successful by providing a framework to consistently deliver precisely defined financial contributions.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Cause and Effect An...J. García - Verdugo
This document discusses a cause and effect matrix tool used in Six Sigma process improvement. It provides instructions for creating a cause and effect matrix, including identifying key customer outputs, rating their importance, evaluating the correlation between process inputs and each output, and calculating total scores to identify important inputs. The document includes an example of a cause and effect matrix applied to a cleaning process, rating inputs like training and regulations on their impact to outputs of clean, undamaged parts. It suggests the matrix helps determine which inputs and process steps require further investigation.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Z TransformationJ. García - Verdugo
The document provides an overview of Z-transformation and capability calculations based on defective units. It explains the theoretical background of Z-transformation and how it can be used to determine the portion of production outside specifications. An example shows how to calculate the percentage of a normal distribution that is above a given value using Z-transformation. The document also discusses how to calculate sigma values from defect portions and provides tables to convert between defects per million opportunities (DPMO) and sigma.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Statistical MethodsJ. García - Verdugo
This document discusses statistical process control and provides examples of statistical concepts and tools. It begins by explaining why statistics are needed for process improvement, specifically to understand variability and stability. It then gives examples of X-bar control charts to show differences between a stable, controlled process (Process A) and an unstable, uncontrolled process (Process B). Further concepts introduced include sources of variation, process capability, probability, the normal distribution, and descriptive statistics. Analytical approaches and a selection of statistical techniques are presented for analyzing different data types.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Process CapabilityJ. García - Verdugo
The document discusses process capability analysis and metrics. It provides information on calculating and interpreting process capability ratios Cp and Cpk using Minitab. Key steps in building a capability study include identifying rational subgroups, collecting a short-term dataset of 30-50 points, and analyzing the data to determine if the process is stable and normally distributed. Process capability can be estimated using pooled standard deviation for potential capability or overall standard deviation for true process capability.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Thought Process MapJ. García - Verdugo
The document provides an example of a thought process map used to structure thoughts, ideas, questions, and tools for developing a project strategy. It includes examples of how thought process maps can be used to plan cost optimization, quality improvement, and other projects. The maps allow graphical presentation of the critical thoughts, paths and questions to guide the project through identification of goals, potential issues to address, and actions needed.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Analysis of Measure...J. García - Verdugo
This document provides an introduction to measurement system analysis. It discusses key concepts like accuracy, precision, bias, repeatability, reproducibility and linearity. Accuracy refers to how close a measurement is to the true value, while precision describes the variation of repeated measurements. Sources of variation include the measurement system itself and actual process variation. The document emphasizes that the measurement system variation must be determined and separated from the process variation in order to improve the actual process. It provides examples of stability, correlation and the precision to tolerance ratio as a way to evaluate measurement systems.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Attributive Data (MSA)J. García - Verdugo
This document provides an overview of analyzing measurement systems, specifically for attributive or subjective measurements. It discusses evaluating measurement systems using the Kappa coefficient, which calculates inter-rater reliability for classification data. A Kappa value above 0.7 indicates an acceptable measurement system, while a value near 1 represents an excellent system. The document uses examples to demonstrate how to calculate Kappa values from classification data involving two or more raters. Analyzing measurement systems with Kappa helps determine if a measurement process is reliable before making process changes.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Non Normal DataJ. García - Verdugo
This document discusses evaluating and transforming non-normal distributed data sets to normal distributions. It describes the Box-Cox and Johnson transformations which can be used to normalize data. The Box-Cox tries to directly transform data while Johnson "distorts" a normal distribution to model the data distribution. The document provides examples applying the Johnson transformation to uniform and non-normal data sets. Graphical analyses show the transformations successfully produce normal distributions for analysis and process capability evaluations.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Chi Square TestJ. García - Verdugo
The Chi Square test can be used for three purposes: testing goodness of fit, testing independence, and testing homogeneity. It is used to determine if a sample comes from a known distribution, if two characteristics are independent of each other, or if multiple samples come from the same population. The test calculates an observed Chi Square value and compares it to a critical value from Chi Square tables to determine if the null hypothesis can be rejected or not. For a goodness of fit test on coin flipping data, the observed Chi Square value exceeded the critical value, indicating the coin was likely manipulated rather than fair.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Multi - vari StudiesJ. García - Verdugo
This document discusses procedures for conducting and analyzing multivariable studies. It begins by providing an overview and explaining that multivariable studies examine how multiple factors interact and influence process outputs. The document then discusses planning a study, collecting data, analyzing the data, and reporting results. It provides an example study looking at contamination levels and examines the effects of factors like day, shift, and time using tools like ANOVA. The results indicate shift has a significant impact on contamination levels.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Design Of Experimen...J. García - Verdugo
The document provides an introduction to design of experiments (DOE). It discusses key components of experiments including input variables, output variables, experimental design, and validity. It also covers 2k factorial designs, which introduce factors and their levels. 2k factorial experiments are presented as an uncomplicated way to start with DOE techniques. The final pages provide examples of 2k factorial design matrices and calculating effects to determine influential factors.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Taguchi Robust DesignsJ. García - Verdugo
The document describes Taguchi's method for robust design. It discusses using design of experiments to create robust products and processes by minimizing variation from the target value. The key aspects are:
- Taguchi proposed applying DOE techniques to optimize settings for factors influencing a process to make it robust against noise factors like materials and environment.
- Taguchi advocated designing processes to be "on-target" rather than just meeting specifications to reduce costs from variation.
- An example application looks at factors influencing the size of ceramic pieces after baking, with the furnace position as the noise factor. Analysis of the experiment aims to find settings minimizing output variation.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Sample Size J. García - Verdugo
A sample size that is too small increases the risks of overlooking important effects and detecting effects that are not truly present. With a larger sample size, the risks decrease but costs and time increase. The key factors in determining sample size are the desired power, significance level, expected effect size, and standard deviation. Sample size calculators can then determine the necessary sample for a given hypothesis test based on specifying values for these factors.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W32 DOE Center Points...J. García - Verdugo
The document discusses using center points in two-level factorial designs to check for linearity. It provides an example of a chemical engineer running a 2x2 design on reaction time and temperature and adding center points. The results show the center points do not significantly deviate from linearity, indicating the model is linear. A second example shows center points having a significant effect, suggesting curvature. The document also discusses incorporating block factors into a design, including an example of using two types of catalysts as block factors.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Median Tests J. García - Verdugo
This document discusses median tests that can be used as alternatives to the analysis of variance (ANOVA) when its assumptions are violated. It describes Mood's median test and the Kruskal-Wallis test, indicating that Mood's test is robust against outliers while Kruskal-Wallis is more robust against unequal distributions. An example analyzes reject rates from a pneumatic module test using both tests to support the conclusions from an initial ANOVA.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Correlation and Reg...J. García - Verdugo
The document discusses correlation and regression analysis. It provides an overview of key concepts like the regression coefficient, correlation coefficient, and fitted line plots. It also describes how to calculate regression using the method of least squares and how to validate factors using tools like t-tests, ANOVA, and regression. An example is shown analyzing the relationship between softening temperature measured at a supplier vs. a customer. The correlation between the two factors is calculated to be 0.834, indicating a strong positive correlation.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Fractional Factoria...J. García - Verdugo
This document discusses fractional factorial designs, which reduce the number of experimental runs needed compared to full factorial designs. It provides examples of fractional factorial designs with different numbers of factors and resolutions. Lower resolutions mean greater confounding between effects. Plackett-Burman designs can assess more factors than fractional factorials with the same number of runs. Fractional designs allow screening of many factors efficiently while sequential or folded designs provide more information.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Rifle Shot StudiesJ. García - Verdugo
One part of this training involves structuring process improvement efforts. Successful completion of the measure and analysis phase is key, as it builds the foundation for improvement. "Rifle Shot Studies" should concentrate on one or two input variables highly related to the problem. They are completed in small, fast pilot studies in less than a week to evaluate factors and build foundations for later decisions. "Rifle Shot Studies" executed early in a project help the team become unified by demonstrating the value of their ideas and making decisions easier with established facts.
Six Sigma Mechanical Tolerance Analysis 1David Panek
David A. Panek has 18 years of experience in cost engineering. He has expertise in tolerance analysis, Monte Carlo techniques, cost estimation, and neural costing. The document discusses different methods for tolerance analysis including worst case, statistical, and six sigma approaches. It also defines terms related to process variation and discusses measures of process capability like Cp and Cpk. Guidelines are provided for designing optimized tolerances through establishing process standard deviation and computing probabilities to achieve tight assembly gaps.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Measurement System ...J. García - Verdugo
This document discusses measurement system analysis for continuous measurements. It introduces the Gage R&R study as a tool to assess measurement systems. Key indices for evaluating measurement systems are the Percentage of Tolerance (P/T) and Percentage of Range and Repeatability (%R&R). P/T assesses how much of the specification tolerance is used by measurement error while %R&R evaluates measurement error relative to total process variation. The document provides guidelines for properly conducting a Gage R&R study and interpreting its results.
The document summarizes an intern's work on an EMS engineering internship. It discusses their tasks analyzing transformer tap positions and voltage/VAR performance across different utility areas, identifying issues affecting state estimator residuals. Notable findings included improved residuals from setting transformer taps to nominal values and enabling tap estimation. However, tap estimation worsened residuals for some utilities with low observability. The intern highlighted areas for further model and measurement improvements and automated their analysis for future use.
The document discusses various facility layout strategies and concepts. It defines facility layout as determining the placement of departments, workgroups, machines, and stock areas. Key layout formats discussed include process layout, product layout, group technology layout, and fixed-position layout. Assembly line balancing concepts are also covered, including precedence diagrams and determining cycle times and workstation loads.
This document discusses adapting kanban methods to fixed-size projects. It proposes using earned value management (EVM) techniques like tracking planned value, earned value, and actual costs to monitor project progress. Cumulative flow diagrams (CFDs) are recommended to visualize work flow and ensure the team is delivering at the desired throughput. When scope changes occur, multiple options should be evaluated using CFDs to determine their impact on capacity, timeline, and planned value before agreeing on an approach with stakeholders.
This document provides guidance on validating a rating curve using graphical and numerical tests. It describes plotting new gauging data against the existing rating curve and confidence limits to check fit. Graphical tests illustrated include period-flow deviation scattergrams, stage-flow deviation diagrams, and cumulative deviation plots to identify bias over time. Numerical tests like Student's t-test are also described to statistically compare new gaugings to the original dataset and determine if the rating curve should be recalculated. The document provides an example validation analysis for a station in Pargaon.
A Visual Guide to Design of Experiments using Quantum XLRamon Balisnomo
An introductory course on developing transfer functions through Design of Experiment (DOE) using the statistical software Quantum XL (QXL). The presentation's purpose is to: (1) give practical advice on the trade-off between the required number of experiments and the accuracy of the transfer function; (2) showcase the user interface for Quantum XL, which integrates DOE and Monte Carlo seamlessly in one package.
Study of Surface Roughness measurement in turning of EN 18 steelIRJET Journal
This document presents a study that uses response surface methodology to optimize surface roughness in the turning of EN 18 steel. Experimental work was conducted using a CNC lathe machine with spindle speed, feed rate, and depth of cut as input variables. A central composite design and Design Expert software were used to develop a test plan and analyze results. Regression equations were developed relating surface roughness to the input parameters. Confirmation experiments found less than 2.32% error between predicted and experimental surface roughness values, validating the developed model. Optimal parameters were identified as 1740.68 rpm spindle speed, 0.82 mm/min feed rate, and 1.27 mm depth of cut to achieve minimum surface roughness.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Non Normal DataJ. García - Verdugo
This document discusses evaluating and transforming non-normal distributed data sets to normal distributions. It describes the Box-Cox and Johnson transformations which can be used to normalize data. The Box-Cox tries to directly transform data while Johnson "distorts" a normal distribution to model the data distribution. The document provides examples applying the Johnson transformation to uniform and non-normal data sets. Graphical analyses show the transformations successfully produce normal distributions for analysis and process capability evaluations.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Chi Square TestJ. García - Verdugo
The Chi Square test can be used for three purposes: testing goodness of fit, testing independence, and testing homogeneity. It is used to determine if a sample comes from a known distribution, if two characteristics are independent of each other, or if multiple samples come from the same population. The test calculates an observed Chi Square value and compares it to a critical value from Chi Square tables to determine if the null hypothesis can be rejected or not. For a goodness of fit test on coin flipping data, the observed Chi Square value exceeded the critical value, indicating the coin was likely manipulated rather than fair.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Multi - vari StudiesJ. García - Verdugo
This document discusses procedures for conducting and analyzing multivariable studies. It begins by providing an overview and explaining that multivariable studies examine how multiple factors interact and influence process outputs. The document then discusses planning a study, collecting data, analyzing the data, and reporting results. It provides an example study looking at contamination levels and examines the effects of factors like day, shift, and time using tools like ANOVA. The results indicate shift has a significant impact on contamination levels.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Design Of Experimen...J. García - Verdugo
The document provides an introduction to design of experiments (DOE). It discusses key components of experiments including input variables, output variables, experimental design, and validity. It also covers 2k factorial designs, which introduce factors and their levels. 2k factorial experiments are presented as an uncomplicated way to start with DOE techniques. The final pages provide examples of 2k factorial design matrices and calculating effects to determine influential factors.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Taguchi Robust DesignsJ. García - Verdugo
The document describes Taguchi's method for robust design. It discusses using design of experiments to create robust products and processes by minimizing variation from the target value. The key aspects are:
- Taguchi proposed applying DOE techniques to optimize settings for factors influencing a process to make it robust against noise factors like materials and environment.
- Taguchi advocated designing processes to be "on-target" rather than just meeting specifications to reduce costs from variation.
- An example application looks at factors influencing the size of ceramic pieces after baking, with the furnace position as the noise factor. Analysis of the experiment aims to find settings minimizing output variation.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Sample Size J. García - Verdugo
A sample size that is too small increases the risks of overlooking important effects and detecting effects that are not truly present. With a larger sample size, the risks decrease but costs and time increase. The key factors in determining sample size are the desired power, significance level, expected effect size, and standard deviation. Sample size calculators can then determine the necessary sample for a given hypothesis test based on specifying values for these factors.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W32 DOE Center Points...J. García - Verdugo
The document discusses using center points in two-level factorial designs to check for linearity. It provides an example of a chemical engineer running a 2x2 design on reaction time and temperature and adding center points. The results show the center points do not significantly deviate from linearity, indicating the model is linear. A second example shows center points having a significant effect, suggesting curvature. The document also discusses incorporating block factors into a design, including an example of using two types of catalysts as block factors.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Median Tests J. García - Verdugo
This document discusses median tests that can be used as alternatives to the analysis of variance (ANOVA) when its assumptions are violated. It describes Mood's median test and the Kruskal-Wallis test, indicating that Mood's test is robust against outliers while Kruskal-Wallis is more robust against unequal distributions. An example analyzes reject rates from a pneumatic module test using both tests to support the conclusions from an initial ANOVA.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Correlation and Reg...J. García - Verdugo
The document discusses correlation and regression analysis. It provides an overview of key concepts like the regression coefficient, correlation coefficient, and fitted line plots. It also describes how to calculate regression using the method of least squares and how to validate factors using tools like t-tests, ANOVA, and regression. An example is shown analyzing the relationship between softening temperature measured at a supplier vs. a customer. The correlation between the two factors is calculated to be 0.834, indicating a strong positive correlation.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Fractional Factoria...J. García - Verdugo
This document discusses fractional factorial designs, which reduce the number of experimental runs needed compared to full factorial designs. It provides examples of fractional factorial designs with different numbers of factors and resolutions. Lower resolutions mean greater confounding between effects. Plackett-Burman designs can assess more factors than fractional factorials with the same number of runs. Fractional designs allow screening of many factors efficiently while sequential or folded designs provide more information.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Rifle Shot StudiesJ. García - Verdugo
One part of this training involves structuring process improvement efforts. Successful completion of the measure and analysis phase is key, as it builds the foundation for improvement. "Rifle Shot Studies" should concentrate on one or two input variables highly related to the problem. They are completed in small, fast pilot studies in less than a week to evaluate factors and build foundations for later decisions. "Rifle Shot Studies" executed early in a project help the team become unified by demonstrating the value of their ideas and making decisions easier with established facts.
Six Sigma Mechanical Tolerance Analysis 1David Panek
David A. Panek has 18 years of experience in cost engineering. He has expertise in tolerance analysis, Monte Carlo techniques, cost estimation, and neural costing. The document discusses different methods for tolerance analysis including worst case, statistical, and six sigma approaches. It also defines terms related to process variation and discusses measures of process capability like Cp and Cpk. Guidelines are provided for designing optimized tolerances through establishing process standard deviation and computing probabilities to achieve tight assembly gaps.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W2 Measurement System ...J. García - Verdugo
This document discusses measurement system analysis for continuous measurements. It introduces the Gage R&R study as a tool to assess measurement systems. Key indices for evaluating measurement systems are the Percentage of Tolerance (P/T) and Percentage of Range and Repeatability (%R&R). P/T assesses how much of the specification tolerance is used by measurement error while %R&R evaluates measurement error relative to total process variation. The document provides guidelines for properly conducting a Gage R&R study and interpreting its results.
The document summarizes an intern's work on an EMS engineering internship. It discusses their tasks analyzing transformer tap positions and voltage/VAR performance across different utility areas, identifying issues affecting state estimator residuals. Notable findings included improved residuals from setting transformer taps to nominal values and enabling tap estimation. However, tap estimation worsened residuals for some utilities with low observability. The intern highlighted areas for further model and measurement improvements and automated their analysis for future use.
The document discusses various facility layout strategies and concepts. It defines facility layout as determining the placement of departments, workgroups, machines, and stock areas. Key layout formats discussed include process layout, product layout, group technology layout, and fixed-position layout. Assembly line balancing concepts are also covered, including precedence diagrams and determining cycle times and workstation loads.
This document discusses adapting kanban methods to fixed-size projects. It proposes using earned value management (EVM) techniques like tracking planned value, earned value, and actual costs to monitor project progress. Cumulative flow diagrams (CFDs) are recommended to visualize work flow and ensure the team is delivering at the desired throughput. When scope changes occur, multiple options should be evaluated using CFDs to determine their impact on capacity, timeline, and planned value before agreeing on an approach with stakeholders.
This document provides guidance on validating a rating curve using graphical and numerical tests. It describes plotting new gauging data against the existing rating curve and confidence limits to check fit. Graphical tests illustrated include period-flow deviation scattergrams, stage-flow deviation diagrams, and cumulative deviation plots to identify bias over time. Numerical tests like Student's t-test are also described to statistically compare new gaugings to the original dataset and determine if the rating curve should be recalculated. The document provides an example validation analysis for a station in Pargaon.
A Visual Guide to Design of Experiments using Quantum XLRamon Balisnomo
An introductory course on developing transfer functions through Design of Experiment (DOE) using the statistical software Quantum XL (QXL). The presentation's purpose is to: (1) give practical advice on the trade-off between the required number of experiments and the accuracy of the transfer function; (2) showcase the user interface for Quantum XL, which integrates DOE and Monte Carlo seamlessly in one package.
Study of Surface Roughness measurement in turning of EN 18 steelIRJET Journal
This document presents a study that uses response surface methodology to optimize surface roughness in the turning of EN 18 steel. Experimental work was conducted using a CNC lathe machine with spindle speed, feed rate, and depth of cut as input variables. A central composite design and Design Expert software were used to develop a test plan and analyze results. Regression equations were developed relating surface roughness to the input parameters. Confirmation experiments found less than 2.32% error between predicted and experimental surface roughness values, validating the developed model. Optimal parameters were identified as 1740.68 rpm spindle speed, 0.82 mm/min feed rate, and 1.27 mm depth of cut to achieve minimum surface roughness.
Simply Thank You is a company that specializes in designing reward and recognition programs to boost engagement for both customers and employees. They offer a variety of reward types including gift cards, points systems, and tangible gifts that can be mixed and matched to create customized programs. With over 20 years of experience designing successful programs, Simply Thank You helps large businesses and SMEs align their engagement strategies with their brands and goals.
Our Pace EMBA consulting group has been hired to help PepsiCo figure out strategies to revitalize the hydration beverage category and to recommend a new product in this category based on consumer research. We researched hydration industry trends, consumer perceptions of PepsiCo hydration brands, consumers perceived needs in this category, consumer behavior with respect to hydration products, and competing products and companies. From this research, here is our recommendations to PepsiCo regarding the company’s hydration strategy.
G-CLOUD: АРХИТЕКТУРА, ЗАЩИТА ИНФОРМАЦИИ И ВЫПОЛНЕНИЕ ТРЕБОВАНИЙ ЗАКОНОДАТЕЛЬС...ActiveCloud
Вячеслав Аксёнов
архитектор систем информационной безопасности управления разработки и внедрения облачных решений компании ActiveCloud
Использование технологий облачных вычислений в сфере государственных услуг становится тенденцией во всем мире. Более двадцати европейских стран используют или планируют использовать технологии облачных вычислений в государственном секторе.
The document proposes changes to improve Minitab's customer support system. It suggests splitting support into two sections: Support for downloads/licensing/macros and a Knowledgebase/FAQ. It recommends displaying top FAQs first before allowing users to search help or call support. The layout would be updated with clearer naming and headings to help customers more easily find the information and support options they need.
This document provides an overview of topics covered in a Lean Six Sigma Green Belt workshop, including potential project pitfalls, defining a project charter, risk analysis, cost-benefit analysis, and go/no-go decision making. It also discusses process mapping and selecting critical-to-quality characteristics. The document contains examples and definitions to explain process mapping techniques and the components and purpose of process maps.
This document discusses factorial design for pharmaceutical experiments. It defines factorial design as an experiment where two or more factors are each studied at different levels or values. The document then describes different types of factorial designs, including full factorial designs with two or three levels, and fractional factorial designs used when there are more than five factors. It also explains how factors and levels are coded numerically for the experiments.
El documento presenta un curso sobre el programa estadístico Minitab15. Explica las características del programa, como su facilidad de uso, gráficas, herramientas estadísticas y de calidad. También describe cómo crear proyectos, hojas de trabajo y gráficos, y realizar análisis de datos como estadísticas descriptivas, ANOVA de un factor e interpretación de resultados.
The document discusses fractional factorial designs, which use a fraction of the total number of combinations in a full factorial design to reduce the number of required runs. It describes how effects become confounded in fractional designs and how design resolution relates to confounding. It provides examples of 2-level and 3-level fractional factorial designs, and discusses other types of designs like Plackett-Burman, central composite, and Taguchi designs. The key benefits of fractional factorial designs are reducing the number of required runs when there are many factors to investigate.
Advanced DOE with Minitab (presentation in Costa Rica)Blackberry&Cross
This document describes using a split-plot design for a wind tunnel experiment to optimize the aerodynamic performance of a racecar. The experiment had 4 factors, with 2 that were hard-to-change (front and rear ride heights) and 2 that were easy-to-change (yaw angle and grill cover). A split-plot design was used to reduce the total time needed, collecting data from 45 runs over 10 hours instead of 36 runs over 30 hours. The analysis accounted for two sources of error and showed several significant factors for improving downforce and reducing drag.
2.0 Introduction
2.1 Objectives
2.2 Meaning of Descriptive Statistics
2.3 Organisation of Data
2.3.1 Classification
2.3.1.1 Frequency Distribution can be with Ungrouped Data and Grouped Data
2.3.1.2 Types of Frequency Distribution
2.3.2 Tabulation
2.3.3 Graphical Presentation of Data
2.3.3.1 Cumulative Frequency Curve or Ogive
2.3.4 Diagrammatic Presentation of Data
2.4 Summarisation of Data
2.4.1 Measures of Central Tendency
2.4.2 Measures of Dispersion
2.4.3 Skewness and Kurtosis
2.4.4 Advantages and Disadvantages of Descriptive Statistics
2.5 Meaning of Inferential Statistics
2.5.1 Estimation
2.5.2 Point Estimation
2.5.3 Interval Estimation
2.6 Hypothesis Testing
2.6.1 Statement of Hypothesis
2.6.2 Level of Significance
2.6.3 One Tail and Two Tail Test
2.7 Errors in Hypothesis Testing
2.7.1 Type I Error
2.7.2 Type II Error
2.7.3 Power of a Test
2.8 General Procedure for Testing A Hypothesis
This document provides an introduction to using Minitab statistical software. It outlines the Minitab layout, menus, and some basic tools. Specifically, it discusses the file, edit, data, calc, stat, and graph menus. It provides an example using the data and calculator tools to calculate total defects by summing defect columns. The goal is to familiarize users with navigating Minitab and using some common tools.
Stuck with your Forecasting Assignment? Get 24/7 help from tutors with Phd in the subject. Email us at support@helpwithassignment.com
Reach us at http://www.HelpWithAssignment.com
Walle produce all series of hydraulic seals,according to category by purpose,including rotary seals, all kinds of O ring(FKM O ring), step seal, glyd ring, oil seal and various of combination seals, gasket, spring energised seals, food machinery seals and so on.
This document provides an overview of statistical quality control techniques. It describes the three main categories of statistical quality control as statistical process control, descriptive statistics, and acceptance sampling. Control charts are introduced as a key tool of statistical process control, and the differences between variable and attribute control charts are explained. Process capability, six sigma methodology, and acceptance sampling plans are also overviewed.
New Clustering-based Forecasting Method for Disaggregated End-consumer Electr...Peter Laurinec
This paper presents a new method for forecasting the load of individual electricity consumers using smart grid data and clustering. The data from all consumers are used for clustering to create more suitable training sets to forecasting methods. Before clustering, time series are efficiently preprocessed by normalisation and the computation of representations of time series using a multiple linear regression model. Final centroid-based forecasts are scaled by saved normalisation parameters to create forecast for every consumer. Our method is compared with the approach that creates forecasts for every consumer separately. Evaluation and experiments were conducted on two large smart meter datasets from residences of Ireland and factories of Slovakia.
The achieved results proved that our clustering-based method improves forecasting accuracy and decreases high rates of errors (maximum). It is also more scalable since it is not necessary to train the model for every consumer.
The document discusses using Six Sigma methodology to reduce in-process rejections at a manufacturing unit producing wheel cylinders. It analyzes the main causes of rejection, implements solutions, and measures the results. The key defects causing rejection were identified as main bore shift and M10 damage. Solutions like correcting hydraulic leaks, improving clamping, and modifying casting processes reduced monthly rejections from 205 to 15 and increased the sigma rating from 2.92 to 3.62, saving approximately $42,780 annually.
This document provides an overview and guidelines for portraying accounting postings in a T-account format simulation with automated summarization and reconciliation. It outlines preparing the general ledger accounts, cost elements, and cost objects to be used, then sketching the postings by entering references to debit and credit the applicable accounts. The postings are depicted in separate views for general ledger accounts/cost elements and cost objects for clarity.
Mba om 14_statistical_qualitycontrolmethodsNiranjana K.R.
This document provides an overview of statistical quality control techniques including:
- Describing categories of statistical quality control and how to measure quality characteristics.
- Explaining sources of variation, process capability, and how to set control limits for control charts.
- Detailing different types of control charts for variables and attributes including x-bar, R, p, and c charts.
- Defining three sigma and six sigma process capability and how they relate to acceptable defect levels.
- Discussing challenges in measuring quality in service organizations and potential metrics that could be monitored.
Lecture6 Chapter3- Function Simplification using Quine-MacCluskey Method.pdfUmerKhan147799
This document discusses function minimization using the Quine-McCluskey algorithm. It begins with the objectives of simplifying functions in Sum-of-Products form using this algorithm. The 4 steps of the algorithm are then described: 1) generate prime implicants, 2) construct a prime implicant table, 3) reduce the table by removing essential prime implicants and applying column/row dominance, 4) solve the table using Petrick's method or branching. An example problem is worked through, generating 10 prime implicants and reducing the table to find the essential prime implicants and a minimum solution.
Managing Agile Software Development QuantitativelyJayGray
This document discusses quantitatively managing agile software development. It proposes using readily available data to benchmark agile team performance and optimize agile release contracts. The first paper explains how to benchmark agile releases. The second discusses optimizing agile release timeboxes based on team performance and constraints. The third explains managing dynamic changes to the agile "contract" using statistical process control. It advocates continuously updating benchmarks to help government agencies objectively evaluate supplier performance and negotiate contracts.
The document discusses estimating the parameters of a Cobb-Douglas production function using econometric methods. It provides the equation Q=A*K^α + L^β and estimates the parameters as α = 0.206925 and β = 0.952008 using least squares regression. It finds that changes in K and L explain 95% of the variation in Q. Finally, it determines that the industry exhibits increasing returns to scale since α + β is greater than 1.
This document summarizes quality analysis data from the machine shop (HMC) of a leading manufacturing company. It includes process flow charts and data on turning, facing, milling and drilling operations with measurement charts and capability analysis. Key findings are that the current process efficiency is 55.24% and sigma level is 3.2, below the company's ISO standards. The document recommends upgrading machinery, implementing computer numerical control, hiring skilled workers, and following six sigma to improve quality and efficiency.
This document summarizes quality analysis results from Machine Shop (HMC) including turning, facing, milling and drilling data. Key findings are:
- Process efficiency is 55.24% with most time spent on operation (79 minutes) and inspection (30 minutes).
- Capability analysis shows Cp and Cpk values ranging from 0.574 to 1.66, indicating some processes are capable while others need improvement.
- Total monthly costs are 284,000 rupees with highest for electricity (150,000 rupees).
- Current sigma level is 3.2 based on defect parts per million, below the company's ISO 9001 certification level.
- Recommendations
The document discusses various aspects of manufacturing plant design including plant layouts, scheduling, and inventory management. It provides details on different types of plant layouts such as functional, product, and cellular layouts. It also covers topics like single machine scheduling, parallel machine scheduling, critical path analysis, economic order quantity, and master production scheduling. Plant location selection and material requirement planning are analyzed through examples and formulas. The document is an in-depth reference on manufacturing design and production planning concepts.
This document provides an outline for reviewing key concepts in production and operations management. It covers topics such as organizing the production function, productivity and forecasting methods, facility planning, quality management, inventory management, and emerging trends like supply chain management, lean operations, and total quality management. Examples are provided to illustrate productivity improvements at Starbucks and Taco Bell. Forecasting techniques like moving average, weighted average, and regression analysis are defined. The critical path method and Gantt charts are explained as project management tools.
This document provides a case study and questions about the bottled water market. Key points include:
- Demand and supply graphs are used to show the impact of increases in gasoline prices on the equilibrium price and quantity of bottled water.
- Elasticity is discussed, showing how a more inelastic demand curve results in a smaller change in equilibrium quantity compared to a more elastic curve.
- Regression analysis is performed to estimate demand equations for summer and winter. Supply is also estimated.
- Graphs are drawn and analyzed to illustrate changes in demand vs. quantity demanded and changes in supply vs. quantity supplied from changes in input prices like gasoline.
- Simplified demand and supply equations are
This document summarizes key aspects of thermo compression bonding (TCB) technology from the perspective of a machine vendor. It discusses three core TCB capabilities that are essential for high yield: accuracy, co-planarity, and sophisticated bond control. Maintaining high accuracy during temperature ramping from cold to hot states is critical. Co-planarity must also be maintained during temperature ramping to avoid yield issues. Sophisticated hybrid bond control, as used in the TC-CUF process, tightly controls bond line thickness despite thermal expansion movements.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Multiple RegressionJ. García - Verdugo
The document discusses multiple regression analysis to model the relationship between inputs and outputs of a process. An example is provided of using measurements from 8 X-ray tubes to determine which input parameters influence the width, length, and unbalancedness of the focal spot, which are the critical outputs. Regression analysis indicates that ambient temperature and humidity define the necessary evaporation temperature to control cooling water usage. The analysis allows determining the required air amount based on water temperature to maintain ammonia loss between set limits.
Similar to Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Minitab - Graphical Methods (20)
The document discusses the concept of poka yoke, which are error-proofing devices used in manufacturing to prevent defects. It describes Shigeo Shingo's definition of poka yoke as preventing inadvertent mistakes. It also discusses different types of inspection processes and how poka yoke aims to eliminate defects by detecting errors early in the production process through the use of simple error-proofing devices built into operations. Shingo's method uses poka yoke systems and devices to achieve zero defects, zero waste, and zero delays in production.
Javier Garcia - Verdugo Sanchez - Trabajo en equipo y dirección de reunionesJ. García - Verdugo
Este documento describe los equipos de trabajo y cómo funcionan de manera eficaz. Explica que los equipos son grupos de personas comprometidas con un objetivo común que trabajan de forma interdependiente. Detalla las características de un equipo eficaz, incluyendo el establecimiento de normas para la comunicación, la toma de decisiones y el aprovechamiento del tiempo. Además, explica que los equipos eficaces desarrollan sus reuniones a través de ciclos que incluyen la definición de objetivos y el feedback.
Javier Garcia - Verdugo Sanchez - The 8D (Eigth Disciplines) MethodologyJ. García - Verdugo
This document provides an overview of the 8-discipline (8-D) problem solving methodology. It discusses the origins of the 8-D system in a US military standard from 1974. The 8-D process uses a team-based approach and eight disciplines to identify and address problems, with the goals of preventing recurrences and achieving continuous improvement. The document also references additional files and graphics that are included to help explain 8-D concepts.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Autocorrelation and...J. García - Verdugo
The document discusses autocorrelation and cross correlation analysis of time series data. It provides an example of measuring daily body weight over 4 weeks and finds autocorrelation at a lag of 1 day. This indicates dependence between successive daily measurements. The document also analyzes viscosity measurements taken hourly and finds autocorrelation up to a lag of 4 hours. An autoregressive model is fitted to account for this autocorrelation. Finally, the document examines cross correlation between methane feed rate and CO2 concentration measurements taken minute-by-minute. The largest correlation is found at a lag of -1 minute, suggesting the CO2 is affected by methane feed rate from the previous minute.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Reliability J. García - Verdugo
The document discusses reliability, including definitions of reliability, reliability phases, reliability importance, reliability calculations for serial and parallel systems, and Weibull analysis. Reliability is defined as the probability that a product or system will function as intended without failure over a specified period of time. There are generally three failure phases: infant mortality with early high failure rates, random failures, and wear out with increasing failure rates over time. Reliability is important for customers, cost savings, and competitiveness. Calculations can determine the reliability of serial and parallel systems based on component reliabilities. Weibull analysis involves plotting failure data to determine the appropriate failure distribution.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Monte Carlo Simulat...J. García - Verdugo
This document describes using Monte Carlo simulation to analyze variations in manufacturing processes and electrical circuits. It discusses generating random input variables based on their distributions, calculating output results using equations, and analyzing the output distribution to determine process capability and variation. An example simulates variations in five stacked metal parts and calculates the total dimension distribution. Another simulates variations in resistor values in an electrical circuit and calculates the distribution of the output voltage.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Lean IntroJ. García - Verdugo
The document provides an overview of lean manufacturing principles and concepts. It discusses the history and evolution of manufacturing approaches from hand crafting to mass production to lean. Key lean concepts are defined, including just-in-time, single piece flow, takt time, cycle time, visual controls, 5S, waste elimination and pull vs push systems. The phases of a typical lean project are outlined as define, measure, analyze, improve and control. Overall the document serves as an introductory guide to lean manufacturing principles, tools and terminology.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Statistical Toleran...J. García - Verdugo
The document discusses statistical tolerance analysis and six sigma tolerancing. It covers topics like worst case tolerancing, root sum of squares tolerancing, statistical tolerancing for linear and non-linear applications. Various tolerance analysis methods like worst case analysis, statistical tolerance analysis and vector analysis are described. Examples of calculating tolerances for assembly gaps and performing statistical tolerance analysis on automotive brake disk assembly are presented.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 Analysis of CovariatesJ. García - Verdugo
This document discusses the analysis of covariates to account for variation from uncontrollable factors in experimental designs. It provides an example of using filament diameter as a covariate in an experiment investigating the effect of production lines on tensile strength. Accounting for diameter significantly increases the amount of variation explained. The document also examines using locator pin position as a covariate in an experiment on printed circuit board assembly.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 The Binary Logistic...J. García - Verdugo
The document discusses binary logistic regression and provides an example. It analyzes data from a study of 100 men investigating the relationship between age and risk of coronary heart disease. Logistic regression is used to estimate the effect of age on the probability of disease. The analysis finds that for each one year increase in age, the odds of disease increase by 13% (odds ratio of 1.13).
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W4 StarterJ. García - Verdugo
This document outlines the agenda and expectations for a Six Sigma training workshop taking place over the course of a week. The workshop will focus on teaching tools used in the Six Sigma DMAIC process for defining, measuring, analyzing, improving, and controlling processes. Participants will review work from the previous week and learn new tools such as regression analysis, design of experiments, statistical tolerancing, and reliability analysis. Time will also be spent on project reviews where teams will report their progress, results, and next steps. The goals are for participants to learn how to apply these tools in their work to drive improvements and financial benefits in their projects.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 QFD Customer Requir...J. García - Verdugo
This document provides an overview of Quality Function Deployment (QFD). QFD is a structured approach to defining customer needs and translating them into product design requirements and specifications. It involves gathering customer requirements, prioritizing them based on importance, and determining how well competitors meet those needs. Multiple "Houses of Quality" are used to map customer needs to functional requirements to design parameters and ensure the final product will satisfy customers. The process involves gathering customer input, analyzing competitor performance, setting goals for improvements, and calculating development priorities to guide product planning and design.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Financial Integration J. García - Verdugo
This document provides guidelines for calculating the financial benefits of Six Sigma projects. It discusses categories for project benefits such as cost reduction, cost avoidance, cash flow improvement, and growth. It explains how to calculate benefits for different types of projects, including volume projects, cost reduction projects, and cost avoidance projects. Required data for calculating benefits includes production costs, capacities, part numbers, and cost centers. Financial benefits are typically tracked over a period of 12 months after a project is completed.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Confidence Intervals J. García - Verdugo
1) The document discusses confidence intervals, which provide a range of values that are likely to include an unknown population parameter based on a sample.
2) Confidence intervals can be calculated for a mean, standard deviation, and proportion based on the sample size and desired confidence level, usually 95%.
3) Examples are provided for calculating 95% confidence intervals for the mean, standard deviation, and proportion to estimate unknown population parameters based on sample data.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Complex Designs J. García - Verdugo
The document discusses evaluating experiments with multiple responses that may conflict. It provides examples of using a response optimizer and overlaid contour plots in Minitab to determine optimal factor settings that maximize desired results. The response optimizer calculates optimal settings by maximizing a desirability function. Overlaid contour plots show the factor space where all responses meet limits, allowing selection of preferred regions. An example optimizes a rubber mix for tires using these tools.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 Full Factorial Desi...J. García - Verdugo
The document discusses full factorial designs for analyzing experiments with two or more factors and levels. It provides examples of 2-factor and 3-factor full factorial designs, and how to customize, evaluate, and analyze the results using statistical software. Graphical analysis methods like effect plots and residual diagnostics are demonstrated. Response surface methodology for investigating quadratic effects is also introduced.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W3 StarterJ. García - Verdugo
This document provides an agenda and overview for a Green Belt Six Sigma training workshop taking place over one week. It outlines the safety procedures, timing, purpose and expectations for the training. The agenda lists the topics to be covered each day, including recaps of previous weeks, exercises using Six Sigma tools like DOE and RSM, and project reviews. Participants are expected to actively engage in team exercises and apply what they learn to their projects. The training aims to teach the structured use of Six Sigma tools to help participants meet project expectations and goals.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELgerogepatton
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
Javier Garcia - Verdugo Sanchez - Six Sigma Training - W1 Minitab - Graphical Methods
1. Introduction in Minitab:
- Graphical Methods
14
12
10
Mean 6,054
StDev 0,2541
N 72
Histogram of Water Content
Normal
Boxplot of Water Content
Frequency
6,66,46,26,05,85,6
8
6
4
2
0
WaterContent 6,50
6,25
6,00
Time Series Plot of Water Content
Water Content
5,75
5,50
WaterContent
6,50
6,25
6,00
5,75
ngCheck
210
200
190
Scatterplot of Recieving Check vs Final check
Index
70635649423528211471
5,75
5,50
Final check
Recievin
240230220210200190180170160
180
170
160
Week 1
Knorr-Bremse Group
Graphical Methods & the DMAIC Cycle
Control
Maintain
DefineMaintain
Improvements
SPC
Control Plans
Project charter
(SMART)
Business Score Card
QFD VOC
D
Documentation QFD + VOC
Strategic Goals
Project strategy
C M
Measure
B li A l iImprove
AI
Baseline Analysis
Process Map
C + E Matrix
M t S t
Analyze
Improve
Adjustment to the
Optimum
FMEA Measurement System
Process Capability
Definition of critical
Inputs
FMEA
S
FMEA
Statistical Tests
Simulation
Tolerancing Statistical Tests
Multi-Vari Studies
Regression
Tolerancing
Always and
Everywhere!
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 2/34
Everywhere!
2. The Use of Graphs
In every phase of the DMAIC cycle during your project work
you will need to answer questions In general we can findyou will need to answer questions. In general we can find
answers for these questions with three methods in the
following order: 1. what kind of practical relations exist, 2. howg p ,
can I present that graphically and 3. which analytical methods
can I use to get the proof.
Graphics are useful in every project in two ways. They are
helpful to visualize the relations and to communicate them.
1. Practical1. Practical
2. GraphicalG ap ca
3. Analytical
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 3/34
y
About this Module
In this module you will be introduced to the
use of the software „Minitab“. After a shortuse of the software „Minitab . After a short
time you will be able to create several different
graphs and understand where to use these .
• Histogram
• Run Chart (Control Chart)( )
• Box Plot
• Dot Plot
• X-Y Scatter Plot
• Marginal Plot
• Matrix Plot
• Pareto Diagram
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 4/34
• Cause and Effect Diagram
3. Histogram
• For this example we need the file: WATER CONTENT.MTW
• The Variable (Y) is the water content of a mixing process.
The process runs 24hrs. at 6 days a week in 3 shifts. The
water contents should be held in the range of 5,5 – 7 %.
This is checked every 2 hrs
Graph
This is checked every 2 hrs.
>Histogram…
Day Time Shift Water Contenty
1 6 1 5,67
1 8 1 6
1 10 1 6,27
1 12 1 6,33
Select a
,
1 14 2 6,53
1 16 2 5,93
1 18 2 6
1 20 2 6,27
type of
graph!
1 20 2 6,27
1 22 3 6,07
1 0 3 6,33
1 2 3 6,13
1 4 3 6 071 4 3 6,07
2 6 1 6,33
2 8 1 6,47
2 10 1 6
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 5/34
Histogram
Graph
>Histogram…
>Simple
Select a
graph>Simple graph
variable
14
12
Histogram of Water Content
ncy
12
10
8
You can adjust the
graph by double
clicking the item you
Freque
6
4
clicking the item you
would like to change.
On the next page we
6,46,26,05,85,6
2
0
On the next page we
will change the
number of intervals
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 6/34
Water Content
6,46,26,05,85,6
4. Histogram
Graph
>Histogram…
>Simple
2. Select
Binning
3 Change>Simple
>Edit X Scale…
3. Change
number of
intervals
14
12
Histogram of Water Content
1. Select the X
axis with a ncy
12
10
8
double click
Freque
6
4
6,556,406,256,105,955,805,655,50
2
0
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 7/34
Water Content
6,556,406,256,105,955,805,655,50
Histogram
Graph
>Histogram…
>With Fit and Groups>With Fit and Groups
>Multiple Graphs
>By Variables
Histogram of Water Content
6,66,46,26,05,85,65,4
3
1 2 3
Mean 6,133
StDev 0 2292
1
g
Normal
2
1
sity
StDev 0,2292
N 12
Mean 6,183
StDev 0,2241
N 12
2
0
3
2
Dens
4 5 6
Mean 6,161
StDev 0,2386
N 12
3
4
6,66,46,26,05,85,65,4
1
0
6,66,46,26,05,85,65,4
Water Content
Mean 5,85
StDev 0,2560
N 12
Mean 5,911
StDev 0 1871
5
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 8/34
StDev 0,1871
N 12
6
Panel variable: Day
5. Histogram
Graph
>Histogram…
>With Fit and Groups
Histogram of Water Content
Normal
2,5
2,0
Day
1
2
3
4
quency
2,0
1,5 Mean StDev N
6,133 0,2292 12
6,183 0,2241 12
5
6
Freq
1,0
0 5
6,161 0,2386 12
5,85 0,2560 12
5,911 0,1871 12
6,083 0,2241 12
6, 83 0,
6,66,46,26,05,85,65,4
0,5
0,0
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 9/34
Water Content
Run Chart – Time Series Plot
Run charts use the same set of data as histograms, but
shows graphically the behavior over a certain time range.
Create a run chart with the same set of data
Stat
Create a run chart with the same set of data.
>Time Series
>Time Series Plot…
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 10/34
6. Run Chart – Time Series Plot
Stat
>Time Series
>Ti S i Pl t>Time Series Plot…
>Simple
6,50
Time Series Plot of Water Content
ontent
6,25
WaterCo
6,00
5,75
70635649423528211471
5,75
5,50
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 11/34
Index
70635649423528211471
Run Chart – Time Series Plot
Stat
>Time Series
>Ti S i Pl t>Time Series Plot…
>With Groups
Select a
group
variable
6,50
Shift
3
1
2
Time Series Plot of Water Content
ontent
6,25
3
WaterCo
6,00
5,75
70635649423528211471
5,75
5,50
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 12/34
Index
70635649423528211471
7. From a Run Chart to a Control Chart
Stat
>Control Charts
>V i bl Ch t f I di id l
The individual chart is the most simple graph
within the statistical process control (SPC).
>Variable Charts for Individuals
>Individuals…
As the output you get the mean value and the
control limits based on the mean +- 3 StDev.
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 13/34
From a Run Chart to a Control Chart
Stat
>Control Charts
>Variable Charts for Indi id als>Variable Charts for Individuals
>Individuals…
6,75
UCL=6,638
I Chart of Water Content
lValue
6,50
6,25
Individual
6,00
5,75
_
X=6,054
70635649423528211471
,
5,50 LCL=5,469
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 14/34
Observation
70635649423528211471
8. Box Plot
Graph
>Boxplot…
>One Y>One Y
Simple
6,50
Boxplot of Water Content
95%
It represents 90% of the
data and there
ent
6,25 75%
95%data and there
distribution.
Very powerful if data
WaterConte
6,00
25%
50%
y p
are split into
subgroups, see next
page
5,75
25%
5 %
page
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 15/34
5,50
Box Plot
Graph
>Boxplot…
>One Y>One Y
With Groups
6,50
Boxplot of Water Content vs Day
ontent
6,25
WaterCo
6,00
5,75
654321
5,75
5,50
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 16/34
Day
654321
9. Dot Plot
Graph
>Dotplot…
>One Y Simple>One Y Simple
This diagram is very similarThis diagram is very similar
to a histogram.
Always all the data will be
shown
Dotplot of Water Content
shown.
Water Content
6,446,306,166,025,885,745,60
We also have the possibility to split the data in
subgroups (By variable).
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 17/34
Try some possibilities.
Dot Plot
Graph
>Dotplot…
>One Y>One Y
With Groups
Dotplot of Water Content vs Shift
Shift
1
2
Water Content
6,446,306,166,025,885,745,60
3
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 18/34
10. X-Y Scatter Plot
Graph
>Scatterplot…
>Simple
With scatter plots we can compare two rows of
continuous data and visualize their relation.
>Simple
An example: The results shows the softening temperatures measured during
the final check at the supplier and at the incoming inspection of the customer.
File: SoftenTemp mtw
pp g p
The results of two different plastic types are listed in two columns.
File: SoftenTemp.mtw
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 19/34
X-Y Scatter Plot
Graph
>Scatterplot…
>Simple
In the menu scatter plots Minitab offers the
option to add a regression line. This subject will
be discussed in week 2>Simple
Graph
be discussed in week 2.
210
200
Scatterplot of Recieving Check vs Final check
>Scatterplot…
>With Regression
ievingCheck
190
180
210
Scatterplot of Recieving Check vs Final check
Reci
180
170
Check
200
190
Final check
240230220210200190180170160
160
RecievingC
180
170
Final check
240230220210200190180170160
170
160
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 20/34
Final check
11. X-Y Scatter Plot
With Minitab you have the
possibility to adjust the
h f d
Graph
>Scatterplot…
>With Connect and Groups
graphs for your needs.
We need some entries in
h d di l
>With Connect and Groups
the data display.
210 Material
1
2
Scatterplot of Recieving Check vs Final check
Check
200
190
2
RecievingC
180
240230220210200190180170160
170
160
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 21/34
Final check
Marginal Plot
Graph
>Marginal Plot…
>With Histogram
A further possibility for visualization is a combination
of plots. Using the same data as before.
>With Histogram
We combine a scatter plot with either
histogram, box plot or dot plot.
Marginal Plot of Recieving Check vs Final check
eck
210
200
RecievingChe
190
180
170
Final check
R
240220200180160
170
160
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 22/34
12. Matrix Plot
Graph
>Matrix Plot…
>Matrix of plots Simple
This is helpful if the problem is more complex.
You visualize the relations. It may serve as a
start point for further investigation>Matrix of plots Simple start point for further investigation.
Day Shift Sample time Temp Pressure Contamination %
File: Contamination.mtw
Day Shift Sample time Temp Pressure Contamination %
1 1 1 91 48 2
1 1 2 97 52 2
1 1 3 88 44 2
1 1 4 87 43 1
1 2 1 109 50 6
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 23/34
Matrix Plot
Graph
>Matrix Plot…
>Matrix of plots Simple
Matrix Plot of Contamination %; Temp; Pressure
>Matrix of plots Simple
Contamination %
5,0
2,5
11010090
2,5
0,0
110
100
Temp
100
90
55
5 02 50 0
50
45
Pressure
555045
? ?
5,02,50,0 555045
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 24/34
What can we learn here? What are the next possible steps?
13. Pareto Diagram
Pareto diagrams sort events vs. their frequencies, e.g. defects as ag q g
function of their occurrence. A rule of thumb says that 20% of the
causes are liable for 80% of the effects.
Example: during an inspection process 4 different types of defects
were monitored over 4 weeks. File: PARETO.CONTROL.MTW
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 25/34
Pareto Diagram
Stat
>Quality Tools
>P t Ch t>Pareto Chart
Pareto Chart of Defects
1000
800
100
80
Count
Percent
600
00
60
P
400
200
40
20
Defects
Count
12,3
431 293 132 120
Percent 44,2 30,0 13,5
DeformationColorFlawsWeight
0 0
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 26/34
Cum % 44,2 74,2 87,7 100,0
14. Pareto Diagrams
The same data as before.
The diagrams on a weekly
lscale.
Required set up of the data
has shown.
Pareto Chart of reason by W 1 to 4
DefColorFlawsWeight
300
W 1 to 4 = 1 W 1 to 4 = 2 reason
Weight
Flaws
Pareto Chart of reason by W 1 to 4
nt
200
100
Color
Def
Coun
0
300
200
W 1 to 4 = 3 W 1 to 4 = 4 Defects W1 Defects W2 Defects W3 Defects W4 Reason W 1 to 4
Weight Flaws Weight Weight Weight 1
Flaws Flaws Weight Weight Flaws 1
Weight Flaws Flaws Color Weight 1
Def Flaws Def Weight Def 1
Weight Weight Color Flaws Weight 1
DefColorFlawsWeight
100
0
Flaws Weight Flaws Flaws Flaws 1
Weight Flaws Flaws Weight Weight 1
Flaws Weight Weight Def Flaws 1
Weight Flaws Flaws Flaws Weight 1
Flaws Def Weight Weight Flaws 1
Weight Weight Def Flaws Weight 1
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 27/34
reason
g
Cause and Effect Diagram
石川 馨 Kaoru Ishikawa, * 1915, Tokio; † 16. April 1989
He developed the „Ishikawa Diagram“ (1943), also
ll d C A d Eff t Di “called „Cause And Effect Diagram“
A graphic tool that helps identify, sort, and display possible
causes of a problem or quality characteristic.
1. Identify and define the effect (objective or problem)
2 Identify the main categories like 6 M´s:2. Identify the main categories, like 6 M s:
Material, Man, Machine, Measure, Method, Mother nature
3. Identify causes influencing the effect
4. Add detailed levels
5. Analyze the diagram… e.g. by help of Pareto
Circle what you can measure or take action on
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 28/34
y
15. Cause and Effect Diagram
The Cause and Effect diagram is an excellent tool to present e.g.
brainstorming results. It groups collected inputs with respect tog g p p p
the output. This is also named Ishikawa or Fishbone diagram.
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 29/34
Cause and Effect Diagram: Example
Or see Black Belt for further information or examples.
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 30/34
16. Cause and Effect Diagram
Stat
>Quality Tools
>Ca se and Effect>Cause and Effect
File: Fischbone.mtw
Cause-and-Effect Diagram
Measurements Material Personnel
Dust
Cutting quality
Granulate size
g
to less checks
Externl
Homogeneity
Glass distribution
Granulation temp
Surface condition
Electrical charge
Granulate size
Dust
Weight
Diameter
Length
It is also possible to
Quality
Problems
Nozzle plate
Cutting condition
Cutting technique
Externl
Hot material in cold pipe
Dryer temp
Electrical charge
Silo de-loading
generate sub branches
for each main branch, e.g.
if material split in internal
Environment Methods Machines
Conveyor design
Dust collector
Transport system
Silo de loading
Silo loading
Transport Extern
Transport Intern
if material split in internal
or external causes.
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 31/34
Cause and Effect Diagram
Measurements Material Personnel
Cause-and-Effect Diagram
Measurements Material Personnel
Electrical charge
Granulate size
Dust
Cutting quality
Length
Granulate size
Quality
Problems
to less checks
Externl
Homogeneity
Glass distribution
Granulation temp
Surface condition
Dust
Weight
Diameter
Nozzle plate
Cutting condition
Cutting technique
Conveyor design
D t olle to
Hot material in cold pipe
Dryer temp
Electrical charge
Silo de-loading
Silo loading
Environment Methods Machines
Dust collector
Transport system
Transport Extern
Transport Intern
FishboneFlat cat
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 32/34
17. Cause and Effect Diagram
MaterialMaschine
Mensch
Event ungleich
Spezifikation
Zielsystem fällt aus
Verantw. n. geregelt
Event fehlerhaft
Event leer
EDM Prg.fehler
Mapping n. aktuell
EDM Prg.änderung
EAI
System
fällt aus
Virus
Policy geändert
DB voll
Bedienungs-
fehler
Service n.
gestartetReceive funct. n. gestartet
Event versehentl. gelöscht
Auf Fehler wird n. reagiert
Fehler wird n. bemerkt
Kontrollplan n. vorh./vollst.
Prg. n. getestet
Keiner/falscher Testplan
Fehlerursachen
Stromausfall
OS-Fehler
Uhr verstellt
Verschiedene Zonen
Event fehlerhaft
Neue SW
installiert HW ProblemStammdaten nicht oder falsch def. Falsche Prg.version installiert
ASNA lä ft i ht
wurde
ungeplant
gestoppt
Fehlerhafte
Eingangsdaten
Eingabedaten
zu langFehler in
Messsystem
OS Fehler
Netzwerk
Switch
Überlastung
Provider-
Fehler
SZ / WZ
ASNA läuft nicht
falsch
konf.
nicht
gestartet
Prog.fehler
in KBMW00
Prog fehler in
RPG-Prog.
DB-Locks
Schlüsselwerte
nicht definiert
User sperrt
Datensatz
Fehler in MW
Mitwelt
Methoden
Prog.fehler in
XPPSDispatcherSchedule
Mapping
Konfiguration
Komponent.
Available Fishbone tools are, e.g.
Mi it b MS Vi iMS Vi i MS P i t
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 33/34
Minitab, MS VisioMS Visio, MS Powerpoint
Summary
The following graphical tools have been created
with Minitab:with Minitab:
•Histogram
•Run Chart (Control Chart)Run Chart (Control Chart)
•Box Plot
•Dot Plot•Dot Plot
•X-Y Scatter Plot
•Marginal Plot•Marginal Plot
•Matrix Plot
P t Di•Pareto Diagram
•Cause and Effect Diagram
Knorr-Bremse Group 07 BB W1 Graphics 07, D. Szemkus/H. Winkler Page 34/34