This document summarizes a thesis on parametric and nonparametric methods applied to conjoint analysis. It discusses rating conjoint analysis, choice-based conjoint analysis, and market segmentation. It provides an example application of rating conjoint analysis to evaluate an anti-theft bicycle patent. It also discusses parametric and nonparametric statistical procedures, including multiple regression, permutation methods, and the Average Marginal Component Effect. Finally, it proposes guidelines for best practices in conjoint analysis applications.
Undergraduate Project written by EBERE on ANALYSIS OF VARIATION IN GSKEbere Uzowuru
This document discusses analysis of variance (ANOVA) and its use in quality control and manufacturing processes. It provides background on the development of ANOVA, beginning in the 1930s and growing rapidly after World War II. It then discusses key concepts in ANOVA like partitioning variance, comparing multiple group means, and hypothesis testing. The rest of the document discusses specific tools used with ANOVA like control charts, histograms, Pareto charts, fishbone diagrams and their uses in identifying sources of variation and improving processes.
This document discusses various approaches to demand estimation in marketing research, including consumer surveys, observational research, consumer clinics, and market experiments. It then provides details on regression analysis techniques, including scatter diagrams, the regression line, ordinary least squares estimation, and tests of significance. Multiple regression analysis is also covered.
Conjoint analysis is a technique used to understand what attributes of a product are most important to consumers. It involves having consumers rank hypothetical products that combine different levels of attributes. This allows researchers to determine the partial utilities (or part-worths) that consumers assign to each attribute level and understand their relative importance. Conjoint analysis provides insight into consumer preferences while maintaining realism by considering attributes together rather than separately. It estimates the tradeoffs consumers make between attributes and can uncover important drivers of choice.
Chi Square Test of Association is used to determine whether there is a statistically significant association between the two categorical variables. This technique is used to determine if the relationship exists between any two business parameters that are of categorical data type.
This document outlines the key concepts and procedures for sampling design in marketing research. It discusses the differences between sample and census approaches, and outlines the five main steps in the sampling design process: defining the target population, determining the sampling frame, selecting a sampling technique, determining sample size, and executing the sampling process. It then classifies and describes various sampling techniques, including probability methods like simple random sampling and stratified sampling, and nonprobability methods like convenience sampling and snowball sampling. Examples and illustrations are provided to explain each technique.
Optimization in pharmaceutics & processingJamia Hamdard
The document discusses various optimization techniques used in pharmaceutical formulation, processing, and manufacturing. It describes techniques like evolutionary operations, simplex methods, and statistical experimental designs that are used to optimize factors in pharmaceutical experiments to develop formulations with desirable responses. The document also provides examples of applying these techniques to optimize different variables like concentration levels, processing parameters, and component ratios.
This document discusses various optimization techniques used in pharmaceutical formulation and processing. It begins by defining optimization as making something as perfect or functional as possible. It describes the key parameters in optimization problems including independent and dependent variables as well as constrained and unconstrained problems. The document then explains several statistical design approaches to optimization including evolutionary operation, simplex method, and search methods. It provides examples of how these techniques are applied and the general steps involved in using a search method for optimization.
The Paired Sample T Test is used to determine whether the mean of a dependent variable. For example, weight, anxiety level, salary, or reaction time is the same in two related groups. It is particularly useful in measuring results before and after a particular event, action, process change, etc.
Undergraduate Project written by EBERE on ANALYSIS OF VARIATION IN GSKEbere Uzowuru
This document discusses analysis of variance (ANOVA) and its use in quality control and manufacturing processes. It provides background on the development of ANOVA, beginning in the 1930s and growing rapidly after World War II. It then discusses key concepts in ANOVA like partitioning variance, comparing multiple group means, and hypothesis testing. The rest of the document discusses specific tools used with ANOVA like control charts, histograms, Pareto charts, fishbone diagrams and their uses in identifying sources of variation and improving processes.
This document discusses various approaches to demand estimation in marketing research, including consumer surveys, observational research, consumer clinics, and market experiments. It then provides details on regression analysis techniques, including scatter diagrams, the regression line, ordinary least squares estimation, and tests of significance. Multiple regression analysis is also covered.
Conjoint analysis is a technique used to understand what attributes of a product are most important to consumers. It involves having consumers rank hypothetical products that combine different levels of attributes. This allows researchers to determine the partial utilities (or part-worths) that consumers assign to each attribute level and understand their relative importance. Conjoint analysis provides insight into consumer preferences while maintaining realism by considering attributes together rather than separately. It estimates the tradeoffs consumers make between attributes and can uncover important drivers of choice.
Chi Square Test of Association is used to determine whether there is a statistically significant association between the two categorical variables. This technique is used to determine if the relationship exists between any two business parameters that are of categorical data type.
This document outlines the key concepts and procedures for sampling design in marketing research. It discusses the differences between sample and census approaches, and outlines the five main steps in the sampling design process: defining the target population, determining the sampling frame, selecting a sampling technique, determining sample size, and executing the sampling process. It then classifies and describes various sampling techniques, including probability methods like simple random sampling and stratified sampling, and nonprobability methods like convenience sampling and snowball sampling. Examples and illustrations are provided to explain each technique.
Optimization in pharmaceutics & processingJamia Hamdard
The document discusses various optimization techniques used in pharmaceutical formulation, processing, and manufacturing. It describes techniques like evolutionary operations, simplex methods, and statistical experimental designs that are used to optimize factors in pharmaceutical experiments to develop formulations with desirable responses. The document also provides examples of applying these techniques to optimize different variables like concentration levels, processing parameters, and component ratios.
This document discusses various optimization techniques used in pharmaceutical formulation and processing. It begins by defining optimization as making something as perfect or functional as possible. It describes the key parameters in optimization problems including independent and dependent variables as well as constrained and unconstrained problems. The document then explains several statistical design approaches to optimization including evolutionary operation, simplex method, and search methods. It provides examples of how these techniques are applied and the general steps involved in using a search method for optimization.
The Paired Sample T Test is used to determine whether the mean of a dependent variable. For example, weight, anxiety level, salary, or reaction time is the same in two related groups. It is particularly useful in measuring results before and after a particular event, action, process change, etc.
2. sem exploratory factor analysis copy (2)Toshali Dey
This document provides an overview of factor analysis, including its purposes, key steps, and assumptions. Factor analysis is used to reduce a large set of variables into a smaller set of underlying factors or dimensions while preserving as much of the original information as possible. The main steps are designing the analysis, selecting variables, ensuring sufficient sample size, checking assumptions, extracting factors, interpreting the results, and rotating factors to aid interpretation. The goal is to identify the fundamental constructs or dimensions that explain relationships among variables.
Optimization techniques and various method of optimization
Full Factorial Design
Introduction to Contour Plots
Introduction of Response Surface Design
Classification
Characteristics of Design
Matrix and analysis of design with case study
Equation of Full and Reduced mathematical models in experimental designs
Central Composite designs
Taguchi and mixtures designs
Application of experimental designs in pharmacology for reduction of animal
The document provides an overview of optimization techniques used in pharmaceutical formulation and processing. It begins by defining optimization and its importance in developing drug products that meet bioavailability and mass production requirements. The key parameters of optimization discussed are problem type (constrained vs unconstrained), variables (independent vs dependent), and application areas (formulation vs processing). Several optimization techniques are then outlined, including evolutionary operations, simplex method, Lagrangian method, search method, and canonical analysis. Examples of each technique are provided, such as using simplex to optimize an analytical method or the Lagrangian method to optimize tablet formulation based on two variables.
Optimization Techniques In Pharmaceutical Formulation & ProcessingAPCER Life Sciences
Optimization techniques are used in pharmaceutical formulation and processing to improve quality while reducing costs. Various methods like evolutionary operation, simplex lattice, and Lagrangian methods can be used to optimize multiple variables. Optimization helps identify the ideal levels of factors like concentration, temperature, and mixing time to achieve the desired response. The techniques allow for more efficient development and large-scale production of drugs.
This document discusses Quality by Design (QbD) principles and Design of Experiments (DOE) methodology. It explains that QbD aims to design quality into products and processes through an understanding of key factors and their interactions. DOE provides a systematic approach to determine these factors and optimize conditions through carefully designed experiments. Common DOE steps include screening experiments to identify important factors, followed by optimization experiments to determine optimal levels and robustness testing to ensure consistent performance under variations.
The document discusses various optimization methods used in the pharmaceutical industry including evolutionary operations, simplex method, Lagrangian method, search method, and canonical analysis. It provides examples of how each method can be applied to optimize different parameters of a tablet formulation such as concentrations of excipients, compression force, and disintegrant levels to minimize disintegration time and friability while meeting constraints. The search method example involves using a five-factor central composite design to optimize tablet properties and identify the best formulation based on constraints for multiple response variables.
The independent sample t-test is a statistical method of hypothesis testing that determines whether there is a statistically significant difference between the means of two independent samples. It is helpful when an organization wants to determine whether there is a statistical difference between two categories or groups or items and, furthermore, if there is a statistical difference, whether that difference is significant.
Optimization techniques in pharmaceutical formulation and processingReshma Fathima .K
This document discusses various optimization techniques used in pharmaceutical formulation and processing. It defines optimization as making something as perfect or effective as possible. The advantages of optimization include reducing costs, saving time, and improving safety, reproducibility, and efficacy. Key optimization parameters discussed include problem type (constrained or unconstrained), variables (independent and dependent), and applied optimization methods like evolutionary operation, simplex method, search method, Lagrangian method, and canonical analysis.
The document outlines the key topics covered in Chapter 15, which include frequency distribution, measures of location, variability and shape related to frequency distributions, hypothesis testing procedures, and cross-tabulation. It provides examples of computing common statistics like the mean, median, range and standard deviation. The chapter cover introduces hypothesis testing methodology, outlining steps like formulating hypotheses, selecting a test, and determining significance levels and types of errors. Examples are given of computing test statistics and determining probabilities. Cross-tabulation and related statistics like chi-square are also listed as chapter topics.
4313ijmvsc02A FUZZY AHP APPROACH FOR SUPPLIER SELECTION PROBLEM: A CASE STUDY...ijmvsc
Supplier selection is one of the most important functions of a purchasing department. Since by deciding the best supplier, companies can save material costs and increase competitive advantage. However this decision becomes complicated in case of multiple suppliers, multiple conflicting criteria, and imprecise parameters. In addition the uncertainty and vagueness of the experts’ opinion is the prominent characteristic of the problem. Therefore an extensively used multi criteria decision making tool Fuzzy AHP can be utilized as an approach for supplier selection problem. This paper reveals the application of Fuzzy AHP in a gear motor company determining the best supplier with respect to selected criteria. The contribution of this study is not only the application of the Fuzzy AHP methodology for supplier selection problem, but also releasing a comprehensive literature review of multi criteria decision making problems. In addition by stating the steps of Fuzzy AHP clearly and numerically, this study can be a guide of the methodology to be implemented to other multiple criteria decision making problems.
Frequent pattern mining is an analytical algorithm that is used by businesses and, is accessible in some self-serve business intelligence solutions. The FP Growth analytical technique finds frequent patterns, associations, or causal structures from data sets in various kinds of databases such as relational databases, transactional databases, and other forms of data repositories.
This document discusses optimization techniques used in pharmaceutical development. It defines optimization as making a formulation or process perfect by finding the best use of resources while considering all influencing factors. It describes independent and dependent variables, different optimization methods like evolutionary operation, simplex method, and statistical experimental designs including factorial, response surface, and Plackett-Burman designs. The advantages of optimization include determining important variables, measuring interactions, and allowing extrapolation to find the best product. Optimization has applications in formulation development, dissolution testing, tablet coating, and capsule preparation.
Predictive analytics targets data to predict if ATL advertising is more effective than BTL advertising and to target customer segments and characteristics.
This document discusses factorial design, which is an experimental design technique used for optimization. It involves studying the effects of two or more factors simultaneously. There are two main types: full factorial design, which tests all possible combinations of factors and levels, and fractional factorial design, which reduces the number of runs when there are many factors. Factorial designs allow evaluation of both main effects and interaction effects. They are useful for formulation development and method optimization in chromatography. Software is used to analyze the results of factorial experiments.
This document discusses various classical and advanced optimization techniques. It begins with an overview of classical techniques like single/multivariable optimization and methods using Lagrange multipliers or Kuhn-Tucker conditions. Numerical methods are then introduced, including linear programming, integer programming, and nonlinear programming. Advanced techniques like hill climbing, simulated annealing, genetic algorithms, and ant colony optimization are also summarized. These optimization methods are inspired by natural processes and use techniques such as local search, positive feedback, and path pheromones to find approximate solutions.
This document provides an overview of data analysis techniques including analysis of variance (ANOVA), regression, correlation, and multivariate statistical analysis. It discusses understanding and interpreting ANOVA, regression, correlation matrices, and exploring factor analysis, multiple discriminant analysis, and cluster analysis. The document also provides examples of interpreting statistical output from ANOVA, regression, and correlation analysis.
This document discusses the Plackett-Burman method for optimizing media in industrial processes. It begins by introducing media optimization and some traditional methods. It then describes the Plackett-Burman design, which allows for testing multiple variables simultaneously using a minimal number of experiments. The method involves designing experiments that test variables at high and low levels. The effects of each variable are then calculated and analyzed using statistical tests to identify the most important factors to optimize. Key advantages of the Plackett-Burman method are that it allows evaluation of many variables with fewer experiments than traditional methods.
Optimization techniques in pharmaceutical processing ROHIT
The document discusses various optimization techniques used in pharmaceutical formulation and processing. It describes classical optimization techniques like calculus that find maxima and minima of functions. It also discusses applied methods like evolutionary operations, the simplex method, the Lagrangian method, and search methods. These methods are used to optimize multiple variables to develop formulations that meet specifications for properties like hardness, dissolution rate, and friability. Graphical techniques like response surfaces and contour plots are used to visualize optimization problems and solutions.
Business plan of an innovative patent Rocket R1Paolo Balasso
This Business Plan was written by:
Paolo Balasso
Mattia Meggiorin
Christian Sanson
It aims to estimate the profittability of an innovative patent by using analytical methods such as Conjoint Analysis (by using R) and evaluate the market positioning, product demand estimation.
If you are interesting in the patent please feel free to contact the owner at this link:
http://www.asap-firmware.com
If you are interested in an doing a market research that this paper provides please write to this email:
p.balasso@hotmail.it
Business Plan - Analysis of a Patent opportunities Paolo Balasso
Market Analysis, Strategy Identification, Competitive Analysis of a firm holding a patent related to a localizer designed for bicycles. An economic financial analysis has been conducted in order to figure out the reliability and a possible gain of this business.
2. sem exploratory factor analysis copy (2)Toshali Dey
This document provides an overview of factor analysis, including its purposes, key steps, and assumptions. Factor analysis is used to reduce a large set of variables into a smaller set of underlying factors or dimensions while preserving as much of the original information as possible. The main steps are designing the analysis, selecting variables, ensuring sufficient sample size, checking assumptions, extracting factors, interpreting the results, and rotating factors to aid interpretation. The goal is to identify the fundamental constructs or dimensions that explain relationships among variables.
Optimization techniques and various method of optimization
Full Factorial Design
Introduction to Contour Plots
Introduction of Response Surface Design
Classification
Characteristics of Design
Matrix and analysis of design with case study
Equation of Full and Reduced mathematical models in experimental designs
Central Composite designs
Taguchi and mixtures designs
Application of experimental designs in pharmacology for reduction of animal
The document provides an overview of optimization techniques used in pharmaceutical formulation and processing. It begins by defining optimization and its importance in developing drug products that meet bioavailability and mass production requirements. The key parameters of optimization discussed are problem type (constrained vs unconstrained), variables (independent vs dependent), and application areas (formulation vs processing). Several optimization techniques are then outlined, including evolutionary operations, simplex method, Lagrangian method, search method, and canonical analysis. Examples of each technique are provided, such as using simplex to optimize an analytical method or the Lagrangian method to optimize tablet formulation based on two variables.
Optimization Techniques In Pharmaceutical Formulation & ProcessingAPCER Life Sciences
Optimization techniques are used in pharmaceutical formulation and processing to improve quality while reducing costs. Various methods like evolutionary operation, simplex lattice, and Lagrangian methods can be used to optimize multiple variables. Optimization helps identify the ideal levels of factors like concentration, temperature, and mixing time to achieve the desired response. The techniques allow for more efficient development and large-scale production of drugs.
This document discusses Quality by Design (QbD) principles and Design of Experiments (DOE) methodology. It explains that QbD aims to design quality into products and processes through an understanding of key factors and their interactions. DOE provides a systematic approach to determine these factors and optimize conditions through carefully designed experiments. Common DOE steps include screening experiments to identify important factors, followed by optimization experiments to determine optimal levels and robustness testing to ensure consistent performance under variations.
The document discusses various optimization methods used in the pharmaceutical industry including evolutionary operations, simplex method, Lagrangian method, search method, and canonical analysis. It provides examples of how each method can be applied to optimize different parameters of a tablet formulation such as concentrations of excipients, compression force, and disintegrant levels to minimize disintegration time and friability while meeting constraints. The search method example involves using a five-factor central composite design to optimize tablet properties and identify the best formulation based on constraints for multiple response variables.
The independent sample t-test is a statistical method of hypothesis testing that determines whether there is a statistically significant difference between the means of two independent samples. It is helpful when an organization wants to determine whether there is a statistical difference between two categories or groups or items and, furthermore, if there is a statistical difference, whether that difference is significant.
Optimization techniques in pharmaceutical formulation and processingReshma Fathima .K
This document discusses various optimization techniques used in pharmaceutical formulation and processing. It defines optimization as making something as perfect or effective as possible. The advantages of optimization include reducing costs, saving time, and improving safety, reproducibility, and efficacy. Key optimization parameters discussed include problem type (constrained or unconstrained), variables (independent and dependent), and applied optimization methods like evolutionary operation, simplex method, search method, Lagrangian method, and canonical analysis.
The document outlines the key topics covered in Chapter 15, which include frequency distribution, measures of location, variability and shape related to frequency distributions, hypothesis testing procedures, and cross-tabulation. It provides examples of computing common statistics like the mean, median, range and standard deviation. The chapter cover introduces hypothesis testing methodology, outlining steps like formulating hypotheses, selecting a test, and determining significance levels and types of errors. Examples are given of computing test statistics and determining probabilities. Cross-tabulation and related statistics like chi-square are also listed as chapter topics.
4313ijmvsc02A FUZZY AHP APPROACH FOR SUPPLIER SELECTION PROBLEM: A CASE STUDY...ijmvsc
Supplier selection is one of the most important functions of a purchasing department. Since by deciding the best supplier, companies can save material costs and increase competitive advantage. However this decision becomes complicated in case of multiple suppliers, multiple conflicting criteria, and imprecise parameters. In addition the uncertainty and vagueness of the experts’ opinion is the prominent characteristic of the problem. Therefore an extensively used multi criteria decision making tool Fuzzy AHP can be utilized as an approach for supplier selection problem. This paper reveals the application of Fuzzy AHP in a gear motor company determining the best supplier with respect to selected criteria. The contribution of this study is not only the application of the Fuzzy AHP methodology for supplier selection problem, but also releasing a comprehensive literature review of multi criteria decision making problems. In addition by stating the steps of Fuzzy AHP clearly and numerically, this study can be a guide of the methodology to be implemented to other multiple criteria decision making problems.
Frequent pattern mining is an analytical algorithm that is used by businesses and, is accessible in some self-serve business intelligence solutions. The FP Growth analytical technique finds frequent patterns, associations, or causal structures from data sets in various kinds of databases such as relational databases, transactional databases, and other forms of data repositories.
This document discusses optimization techniques used in pharmaceutical development. It defines optimization as making a formulation or process perfect by finding the best use of resources while considering all influencing factors. It describes independent and dependent variables, different optimization methods like evolutionary operation, simplex method, and statistical experimental designs including factorial, response surface, and Plackett-Burman designs. The advantages of optimization include determining important variables, measuring interactions, and allowing extrapolation to find the best product. Optimization has applications in formulation development, dissolution testing, tablet coating, and capsule preparation.
Predictive analytics targets data to predict if ATL advertising is more effective than BTL advertising and to target customer segments and characteristics.
This document discusses factorial design, which is an experimental design technique used for optimization. It involves studying the effects of two or more factors simultaneously. There are two main types: full factorial design, which tests all possible combinations of factors and levels, and fractional factorial design, which reduces the number of runs when there are many factors. Factorial designs allow evaluation of both main effects and interaction effects. They are useful for formulation development and method optimization in chromatography. Software is used to analyze the results of factorial experiments.
This document discusses various classical and advanced optimization techniques. It begins with an overview of classical techniques like single/multivariable optimization and methods using Lagrange multipliers or Kuhn-Tucker conditions. Numerical methods are then introduced, including linear programming, integer programming, and nonlinear programming. Advanced techniques like hill climbing, simulated annealing, genetic algorithms, and ant colony optimization are also summarized. These optimization methods are inspired by natural processes and use techniques such as local search, positive feedback, and path pheromones to find approximate solutions.
This document provides an overview of data analysis techniques including analysis of variance (ANOVA), regression, correlation, and multivariate statistical analysis. It discusses understanding and interpreting ANOVA, regression, correlation matrices, and exploring factor analysis, multiple discriminant analysis, and cluster analysis. The document also provides examples of interpreting statistical output from ANOVA, regression, and correlation analysis.
This document discusses the Plackett-Burman method for optimizing media in industrial processes. It begins by introducing media optimization and some traditional methods. It then describes the Plackett-Burman design, which allows for testing multiple variables simultaneously using a minimal number of experiments. The method involves designing experiments that test variables at high and low levels. The effects of each variable are then calculated and analyzed using statistical tests to identify the most important factors to optimize. Key advantages of the Plackett-Burman method are that it allows evaluation of many variables with fewer experiments than traditional methods.
Optimization techniques in pharmaceutical processing ROHIT
The document discusses various optimization techniques used in pharmaceutical formulation and processing. It describes classical optimization techniques like calculus that find maxima and minima of functions. It also discusses applied methods like evolutionary operations, the simplex method, the Lagrangian method, and search methods. These methods are used to optimize multiple variables to develop formulations that meet specifications for properties like hardness, dissolution rate, and friability. Graphical techniques like response surfaces and contour plots are used to visualize optimization problems and solutions.
Business plan of an innovative patent Rocket R1Paolo Balasso
This Business Plan was written by:
Paolo Balasso
Mattia Meggiorin
Christian Sanson
It aims to estimate the profittability of an innovative patent by using analytical methods such as Conjoint Analysis (by using R) and evaluate the market positioning, product demand estimation.
If you are interesting in the patent please feel free to contact the owner at this link:
http://www.asap-firmware.com
If you are interested in an doing a market research that this paper provides please write to this email:
p.balasso@hotmail.it
Business Plan - Analysis of a Patent opportunities Paolo Balasso
Market Analysis, Strategy Identification, Competitive Analysis of a firm holding a patent related to a localizer designed for bicycles. An economic financial analysis has been conducted in order to figure out the reliability and a possible gain of this business.
This analysis has been done applying the knowledge developed during the course of statistic methods and applications to a real business. A conjoint analysis was conducted to estimate the partial worths of the different features of running shoes. This analysis helps to figure out which attribute and level is most important according to the constumer's view and costumize the offer of the firm considering the different interests of the market. It allows to orientate the product to a target market.
Discrete event simulation presentation using Anylogic software. A food distribution, organized by Caritas association, has been modeled in order to
significantly improve its volunteering activity. University project. In Italian
The document provides an overview of new products from Doosan, including two new wheel loaders, three new material handlers, and a new power tilting coupler. The wheel loaders feature a hydrostatic transmission system for the first time and offer improved fuel efficiency, enhanced control, and selectable power modes. The two new material handlers are designed for scrap handling and provide better visibility, reduced fuel consumption, versatile attachments, and meet Tier 4 standards. The new power tilting coupler allows excavator buckets to tilt 90 degrees left or right for increased flexibility.
La topología de red se refiere a la configuración física de cómo los dispositivos en una red están conectados entre sí. Existen varios tipos de topologías de red, incluyendo topología de bus, donde todos los dispositivos se conectan a una línea de transmisión común; topología de anillo, donde los dispositivos se comunican en secuencia formando un bucle; y topología de estrella, donde los dispositivos se conectan a un concentrador central.
Este documento muestra un ejemplo básico de código HTML para darle color de fondo rosado a una página web. Se incluye la etiqueta <body> y el atributo "style" para establecer el color de fondo como "pink".
El documento describe cómo crear formularios en HTML utilizando la etiqueta <form> para agrupar los diferentes elementos de entrada como cuadros de texto, botones y listas desplegables. Incluye dos ejemplos de formularios, uno más simple con diferentes controles de entrada y otro más complejo con una tabla que contiene los campos para registrar datos de empleados. Finalmente, pide crear un formulario más complejo de registro de clientes utilizando todas las etiquetas vistas hasta el momento.
This document summarizes Doosan equipment used by two large infrastructure contractors, Branch Highways and H&E Equipment. It describes how Branch Highways relied on Doosan excavators to handle an increased workload of 35-40 projects in 2015, praising the machines' versatility, reliability, and productivity. It provides details on two specific Doosan excavators used on large highway and bridge projects in Virginia. The summary highlights how H&E Equipment was impressed by the performance of initial Doosan machines, leading them to purchase additional Doosan equipment to meet jobsite needs.
The document discusses new products from Doosan, including new articulated dump trucks and expanded excavator attachments. Doosan introduces two new hydraulic breakers and details its 48-hour parts guarantee. The magazine provides construction equipment field test results, success stories from Doosan customers, and specifications for Doosan articulated dump trucks, excavators, wheel loaders, and log loaders.
The document provides information on new material handling equipment from Doosan, including three new material handlers and grapples for handling various materials. The new models - the DX225MH-3, DX300MH-5, and DX210WMH - are purpose-built for sorting and handling waste and recyclables. They have features like elevated cabs for visibility, straight booms for reach, and grapples for grasping materials. The machines offer various power modes for different applications and come with a three-year telematics subscription for monitoring.
The document discusses new products from Doosan, including two new wheel loaders, five new crawler excavators and three new wheel excavators. It also profiles a new log loader and clamp attachment for excavators. The wheel loaders and excavators are equipped with features for improved fuel efficiency, visibility, serviceability and versatility. The log loader provides more horsepower, weight and durability for logging tasks.
This document provides an overview of statistics used in meta-analysis. It discusses key concepts like odds ratios, relative risk, confidence intervals, heterogeneity, and fixed and random effects models. It also summarizes different types of meta-analyses including realist reviews, meta-narrative reviews, and network meta-analyses. Software for performing meta-analyses and potential pitfalls in systematic reviews are also briefly covered.
Errors in chemical analysis can be random or systematic. Random errors cause imprecise results while systematic errors lead to inaccurate results by introducing bias. Common sources of systematic error include faulty instrumentation, non-ideal chemical behaviors in analytical methods, and personal biases of experimenters. Systematic errors can be detected through frequent calibration of instruments, analysis of reference standards, independent verification methods, blank determinations, and evaluation of results from varying sample sizes. Controlling for systematic errors is important for obtaining reliable analytical data.
Statistics applied to the interdisciplinary areas of marketingCarol Hargreaves
Optimising price and marketing mix.
Concept of learning. When an account/product has too little sales data, bayesian shrinkage allows us to borrow information from other accounts.
Deals with outliers, by shrinking estimates towards each other.
Allows one hierarchical model instead of multiple models.
More robust, stable estimates with significant regional and account variation in estimates that cannot be done in a classical linear model.
Provides price elasticity measure that shows the impact of price changes on volume
The document discusses different sampling techniques used in research. It describes probability sampling methods like simple random sampling, systematic sampling, stratified random sampling, and multistage cluster sampling which ensure that each population element has a known chance of selection. It also covers non-probability sampling which uses arbitrary selection. Key advantages of probability sampling include controlling for bias and representing the population, while non-probability sampling has lower costs. Sample size is based on desired precision, population variability, and confidence level.
Descriptive Statistics and Interpretation Grading GuideQNT5.docxtheodorelove43763
This document outlines a sampling and data collection plan to test whether implementing a Total Quality Management (TQM) system will increase product quality at PhoenixSolar. The target population includes production workers, managers, engineers, technicians, and customers who will provide insights through focus groups and surveys. A sample size of 385 is needed for a 95% confidence level. Internal employees will participate in exploratory focus groups, while external groups like technicians and customers will complete paper, email, and installation surveys. Validity, reliability, and privacy protocols are defined. The plan is to analyze responses over six months to determine if TQM increases quality and customer satisfaction at PhoenixSolar.
Data Analysis Of An Analytical Method Transfer ToDwayne Neal
To provide the basis for a PDA task force discussion to arrive at a consensus of best industry practices for data analysis of method transfers. The discussion is also relevant to method validation activities.
Sensitivity Analysis, Optimal Design, Population Modeling.pptxAditiChauhan701637
Sensitivity analysis is the study of the unreliability related to output and input of mathematical model or numerical system which can be divided and allocated to various sources.
The process of outcome under possible speculation to find out the impact of a variable under sensitivity analysis can be useful for a range of purpose, consisting -
1. In the existence of unreliability, prefer testing of the results of a model or system.
2. Enhanced understanding of correlation between input and output variables in a model or system.
Sensitivity analysis methods:
There are many number of methods to study the sensitivity analysis, many of which have been developed to address one or more of the limitations discussed above. By the type sensitivity analysis measurement they are differentiate, be it based on variance decompositions, partial derivatives or elementary effects.
The application-of-fmea-to-a-medication-reconciliation-process-12337721246904...Sathish Kumar
The document discusses applying Failure Mode and Effects Analysis (FMEA) to improve a hospital's medication reconciliation process upon patient admission. FMEA is a systematic method used in high-risk industries to identify potential failures, reduce risks, and document the process. A team used FMEA to analyze the medication reconciliation process, identifying failure modes like inaccurate medication histories. They calculated risk priority numbers and prioritized issues, then recommended process improvements like a single shared medication list and formalized reconciliation. Lessons included the importance of leadership, staff buy-in, and following through on recommendations.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Un tentativo di utilizzare il Controllo di Qualità Inter Laboratorio (Allargato) per definire e misurare la Qualità delle prestazioni Analitiche; in questo caso nell'ambito dello Screening del Sangue Occulto nelle Feci
Assignment Pharmacoeconomics Fatma Adel SolimanAsia Smith
This document provides an overview of decision analysis and Markov modeling. It discusses the steps in decision analysis, including identifying the decision, specifying alternatives, drawing the decision tree structure, specifying costs/outcomes/probabilities, performing calculations, and conducting sensitivity analysis. The key steps in Markov modeling are also outlined, such as choosing health states, determining transitions between states, setting the cycle length/number, estimating transition probabilities, and calculating costs and outcomes. The advantages of Markov modeling include flexibility and modeling dynamic systems, while disadvantages include greater complexity and the memoryless assumption.
Harmonization of Laboratory Testing, 08 04-2017Ola Elgaddar
1) Harmonization of laboratory testing aims to make test results from different laboratories comparable by adjusting for differences in measurement procedures and methods.
2) Factors like reproducibility, linearity, heterogeneity, calibration, commutability, stability, sustainability, and value assignment of measurement procedures and reference materials can affect the possibility and success of harmonization efforts.
3) Global initiatives are working to prioritize measurands, coordinate harmonization activities, and promote the standardization and statistical harmonization of test results.
Quality assurance and quality control programs are necessary to ensure the reliability and accuracy of analytical environmental data. An inter-laboratory study by the EPA showed wide variation in nutrient concentration measurements between laboratories. Measurement of total dissolved solids and electrical conductivity also showed significant variation between laboratories. Shewhart control charts can be used to monitor the statistical control of analytical procedures and identify sources of random and systematic error by tracking the spread and displacement of results from control samples over time. Control limits on the charts indicate thresholds for corrective action to maintain method accuracy.
Quality assurance and quality control programs are necessary to ensure the validity and reliability of analytical environmental data. Several studies have shown large variations in results for identical samples analyzed by different laboratories. AQC programs establish procedures for sample collection, analysis, calibration, quality control checks, and data reporting. Key aspects include standard methods, analyst training, instrument maintenance, calibration verification, internal quality control samples, and inter-laboratory sample exchanges to check for accuracy. Control charts can be used to monitor results and identify any loss of statistical control that could indicate errors have been introduced. Both precision and accuracy are important to consider when evaluating results.
Quality assurance and quality control programs are necessary to ensure the reliability and accuracy of analytical environmental data. An inter-laboratory study by the EPA showed wide variation in nutrient concentration measurements between laboratories. Measurement of total dissolved solids and electrical conductivity also showed significant variation between laboratories. Shewhart control charts can be used to monitor the statistical control of analytical procedures and identify issues by tracking results from quality control samples against mean values and standard deviations. Key aspects of a quality assurance program include sample handling procedures, standardized analytical methods, analyst training, instrument maintenance, calibration procedures, analytical quality control tests, data management, and control chart monitoring.
Optimization techniques in Pharmaceutical formulation and processing AREEBA SHAFIQ
This document provides an overview of optimization techniques used in pharmaceutical formulation and processing. It discusses key concepts like the definition of optimization, advantages of systematic optimization approaches, and common optimization parameters. It also describes important terms in optimization like factors, effects, interactions, coding, and confounding. Different experimental design approaches are explained, including factorial designs, fractional factorial designs, Plackett-Burman designs, central composite designs, Box-Behnken designs, and mixture designs. The document concludes by noting the seven main steps involved in the optimization methodology for drug delivery systems.
This document discusses audit sampling techniques including:
1. Audit sampling involves examining less than 100% of items in a population to obtain evidence about the population's characteristics. Key is selecting a representative sample.
2. Statistical sampling uses probability theory to evaluate results and measure sampling risk, making it more defensible than non-statistical sampling.
3. When planning samples, auditors must consider objectives, populations, possible stratification, sampling units, and desired reliability. Sample sizes are chosen based on tolerable deviation/error rates and allowable risks.
This document discusses key concepts related to sampling and optimization in research. It begins by defining a population and sample, and explains why sampling is commonly used. It then covers probability and non-probability sampling methods. The document also discusses factors that influence sample representativeness such as sampling procedure, sample size, and participation rates. Finally, it introduces the concept of optimization and describes how it is applied in research to refine formulations and processes.
2020 trends in biostatistics what you should know about study design - slid...nQuery
2020 Trends In Biostatistics - What you should know about study design.
In this free webinar you will learn about:
-Adaptive designs in confirmatory trials
-Using external data in study planning
-Innovative designs in early-stage trials
To watch the full webinar:
https://www.statsols.com/webinar/2020-trends-in-biostatistics-what-you-should-know-about-study-design
Similar to Balasso paolo tesi di laurea magistrale in ingegneria gestionale (20)
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Blood finder application project report (1).pdfKamal Acharya
Blood Finder is an emergency time app where a user can search for the blood banks as
well as the registered blood donors around Mumbai. This application also provide an
opportunity for the user of this application to become a registered donor for this user have
to enroll for the donor request from the application itself. If the admin wish to make user
a registered donor, with some of the formalities with the organization it can be done.
Specialization of this application is that the user will not have to register on sign-in for
searching the blood banks and blood donors it can be just done by installing the
application to the mobile.
The purpose of making this application is to save the user’s time for searching blood of
needed blood group during the time of the emergency.
This is an android application developed in Java and XML with the connectivity of
SQLite database. This application will provide most of basic functionality required for an
emergency time application. All the details of Blood banks and Blood donors are stored
in the database i.e. SQLite.
This application allowed the user to get all the information regarding blood banks and
blood donors such as Name, Number, Address, Blood Group, rather than searching it on
the different websites and wasting the precious time. This application is effective and
user friendly.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Mechatronics is a multidisciplinary field that refers to the skill sets needed in the contemporary, advanced automated manufacturing industry. At the intersection of mechanics, electronics, and computing, mechatronics specialists create simpler, smarter systems. Mechatronics is an essential foundation for the expected growth in automation and manufacturing.
Mechatronics deals with robotics, control systems, and electro-mechanical systems.
Software Engineering and Project Management - Introduction, Modeling Concepts...Prakhyath Rai
Introduction, Modeling Concepts and Class Modeling: What is Object orientation? What is OO development? OO Themes; Evidence for usefulness of OO development; OO modeling history. Modeling
as Design technique: Modeling, abstraction, The Three models. Class Modeling: Object and Class Concept, Link and associations concepts, Generalization and Inheritance, A sample class model, Navigation of class models, and UML diagrams
Building the Analysis Models: Requirement Analysis, Analysis Model Approaches, Data modeling Concepts, Object Oriented Analysis, Scenario-Based Modeling, Flow-Oriented Modeling, class Based Modeling, Creating a Behavioral Model.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
Applications of artificial Intelligence in Mechanical Engineering.pdfAtif Razi
Historically, mechanical engineering has relied heavily on human expertise and empirical methods to solve complex problems. With the introduction of computer-aided design (CAD) and finite element analysis (FEA), the field took its first steps towards digitization. These tools allowed engineers to simulate and analyze mechanical systems with greater accuracy and efficiency. However, the sheer volume of data generated by modern engineering systems and the increasing complexity of these systems have necessitated more advanced analytical tools, paving the way for AI.
AI offers the capability to process vast amounts of data, identify patterns, and make predictions with a level of speed and accuracy unattainable by traditional methods. This has profound implications for mechanical engineering, enabling more efficient design processes, predictive maintenance strategies, and optimized manufacturing operations. AI-driven tools can learn from historical data, adapt to new information, and continuously improve their performance, making them invaluable in tackling the multifaceted challenges of modern mechanical engineering.
Accident detection system project report.pdfKamal Acharya
The Rapid growth of technology and infrastructure has made our lives easier. The
advent of technology has also increased the traffic hazards and the road accidents take place
frequently which causes huge loss of life and property because of the poor emergency facilities.
Many lives could have been saved if emergency service could get accident information and
reach in time. Our project will provide an optimum solution to this draw back. A piezo electric
sensor can be used as a crash or rollover detector of the vehicle during and after a crash. With
signals from a piezo electric sensor, a severe accident can be recognized. According to this
project when a vehicle meets with an accident immediately piezo electric sensor will detect the
signal or if a car rolls over. Then with the help of GSM module and GPS module, the location
will be sent to the emergency contact. Then after conforming the location necessary action will
be taken. If the person meets with a small accident or if there is no serious threat to anyone’s
life, then the alert message can be terminated by the driver by a switch provided in order to
avoid wasting the valuable time of the medical rescue team.
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Transcat
Join us for this solutions-based webinar on the tools and techniques for commissioning and maintaining PV Systems. In this session, we'll review the process of building and maintaining a solar array, starting with installation and commissioning, then reviewing operations and maintenance of the system. This course will review insulation resistance testing, I-V curve testing, earth-bond continuity, ground resistance testing, performance tests, visual inspections, ground and arc fault testing procedures, and power quality analysis.
Fluke Solar Application Specialist Will White is presenting on this engaging topic:
Will has worked in the renewable energy industry since 2005, first as an installer for a small east coast solar integrator before adding sales, design, and project management to his skillset. In 2022, Will joined Fluke as a solar application specialist, where he supports their renewable energy testing equipment like IV-curve tracers, electrical meters, and thermal imaging cameras. Experienced in wind power, solar thermal, energy storage, and all scales of PV, Will has primarily focused on residential and small commercial systems. He is passionate about implementing high-quality, code-compliant installation techniques.
Zener Diode and its V-I Characteristics and Applications
Balasso paolo tesi di laurea magistrale in ingegneria gestionale
1. UNIVERSITÀ DEGLI STUDI DI PADOVA
Facoltà di Ingegneria
Dipartimento di Tecnica e Gestione dei Sistemi Industriali
TESI DI LAUREA MAGISTRALE IN INGEGNERIA GESTIONALE
PARAMETRIC AND NONPARAMETRIC METHODS
APPLIED TO CONJOINT ANALYSIS
Relatore: Ch.mo Prof. Luigi Salmaso
Correlatore: Ch.mo Prof. Devin Caughey
Correlatore: Ch.mo Prof. Teppei Yamamoto
Laureando: Paolo Balasso
Anno accademico 2015/2016
2. Index
INTRODUCTION OF CONJOINT ANALYSIS
data input and procedure
RATING CA
INTRODUCTION
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
PARAMETRIC CONJOINT ANALYSIS
Limits and shortcomings
Application to analyze a new patent
NONPARAMETRIC CONJOINT ANALYSIS
Average Marginal Treatment Effect
FWER Simulation
Parametric Bootstrap
Application to Food and Beverage Sector
Market Share EstimationSales forecasting
Applications
Partial-worths Estimation
3. Type of Conjoint analysis
CONJOINT ANALYSIS
RATING CA
INTRODUCTION
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
8 6 5
Data
required
Parametric
Statistic
procedures
METRIC CONJOINT
ANALYSIS
CHOICE-BASED
CONJOINT ANALYSIS
Ratings or
rankings
Choices within
profiles
K-way-Anova,
Multiple regression
Multinomial
logit analysis
Nonparametric
Statistic
procedures
Average Marginal
component Effect(AMCE)
Permutation methods
4. Parametric methods
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
Anti-theft patent for bicycles
Rating marketing experiment applied to a company interested in evaluating his patent: an anti-theft product for bike
with an innovative characteristic was developed.
Full integrated
Integration: it is a characteristic that keeps the GPS device safe
from the burglar
3 attributes were taken into account:
External/camouflaged
External/visible
Difficult, technician needed
Maintenance/installation, this is a characteristic about
charging the battery with three levels:
Difficult, no technician needed
Easy
Sound alarm, presence of sound alarm with two levels:
Yes – the alarm is present
No – the alarm is not present
The goal: to figure out if a full integration and the insertion of an alarm could be a competitive
advantage that allowed to get a higher market share.
Types of integrations:
6. Parametric methods-Example
Assumptions and diagnostics
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
“Most statistical tests rely upon certain assumptions about the variables used in the analysis. When
these assumptions are not met the results may not be trustworthy, resulting in a Type I or Type II
error, or over- or under-estimation of significance or effect size(s)”.
Osborne, Jason & Elaine Waters , North Carolina State University and University of Oklahoma
This is confirmed by the following
diagnostic procedure
Data indicate the assumptions of normality and
homoschedasticity may be violated.
7. Nonparametric methods
A new permutation method
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
Run regression by respondent and
store the obtained estimates
This approach does not require normality or homoschedasticity but only a more relaxed assumption
that is exchangeability. This method is proposed by Finos in "Permutation tests for between-unit fixed
effects in multivariate generalized linear mixed models”(2014)
(Intercept)
Full-integ
External-
Camouflaged
Complex-
technician
Complex-no-
technician
Sound-
alarm-yes
Sign Test 0.00e-16 0.00e-16 2,26E-10 7,05E-12 1,562E-03 4,74E-09
Wilcoxon 3,61E-06 3,78E-06 6,98E-06 4,16E-06 1,18E-02 6,66E-06
P values
8. Parametric methods-Example
Market share – Parametric bootstrap
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
In order to add uncertainty into the model we have run a simulation in which, for each loop, the beta vector is
computed by taking into account the estimates and the standard errors of the betas.
Rating of product j
and respondent i
in simulation s
Dummy variable:
0 or 1
Coefficients that will be extracted
from generated normal distributions
for each simulation
Error terms that will be extracted
from a generated normal distribution
for each simulation
Calculate for each
simulation the MKS
of the products
9. Average Marginal Component Effect (AMCE)
Advantages
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
Weaker assumptions than other usual methods
Randomizing the profiles across respondents
AMCE does not require normality and homoschedasticity
The randomized design substitutes the fractional and
orthogonal designs typical of other approaches which
confounds the interaction effects
AMCE allows to decide the distribution of the treatment
components actually used in the experiment
It allows to create a design that simulates the real world
distribution of the treatment
Shortcomings
Its statistic properties need to be tested further
10. Average Marginal Component Effect (AMCE)
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS If the FWER is equal to alpha(in this case set to 0,05) the test can be considered exact.
Note that the value are higher especially when interactions are considered
Correction for multiplicity are useful to reduce the FWER, thus other simulations were
conducted by implementing Bonferroni, Holm, Hochberg, Benjamini-Hochberg and
Benjamini Yekutieli adjustments
Family Wise Error Rate (FWER) is the probability of making one or more I type
errors on the whole of the considered hypotheses (Marcus et al., 1976).
11. Average Marginal Component Effect (AMCE)
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
Adjustment
procedures of
FWER
main effects
Adjustment
procedures of
FWER
interaction
effects
Bonferroni-Holm Benjamini-Hoch Benjamini-Yekut
12. Average Marginal Component Effect (AMCE)
INTRODUCTION
RATING CA
CHOICE-BASED
CA
MARKET
SEGMENTATION
CONCLUSIONS
CONJOINT ANALYSIS APPLIED TO FOOD AND BEVERAGE SECTOR
Attribute Level Estimate Std. Err z value Pr(>|z|) Significance Holm adjust.
consistency Plain 0.0392 0.005 69.273 4,29E-08 *** 8,58E-06
consistency Crunchy 0.0899 0.006 141.066 3,46E-41 *** 1,38E-38
organic No -0.1567 0.005 -277.191 4,11E-165 *** 3,29E-162
price $5.99 -0.0896 0.006 -147.767 2,07E-45 *** 1,04E-42
price $8.99 -0.1605 0.006 -257.044 1,04E-141 *** 6,27E-139
Taste chocolate 0.1678 0.006 268.345 1,28E-154 *** 8,96E-152
taste Coconut 0.0769 0.006 121.243 7,85E-30 *** 2,36E-27
taste strawberries 0.0563 0.008 65.856 4,53E-07 *** 4,53E-05
Choice-based marketing experiment where an American industry of granola is interested to figure out
what kind of product may get the highest market share and how the levels of each attribute affect the
choice of purchasing the product.
Price $3.99, $5.99, $8.99
Organic yes,no
Consistency chewy, plain, crunchy
Taste cereal, chocolate, coconut, strawberries
Attribute Level
From the simulation Holm adjustment seems to be a good control
for the Family Wise Error Rate
13. MARKET SEGMENTATION
Market Segmentation
INTRODUCTION
CHOICE-BASED
CA
CONCLUSIONS
The general goal of market segmentation is to find groups of customers that differ in important ways
associated with product interest, market participation, or response to marketing efforts. One way is
to use priori segmentations as proposed in the paper “Market Segmentation with Choice-Based
Conjoint Analysis “, Wayne S.
Steps:
Collect priori segmentation information for each respondent
Choose a statistical approach to perform to CA data(in our case AMCE)
Run the method for each priori cluster and deal with multiplicity adjustment(Holm)
Interpret the results
Level Holm adj.-Healthy Holm adj.-Unhealthy
plain 9,89E-04 1,98E-09
crunchy 1,16E-10 1,76E-34
no 0,00E+00 1,55E-51
$5.99 2,42E-10 6,94E-41
$8.99 2,92E-31 1,96E-117
chocolate 3,74E-29 3,85E-134
coconut 2,26E-08 1,33E-19
strawberries 5,54E-173 4,75E-07
MARKET
SEGMENTATION
RATING CA
14. GUIDELINES FOR CA APPLICATIONS
Finally we try to provide a best practice guideline for a Conjoint Analysis experiment
Holm adjustment
for Multiplicity
Collect data
from respondents using profiles with a
rondomized design
Choice-based CA
with AMCE or
Mnlogit model
Market Share
Tools or service Procedures
Cost for each response: 99c
Opensource Software
Opensource Software
Sales forecasting
B2B
B2C
www.revolutionanalytics.com/companies-using-r
15. CONCLUSIONS: NEW CONTRIBUTION TO CA
AMCE method
Nonparametric method used to validate
estimators of parametric approaches
Nonparametric
Approaches
Bootstrap method used to consider the
uncertainty in market share estimations
It requires weaker assumptions and allows to
get more reliable outcomes
Editor's Notes
Buongiorno, vi parlerò degli approcci parametrici e non parametrici applicati alla conjoint analysis, tema che ho sviluppato per circa 3 mesi al MIT di Boston
Dopo aver brevemente introdotto questo metodo statistico, parlerò degli approcci parametrici, molto comuni e utilizzati dalle aziende, quindi dei loro limiti e svantaggi, per poi focalizzarmi sui metodi non parametrici. Ca può essere utilizzata per l’analisi del business, permette infatti di fare previsioni, stimare vendite , market share, partial worth o utilità parziali.
La conjoint analysis può essere suddivisa in 2 tipologie: metrica e choice-based che differenti tipi di input, rispettivamente rating o ranking e vettori di 1 e 0 che implicano una scelta e non scelta di un determinato profilo. Si possono applicare poi procedure statistiche parametriche: modello anova per la CA metrica e MLM choice based e mentre per gli approcci non parametrici metodi con permutazione e infine AMCE, modello sviluppato dai professori del MIT.
Questa è l’applicazione di un approccio parametrico di CA nell’analisi di un business in cui un azienda era interessata a valutare la profittabilità di un suo brevetto. In particolare il prodotto era un dispositivo antifurto per biciclette. Abbiamo preso in considerazione 3 caratteristiche: integrazione, difficoltà di installazione e allarme sonoro.
Questi sono i risultati della CA, in particolare questo è il grafico che mostra le utilità parziali per ogni caratteristica del prodotto. Vediamo che La presenza dell’allarme sonoro e la caratteristica completamente integrato sono ben considerate adl consumatore, Se poi facciamo una stima del market share vediamo quanto il prodotto in questione «Rocket» raggiunga con le sue caratteristiche innovative un buon market share, circa il 50%
Tuttavia questo approcchio richiede delle assunzioni restrittive che se violate possono portare a stime erronee e quindi a prendere decisioni sbagliate. Qui vediamo che l’assunzione di uguaglianza delle varianze risulta violata. Questo viene confermato dalla procedura diagnostica ed in particolare nel primo grafico. Nel secondo grafico si vede invece come anche l’assunzione di normalità possa essere violata.
Questo è un metodo che è stato creato per validare le stime ottenute dall’approccio parametrico. Bisogna applicare un’analisi di regressione su ogni rispondente. Si ottiene cosi una matrice in cui in ogni colonna si trovano tutti i coefficienti di tutti I rispondenti per una determinata caratteristica del mio prodotto. Bisogna poi applicare su ogni colonna il test di Wilcoxon e si ottengono cosi I p values. Dato che sono tutti significativi sotto il valore di 0,05, possiamo validare tutte le stime ottenute dal precedente modello.
Utilizzerò le deviazioni standard e le stime dei coefficienti e dei residui per generare le utilità parziali e i residui per ogni simulazione. Per ogni simulazione calcolerò poi i valori previsti e quindi i market share. Dopo 1000, 2000 simulazioni questo è il grafico finale in cui non ho più una stima puntuale di market share per prodotto ma una distribuizione di probabilità che potrò utilizzare per creare differenti scenari.
Non richiede assunzioni di omoschedasticità e normalità ma solo una randomizzazione dei dei profili tra i rispondenti che può essere visto come un vantaggio. Infatti mentre nei metodi classici di conjoint analysis viene somministrato ai rispondenti uno stesso set di profili di prodotto da cui si ricavano le le utilità parziali, in questo metodo i profili vengono randomizzati per rispondente permettendo di fare inferenza su un più vario set di profili.
.Lo svantaggio di questo metodo sta invece nel fatto che è un metodo molto recente e richiede ulteriori validazioni
E la probabilità che, nell’intera serie di ipotesi considerate, almeno un ipotesi vera sia rifiutata. Questo grafico mostra il FWER all’aumentare degli attributi e i livelli degli attributi. Se il colore è blue il valore è basso se il valore è marrone è alto. All’aumentare del numero dei fattori considerati il FWER aumenta. In particlare se considero anche le interazioni con gli attributi questo indicatore raggiunge rapidamente il valore unitario.
Risulta necessario qui di applicare le procedure di correzione di molteplicità. Si nota come le prime 2 procedure Holm, Benjamini hochberg forniscano una buona correzione. La terza correzione invece Benjamini Yekutieli è troppo conservativa, non rifiutando mai H0, si nota infatti come il blue compaia su tutte le casistiche considerate.
AMCE è il metodo statistico non parametrico di conjoint analysis che è stato inventato dai professori del mit lo scorso anno. Ho quindi applicato questo metodo statistico nell’analisi di un altro business del settore food and beverage, la granola. Si vede come tutti gli attributi considerati siano significativa e quindi come tutte queste caratteristiche influenzino la probabilità di acquistare il prodotto, anche dopo aver implementato aggiustamento di HOLM.