This document provides an overview of key concepts in statistics for management including data collection methods, primary and secondary data, sampling techniques, measures of central tendency, measures of dispersion, probability theory, probability distributions, the normal distribution, and hypothesis testing. It defines primary and secondary data collection methods. It also describes probability sampling techniques like simple random sampling, stratified sampling, and cluster sampling as well as non-probability techniques. Key concepts around measures of central tendency like the mean, median and mode are explained. The document also covers variance, standard deviation, and the normal distribution. It concludes with defining the null and alternative hypotheses.
Linear programming is a mathematical optimization technique used to maximize or minimize an objective function subject to constraints. It involves decision variables, an objective function that is a linear combination of the variables, and linear constraints. The key assumptions of linear programming are certainty, divisibility, additivity, and linearity. It allows improving decision quality through cost-benefit analysis and considers multiple possible solutions. However, it has disadvantages like fractional solutions, complex modeling, and inability to directly address time effects.
Quantitative techniques are statistical and programming methods that help decision makers analyze problems, especially business problems, using quantitative data. They have evolved from early applications in the 19th century to today where they are used widely. They can be classified into statistical techniques, which analyze collected data, and programming techniques, like linear programming, that model relationships to find optimal solutions. Quantitative techniques help businesses with tasks like resource allocation, strategy selection, and decision making. However, they have limitations like not accounting for intangible human factors.
Operational research emerged in 1885 when Frederick Taylor emphasized applying scientific analysis to production methods. Operational research systematically studies the basic structure, relationships, and functions of an organization using an interdisciplinary team approach from fields like statistics, engineering, and management. The goal is to help management make better decisions by developing mathematical models to quantitatively solve problems and find optimal solutions, though perfect answers may not always be possible. Computers are often needed to solve the complex mathematical models.
This is a presentation from video on 'Introduction to Operations Research' available at the end of this presentations and directly at https://youtu.be/PSOW3_gX2OU
Topics like Organisations of Operations Research, History of Operations Research Role of Operations Research(OR), Scope of Operations Research(OR), Characteristics of Operations Research(OR), Attributes of Operations Research(OR).
This video also talks about Models of Operations Research
• Degree of abstraction
o Mathematical models
o Language models
o Concrete models
• Function
o Descriptive models
o Predictive models
o Normative models
• Time Horizon
o Static models
o Dynamic models
• Structure
o Iconic or physical models
o Analog or schematic models
o Symbolic or mathematical models
• Nature of environment
o Deterministic models
o Probabilistic models
• Extent of generality
o General model
o Specific models
Decision theory deals with determining the optimal course of action when alternatives have uncertain consequences. There are several key concepts: decision alternatives are available options; states of nature are uncontrollable events; and payoff is the numerical outcome of alternatives and states. The decision process involves defining the problem, listing states, identifying alternatives, expressing payoffs, and applying a model to select the optimal alternative based on criteria. Decision making can occur under certainty, risk, or uncertainty depending on what is known about states and payoffs. Different techniques are used depending on the environment.
Social accounting is a process that applies accounting principles to measure a company's social and environmental performance. It identifies and quantifies a firm's social costs and benefits to determine if its strategies are consistent with social priorities and making information available to the public. Social accounting approaches include the classical profit-focused view, descriptive reporting of social activities, and indicators that measure net income plus contributions to human resources, the public, the environment and products/services. The objectives are to measure a company's social contribution and ensure its alignment with social welfare.
Questionnaire /Schedule design is a systematic approach/process of including relevant questions in a questionnaire in such a way that the best or accurate responses are obtained from respondent with very little / no discomfort on the part of the respondent as well as the enumerator.The most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the target group. Questionnaire / Schedules design is one of the most critical stages in the survey research process and therefore has to be given the utmost attention. This power point presentation will guide you through schedules and questionnaire design.
This document provides an introduction to business statistics. It defines statistics as the science of collecting, organizing, analyzing, and interpreting numerical data. The document notes that statistics can refer to both quantitative information and the methods used to analyze that information. It describes the key stages of a statistical analysis: data collection, organization, presentation, analysis, and interpretation. The document also discusses whether statistics is a science or an art and the important functions of statistics like providing definiteness, enabling comparison, and aiding in prediction.
Linear programming is a mathematical optimization technique used to maximize or minimize an objective function subject to constraints. It involves decision variables, an objective function that is a linear combination of the variables, and linear constraints. The key assumptions of linear programming are certainty, divisibility, additivity, and linearity. It allows improving decision quality through cost-benefit analysis and considers multiple possible solutions. However, it has disadvantages like fractional solutions, complex modeling, and inability to directly address time effects.
Quantitative techniques are statistical and programming methods that help decision makers analyze problems, especially business problems, using quantitative data. They have evolved from early applications in the 19th century to today where they are used widely. They can be classified into statistical techniques, which analyze collected data, and programming techniques, like linear programming, that model relationships to find optimal solutions. Quantitative techniques help businesses with tasks like resource allocation, strategy selection, and decision making. However, they have limitations like not accounting for intangible human factors.
Operational research emerged in 1885 when Frederick Taylor emphasized applying scientific analysis to production methods. Operational research systematically studies the basic structure, relationships, and functions of an organization using an interdisciplinary team approach from fields like statistics, engineering, and management. The goal is to help management make better decisions by developing mathematical models to quantitatively solve problems and find optimal solutions, though perfect answers may not always be possible. Computers are often needed to solve the complex mathematical models.
This is a presentation from video on 'Introduction to Operations Research' available at the end of this presentations and directly at https://youtu.be/PSOW3_gX2OU
Topics like Organisations of Operations Research, History of Operations Research Role of Operations Research(OR), Scope of Operations Research(OR), Characteristics of Operations Research(OR), Attributes of Operations Research(OR).
This video also talks about Models of Operations Research
• Degree of abstraction
o Mathematical models
o Language models
o Concrete models
• Function
o Descriptive models
o Predictive models
o Normative models
• Time Horizon
o Static models
o Dynamic models
• Structure
o Iconic or physical models
o Analog or schematic models
o Symbolic or mathematical models
• Nature of environment
o Deterministic models
o Probabilistic models
• Extent of generality
o General model
o Specific models
Decision theory deals with determining the optimal course of action when alternatives have uncertain consequences. There are several key concepts: decision alternatives are available options; states of nature are uncontrollable events; and payoff is the numerical outcome of alternatives and states. The decision process involves defining the problem, listing states, identifying alternatives, expressing payoffs, and applying a model to select the optimal alternative based on criteria. Decision making can occur under certainty, risk, or uncertainty depending on what is known about states and payoffs. Different techniques are used depending on the environment.
Social accounting is a process that applies accounting principles to measure a company's social and environmental performance. It identifies and quantifies a firm's social costs and benefits to determine if its strategies are consistent with social priorities and making information available to the public. Social accounting approaches include the classical profit-focused view, descriptive reporting of social activities, and indicators that measure net income plus contributions to human resources, the public, the environment and products/services. The objectives are to measure a company's social contribution and ensure its alignment with social welfare.
Questionnaire /Schedule design is a systematic approach/process of including relevant questions in a questionnaire in such a way that the best or accurate responses are obtained from respondent with very little / no discomfort on the part of the respondent as well as the enumerator.The most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the target group. Questionnaire / Schedules design is one of the most critical stages in the survey research process and therefore has to be given the utmost attention. This power point presentation will guide you through schedules and questionnaire design.
This document provides an introduction to business statistics. It defines statistics as the science of collecting, organizing, analyzing, and interpreting numerical data. The document notes that statistics can refer to both quantitative information and the methods used to analyze that information. It describes the key stages of a statistical analysis: data collection, organization, presentation, analysis, and interpretation. The document also discusses whether statistics is a science or an art and the important functions of statistics like providing definiteness, enabling comparison, and aiding in prediction.
Risk & Uncertainty in Managerial Decision Making (Managerial economics)Mateen Altaf
1) The document discusses risk and uncertainty in managerial decision making. It defines decision making as choosing among alternatives and outlines the decision making process.
2) It describes decisions under risk as having known probabilities and decisions under uncertainty as having unknown probabilities. Most major organizational decisions are made under uncertainty.
3) It provides a case study of Nirala Sweets, a Pakistani sweets company, and does a SWOT analysis identifying their strengths in quality, weaknesses in rewards and prices, and threats from competitors.
Melvin T Mathew presents on linear programming problems. A linear programming problem involves determining the optimal allocation of limited resources to meet objectives. It includes a set of simultaneous linear equations or inequalities that represent resource restrictions and a linear objective function expressing total profit or cost. Linear programming is defined as a method of determining an optimal program of interdependent activities given available resources, with the objective of maximizing profit or minimizing cost subject to constraints. The solution shows the optimal amounts to produce, sell, or purchase to satisfy objectives and constraints. Linear programming is a powerful technique that can be used to solve production scheduling, manufacturing, and marketing problems.
This document discusses criteria for good scaling in measurement. It defines key concepts like measurement, constructs, scales, and the primary scales of measurement - nominal, ordinal, interval, and ratio scales. It explains the meaning and purpose of scaling and scales. It also discusses criteria for good measurement like unidimensionality, validity, reliability, practicality, and sensitivity. It emphasizes that for a scale to be valid, it must also be reliable, so reliability is a necessary but insufficient condition for validity.
Different topics of management accountingkomal goyal
This document discusses responsibility centers and product life cycles. It defines responsibility centers as entities within an organization that are responsible for managing revenue, expenses, and investment funds. There are four main types of responsibility centers: cost centers, profit centers, investment centers, and revenue centers. The document also discusses target costing, which is setting a target cost for a product based on the desired selling price and profit. It outlines the objectives, applications, and limitations of using target costing. Additionally, the document covers value chain analysis and Porter's value chain model for analyzing a firm's activities to identify sources of competitive advantage. Finally, it defines transfer pricing as the price charged between different parts of the same company.
The document provides an overview of linear programming, including its applications, assumptions, and mathematical formulation. Some key points:
- Linear programming is a tool for maximizing or minimizing quantities like profit or cost, subject to constraints. 50-90% of business decisions and computations involve linear programming.
- Applications in business include production, personnel, inventory, marketing, financial, and blending problems. The objective is to optimize variables like costs, profits, or resources while meeting constraints.
- Assumptions of linear programming include certainty, linearity/proportionality, additivity, divisibility, non-negativity, finiteness, and optimality at corner points.
- A linear programming problem is modeled mathemat
This document discusses index numbers, which are statistical tools used to measure relative changes in variables such as prices or quantities over time. It defines index numbers and outlines their key features and types, including price, quantity, value, simple and composite index numbers. The document also describes several methods for constructing index numbers, such as Laspeyre's method, Paasche's method, Fisher's ideal method and consumer price indexes. Index numbers are expressed as percentages and measure the effect of changes over periods of time.
Operations Research - Meaning, Origin & CharacteristicsSundar B N
Operations research (OR) is a scientific approach to problem solving that uses quantitative analysis. It originated during World War II when the British military used empirical data and basic statistics to develop tactics. OR is characterized by its use of decision making, information technology, quantitative solutions, simulation, optimization, and interdisciplinary team-based work. It aims to uncover new problems and provide the best performance under given circumstances through mathematical modeling.
This document provides an overview of a course on managing decision under uncertainties taught by Dr. Elijah Ezendu. The learning objectives are to recognize the importance of managing decisions, identify the effects of risks and uncertainties on decisions, identify sources and levels of uncertainties, and manipulate uncertainties effectively while managing decisions. It defines certainty, risk, and uncertainty and differentiates between risk and uncertainty. It discusses various sources of uncertainty including demand structure, supply structure, competitors, internal forces, and time. It also covers levels of uncertainty, biases that hinder effective decision-making, handling uncertainties, limitations of tools like net present value analysis and real options, and the concept of decision profiling to examine past choices.
Cost Accounting Vs Management Accounting & Management Accounting Vs Financial...Uttar Tamang ✔
This Slide includes:
1. Cost Accounting Vs Management Accounting
2. Management Accounting Vs Financial Accounting
3. Types of Accounting
4. Difference between Cost, Management and Financial Accounting with basis
Questionnaire construction is presented by Prakash Aryal. Questionnaires can be used for primary research and involve asking respondents questions either in person or through mail/online surveys. Key steps in constructing a questionnaire include determining the type of survey, developing questions, organizing the question sequence and layout, and pilot testing. Questions should avoid ambiguity, bias, and double meanings. Both open-ended and closed-ended questions can be used, with closed-ended questions being easier to analyze but potentially limiting responses. The order and format of questions is also important to make the questionnaire smooth, logical and easy for respondents to follow.
Managerial Economics- Introduction,Characteristics and ScopePooja Kadiyan
This document provides an introduction to the scope of managerial economics. It defines managerial economics as the integration of economic theory with business practice to facilitate decision-making. The key areas covered in the scope of managerial economics include microeconomic analysis of the firm, acceptance and use of macroeconomic variables, a normative approach, and an emphasis on case studies. Microeconomics is applied to operational issues like production, costs, pricing, and investment. Macroeconomics is applied to the business environment, including factors like government policies, foreign trade, and the overall economic system.
Linear programming is an optimization technique used to maximize or minimize a linear objective function subject to linear constraints. It involves defining variables, constraints, and an objective function in terms of those variables. The optimal solution is found by systematically considering all extreme points of the feasible region defined by the constraints. Linear programming has wide applications in fields like production, transportation, finance, and resource allocation.
The document discusses economics and business economics. Economics is defined as the study of how individuals and groups allocate scarce resources. Business economics applies economic theories and techniques to solve business problems and aid management decision making. It uses micro and macroeconomic approaches to understand issues like demand, costs, profits, and external factors that influence business. The key aspects of business economics are demand forecasting, cost analysis, profit analysis, and capital management. Overall, the document outlines the basic concepts, scope, importance and determinants of demand within the field of business economics.
Non-sampling errors occur in surveys and censuses due to factors other than sampling and can happen at various stages:
- Specification errors occur during planning due to issues like incomplete population coverage or ambiguous questions.
- Ascertainment errors happen during data collection due to inaccurate recording or ambiguous instructions.
- Tabulation errors take place during analysis through mistakes in coding, analysis or presentation of results.
Some common sources of non-sampling errors include lack of proper planning, incomplete or inaccurate responses, ambiguous definitions, and errors in data processing. Measures like pre-testing questionnaires, hiring experienced staff, and cross-checking data can help reduce non-sampling errors.
Operations research is a scientific approach to problem solving and decision making that is useful for managing organizations. It has its origins in World War II and is now widely used in business and industry. Some key areas where operations research models are applied include forecasting, production scheduling, inventory control, and transportation. Models are an essential part of operations research and can take various forms like physical, mathematical, or conceptual representations of real-world problems. Models are classified in different ways such as by their structure, purpose, solution method, or whether they consider deterministic or probabilistic systems. Operations research techniques help solve complex business problems through mathematical analysis and support improved organizational performance.
The document discusses analyzing multivariate time series of five energy futures (crude oil, ethanol, gasoline, heating oil, natural gas) using vector autoregressive (VAR) and vector error correction (VEC) models. It finds the futures are cointegrated using Johansen and Engle-Granger tests, indicating they share a common stochastic trend. A VAR(1) model is estimated and found stable. The VEC model captures the error correction behavior as futures return to their long-run equilibrium. Forecasts are generated and limitations of the Engle-Granger approach discussed.
Forecasting and decision making are important for businesses to plan effectively amid risk and uncertainty. Economic forecasting helps businesses understand changes in the broader environment so they can formulate strategies. Demand forecasting also allows businesses to predict sales and allocate resources appropriately. Qualitative techniques like expert opinions and surveys, and quantitative techniques like time series analysis are commonly used for demand forecasting. The results of forecasting assist both businesses and governments in planning investments and policies.
This document discusses strategic business units (SBUs) and provides Dabur India Limited as a case study. It defines SBUs and their characteristics. Dabur's structure consists of multiple SBUs including consumer care, international business, and retail divisions. The document outlines Dabur's history, vision, markets, manufacturing units, and strategies around promotion, distribution, regional branding, and pricing. It analyzes Dabur's product portfolio using the BCG matrix and provides recommendations to focus on national branding and innovation. In conclusion, it states that Dabur has a strong distribution network and is one of India's most trusted FMCG companies with aggressive strategies to capture various market segments.
Decision making under condition of risk and uncertaintysapna moodautia
The document discusses decision making under conditions of certainty, risk, and uncertainty. It defines each condition and explains how the degree of knowledge affects the decision. Certainty exists when all alternatives and outcomes are known. Risk exists when probabilities can be estimated but information is imperfect. Uncertainty exists when probabilities cannot be estimated due to limited information. Modern approaches to decision making under uncertainty include risk analysis, decision trees, and preference theory.
Here are some potential sources of secondary data that could help analyze this business problem:
- Industry reports on the retail sector that provide market size, growth trends and future projections at the state and national level. This can help assess expansion opportunities.
- Demographic data from census providing population stats, income levels etc. of potential target cities which impact retail spending.
- Economic indicators like GDP, disposable income figures of different states. This helps compare market potential of states.
- Competitor analysis studying expansion strategies of other retail chains operating in similar segments.
- Feasibility studies or reports conducted previously for other retailers exploring new markets.
Collecting and analyzing such secondary data sources can help get an overview
This document provides information on conducting surveys. It discusses the different types of surveys, including cross-sectional and longitudinal surveys. It also describes various modes of survey administration like observation surveys, personal interviews, telephone interviews, mail surveys, and internet surveys, highlighting their advantages and disadvantages. Additionally, it covers key aspects of survey design like developing a questionnaire, sampling plan, and different types of questions (open-ended, closed-ended, dichotomous, multiple choice, rating scale, and rank order) that can be used in a survey. The document provides guidance on writing clear, unbiased questions for surveys.
Risk & Uncertainty in Managerial Decision Making (Managerial economics)Mateen Altaf
1) The document discusses risk and uncertainty in managerial decision making. It defines decision making as choosing among alternatives and outlines the decision making process.
2) It describes decisions under risk as having known probabilities and decisions under uncertainty as having unknown probabilities. Most major organizational decisions are made under uncertainty.
3) It provides a case study of Nirala Sweets, a Pakistani sweets company, and does a SWOT analysis identifying their strengths in quality, weaknesses in rewards and prices, and threats from competitors.
Melvin T Mathew presents on linear programming problems. A linear programming problem involves determining the optimal allocation of limited resources to meet objectives. It includes a set of simultaneous linear equations or inequalities that represent resource restrictions and a linear objective function expressing total profit or cost. Linear programming is defined as a method of determining an optimal program of interdependent activities given available resources, with the objective of maximizing profit or minimizing cost subject to constraints. The solution shows the optimal amounts to produce, sell, or purchase to satisfy objectives and constraints. Linear programming is a powerful technique that can be used to solve production scheduling, manufacturing, and marketing problems.
This document discusses criteria for good scaling in measurement. It defines key concepts like measurement, constructs, scales, and the primary scales of measurement - nominal, ordinal, interval, and ratio scales. It explains the meaning and purpose of scaling and scales. It also discusses criteria for good measurement like unidimensionality, validity, reliability, practicality, and sensitivity. It emphasizes that for a scale to be valid, it must also be reliable, so reliability is a necessary but insufficient condition for validity.
Different topics of management accountingkomal goyal
This document discusses responsibility centers and product life cycles. It defines responsibility centers as entities within an organization that are responsible for managing revenue, expenses, and investment funds. There are four main types of responsibility centers: cost centers, profit centers, investment centers, and revenue centers. The document also discusses target costing, which is setting a target cost for a product based on the desired selling price and profit. It outlines the objectives, applications, and limitations of using target costing. Additionally, the document covers value chain analysis and Porter's value chain model for analyzing a firm's activities to identify sources of competitive advantage. Finally, it defines transfer pricing as the price charged between different parts of the same company.
The document provides an overview of linear programming, including its applications, assumptions, and mathematical formulation. Some key points:
- Linear programming is a tool for maximizing or minimizing quantities like profit or cost, subject to constraints. 50-90% of business decisions and computations involve linear programming.
- Applications in business include production, personnel, inventory, marketing, financial, and blending problems. The objective is to optimize variables like costs, profits, or resources while meeting constraints.
- Assumptions of linear programming include certainty, linearity/proportionality, additivity, divisibility, non-negativity, finiteness, and optimality at corner points.
- A linear programming problem is modeled mathemat
This document discusses index numbers, which are statistical tools used to measure relative changes in variables such as prices or quantities over time. It defines index numbers and outlines their key features and types, including price, quantity, value, simple and composite index numbers. The document also describes several methods for constructing index numbers, such as Laspeyre's method, Paasche's method, Fisher's ideal method and consumer price indexes. Index numbers are expressed as percentages and measure the effect of changes over periods of time.
Operations Research - Meaning, Origin & CharacteristicsSundar B N
Operations research (OR) is a scientific approach to problem solving that uses quantitative analysis. It originated during World War II when the British military used empirical data and basic statistics to develop tactics. OR is characterized by its use of decision making, information technology, quantitative solutions, simulation, optimization, and interdisciplinary team-based work. It aims to uncover new problems and provide the best performance under given circumstances through mathematical modeling.
This document provides an overview of a course on managing decision under uncertainties taught by Dr. Elijah Ezendu. The learning objectives are to recognize the importance of managing decisions, identify the effects of risks and uncertainties on decisions, identify sources and levels of uncertainties, and manipulate uncertainties effectively while managing decisions. It defines certainty, risk, and uncertainty and differentiates between risk and uncertainty. It discusses various sources of uncertainty including demand structure, supply structure, competitors, internal forces, and time. It also covers levels of uncertainty, biases that hinder effective decision-making, handling uncertainties, limitations of tools like net present value analysis and real options, and the concept of decision profiling to examine past choices.
Cost Accounting Vs Management Accounting & Management Accounting Vs Financial...Uttar Tamang ✔
This Slide includes:
1. Cost Accounting Vs Management Accounting
2. Management Accounting Vs Financial Accounting
3. Types of Accounting
4. Difference between Cost, Management and Financial Accounting with basis
Questionnaire construction is presented by Prakash Aryal. Questionnaires can be used for primary research and involve asking respondents questions either in person or through mail/online surveys. Key steps in constructing a questionnaire include determining the type of survey, developing questions, organizing the question sequence and layout, and pilot testing. Questions should avoid ambiguity, bias, and double meanings. Both open-ended and closed-ended questions can be used, with closed-ended questions being easier to analyze but potentially limiting responses. The order and format of questions is also important to make the questionnaire smooth, logical and easy for respondents to follow.
Managerial Economics- Introduction,Characteristics and ScopePooja Kadiyan
This document provides an introduction to the scope of managerial economics. It defines managerial economics as the integration of economic theory with business practice to facilitate decision-making. The key areas covered in the scope of managerial economics include microeconomic analysis of the firm, acceptance and use of macroeconomic variables, a normative approach, and an emphasis on case studies. Microeconomics is applied to operational issues like production, costs, pricing, and investment. Macroeconomics is applied to the business environment, including factors like government policies, foreign trade, and the overall economic system.
Linear programming is an optimization technique used to maximize or minimize a linear objective function subject to linear constraints. It involves defining variables, constraints, and an objective function in terms of those variables. The optimal solution is found by systematically considering all extreme points of the feasible region defined by the constraints. Linear programming has wide applications in fields like production, transportation, finance, and resource allocation.
The document discusses economics and business economics. Economics is defined as the study of how individuals and groups allocate scarce resources. Business economics applies economic theories and techniques to solve business problems and aid management decision making. It uses micro and macroeconomic approaches to understand issues like demand, costs, profits, and external factors that influence business. The key aspects of business economics are demand forecasting, cost analysis, profit analysis, and capital management. Overall, the document outlines the basic concepts, scope, importance and determinants of demand within the field of business economics.
Non-sampling errors occur in surveys and censuses due to factors other than sampling and can happen at various stages:
- Specification errors occur during planning due to issues like incomplete population coverage or ambiguous questions.
- Ascertainment errors happen during data collection due to inaccurate recording or ambiguous instructions.
- Tabulation errors take place during analysis through mistakes in coding, analysis or presentation of results.
Some common sources of non-sampling errors include lack of proper planning, incomplete or inaccurate responses, ambiguous definitions, and errors in data processing. Measures like pre-testing questionnaires, hiring experienced staff, and cross-checking data can help reduce non-sampling errors.
Operations research is a scientific approach to problem solving and decision making that is useful for managing organizations. It has its origins in World War II and is now widely used in business and industry. Some key areas where operations research models are applied include forecasting, production scheduling, inventory control, and transportation. Models are an essential part of operations research and can take various forms like physical, mathematical, or conceptual representations of real-world problems. Models are classified in different ways such as by their structure, purpose, solution method, or whether they consider deterministic or probabilistic systems. Operations research techniques help solve complex business problems through mathematical analysis and support improved organizational performance.
The document discusses analyzing multivariate time series of five energy futures (crude oil, ethanol, gasoline, heating oil, natural gas) using vector autoregressive (VAR) and vector error correction (VEC) models. It finds the futures are cointegrated using Johansen and Engle-Granger tests, indicating they share a common stochastic trend. A VAR(1) model is estimated and found stable. The VEC model captures the error correction behavior as futures return to their long-run equilibrium. Forecasts are generated and limitations of the Engle-Granger approach discussed.
Forecasting and decision making are important for businesses to plan effectively amid risk and uncertainty. Economic forecasting helps businesses understand changes in the broader environment so they can formulate strategies. Demand forecasting also allows businesses to predict sales and allocate resources appropriately. Qualitative techniques like expert opinions and surveys, and quantitative techniques like time series analysis are commonly used for demand forecasting. The results of forecasting assist both businesses and governments in planning investments and policies.
This document discusses strategic business units (SBUs) and provides Dabur India Limited as a case study. It defines SBUs and their characteristics. Dabur's structure consists of multiple SBUs including consumer care, international business, and retail divisions. The document outlines Dabur's history, vision, markets, manufacturing units, and strategies around promotion, distribution, regional branding, and pricing. It analyzes Dabur's product portfolio using the BCG matrix and provides recommendations to focus on national branding and innovation. In conclusion, it states that Dabur has a strong distribution network and is one of India's most trusted FMCG companies with aggressive strategies to capture various market segments.
Decision making under condition of risk and uncertaintysapna moodautia
The document discusses decision making under conditions of certainty, risk, and uncertainty. It defines each condition and explains how the degree of knowledge affects the decision. Certainty exists when all alternatives and outcomes are known. Risk exists when probabilities can be estimated but information is imperfect. Uncertainty exists when probabilities cannot be estimated due to limited information. Modern approaches to decision making under uncertainty include risk analysis, decision trees, and preference theory.
Here are some potential sources of secondary data that could help analyze this business problem:
- Industry reports on the retail sector that provide market size, growth trends and future projections at the state and national level. This can help assess expansion opportunities.
- Demographic data from census providing population stats, income levels etc. of potential target cities which impact retail spending.
- Economic indicators like GDP, disposable income figures of different states. This helps compare market potential of states.
- Competitor analysis studying expansion strategies of other retail chains operating in similar segments.
- Feasibility studies or reports conducted previously for other retailers exploring new markets.
Collecting and analyzing such secondary data sources can help get an overview
This document provides information on conducting surveys. It discusses the different types of surveys, including cross-sectional and longitudinal surveys. It also describes various modes of survey administration like observation surveys, personal interviews, telephone interviews, mail surveys, and internet surveys, highlighting their advantages and disadvantages. Additionally, it covers key aspects of survey design like developing a questionnaire, sampling plan, and different types of questions (open-ended, closed-ended, dichotomous, multiple choice, rating scale, and rank order) that can be used in a survey. The document provides guidance on writing clear, unbiased questions for surveys.
A questionnaire is a research instrument consisting of questions used to gather information from respondents. There are two main types of questionnaires: open-ended questionnaires that allow free responses and closed-ended questionnaires that provide answer choices. Well-designed questionnaires keep questions concise and simple, assure respondent anonymity, and are pretested to identify issues before widespread use. Questionnaires provide an efficient way to collect standardized self-reported data from a large number of people but rely on respondents and may receive incomplete answers.
This document discusses the design and development of questionnaires for research. It begins by defining a questionnaire as a standardized set of questions used to collect responses from participants. It then outlines the functions of questionnaires in translating research objectives into specific questions, standardizing responses, and facilitating data collection and analysis. The document discusses different types of questionnaires and the systematic process of questionnaire design. It presents the "flowerpot approach" involving general to specific questioning. Key steps are outlined, including developing an introductory section to motivate participation. Basic principles of writing clear, unbiased questions are also provided.
This document discusses research methods and designs, focusing on surveys. It defines surveys and describes their purpose, which includes providing information, explaining situations, identifying and solving problems, and measuring change. The main types of survey designs discussed are cross-sectional, longitudinal, trend studies, cohort studies, and panel studies. Advantages and disadvantages of different designs are compared. Guidelines for developing questionnaires and conducting interviews are also provided.
Tools and techniques for data collection.pptxJuruJackline
These the tools and techniques used for data collection when carrying out community diagnosis in public health setting.
The slides looked into details the various tools and how they can be used in the data collection depending on the type of data you would like to collect.
Questionnaire design for beginners (Bart Rienties)Bart Rienties
This document provides an introduction to questionnaire design. It discusses the objectives of using questionnaires which are to understand why they are used, the process of constructing them, and key features of good question design. It also covers strengths and limitations of questionnaires, the survey process, maximizing response rates, and types of questions. The document aims to provide guidance on best practices for designing and implementing effective questionnaires.
This document discusses different methods for collecting data in scientific research, focusing on questionnaires and interviews. It provides details on how to design and administer questionnaires, including defining objectives, writing questions, and pilot testing. It also describes structured, semi-structured, and in-depth interviews. Focus group discussions are explained as a way to stimulate conversation around a topic and cross-check opinions. Questionnaires allow collecting large amounts of subjective and objective data but depend on honesty, while interviews provide more context and understanding but are more time intensive.
The contents of this presentation includes the introduction, steps involved in a survey, pros and cons as well as the sources of error. The contents are designed to support the researchers and students in their basics.
This document provides an overview of the survey design process. It discusses determining the need for a survey by establishing goals, objectives and expected outcomes. It covers designing survey questions, avoiding biases, pretesting and revising questions. It addresses technical aspects of building a survey such as question types, logic and validation. It also discusses sampling, collecting responses, cleaning data and reporting results. The overall process emphasizes establishing a clear need, designing unbiased questions and using survey results to inform actions.
Collecting Research Data With Questionnaires And InterviewsAmanda Walker
Questionnaires and interviews are methods used to collect research data on phenomena that cannot be directly observed, such as opinions, values, and experiences. Both methods require similar steps in design, data collection, analysis, and interpretation. Questionnaires provide standardized answers from a sample but cannot probe responses, while interviews can probe but are more time-consuming and expensive. Careful planning is needed for both methods to train staff and develop well-designed questions and procedures.
Methods of collecting data
Survey, methods and type, response rate, variable language
Hands on: Graphical techniques II, SPSS
Questionnaire design
Tips on writing a research paper
Individual project: article critique
The document discusses various tools used for data collection in research such as observation schedules, interview schedules, interview guides, questionnaires, rating scales, checklists, and document schedules. It provides details on how each tool is used, the differences between schedules and questionnaires, and guidelines for constructing effective schedules and questionnaires. A pilot study or pretesting is recommended to test the data collection tools, identify any issues, and make necessary revisions before the full research study.
This document discusses primary and secondary data collection methods. It defines primary data as data that is collected for the first time, while secondary data refers to data that was previously collected by another source. Some key points made include:
- Secondary data is collected before primary data in order to understand what is already known about a topic before conducting new research.
- Primary data collection is usually more costly and time-consuming than using secondary data.
- Sampling techniques like simple random sampling, stratified sampling, and cluster sampling aim to select a representative sample from a population.
- Survey construction should consider question type (open-ended, closed-ended, scaled response) and design (user-friendly format,
A questionnaire is a research instrument consisting of a series of questions and other prompts for the purpose of gathering information from respondents.
This document outlines the 10 step process for designing an effective questionnaire: 1) Specify the information needed, 2) Determine the interview method, 3) Design question content, 4) Structure questions properly, 5) Determine question wording and phrasing, 6) Establish question sequence, 7) Design the form and layout, 8) Reproduce the questionnaire, 9) Pretest the questionnaire using personal interviews with a sample similar to the target population, and 10) Edit and finalize the questionnaire based on pretest results. The goal is to obtain all necessary information through clear, unbiased questions in a logical order that motivates respondents to answer accurately.
The document discusses different methods and techniques for collecting data, focusing on interviews and surveys. It provides details on conducting interviews, including structured, semi-structured, and unstructured styles. It also discusses considerations for interviews such as who to interview and what type of interview to use. For surveys, it discusses using questionnaires to collect both qualitative and quantitative information from a sample of a population. Key aspects of surveys include using a fixed design, collecting standardized data from many individuals, and selecting a representative sample.
Demo presentation SVPISTM Sampling and scales.pptxPradeep513562
This document discusses research methodology concepts including sampling techniques, scales of measurement, and validity and reliability. It describes common sampling techniques like simple random sampling, stratified sampling, and cluster sampling. It explains that probability sampling allows results to be generalized while nonprobability sampling does not. Various scales of measurement are outlined including nominal, ordinal, interval, and ratio scales. The key difference between reliability and validity is explained - reliability refers to consistency of measurement and validity refers to measuring what was intended.
This document provides guidance on creating compelling research manuscripts. It discusses identifying quality research publications and constructing the structure of a research article. It outlines the typical sections of a research paper like introduction, literature review, methodology, results, and discussion. It also addresses title, abstract, conclusions, and future research. Instructional methods include lectures and demonstrations. The document recommends resources like reference managers and academic databases. It emphasizes writing clearly and concisely while avoiding plagiarism.
This document provides an overview of organizational behavior including its objectives, outcomes, major contributing disciplines, and evolution. The objectives are to understand individual and group behavior, apply OB knowledge to business, and develop better workplace relationships. Regarding evolution, the document discusses the classical approach focusing on efficiency, the neo-classical approach emphasizing human relations, and the modern approach combining classical and social science concepts. Major disciplines influencing OB include psychology, sociology, social psychology, anthropology, and political science.
This document discusses who conducts business research and the research process. It provides examples of syndicated data providers, specialty business research firms, communication agencies, consultants, and trade associations that conduct business research. It describes why studying business research is important and defines business research. The document outlines the research process including problem discovery and definition, determining the unit of analysis, and methods of research such as descriptive, correlational, experimental, exploratory, and explanatory research. It discusses classifying research based on application and logic. Finally, it covers the research stages from clarifying the research question to reporting results.
The document discusses identifying and defining research problems. It provides information on:
1) What constitutes a research problem - it is an issue or concern that an investigator presents and justifies studying to address a management issue.
2) How to identify a research problem - this involves searching for problems, reading about the topic, taking notes, seeking advice, and keeping the topic interesting.
3) The importance of clearly defining the research problem so the research yields useful information for management and addresses the core issue. A well-defined problem guides the research process.
The document defines key concepts related to sampling, including population, sample, sampling methods, and errors. It discusses different types of sampling methods like probability sampling (simple random sampling, stratified sampling, cluster sampling) and non-probability sampling (convenience sampling, judgement sampling). It also explains sampling frame, sampling frame error, random sampling error, and non-response error that can occur in sampling. The document provides steps involved in conducting a sample survey from defining the target population to selecting the sampling technique and sample size.
15 Qualitative Research Methods Overview.pptPradeep513562
This document provides an overview of qualitative research methods, including definitions of qualitative research, distinctions from quantitative research, philosophical foundations, different approaches, and when to use qualitative research. It discusses how qualitative research focuses on meanings, experiences, and interpretations. While there is no single definition, qualitative research generally uses naturalistic and interpretive approaches to understand peoples' social worlds from their perspectives.
This chapter discusses new digital media and how marketers use various media channels. It describes different types of media including broadcast, print, narrowcast, and pointcast. It also covers social media, blogs, online communities and how companies can build communities. The chapter discusses search engine optimization and paid search marketing. It provides examples of how companies measure the effectiveness of online advertising campaigns.
This document outlines the Tamil Nadu Transparency in Tenders Act and Rules which provide transparency in public procurement in Tamil Nadu. It defines procuring entities, categories and types of procurement contracts. It describes the roles of Tender Inviting and Accepting Authorities and the tendering process including notice, evaluation, acceptance and special provisions. The objective is to regulate procedures for inviting and accepting tenders in a transparent manner.
The document discusses key concepts in measurement including scales, validity, and reliability. It defines scales as a series of items arranged according to value for quantification. There are four main types of scales: nominal, ordinal, interval, and ratio. Validity refers to a scale measuring what it intends to measure and is classified into face, content, criterion, and construct validity. Reliability is the degree to which a scale yields consistent results and is measured through internal consistency and test-retest methods. Accuracy in measurement depends on both validity and reliability.
This document discusses measurement scales and how to establish the reliability and validity of measurement instruments. It describes the four main types of scales - nominal, ordinal, interval, and ratio - and provides examples of each. Rating scales and ranking scales are presented as two categories for developing attitudinal scales. The document emphasizes the importance of establishing the goodness of measures through item analysis and testing the reliability and validity of instruments.
This document defines key concepts in research including variables, constructs, propositions, and hypotheses. It explains that concepts are generalized ideas about objects, while constructs are concepts measured by multiple variables. Propositions make statements relating concepts, and hypotheses are testable statements about relationships between variables. The document also discusses different types of variables like independent and dependent variables, and different types of reasoning used in research like deductive and inductive reasoning.
This document discusses ethical issues and security challenges related to information technology. It covers topics like computer crime, hacking, privacy, censorship, cyberlaw, and strategies for improving security such as encryption, firewalls, biometrics, and fault tolerant systems. The document provides examples and definitions to explain these concepts over several pages. It also includes two case studies and questions to help readers apply the concepts.
This document provides an overview of data resource management and database concepts. It discusses the business value of data resource management and the advantages of a database management approach over traditional file processing. It also defines key database terms and concepts such as data warehouses and data mining. Different database structures like hierarchical, network, relational, and multidimensional models are explained. The document concludes with a discussion of the database development process and three case studies on data resource management.
This document provides an overview of database concepts and types of databases. It discusses the advantages of database management over traditional file processing, including reduced data redundancy and improved data integration and standardization. The document also describes several types of databases, including operational databases, distributed databases, external databases, hypermedia databases, data warehouses, and the use of data mining.
This chapter discusses the research proposal. The purpose of a research proposal is to present the research question and its importance, discuss related past research, and suggest the necessary data. All research has a sponsor that provides funding, either a company or academic institution. Developing a proposal allows the researcher to plan steps, serve as a guide, and estimate time and budgets. Proposals can be internal or external and range from exploratory to large-scale professional studies costing millions. The proposal should be structured into modules that are tailored for the intended audience. Key modules include an executive summary, problem statement, research objectives, literature review, importance, design, analysis, results, qualifications, budget, schedule, and appendices.
Google launched in 1998 with an innovative search strategy that ranked results based on popularity and keywords. It now performs over a billion searches per day across many languages and countries. Google generates most of its revenue from advertising on its sites. It offers many products including search, advertising, applications and enterprise products. The document discusses how companies like Google develop products online, focusing on attributes, branding, support services and labeling to create value for customers. It provides examples of how Google and other companies have built their brands and launched new products online over time.
The document outlines various topics related to online distribution channels and e-business models. It describes the three major functions of a distribution channel as transactional, logistical, and facilitating. It then explains the different types of online channel intermediaries and models, including brokers, agents, online retailing/e-commerce, m-commerce, and social commerce. It also discusses how the Internet is affecting channel length and functions.
The document discusses the importance of using analytics and data-driven decision making in businesses. It argues that relying solely on intuition and experience can be limiting, while incorporating analytics provides benefits such as understanding customer behaviors, predicting market shifts, and improving efficiency. The document also outlines key factors ("THE ANALYTICAL DELTA") that are necessary for successfully implementing analytics initiatives, including accessible high-quality data, enterprise-wide support and leadership, clear strategic targets, and sufficient analytical talent.
The document discusses the prerequisites for effective data analytics. It outlines seven key aspects of data that must be addressed: structure, uniqueness, integration, quality, access, privacy, and governance. For each aspect, examples are provided of how companies have leveraged their data through analytics. The document emphasizes that while perfect data is impossible, organizations should focus on the most critical data needed for decision making and prioritize analysis over endless data cleanup efforts. High-level management support and clear roles for data ownership are important for effective data governance.
Best practices for project execution and deliveryCLIVE MINCHIN
A select set of project management best practices to keep your project on-track, on-cost and aligned to scope. Many firms have don't have the necessary skills, diligence, methods and oversight of their projects; this leads to slippage, higher costs and longer timeframes. Often firms have a history of projects that simply failed to move the needle. These best practices will help your firm avoid these pitfalls but they require fortitude to apply.
SATTA MATKA SATTA FAST RESULT KALYAN TOP MATKA RESULT KALYAN SATTA MATKA FAST RESULT MILAN RATAN RAJDHANI MAIN BAZAR MATKA FAST TIPS RESULT MATKA CHART JODI CHART PANEL CHART FREE FIX GAME SATTAMATKA ! MATKA MOBI SATTA 143 spboss.in TOP NO1 RESULT FULL RATE MATKA ONLINE GAME PLAY BY APP SPBOSS
𝐔𝐧𝐯𝐞𝐢𝐥 𝐭𝐡𝐞 𝐅𝐮𝐭𝐮𝐫𝐞 𝐨𝐟 𝐄𝐧𝐞𝐫𝐠𝐲 𝐄𝐟𝐟𝐢𝐜𝐢𝐞𝐧𝐜𝐲 𝐰𝐢𝐭𝐡 𝐍𝐄𝐖𝐍𝐓𝐈𝐃𝐄’𝐬 𝐋𝐚𝐭𝐞𝐬𝐭 𝐎𝐟𝐟𝐞𝐫𝐢𝐧𝐠𝐬
Explore the details in our newly released product manual, which showcases NEWNTIDE's advanced heat pump technologies. Delve into our energy-efficient and eco-friendly solutions tailored for diverse global markets.
Understanding User Needs and Satisfying ThemAggregage
https://www.productmanagementtoday.com/frs/26903918/understanding-user-needs-and-satisfying-them
We know we want to create products which our customers find to be valuable. Whether we label it as customer-centric or product-led depends on how long we've been doing product management. There are three challenges we face when doing this. The obvious challenge is figuring out what our users need; the non-obvious challenges are in creating a shared understanding of those needs and in sensing if what we're doing is meeting those needs.
In this webinar, we won't focus on the research methods for discovering user-needs. We will focus on synthesis of the needs we discover, communication and alignment tools, and how we operationalize addressing those needs.
Industry expert Scott Sehlhorst will:
• Introduce a taxonomy for user goals with real world examples
• Present the Onion Diagram, a tool for contextualizing task-level goals
• Illustrate how customer journey maps capture activity-level and task-level goals
• Demonstrate the best approach to selection and prioritization of user-goals to address
• Highlight the crucial benchmarks, observable changes, in ensuring fulfillment of customer needs
Zodiac Signs and Food Preferences_ What Your Sign Says About Your Tastemy Pandit
Know what your zodiac sign says about your taste in food! Explore how the 12 zodiac signs influence your culinary preferences with insights from MyPandit. Dive into astrology and flavors!
How to Implement a Real Estate CRM SoftwareSalesTown
To implement a CRM for real estate, set clear goals, choose a CRM with key real estate features, and customize it to your needs. Migrate your data, train your team, and use automation to save time. Monitor performance, ensure data security, and use the CRM to enhance marketing. Regularly check its effectiveness to improve your business.
How to Implement a Strategy: Transform Your Strategy with BSC Designer's Comp...Aleksey Savkin
The Strategy Implementation System offers a structured approach to translating stakeholder needs into actionable strategies using high-level and low-level scorecards. It involves stakeholder analysis, strategy decomposition, adoption of strategic frameworks like Balanced Scorecard or OKR, and alignment of goals, initiatives, and KPIs.
Key Components:
- Stakeholder Analysis
- Strategy Decomposition
- Adoption of Business Frameworks
- Goal Setting
- Initiatives and Action Plans
- KPIs and Performance Metrics
- Learning and Adaptation
- Alignment and Cascading of Scorecards
Benefits:
- Systematic strategy formulation and execution.
- Framework flexibility and automation.
- Enhanced alignment and strategic focus across the organization.
Industrial Tech SW: Category Renewal and CreationChristian Dahlen
Every industrial revolution has created a new set of categories and a new set of players.
Multiple new technologies have emerged, but Samsara and C3.ai are only two companies which have gone public so far.
Manufacturing startups constitute the largest pipeline share of unicorns and IPO candidates in the SF Bay Area, and software startups dominate in Germany.
[To download this presentation, visit:
https://www.oeconsulting.com.sg/training-presentations]
This PowerPoint compilation offers a comprehensive overview of 20 leading innovation management frameworks and methodologies, selected for their broad applicability across various industries and organizational contexts. These frameworks are valuable resources for a wide range of users, including business professionals, educators, and consultants.
Each framework is presented with visually engaging diagrams and templates, ensuring the content is both informative and appealing. While this compilation is thorough, please note that the slides are intended as supplementary resources and may not be sufficient for standalone instructional purposes.
This compilation is ideal for anyone looking to enhance their understanding of innovation management and drive meaningful change within their organization. Whether you aim to improve product development processes, enhance customer experiences, or drive digital transformation, these frameworks offer valuable insights and tools to help you achieve your goals.
INCLUDED FRAMEWORKS/MODELS:
1. Stanford’s Design Thinking
2. IDEO’s Human-Centered Design
3. Strategyzer’s Business Model Innovation
4. Lean Startup Methodology
5. Agile Innovation Framework
6. Doblin’s Ten Types of Innovation
7. McKinsey’s Three Horizons of Growth
8. Customer Journey Map
9. Christensen’s Disruptive Innovation Theory
10. Blue Ocean Strategy
11. Strategyn’s Jobs-To-Be-Done (JTBD) Framework with Job Map
12. Design Sprint Framework
13. The Double Diamond
14. Lean Six Sigma DMAIC
15. TRIZ Problem-Solving Framework
16. Edward de Bono’s Six Thinking Hats
17. Stage-Gate Model
18. Toyota’s Six Steps of Kaizen
19. Microsoft’s Digital Transformation Framework
20. Design for Six Sigma (DFSS)
To download this presentation, visit:
https://www.oeconsulting.com.sg/training-presentations
Building Your Employer Brand with Social MediaLuanWise
Presented at The Global HR Summit, 6th June 2024
In this keynote, Luan Wise will provide invaluable insights to elevate your employer brand on social media platforms including LinkedIn, Facebook, Instagram, X (formerly Twitter) and TikTok. You'll learn how compelling content can authentically showcase your company culture, values, and employee experiences to support your talent acquisition and retention objectives. Additionally, you'll understand the power of employee advocacy to amplify reach and engagement – helping to position your organization as an employer of choice in today's competitive talent landscape.
❼❷⓿❺❻❷❽❷❼❽ Dpboss Matka Result Satta Matka Guessing Satta Fix jodi Kalyan Final ank Satta Matka Dpbos Final ank Satta Matta Matka 143 Kalyan Matka Guessing Final Matka Final ank Today Matka 420 Satta Batta Satta 143 Kalyan Chart Main Bazar Chart vip Matka Guessing Dpboss 143 Guessing Kalyan night
Digital Marketing with a Focus on Sustainabilitysssourabhsharma
Digital Marketing best practices including influencer marketing, content creators, and omnichannel marketing for Sustainable Brands at the Sustainable Cosmetics Summit 2024 in New York
Discover timeless style with the 2022 Vintage Roman Numerals Men's Ring. Crafted from premium stainless steel, this 6mm wide ring embodies elegance and durability. Perfect as a gift, it seamlessly blends classic Roman numeral detailing with modern sophistication, making it an ideal accessory for any occasion.
https://rb.gy/usj1a2
The APCO Geopolitical Radar - Q3 2024 The Global Operating Environment for Bu...APCO
The Radar reflects input from APCO’s teams located around the world. It distils a host of interconnected events and trends into insights to inform operational and strategic decisions. Issues covered in this edition include:
The Genesis of BriansClub.cm Famous Dark WEb PlatformSabaaSudozai
BriansClub.cm, a famous platform on the dark web, has become one of the most infamous carding marketplaces, specializing in the sale of stolen credit card data.
Call8328958814 satta matka Kalyan result satta guessing➑➌➋➑➒➎➑➑➊➍
Satta Matka Kalyan Main Mumbai Fastest Results
Satta Matka ❋ Sattamatka ❋ New Mumbai Ratan Satta Matka ❋ Fast Matka ❋ Milan Market ❋ Kalyan Matka Results ❋ Satta Game ❋ Matka Game ❋ Satta Matka ❋ Kalyan Satta Matka ❋ Mumbai Main ❋ Online Matka Results ❋ Satta Matka Tips ❋ Milan Chart ❋ Satta Matka Boss❋ New Star Day ❋ Satta King ❋ Live Satta Matka Results ❋ Satta Matka Company ❋ Indian Matka ❋ Satta Matka 143❋ Kalyan Night Matka..
2. Data Collection Method
It is crucial to understand how the data should be collected in order to make
it reliable
Primary
• It is the data which has been freshly collected and for the first time i.e. the
data is original in nature.
Merits
• The reliability and credibility is high.
• The researcher can control the quality of data
Demerits
• It is time consuming method.
• The cost of collecting data is high.
3. Primary Data Collection Methods
• Interview
Interview is conducted which involves one to one questions and
interrogation
• Observation
Participant
Non Participant
Disguised
• Survey/Questionnaire
4. Secondary data
• This is the data which has been previously acquired by some other
researcher and then published. Thus, it is not an original data.
Merits
• It is easier to collect secondary data.
• • It is quick and consumes less time in collection.
• Sometimes the data can accurate entirely, thus it provides answer to all the
research questions.
Demerits
• The definitions used in the research must be carefully read as it must
reflect that the data was carried for another purpose. Terms may have
entirely different meanings which may not be relevance for research.
5. Factors Affecting Choice of Data Collection
Method
• The method must gather all information required. The choosen
method must be as per the resources available and budget.
• The time constraint should be also considered.
• The researchers must check whether respondents are available and
what is the geographical spread of the sample.
Sources of Secondary data
• Internal Sources of Data: Storage device, financial data
• External Sources of Data: Government data, consultancy,literatures
6. A Classification of Survey Methods
Traditional
Telephone
Computer-Assisted
Telephone
Interviewing
Mail
Interview
Mail
Panel
In-Home Mall
Intercept
Computer-Assisted
Personal
Interviewing
E-mail Internet
Survey
Methods
Telephone Personal Mail Electronic
7. Questionnaire Definition
• A questionnaire is a formalized set of questions
for obtaining information from respondents.
8. Questionnaire Objectives
• It must translate the information needed into a
set of specific questions that the respondents can
and will answer.
• A questionnaire must uplift, motivate, and
encourage the respondent to become involved in
the interview, to cooperate, and to complete the
interview.
• A questionnaire should minimize response error.
9. Specify the Information Needed
Design the Question to Overcome the Respondent’s Inability and Unwillingness to
Answer
Determine the Content of Individual Questions
Decide the Question Structure
Determine the Question Wording
Arrange the Questions in Proper Order
Reproduce the Questionnaire
Specify the Type of Interviewing Method
Identify the Form and Layout
Eliminate Bugs by Pre-testing
Questionnaire Design Process
10. Choosing Question Structure –
Unstructured Questions
• Unstructured questions are open-ended
questions that respondents answer in their own
words.
What is your occupation?
Who is your favorite actor?
What do you think about people who shop at
high-end department stores?
11. Choosing Question Structure –
Structured Questions
• Structured questions specify the set of
response alternatives and the response format.
A structured question may be multiple-choice,
dichotomous, or a scale.
12. Choosing Question Structure –
Multiple-Choice Questions
• In multiple-choice questions, the researcher provides a
choice of answers and respondents are asked to select one
or more of the alternatives given.
Do you intend to buy a new car within the next six
months?
____ Definitely will not buy
____ Probably will not buy
____ Undecided
____ Probably will buy
____ Definitely will buy
____ Other (please specify)
13. Choosing Question Structure –
Dichotomous Questions
• A dichotomous question has only two response
alternatives: yes or no, agree or disagree, and so on.
• Often, the two alternatives of interest are
supplemented by a neutral alternative, such as “no
opinion,” “don't know,” “both,” or “none.”
Do you intend to buy a new car within the next six
months?
_____ Yes
_____ No
_____ Don't know
14. Choosing Question Structure – Scales
• Scales were discussed in detail in Chapters 8 and 9:
Do you intend to buy a new car within the next six months?
DefinitelyProbably Undecided Probably Definitely
will not buy will not buy will buy will buy
1 2 3 4 5
15. Choosing Question Wording –
Define the Issue
• Define the issue in terms of who, what, when, where, why, and way (the six
Ws). Who, what, when, and where are particularly important.
Which brand of shampoo do you use?
(Incorrect)
Which brand or brands of shampoo have you
personally used at home during the last month?
In case of more than one brand, please
list all the brands that apply. (Correct)
16. Determining the Order of Questions
Opening Questions
• The opening questions should be interesting, simple, and
non-threatening.
Type of Information
• As a general guideline, basic information should be
obtained first, followed by classification, and, finally,
identification information.
Difficult Questions
• Difficult questions or questions which are sensitive,
embarrassing, complex, or dull, should be placed late in the
sequence.
17. Determining the Order of Questions
Effect on Subsequent Questions
• General questions should precede the specific
questions (funnel approach).
Q1: “What considerations are important to you in
selecting a department store?”
Q2: “In selecting a department store, how important
is convenience of location?”
(Correct)
18. Determining the Order of
Questions
Logical Order
The following guidelines should be followed for
branching questions:
• The question being branched (the one to which the
respondent is being directed) should be placed as close
as possible to the question causing the branching.
• The branching questions should be ordered so that the
respondents cannot anticipate what additional
information will be required.
19. Form and Layout
• Divide a questionnaire into several parts.
• The questions in each part should be numbered,
particularly when branching questions are used.
• The questionnaires should preferably be precoded.
• The questionnaires themselves should be numbered
serially.
20. Sampling
• Sampling is a statistical tool which helps to know the characteristics of
the universe or population by examining only a small part of it. The
values obtained from the study of sample, such as the average and
variance are known as statistic.
21. Sample vs. Census
Conditions Favoring the Use of
Type of Study Sample Census
1. Budget Small Large
2. Time available Short Long
3. Population size Large Small
4. Variance in the characteristic Small Large
5. Cost of sampling errors Low High
6. Cost of nonsampling errors High Low
7. Nature of measurement Destructive Nondestructive
8. Attention to individual cases Yes No
22. The Sampling Design Process
Define the Target Population
Determine the Sampling Frame
Select Sampling Technique(s)
Determine the Sample Size
Execute the Sampling Process
23. Define the Target Population
The target population is the collection of elements or objects that possess the
information sought by the researcher and about which inferences are to be
made. The target population should be defined in terms of elements, sampling
units, extent, and time.
• An element is the object about which or from which the information is
desired, e.g., the respondent.
• A sampling unit is an element, or a unit containing the element, that is
available for selection at some stage of the sampling process.
• Extent refers to the geographical boundaries.
• Time is the time period under consideration.
24. Define the Target Population
Important qualitative factors in determining the sample size
• the importance of the decision
• the nature of the research
• the number of variables
• the nature of the analysis
• sample sizes used in similar studies
• incidence rates
• completion rates
• resource constraints
25. Classification of Sampling Techniques
Sampling Techniques
Nonprobability
Sampling Techniques
Probability
Sampling Techniques
Convenience
Sampling
Judgmental
Sampling
Quota
Sampling
Snowball
Sampling
Systematic
Sampling
Stratified
Sampling
Cluster
Sampling
Other Sampling
Techniques
Simple Random
Sampling
26. Choosing Nonprobability vs.
Probability Sampling
Conditions Favoring the Use of
Factors Nonprobability
sampling
Probability
sampling
Nature of research Exploratory Conclusive
Relative magnitude of sampling
and nonsampling errors
Nonsampling
errors are
larger
Sampling
errors are
larger
Variability in the population Homogeneous
(low)
Heterogeneous
(high)
Statistical considerations Unfavorable Favorable
Operational considerations Favorable Unfavorable
27. Technique Strengths Weaknesses
Nonprobability Sampling
Convenience sampling
Least expensive, least
time-consuming, most
convenient
Selection bias, sample not
representative, not recommended for
descriptive or causal research
Judgmental sampling Low cost, convenient,
not time-consuming
Does not allow generalization,
subjective
Quota sampling Sample can be controlled
for certain characteristics
Selection bias, no assurance of
representativeness
Snowball sampling Can estimate rare
characteristics
Time-consuming
Probability sampling
Simple random sampling
(SRS)
Easily understood,
results projectable
Difficult to construct sampling
frame, expensive, lower precision,
no assurance of representativeness.
Systematic sampling Can increase
representativeness,
easier to implement than
SRS, sampling frame not
necessary
Can decrease representativeness
Stratified sampling Include all important
subpopulations,
precision
Difficult to select relevant
stratification variables, not feasible to
stratify on many variables, expensive
Cluster sampling Easy to implement, cost
effective
Imprecise, difficult to compute and
interpret results
Strengths and Weaknesses of
Basic Sampling Techniques
28. Measure of Central tendency
• This depicts the middle point of any data distribution. The measures
of central tendency are also known as measures of location
• Mean
• Median
• Mode
29. Definition of the mean
• Given a sample of n data points, x1, x2, x3, … xn, the
formula for the mean or average is given below.
pts
data
number
the
pts
data
the
of
sum
the
n
x
x
30. Find the mean
• My 5 test scores for Calculus I are 95, 83, 92, 81, 75. What is the
mean?
• ANSWER: sum up all the tests and divide by the total number of tests.
• Test mean = (95+83+92+81+75)/5 = 85.2
31. Find the median.
• Here are a bunch of 10 point quizzes from MAT117:
• 9, 6, 7, 10, 9, 4, 9, 2, 9, 10, 7, 7, 5, 6, 7
• As you can see there are 15 data points.
• Now arrange the data points in order from smallest to largest.
• 2, 4, 5, 6, 6, 7, 7, 7, 7, 9, 9, 9, 9, 10, 10
• Calculate the location of the median: (15+1)/2=8. The eighth piece of
data is the median. Thus the median is 7.
• By the way what is the mean???? It’s 7.13…
32. The mode
• The mode is the most frequent number in a collection of data.
• Example A: 3, 10, 8, 8, 7, 8, 10, 3, 3, 3
• The mode of the above example is 3, because 3 has a frequency of 4.
• Example B: 2, 5, 1, 5, 1, 2
• This example has no mode because 1, 2, and 5 have a frequency of 2.
• Example C: 5, 7, 9, 1, 7, 5, 0, 4
• This example has two modes 5 and 7. This is said to be bimodal.
33. Measure of Dispersion
• The second attribute of a data is to learn how far the data is spread,
i.e. its variability. It is possible that the mean of all the data set is
same, but they may vary in variability. Thus, it is significant to study
how the data is spread or dispersed
• Range: Difference between highest point and lowest point
• Variance
• Standard Deviation
34. Variance
• Each population is characterized by variance which is denoted by (read as sigma squared). The
formulae to calculate variance is derived by dividing the sum of squared distances between the
mean and each observation, finally dividing by the entire population
35. Standard Deviation
• The population standard deviation is the square root of average of
the squared distances of the observations from the mean. Thus, it is
the square root of variance
36. Probability Theory
• Theory of probability states that "If an experiment is performed
repeatedly under essentially homogeneous and similar conditions,
the result of what is commonly termed as an outcome may be unique
or certain indefinite but may be certainly one of the various
possibilities depending on the experiments."
37. Probability approaches
• Classical Approach
• This approach assumes that all possible outcomes of an experiment
are mutually exclusive and equally likely.
• When we draw a card at random from well shuffled, bridge ace has
the same chance of being drawn, i.e. I in 52 or 1/52, the probability of
drawing a red card is 26/52 = 1/2
38. Empirical approach
• All possible outcomes are known you can use Classical but how about
• Will this tree fall within next winter ?
• The likelihood that the tree will fall is much smaller than it will stand.
How much smaller? This is the type of question that requires
references to empirical data.
• The probability of an event is determined objectively by repetitive
empirical observations,
39. Axiomatic Approach
• a type of probability that has a set of axioms (rules) attached to it. For
example, you could have a rule that the probability must be greater
than 0%, that one event must happen, and that one event cannot
happen if another event happens. the entire theory is developed by
logic of deduction
40. Probability Distribution
• Probability distribution is related to frequency distribution ,how the
• outcomes of a said event are expected to vary, there are two types
1. Discrete Probability Distribution
2. Continuous Probability Distribution
• Discrete Probability Distribution
• Distributions where only limited number of values can be Listed
• Eg: Probability of a student getting selected in a class of three section
for the game ?
41. Continuous Probability Distribution
• This comprises of variables which can take any value, within a
specified range.
• All the possible outcomes cannot be listed because there are
numerous variable and outcomes within a range. e.g.
• Measuring the level of ppm in air quality index will vary near sea level
(Mumbai) and cities like Delhi. Thus, the variable can assume any
value here.
43. There is nothing to worry about understanding
the concept of normal distribution The bell curve
• Imagine an example (Fuel efficiency of a bike,)
• Collect data and plot the data points, Most probably according to the
theory behind the Normal distribution you shall get a Bell shaped
curve.
• Not clarified yet, Okay lets talk about frequency of some events man
made or natural ( Weight of students, rain fall ,temperature, financial
data ,sales etc)
• For these, datum close to the mean are frequent and the data away
from mean or less and less frequent and they are sometimes called as
outliers.
44. Characteristics
• We say the data is "normally distributed”
• The Normal Distribution has:
• mean = median = mode
• symmetry about the center
• 50% of values less than the mean
and 50% greater than the mean
• A bell curve / normal curve has predictable standard deviations that
follow the 68.26 95.44 99.74 rule .
• The area under the cure is always equal to 1
45. Cont...
• The mean (average) is always in the center of a bell curve or normal
curve.
• A bell curve / normal curve has only one mode, or peak. Mode here
means “peak”; a curve with one peak is unimodal; two peaks is
bimodal, and so on.
• A bell curve / normal curve is symmetric. Exactly half of data points
are to the left of the mean and exactly half are to the right of the mean.
• The two tails never touch the horizontal lines they extend indefinitely
so its –infinity to + infinity at the horizontal sides
46. • There are many different normal
distributions, with each one depending
on two parameters:
1.Population mean, μ(Mu) and
2.Population standard deviation, σ(Sigma).
• These two determine the shape of the
curve
• Can we look how the changes in Mu (L 2
R) and Sigma (Breadth) appears
• What you mean there's low S.D? can we
relay on the data?
47.
48. Hypothesis Testing
Null Hypothesis (H0)
• A statement in which no difference or effect is expected. If the null
hypothesis is not rejected,no changes will be made.
Alternate/Alternative Hypothesis(Ha)
• A statement that some difference or effect is expected. Accepting the
alternative hypothesis will lead to changes in opinions or actions.
50. Descriptive Hypotheses
• Describes the existence, size, form or distribution of some variable.
- 60% of investors favors cash dividend.
- MBA institutes facing problems in placement
51. Relational Hypotheses
• Describes the relationship between two or more variables.
The greater the stress experienced in the job the lower the job-
satisfaction.( directional)
Women are better than men
There is a relationship between age and job-satisfaction. (non-
directional)
52. Relational Hypotheses
• Correlational Hypotheses
Only shows the correlation between two or more variables but no
claims are made that one causes the other.
• Explanatory Hypotheses.
Claims are made that one variable causes other to occur.
53. Importance of Hypotheses
• Guides the direction of study;
• Identifies the facts relevant for the study;
• Helps in the selection of Research Design;
• Helps in providing the framework in which the results have to be
given.
54. Characteristics of a Good Hypothesis
• Adequate for the purpose
i) Should address the original problem
ii) Clearly identifies the variables relevant in the study.
iii) Helps in knowing the research design
iv) Helps in organizing the results of the study.
55. Characteristics of a Good Hypothesis
• Testable
i) Uses acceptable techniques
ii) Simple requiring few conditions
iii) Explanation can be given from the given theoretical framework.
56. Characteristics of a Good Hypothesis
• Better than its rivals
i) Explains more facts than its rivals
ii) Greater variety or scope of facts
57. Steps for Hypothesis Testing
Draw Research Conclusion
Formulate H0 and H1
Select Appropriate Test
Choose Level of Significance
Determine Prob
Assoc with Test Stat
Determine Critical
Value of Test Stat
TSCR
Determine if TSCR
falls into (Non)
Rejection Region
Compare with Level
of Significance,
Reject/Do not Reject H0
Calculate Test Statistic TSCAL
58. Type I Error
• Occurs if the null hypothesis is rejected when it is in fact true.
• The probability of type I error ( α ) is also called the level of
significance.
Type II Error
• Occurs if the null hypothesis is not rejected when it is in fact false.
• The probability of type II error is denoted by β .
• Unlike α, which is specified by the researcher, the magnitude of β
depends on the actual value of the population parameter
(proportion).
It is necessary to balance the two types of errors.
Choose Level of Significance
59. • Power of a Test
The power of a test is the probability of rejecting the null hypothesis
when it is false and should be rejected. Although is unknown, it is
related to .
• An extremely low value of (e.g., 0.001) will result in intolerably high
errors. So it is necessary to balance the two types of errors.
60. chi-square statistic
• The chi-square statistic is used to test the statistical significance of
the observed association in a cross-tabulation. It assists us in
determining whether a systematic association exists between the two
variables.
61. Correlation
• Chi-square test depict whether there is any relation between two
variables but it does not define what relation exist between
• Correlation means that between two series or group of data there
exist some casual connections." Correlation is an analysis of the co-
variation between two or more variables
62. Types of Correlation
• Positive Correlation
If one variable increases the other also increases and vice versa
• Negative Correlation
If one variable increases the other decreases and vice versa
63. Degrees of Correlation
• Perfect Positive Correlation When two variables change in the same
proportion in same direction. In this case,coefficient of correlation is
(r = + l).
• Perfect Negative Correlation When two variables change in the same
proportion in opposite directions. In this case, coefficient of
correlation is (r = -1).
• If there is no relation between two sets of variaAbsence of
Correlation bles, i.e. change in one has no effect on the change in
other variable, degree of correlation is zero (r = 0).
64. Regression
• It is often more important to find out what the relation actually is, in
order to estimate or predict one variable and the statistical technique
appropriate to such a case is called regression analysis
• Regression is the statistical tool which will help to estimate or predict
the unknown values of one variable from known values of another
variable
• Regression equation of y on x : y = a + bx + e
65. Assumptions of Regression Analysis
• Linearity The relationship between two variables must be linear.
• Normality of Error Distribution The error terms or possible value of
error terms should be normally distributed
• Independence of Error The errors must not be dependent on each
other and there should not be any pattern followed by the errors.
• Homoskedasticity The error terms should not change or vary with the
value of independent (predictor) variables. This property is called
homoskedasticity.
66. Types of Regression
• Simple Regression (One DV)
• Multiple Regression (Multiple DV)
Utility of Regression Analysis
• Determination of rule of change in variables.
• Helps in estimating the event like changes in value sales
or profit.
• Calculation of coefficient of correlation.
68. Binomial
• Expresses the probability of one set of dichotomous alternatives, known as
success and failure.
• It is computed by (q + p)^n
q = Failure
p = Success
n = Total number of experiments
Characteristics
• All the trials are independent of each other.
• The probability of success in any trial 'p' is constant for each trial. The
probability of failure, q = I— p is also constant.
• Thus p=0.5 and q =I-P
69. Cont..
• These conditions are satisfied if we toss a coin, say five times and
want to know the probability of two heads resulting from these five
tosses. It be HTHTT,THHTT, THTTH.. (n=5,x=2)
70. • Regardless of the values of n, the distribution is symmetrical, when P
= 0.5
• When P is greater than 0.5, the distribution is negatively skewed
asymmetrical distribution, with the peak occurring to the right of the
centre. Like if P = 0.9
• When P is less than 0.5, the distribution is positively skewed and
asymmetrical with the peak occurring to the left of the centre. For P =
0.1
72. t-distribution
• A symmetrical bell shaped distribution that is contingent on sample
size ,has a mean of 0 and a standard deviation equal to 1.
• A univariate t-test is used for testing hypotheses involving some
observed mean against some specified value .
• When sample is greater than 30 the results from t-test and z-test are
almost same.So t-test is appropriate for small sample and S.D is
unknown
• The shape of the t-distribution is influenced by degrees of freedom
• The number of observation minus the number of constraints or
assumptions needed to calculate the statistical term.
73. T-Test /student’s t-test-Prof Gosset
• When size of the sample is less than 30, theory of sampling is called
as small sample. If size of sample is small, normality of distribution
cannot be applied. It is also called as one sample t-test
• t-test may be used to test the significance between the difference of
sample mean and population mean
74. Example
• The mean height of Indian adults ages 20 and older is about 66.5
inches (69.3 inches for males, 63.8 inches for females).
• H0: µHeight = 66.5 ("the mean height is equal to 66.5")
H1: µHeight ≠ 66.5 ("the mean height is not equal to 66.5")
75. Independent sample t-test/two sample
• The Independent Samples t Test compares the means of two
independent groups in order to determine whether there is statistical
evidence that the associated population means are significantly
different.
• The Independent Samples t Test can only compare the means for two
(and only two) groups. It cannot make comparisons among more than
two groups
76. Example
• In our sample dataset, students reported their typical time to run a
mile, and whether or not they were an athlete. Suppose we want to
know if the average time to run a mile is different for athletes versus
non-athletes
• The hypotheses for this example can be expressed as:
• H0: µnon-athlete - µathlete = 0 ("the difference of the means is equal to
zero")
• H1: µnon-athlete - µathlete ≠ 0 ("the difference of the means is not equal to
zero")
78. Paired /dependent sample t-test
• The Paired Samples t Test compares the means of two measurements
taken from the same individual, object, or related units
• These "paired" measurements can represent thing like:
oA measurement taken at two different times (e.g., pre-test and post-
test score with an intervention administered between the two time
points)
• The purpose of the test is to determine whether there is statistical
evidence that the mean difference between paired observations is
significantly different from zero. The Paired Samples t Test is a
parametric test
79. Applications
The Paired Samples t Test is commonly used to test the following:
• Statistical difference between two time points
• Statistical difference between two conditions
• Statistical difference between two measurements
• Statistical difference between a matched pair
80. Hypotheses
• The hypotheses can be expressed in two different ways that express
the same idea and are mathematically equivalent:
• H0: µ1 = µ2 ("the paired population means are equal")
H1: µ1 ≠ µ2 ("the paired population means are not equal")
• OR
• H0: µ1 - µ2 = 0 ("the difference between the paired population means
is equal to 0")
• H1: µ1 - µ2 ≠ 0 ("the difference between the paired population means
is not 0")
81. • Formula Where Sx
x¯diff = Sample mean of the differences
n = Sample size (i.e., number of observations)
sdiff= Sample standard deviation of the differences
sx¯ = Estimated standard error of the mean (s/sqrt(n))
82. Z-test-Professor Fisher
• In case of large sample, where sample size is greater than 30,we apply
Z-test which is based on normal distribution
• A technique used to test the hypothesis that proportions are
significantly different for two independent groups
• When a researcher wants to test the sample correlation against any
other value of r or if it is desired to test whether the two given sample
have come from same population or not, the Z-test is used.
83. Chi-Square Test
• The Chi-Square Test of Independence determines whether there is an
association between categorical variables (i.e., whether the variables
are independent or related)
• The Chi-Square Test of Independence is commonly used to test the
following:
• Statistical independence or association between two or more
categorical variables
84. Hypotheses
• The null hypothesis (H0) and alternative hypothesis (H1) of the Chi-
Square Test of Independence can be expressed in two different but
equivalent ways:
• H0: "[Variable 1] is independent of [Variable 2]"
H1: "[Variable 1] is not independent of [Variable 2]"
• OR
• H0: "[Variable 1] is not associated with [Variable 2]"
H1: "[Variable 1] is associated with [Variable 2]"
85. Example
• In the sample dataset, respondents were asked their gender and
whether or not they were a cigarette smoker. There were three
answer choices: Nonsmoker, Past smoker, and Current smoker.
Suppose we want to test for an association between smoking
behavior (nonsmoker, current smoker, or past smoker) and gender
(male or female) using a Chi-Square Test of Independence (we'll
use α = 0.05).