The document discusses using Six Sigma's DMAIC (Define, Measure, Analyze, Improve, Control) approach to evaluate the relevance of Technology Business Incubators (TBIs) in India. Data was collected on the occupancy rates of various TBIs across different sectors including ICT, bio-tech, instrumentation, and agriculture. The data was analyzed using ANOVA and interaction plots to assess intra-sector and inter-sector relevance of the TBIs. The results of the Define, Measure, and Analyze phases are presented, showing the occupancy rates and analyses conducted for each sector.
This document discusses methods for testing whether a data set is normally distributed. It describes both graphical and statistical tests for normality, including Q-Q plots and the Kolmogorov-Smirnov, Shapiro-Wilk, and Lilliefors tests. It then provides a detailed example of how to perform the Kolmogorov-Smirnov test for normality on a set of height data.
This document contains output from statistical analyses performed on panel data using Stata. The analyses include:
1. Correlation analysis, pooled OLS regression, and tests for multicollinearity to examine the relationship between variables.
2. Specification error tests to check if the model is correctly specified.
3. Tests for normality of residuals to check model assumptions.
4. Panel regression using fixed effects and random effects models.
5. Tests to compare the fixed and random effects models and check for heteroskedasticity and autocorrelation.
In summary, the document analyzes relationships between variables in panel data and tests assumptions and specifications of regression models fit to the data.
11:20 Louvard - adjusting your level of competence to the difficulty of a CTOEuro CTO Club
1) Adjusting the difficulty level of chronic total occlusion (CTO) percutaneous coronary intervention (PCI) cases to the skill level of the operator is important. This can be done through patient selection based on predictors of success and operator experience.
2) A team-based approach, use of new devices and techniques, individual case volumes, and proctoring can help improve CTO PCI success rates.
3) Scores like the J-CTO score and the new CL-SCORE can help predict procedural success and guide patient selection and referral. Maintaining a database is useful for monitoring outcomes and individual operator success.
Comparing Machine Learning Algorithms in Text MiningAndrea Gigli
In this project I compare different Machine Learning Algorithm on different Text Mining Tasks.
ML algorithms: Naive Bayes, Support Vector Machine, Decision Trees, Random Forest, Ordinal Regression as ML task
Tasks considered: Classifying Positive and Negative Reviews, Predicting Review Stars, Quantifying Sentiment Over Time, Detecting Fake Reviews
This study analyzed the relationship between the Indian rupee-US dollar exchange rate and other macroeconomic variables like foreign institutional investments (FII), current account deficit (CAD), and trade balance over time. Graphical analysis showed CAD and trade balance moving together while FII and foreign exchange reserves moved in opposite directions of trade balance and CAD. Cointegration and vector autoregression tests confirmed a long-term cointegrating relationship. The error correction model found long-term causality from CAD to the rupee rate and from past FII, CAD, and other variables to trade balance. Granger causality tests indicated short-term uni-directional causality from FII to CAD and trade balance. In conclusion, the
This document discusses methods for testing whether a data set is normally distributed. It describes both graphical and statistical tests for normality, including Q-Q plots and the Kolmogorov-Smirnov, Shapiro-Wilk, and Lilliefors tests. It then provides a detailed example of how to perform the Kolmogorov-Smirnov test for normality on a set of height data.
This document contains output from statistical analyses performed on panel data using Stata. The analyses include:
1. Correlation analysis, pooled OLS regression, and tests for multicollinearity to examine the relationship between variables.
2. Specification error tests to check if the model is correctly specified.
3. Tests for normality of residuals to check model assumptions.
4. Panel regression using fixed effects and random effects models.
5. Tests to compare the fixed and random effects models and check for heteroskedasticity and autocorrelation.
In summary, the document analyzes relationships between variables in panel data and tests assumptions and specifications of regression models fit to the data.
11:20 Louvard - adjusting your level of competence to the difficulty of a CTOEuro CTO Club
1) Adjusting the difficulty level of chronic total occlusion (CTO) percutaneous coronary intervention (PCI) cases to the skill level of the operator is important. This can be done through patient selection based on predictors of success and operator experience.
2) A team-based approach, use of new devices and techniques, individual case volumes, and proctoring can help improve CTO PCI success rates.
3) Scores like the J-CTO score and the new CL-SCORE can help predict procedural success and guide patient selection and referral. Maintaining a database is useful for monitoring outcomes and individual operator success.
Comparing Machine Learning Algorithms in Text MiningAndrea Gigli
In this project I compare different Machine Learning Algorithm on different Text Mining Tasks.
ML algorithms: Naive Bayes, Support Vector Machine, Decision Trees, Random Forest, Ordinal Regression as ML task
Tasks considered: Classifying Positive and Negative Reviews, Predicting Review Stars, Quantifying Sentiment Over Time, Detecting Fake Reviews
This study analyzed the relationship between the Indian rupee-US dollar exchange rate and other macroeconomic variables like foreign institutional investments (FII), current account deficit (CAD), and trade balance over time. Graphical analysis showed CAD and trade balance moving together while FII and foreign exchange reserves moved in opposite directions of trade balance and CAD. Cointegration and vector autoregression tests confirmed a long-term cointegrating relationship. The error correction model found long-term causality from CAD to the rupee rate and from past FII, CAD, and other variables to trade balance. Granger causality tests indicated short-term uni-directional causality from FII to CAD and trade balance. In conclusion, the
A Study on the Short Run Relationship b/w Major Economic Indicators of US Eco...aurkoiitk
The objective of this study
was to develop an economic indicator system for the US
economy that will help to forecast the turning points in the
aggregate level of economic activity. Our primary concern
is to study the short run relationship between the major
economic indicators of US economy (eg: GDP, Money
Supply, Unemployment Rate, Inflation rate, Federal Fund
Rate, Exchange Rate, Government Expenditure &
Receipt, Crude Oil Price, Net Import & Export).
The document is a scorecard for Swindon Walk Bundling Centre summarizing their performance against key performance indicators (KPIs) for 2012/2013. It shows that in the area of people, the centre achieved their target of zero lost time accidents by implementing proactive safety management practices like weekly safety audits, safety walk rounds, and daily safety huddles to raise awareness and remove hazards before accidents occurred. Their sick absence rate was also lower than the previous year.
This document summarizes key concepts in building multiple regression models, including:
1) Analyzing nonlinear variables, qualitative variables, and building and evaluating regression models.
2) Transforming variables to improve model fit, including using indicator variables for qualitative data.
3) Common model building techniques like stepwise regression, forward selection, and backward elimination.
1. The document describes an analysis of response surface methodology to model two responses (Y1 and Y2) based on three factors (A, B, C).
2. A central composite design with 17 runs was used to collect data on the responses across varying levels of the factors. Response surface regressions were then used to model each response as a function of the factors and their interactions.
3. For response Y1, the regression identified factors B and C as significant, while for response Y2, factors C and B*B were found to be significant based on a 95% confidence level. Contour and surface plots of the responses are also presented.
The document contains multiple decision problems involving expected monetary value (EMV) calculations. The first problem involves determining the optimal act from among three acts (A1, A2, A3) based on their payoffs under three possible states of nature (S1, S2, S3) and the given probabilities. The optimal act determined using EMV is A1. Another problem involves determining whether a proposal should be accepted or rejected based on the EMV of each decision. The EMV calculation shows the decision should be to reject the proposal. A third problem involves calculating EMV under different criteria to determine the best investment from among stocks, bonds, and debentures.
This document describes a project to reduce scrap rates in piston casting at a foundry from 7% to 2% over 4 months using optimization techniques. The foundry casts pistons for Tata trucks. Testing of process parameters like cooling time, gate size, water temperature and metal temperature identified relationships to reduce scrap. Statistical modeling determined the optimal settings were a cooling time of 47 seconds, gate size of 132 mm^2, water temperature of 32°C and metal temperature of 792°C to minimize scrap to an estimated 5.28%. The document recommends applying fuzzy logic, genetic algorithms or artificial intelligence to further improve the optimization model.
The document is a dissertation report submitted by Parmod Kumar that examines the behavioural analysis of P22 and P91 steels after TIG welding and post weld heat treatment (PWHT) processes. It includes an introduction to the materials, literature review on welding of P22 and P91 steels, identified research gaps, methodology adopted for the experimental plan, findings from the experiments, results and discussion, and plans for future work. The experimental plan involves TIG welding of P22 and P91 steel samples, analyzing hardness and microstructure after welding, conducting PWHT at varying hold times, and assessing the impact of PWHT on hardness and microstructure.
This document presents a SWOT analysis comparing the strengths and weaknesses of different quality strategies, including JIT, Lean, LSS, Six Sigma, TQM, and others. Tables of data on strengths and weaknesses are provided. Statistical analyses were conducted to determine if there are significant differences in the means for strengths and weaknesses among the different methods. Both analyses found significant differences, indicating at least one method has a mean strength or weakness that is different from the others.
The document discusses multi response optimization of friction stir welding (FSW) parameters for aluminum alloy AA6105 using the Taguchi method. It first provides background on welding processes including FSW. It then discusses factors that affect FSW quality like rotational speed, welding speed, and tilt angle. The document proposes using an L9 orthogonal array experiment with three control factors and mechanical properties and microstructure as response variables. The methodology involves conducting experiments, applying Taguchi analysis and ANOVA to optimize parameters for properties like tensile strength and hardness. The research aims to minimize defects and improve joint quality of AA6105. Equipment for FSW and response testing is also detailed.
The document proposes optimizing the mechanical properties of AA1100 metal matrix composites (MMCs) through mixture design of experiments (DoE). AA1100 alloy will be reinforced with silicon, copper and magnesium particles via stir casting. Response variables like hardness and compressive strength will be measured. Mixture DoE will then be used to optimize compositional and process parameters like percentage of reinforcements, stirring speed and time to achieve the required mechanical property ranges for car bodies. The methodology involves preparing composite samples, testing them, and analyzing the results to optimize the formulation and processing of AA1100 MMCs.
This document provides a synopsis for a presentation on integrating Lean Six Sigma and Industry 4.0 tools to manage quality in the Indian textile industry. It includes an introduction, literature review, identified research gaps, problem formulation, research objectives, proposed methodology, and research plan. The proposed methodology involves developing an integrated Lean Six Sigma and Industry 4.0 model called LSS 4.0 to address limitations of existing quality management techniques and help textile SMEs improve operational performance. A case study would validate and test the LSS 4.0 model in an Indian textile company. The research is expected to take 38 months to complete.
This document describes a study conducted to analyze factors that affect student admission rates at engineering institutes. A quantitative strategy called Box-Behnken design was used to examine the effects of 9 factors and their interactions on student response percentage. Statistical analysis found that brand name, location, fees, placement percentage, and certifications had significant positive effects, while factors like location and fees had negative effects. Response surface methodology and contour plots are presented to show relationships between factors and student response. The analysis provides insights for engineering institutes to optimize admission rates.
0000. the blockchain-revolution-an-analysis-of-regulation-and-technoloDr. Bikram Jit Singh
This document provides an overview of blockchain technology and its potential applications and regulatory landscape. It defines key concepts like distributed ledger technology and differentiates digital currencies from blockchain. Blockchain allows for the decentralized verification and recording of transactions through a peer-to-peer network. The technology has applications beyond digital currencies, including for smart contracts that can automate transactions. Regulators globally are assessing how to oversee blockchain to support innovation while mitigating risks.
The document summarizes the application of Six Sigma's DMAIC approach to improve the process capability of PVC pipe extrusion. It analyzes critical process parameters like feeder RPM, barrel zone temperatures, die zone temperatures, and haul off RPM using tools like correlation, regression, ANOVA, and t-tests. Significant parameters identified are feeder RPM, BZ3T, DZ2T, and DZ3T. The document proposes using Taguchi's method of parametric optimization to improve the process by setting control factor levels for the significant parameters.
This document discusses measurement system analysis (MSA), which is used to evaluate statistical properties of process measurement systems. MSA determines if current measurement systems provide representative, unbiased and minimal variability measurements. The document outlines the MSA process, including preparing for a study, evaluating stability, accuracy, precision, linearity, and repeatability and reproducibility. Accuracy looks at bias while precision considers repeatability and reproducibility. MSA is required for certification and helps identify process variation sources and minimize defects.
The document discusses methods to enhance RAM (reliability, availability, maintainability) of systems. It provides a regression equation that models availability percentage as a function of reliability and maintainability percentages, based on analysis of data from different machines. Various graphs and statistical analyses are also presented to compare mean time between failures (MTBF) of the machines and identify differences between them.
Tensile strength and hardness tests were conducted on aluminum alloy welded specimens using a UTM and Vickers hardness tester located at CITCO, IDFC, Chandigarh. The tensile test used an UTM machine to apply loads to specimens based on ASTM standards and record the stress-strain curves to evaluate tensile strength and elongation. Hardness tests used a Vickers hardness tester to indent specimens with a 1 kgf load and record the impression values to determine material hardness. Test certificates in the appendices provide hardness and tensile strength data for the base aluminum alloys and welded samples tested.
This document discusses measurement system analysis (MSA) and gauge repeatability and reproducibility (R&R) studies. MSA is used to evaluate different aspects of a measurement system like bias, linearity, stability, repeatability and reproducibility. R&R studies focus specifically on repeatability and reproducibility. Key terms are defined, including bias, repeatability, reproducibility, stability, linearity, attribute R&R parameters like effectiveness, misses, false alarms, and bias, and how to analyze variable measurement data using analysis of variance. Guidelines for acceptable levels of R&R parameters are also provided.
Keywords: six sigma; foundry SMEs; small and medium-sized enterprises; design of experiments; DOE; measurement system analysis; MSA; failure mode and effects analysis; FMEA; non-conforming products; cost of poor quality; hypothesis testing; defects per million opportunities; DPMO; process capability; DMAICS; analysis of variance; ANOVA; India; make-to-order foundries; scrap reduction; productivity.
More Related Content
Similar to Synthesizing tbi relevance in india through six sigma approach
A Study on the Short Run Relationship b/w Major Economic Indicators of US Eco...aurkoiitk
The objective of this study
was to develop an economic indicator system for the US
economy that will help to forecast the turning points in the
aggregate level of economic activity. Our primary concern
is to study the short run relationship between the major
economic indicators of US economy (eg: GDP, Money
Supply, Unemployment Rate, Inflation rate, Federal Fund
Rate, Exchange Rate, Government Expenditure &
Receipt, Crude Oil Price, Net Import & Export).
The document is a scorecard for Swindon Walk Bundling Centre summarizing their performance against key performance indicators (KPIs) for 2012/2013. It shows that in the area of people, the centre achieved their target of zero lost time accidents by implementing proactive safety management practices like weekly safety audits, safety walk rounds, and daily safety huddles to raise awareness and remove hazards before accidents occurred. Their sick absence rate was also lower than the previous year.
This document summarizes key concepts in building multiple regression models, including:
1) Analyzing nonlinear variables, qualitative variables, and building and evaluating regression models.
2) Transforming variables to improve model fit, including using indicator variables for qualitative data.
3) Common model building techniques like stepwise regression, forward selection, and backward elimination.
1. The document describes an analysis of response surface methodology to model two responses (Y1 and Y2) based on three factors (A, B, C).
2. A central composite design with 17 runs was used to collect data on the responses across varying levels of the factors. Response surface regressions were then used to model each response as a function of the factors and their interactions.
3. For response Y1, the regression identified factors B and C as significant, while for response Y2, factors C and B*B were found to be significant based on a 95% confidence level. Contour and surface plots of the responses are also presented.
The document contains multiple decision problems involving expected monetary value (EMV) calculations. The first problem involves determining the optimal act from among three acts (A1, A2, A3) based on their payoffs under three possible states of nature (S1, S2, S3) and the given probabilities. The optimal act determined using EMV is A1. Another problem involves determining whether a proposal should be accepted or rejected based on the EMV of each decision. The EMV calculation shows the decision should be to reject the proposal. A third problem involves calculating EMV under different criteria to determine the best investment from among stocks, bonds, and debentures.
This document describes a project to reduce scrap rates in piston casting at a foundry from 7% to 2% over 4 months using optimization techniques. The foundry casts pistons for Tata trucks. Testing of process parameters like cooling time, gate size, water temperature and metal temperature identified relationships to reduce scrap. Statistical modeling determined the optimal settings were a cooling time of 47 seconds, gate size of 132 mm^2, water temperature of 32°C and metal temperature of 792°C to minimize scrap to an estimated 5.28%. The document recommends applying fuzzy logic, genetic algorithms or artificial intelligence to further improve the optimization model.
Similar to Synthesizing tbi relevance in india through six sigma approach (9)
The document is a dissertation report submitted by Parmod Kumar that examines the behavioural analysis of P22 and P91 steels after TIG welding and post weld heat treatment (PWHT) processes. It includes an introduction to the materials, literature review on welding of P22 and P91 steels, identified research gaps, methodology adopted for the experimental plan, findings from the experiments, results and discussion, and plans for future work. The experimental plan involves TIG welding of P22 and P91 steel samples, analyzing hardness and microstructure after welding, conducting PWHT at varying hold times, and assessing the impact of PWHT on hardness and microstructure.
This document presents a SWOT analysis comparing the strengths and weaknesses of different quality strategies, including JIT, Lean, LSS, Six Sigma, TQM, and others. Tables of data on strengths and weaknesses are provided. Statistical analyses were conducted to determine if there are significant differences in the means for strengths and weaknesses among the different methods. Both analyses found significant differences, indicating at least one method has a mean strength or weakness that is different from the others.
The document discusses multi response optimization of friction stir welding (FSW) parameters for aluminum alloy AA6105 using the Taguchi method. It first provides background on welding processes including FSW. It then discusses factors that affect FSW quality like rotational speed, welding speed, and tilt angle. The document proposes using an L9 orthogonal array experiment with three control factors and mechanical properties and microstructure as response variables. The methodology involves conducting experiments, applying Taguchi analysis and ANOVA to optimize parameters for properties like tensile strength and hardness. The research aims to minimize defects and improve joint quality of AA6105. Equipment for FSW and response testing is also detailed.
The document proposes optimizing the mechanical properties of AA1100 metal matrix composites (MMCs) through mixture design of experiments (DoE). AA1100 alloy will be reinforced with silicon, copper and magnesium particles via stir casting. Response variables like hardness and compressive strength will be measured. Mixture DoE will then be used to optimize compositional and process parameters like percentage of reinforcements, stirring speed and time to achieve the required mechanical property ranges for car bodies. The methodology involves preparing composite samples, testing them, and analyzing the results to optimize the formulation and processing of AA1100 MMCs.
This document provides a synopsis for a presentation on integrating Lean Six Sigma and Industry 4.0 tools to manage quality in the Indian textile industry. It includes an introduction, literature review, identified research gaps, problem formulation, research objectives, proposed methodology, and research plan. The proposed methodology involves developing an integrated Lean Six Sigma and Industry 4.0 model called LSS 4.0 to address limitations of existing quality management techniques and help textile SMEs improve operational performance. A case study would validate and test the LSS 4.0 model in an Indian textile company. The research is expected to take 38 months to complete.
This document describes a study conducted to analyze factors that affect student admission rates at engineering institutes. A quantitative strategy called Box-Behnken design was used to examine the effects of 9 factors and their interactions on student response percentage. Statistical analysis found that brand name, location, fees, placement percentage, and certifications had significant positive effects, while factors like location and fees had negative effects. Response surface methodology and contour plots are presented to show relationships between factors and student response. The analysis provides insights for engineering institutes to optimize admission rates.
0000. the blockchain-revolution-an-analysis-of-regulation-and-technoloDr. Bikram Jit Singh
This document provides an overview of blockchain technology and its potential applications and regulatory landscape. It defines key concepts like distributed ledger technology and differentiates digital currencies from blockchain. Blockchain allows for the decentralized verification and recording of transactions through a peer-to-peer network. The technology has applications beyond digital currencies, including for smart contracts that can automate transactions. Regulators globally are assessing how to oversee blockchain to support innovation while mitigating risks.
The document summarizes the application of Six Sigma's DMAIC approach to improve the process capability of PVC pipe extrusion. It analyzes critical process parameters like feeder RPM, barrel zone temperatures, die zone temperatures, and haul off RPM using tools like correlation, regression, ANOVA, and t-tests. Significant parameters identified are feeder RPM, BZ3T, DZ2T, and DZ3T. The document proposes using Taguchi's method of parametric optimization to improve the process by setting control factor levels for the significant parameters.
This document discusses measurement system analysis (MSA), which is used to evaluate statistical properties of process measurement systems. MSA determines if current measurement systems provide representative, unbiased and minimal variability measurements. The document outlines the MSA process, including preparing for a study, evaluating stability, accuracy, precision, linearity, and repeatability and reproducibility. Accuracy looks at bias while precision considers repeatability and reproducibility. MSA is required for certification and helps identify process variation sources and minimize defects.
The document discusses methods to enhance RAM (reliability, availability, maintainability) of systems. It provides a regression equation that models availability percentage as a function of reliability and maintainability percentages, based on analysis of data from different machines. Various graphs and statistical analyses are also presented to compare mean time between failures (MTBF) of the machines and identify differences between them.
Tensile strength and hardness tests were conducted on aluminum alloy welded specimens using a UTM and Vickers hardness tester located at CITCO, IDFC, Chandigarh. The tensile test used an UTM machine to apply loads to specimens based on ASTM standards and record the stress-strain curves to evaluate tensile strength and elongation. Hardness tests used a Vickers hardness tester to indent specimens with a 1 kgf load and record the impression values to determine material hardness. Test certificates in the appendices provide hardness and tensile strength data for the base aluminum alloys and welded samples tested.
This document discusses measurement system analysis (MSA) and gauge repeatability and reproducibility (R&R) studies. MSA is used to evaluate different aspects of a measurement system like bias, linearity, stability, repeatability and reproducibility. R&R studies focus specifically on repeatability and reproducibility. Key terms are defined, including bias, repeatability, reproducibility, stability, linearity, attribute R&R parameters like effectiveness, misses, false alarms, and bias, and how to analyze variable measurement data using analysis of variance. Guidelines for acceptable levels of R&R parameters are also provided.
Keywords: six sigma; foundry SMEs; small and medium-sized enterprises; design of experiments; DOE; measurement system analysis; MSA; failure mode and effects analysis; FMEA; non-conforming products; cost of poor quality; hypothesis testing; defects per million opportunities; DPMO; process capability; DMAICS; analysis of variance; ANOVA; India; make-to-order foundries; scrap reduction; productivity.
Keywords: six sigma, DMAIC project, scrap, rework, analysis of variance, ANOVA, design of experiments, DOE, process audit sheets, India, foundries, foundry industry, SMEs, small and medium–sized enterprises, die casting
The document summarizes research on optimizing machining parameters for CNC turning of aluminum alloy 7020 using Response Surface Methodology (RSM). It provides background on CNC turning, tool materials, workpiece material (Al 7020), and machining parameters to be optimized (cutting speed, feed rate, depth of cut). The document also reviews previous literature on optimizing machining parameters for aluminum alloys and describes the methodology used in the present study.
This document outlines a Six Sigma project to optimize the backup power systems at Baba Banda Singh Bahadur Engineering College using diesel generator sets manufactured by Cummins India Ltd. The goal is to enhance the mileage (units generated per liter of diesel) of the generator sets. The project will follow the DMAIC methodology over 5-7 months. Initial data collection and analysis found the current sigma level of the generator sets to be around 1.0, indicating a need for improvement. Various factors that could impact mileage will be analyzed using tools like root cause analysis, ANOVA, and regression analysis to identify opportunities.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSIJNSA Journal
The smart irrigation system represents an innovative approach to optimize water usage in agricultural and landscaping practices. The integration of cutting-edge technologies, including sensors, actuators, and data analysis, empowers this system to provide accurate monitoring and control of irrigation processes by leveraging real-time environmental conditions. The main objective of a smart irrigation system is to optimize water efficiency, minimize expenses, and foster the adoption of sustainable water management methods. This paper conducts a systematic risk assessment by exploring the key components/assets and their functionalities in the smart irrigation system. The crucial role of sensors in gathering data on soil moisture, weather patterns, and plant well-being is emphasized in this system. These sensors enable intelligent decision-making in irrigation scheduling and water distribution, leading to enhanced water efficiency and sustainable water management practices. Actuators enable automated control of irrigation devices, ensuring precise and targeted water delivery to plants. Additionally, the paper addresses the potential threat and vulnerabilities associated with smart irrigation systems. It discusses limitations of the system, such as power constraints and computational capabilities, and calculates the potential security risks. The paper suggests possible risk treatment methods for effective secure system operation. In conclusion, the paper emphasizes the significant benefits of implementing smart irrigation systems, including improved water conservation, increased crop yield, and reduced environmental impact. Additionally, based on the security analysis conducted, the paper recommends the implementation of countermeasures and security approaches to address vulnerabilities and ensure the integrity and reliability of the system. By incorporating these measures, smart irrigation technology can revolutionize water management practices in agriculture, promoting sustainability, resource efficiency, and safeguarding against potential security threats.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELgerogepatton
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
Understanding Inductive Bias in Machine LearningSUTEJAS
This presentation explores the concept of inductive bias in machine learning. It explains how algorithms come with built-in assumptions and preferences that guide the learning process. You'll learn about the different types of inductive bias and how they can impact the performance and generalizability of machine learning models.
The presentation also covers the positive and negative aspects of inductive bias, along with strategies for mitigating potential drawbacks. We'll explore examples of how bias manifests in algorithms like neural networks and decision trees.
By understanding inductive bias, you can gain valuable insights into how machine learning models work and make informed decisions when building and deploying them.
Synthesizing tbi relevance in india through six sigma approach
1. SYNTHESISING TBI- RELEVANCE IN INDIA THROUGH
SIX SIGMA APPROACH
By
Dr. Bikram Jit Singh
Professor
MMDU, Mullana
2. INTRODUCTION
-TBIs, Significance of TBIs
-- Status in India
-- Classification of TBIs
--Relevance concept
LITERATURE SURVEY
-TBIs
--need of TBIs
--Any reference works to relevance of TBIs (already done)
PROBLEM FORMULATION
-What happen without TBIs
-- Indian context
-- Why to study Relevance
METHODOLOGY ADOPTED
-Six sigma’s DMAIC approach to judge the relevance of TBIs
CASE FINDINGS
- Define, Measure & Analyse Phase
-Result Appraisal
Conclusions
References
4. MEASURE PHASE
ServicesICT SectorBio-Tech SectorAgricultureAdvanced technologies
11.5
11.0
10.5
10.0
9.5
9.0
8.5
8.0
Thrust Area
MeanOccupancy
Main Effects Plot for Avg. Occupancy of TBIs
Data Means
(in % age)
5. MEASURE PHASE
Low Occupancy is because of meager awareness and non-
availability of TBIs in India
2016128
Median
Mean
11.010.510.09.59.08.58.0
1st Q uartile 7.0000
Median 9.5000
3rd Q uartile 11.2500
Maximum 19.0000
8.4664 11.0003
8.0000 10.7713
2.7022 4.5613
A -Squared 0.57
P-V alue 0.125
Mean 9.7333
StDev 3.3930
V ariance 11.5126
Skewness 0.801725
Kurtosis 0.516561
N 30
Minimum 5.0000
A nderson-Darling Normality Test
95% C onfidence Interv al for Mean
95% C onfidence Interv al for Median
95% C onfidence Interv al for StDev
95% Confidence Intervals
Summary for Avg. Occupancy of TBIs in India
6. MEASURE PHASE
DATA MEASURED FOR INTRA-RELEVANCE ANALYSIS
(WITH IN A THRUST AREA)
DATA MEASURED FOR INTER-RELEVANCE ANALYSIS
(AMONG VARIOUS THRUST AREAS)
8. ANALYSE PHASE
Plan for Synthesizing Relevance of TBIs
Sr. No. Type of Analysis Technique / Tool Used
1 Intra Sector Unstacked ANOVA
( One-Way Analysis of Variance)
2 Inter Sector Interaction Plot
28. Thrust Area TBI Services Relevance
A G1 0.53
A G2 0.62
A G3 0.74
A G4 0.62
A G5 0.76
B G1 0.62
B G2 0.6
B G3 0.65
B G4 0.56
B G5 0.7
C G1 0.57
C G2 0.64
C G3 0.58
C G4 0.39
C G5 0.51
D G1 0.42
D G2 0.62
D G3 0.54
D G4 0.52
D G5 0.35
E G1 0.22
E G2 0.4
E G3 0.41
E G4 0.52
E G5 0.49
0.20.10.0-0.1-0.2
99
90
50
10
1
Residual
Percent
0.70.60.50.40.3
0.1
0.0
-0.1
-0.2
Fitted Value
Residual
0.150.100.050.00-0.05-0.10-0.15
8
6
4
2
0
Residual
Frequency
24222018161412108642
0.1
0.0
-0.1
-0.2
Observation Order
Residual
Normal Probability Plot Versus Fits
Histogram Versus Order
Residual Plots for Relevance
INTER-SECTOR RELEVANCE ANALYSIS
29. Two-way ANOVA: Relevance Versus Thrust Area &TBI Services
Source DF SS MS F P
Thrust Area 4 0.201344 0.0503360 5.76 0.005
TBI Services 4 0.043064 0.0107660 1.23 0.336
Error 16 0.139736 0.0087335
Total 24 0.384144
S = 0.09345 R-Sq = 63.62% R-Sq(adj) = 45.44%
Individual 95% CIs For Mean Based on
Thrust Pooled StDev
Area Mean ---+---------+---------+---------+------
A 0.654 (------*-------)
B 0.626 (------*-------)
C 0.538 (-------*------)
D 0.490 (-------*------)
E 0.408 (------*------)
---+---------+---------+---------+------
0.36 0.48 0.60 0.72
Individual 95% CIs For Mean Based on
TBI Pooled StDev
Services Mean --+---------+---------+---------+-------
G1 0.472 (----------*----------)
G2 0.576 (----------*----------)
G3 0.584 (----------*----------)
G4 0.522 (----------*----------)
G5 0.562 (----------*----------)
--+---------+---------+---------+-------
0.400 0.480 0.560 0.640
42. OVERALL RELEVANCE-ANALYSIS OF INTER-TBI
SERVICES
G5G4G3G2G1
0.8
0.7
0.6
0.5
0.4
0.3
0.2
TBI Services
Relevance
A
B
C
D
E
Area
Thrust
Multi-Vari Chart for Relevance by Thrust Area - TBI Services