The document summarizes a study on determining the optimal disassembly sequence of a cell phone to minimize the number of workstations and idle times. Data on disassembling 25 parts of a Samsung cell phone by 3 workers was analyzed using a sequencing algorithm in operations research. The analysis found the total estimated time was 154 minutes, with idle times of 9, 15 and 13 minutes for each worker respectively. This allows placing workers to minimize idle times in a disassembly line.
This document summarizes a report on developing a total cost model for additive manufacturing (AM), also known as 3D printing. It conducted experiments using laser sintering (LS) and selective laser melting (SLM) to understand build failure rates, manual post-processing costs, and how costs decrease with higher machine utilization. The results provide a methodology to analyze the full economic costs of AM and understand when it may have a viable business case compared to traditional manufacturing.
IRJET-Design Optimization of Free Standing Communication Tower using Genetic ...IRJET Journal
This document describes research optimizing the design of free standing communication towers using a genetic algorithm approach. It summarizes the objectives of minimizing tower weight while satisfying structural constraints. The study develops a genetic algorithm using MATLAB to optimize 7 configurations of communication towers. Results show the genetic algorithm yields a 1-2% reduction in optimal tower weight compared to other optimization methods like particle swarm optimization. The genetic algorithm is validated on benchmark problems and applied to optimize the member sizing of communication towers with different bracing configurations. Overall, the research demonstrates genetic algorithms can effectively optimize structural designs like tower configurations for reduced weight.
WHEN DOES PRECISION ENGINEERING STARTS?
Precision engineering was first published in January 1979; since 1986 it has also been known to many of its readers as the Journal of the American Society of Precision Engineering. Now with effect from 2000, it assumes a new look, proudly proclaiming itself the Journal of the International Societies of Precision Engineering and nanotechnology.
This document summarizes a bachelor's thesis that analyzes how additive manufacturing (3D printing) could impact the after-sales services supply chain at Fokker Services B.V. The author identifies four key areas where 3D printing could help reduce costs: 1) addressing obsolescence issues when original parts are no longer available, 2) providing a lower-cost production alternative for some spare parts, 3) enabling part consolidation through redesign, and 4) potentially allowing for decentralization if airline operators invest in 3D printing. A data model is used to identify 235 spare parts that could benefit from 3D printing. Further analysis of 15 parts found 4 that are suitable for 3D printing and likely to provide cost savings
This document discusses the implementation of design for manufacturing (DFM) principles in mechanical engineering education. It provides examples of applying DFM to dimensional tolerancing using standards to ensure functionality and manufacturability. Advanced computer technologies like CAD/CAM and AI are also discussed in improving the link between design and manufacturing to better implement DFM concepts. The authors aim to integrate DFM training earlier in the manufacturing engineering curriculum.
IRJET- Optimization of RC Column and Footings using Genetic AlgorithmIRJET Journal
This document discusses using genetic algorithms to optimize the design of reinforced concrete columns and footings to minimize cost. A visual basic application code was developed to design RC columns and footings according to code specifications. The design variables like column dimensions, reinforcement size and spacing were optimized using a genetic algorithm coded in MATLAB. The optimized design results were compared to standard design results and found to provide a more economical design that meets all structural requirements.
This document summarizes research on the mechanical behavior of 3D printed parts. It discusses how factors like layer thickness, build orientation, and temperature can influence properties of printed samples. Tensile strengths of polyether-ether-ketone (PEEK) samples were significantly affected by layer thickness and raster angle. The document also outlines applications of 3D printing like medical implants and prototypes. A variety of materials that can be 3D printed are listed, including metals, polymers, and biopolymers.
Fatigue and fracture behavior of additively manufactured metals after heat tr...TAV VACUUM FURNACES
Additive Manufacturing (AM) is any of various processes of making three-dimensional solid objects from a digital file.
Unlike subtractive manufacturing methods that start with a solid block of material and then cut away the excess to create a finished part, additive manufacturing builds up a part (or features onto parts) layer by layer from geometry described in a 3D design model.
For many decades, AM processes have been used for rapid prototyping. Over the last few years, additive manufacturing has gained incredible interest in all industry facets: from aerospace applications to simple one-off consumer home builds. This technology has immense versatility and flexibility, due to its ability to create complex geometries with customizable material properties.
Discover how the additive manufacturing processing of metals makes it possible to design and build lightweight parts in real time and understand potential of heat treatments in vacuum for 3D printed parts.
This document summarizes a report on developing a total cost model for additive manufacturing (AM), also known as 3D printing. It conducted experiments using laser sintering (LS) and selective laser melting (SLM) to understand build failure rates, manual post-processing costs, and how costs decrease with higher machine utilization. The results provide a methodology to analyze the full economic costs of AM and understand when it may have a viable business case compared to traditional manufacturing.
IRJET-Design Optimization of Free Standing Communication Tower using Genetic ...IRJET Journal
This document describes research optimizing the design of free standing communication towers using a genetic algorithm approach. It summarizes the objectives of minimizing tower weight while satisfying structural constraints. The study develops a genetic algorithm using MATLAB to optimize 7 configurations of communication towers. Results show the genetic algorithm yields a 1-2% reduction in optimal tower weight compared to other optimization methods like particle swarm optimization. The genetic algorithm is validated on benchmark problems and applied to optimize the member sizing of communication towers with different bracing configurations. Overall, the research demonstrates genetic algorithms can effectively optimize structural designs like tower configurations for reduced weight.
WHEN DOES PRECISION ENGINEERING STARTS?
Precision engineering was first published in January 1979; since 1986 it has also been known to many of its readers as the Journal of the American Society of Precision Engineering. Now with effect from 2000, it assumes a new look, proudly proclaiming itself the Journal of the International Societies of Precision Engineering and nanotechnology.
This document summarizes a bachelor's thesis that analyzes how additive manufacturing (3D printing) could impact the after-sales services supply chain at Fokker Services B.V. The author identifies four key areas where 3D printing could help reduce costs: 1) addressing obsolescence issues when original parts are no longer available, 2) providing a lower-cost production alternative for some spare parts, 3) enabling part consolidation through redesign, and 4) potentially allowing for decentralization if airline operators invest in 3D printing. A data model is used to identify 235 spare parts that could benefit from 3D printing. Further analysis of 15 parts found 4 that are suitable for 3D printing and likely to provide cost savings
This document discusses the implementation of design for manufacturing (DFM) principles in mechanical engineering education. It provides examples of applying DFM to dimensional tolerancing using standards to ensure functionality and manufacturability. Advanced computer technologies like CAD/CAM and AI are also discussed in improving the link between design and manufacturing to better implement DFM concepts. The authors aim to integrate DFM training earlier in the manufacturing engineering curriculum.
IRJET- Optimization of RC Column and Footings using Genetic AlgorithmIRJET Journal
This document discusses using genetic algorithms to optimize the design of reinforced concrete columns and footings to minimize cost. A visual basic application code was developed to design RC columns and footings according to code specifications. The design variables like column dimensions, reinforcement size and spacing were optimized using a genetic algorithm coded in MATLAB. The optimized design results were compared to standard design results and found to provide a more economical design that meets all structural requirements.
This document summarizes research on the mechanical behavior of 3D printed parts. It discusses how factors like layer thickness, build orientation, and temperature can influence properties of printed samples. Tensile strengths of polyether-ether-ketone (PEEK) samples were significantly affected by layer thickness and raster angle. The document also outlines applications of 3D printing like medical implants and prototypes. A variety of materials that can be 3D printed are listed, including metals, polymers, and biopolymers.
Fatigue and fracture behavior of additively manufactured metals after heat tr...TAV VACUUM FURNACES
Additive Manufacturing (AM) is any of various processes of making three-dimensional solid objects from a digital file.
Unlike subtractive manufacturing methods that start with a solid block of material and then cut away the excess to create a finished part, additive manufacturing builds up a part (or features onto parts) layer by layer from geometry described in a 3D design model.
For many decades, AM processes have been used for rapid prototyping. Over the last few years, additive manufacturing has gained incredible interest in all industry facets: from aerospace applications to simple one-off consumer home builds. This technology has immense versatility and flexibility, due to its ability to create complex geometries with customizable material properties.
Discover how the additive manufacturing processing of metals makes it possible to design and build lightweight parts in real time and understand potential of heat treatments in vacuum for 3D printed parts.
Our Industrial Modeling Service (IMS) involves several important (but rarely implemented) methods to significantly improve and advance your existing models and data. Since it is well-known that good decision-making requires good models and data, IMS is ideally suited to support this continuous-improvement endeavour. IMS is specifically designed to either co-exist with your existing design, planning, scheduling, etc. applications or these same models and data can be used seamlessly into our Industrial Modeling and Programming Language (IMPL) to create new value-added applications. The following techniques form the basis of our IMS offering.
IRJET- Finite Element Analysis of Passenger Vehicle BumperIRJET Journal
This document presents a finite element analysis of passenger vehicle bumpers to improve safety performance. The study models and simulates bumpers in LS-DYNA software to analyze deformation, impact force, stress distribution, and energy absorption using different materials and designs. Results show that modifying the bumper thickness from 3mm to 5mm significantly reduces maximum strain from 50% to 15% and decreases maximum deformation from 476mm to 423mm, improving the bumper's ability to absorb impact energy.
IRJET- A Study on the Behavior of Grid Slab Subjected to Seismic LoadingIRJET Journal
This document discusses multi-cornered thin-wall sections for improving vehicle crashworthiness and occupant protection. It analyzes the collapse behavior of square, hexagonal, and octagonal cross-sections as well as a newly introduced 12-edge section through computer simulations. The 12-edge section was found to have high energy absorption capacity through stable collapse. Nonlinear finite element analysis was performed using ABAQUS to evaluate the response of the different section geometries under axial loading and over a range of steel grades. The goal was to maximize the specific energy absorption of the cross-sections to better manage crash energy and improve occupant safety.
Application of non traditional optimization for quality improvement in tool h...iaemedu
This document discusses the application of non-traditional optimization techniques for quality improvement in tool holders. It begins by introducing the use of Taguchi's design of experiments, response surface methodology, genetic algorithms, and particle swarm optimization to optimize process parameters and minimize defects. The document then reviews past literature on using various optimization methods like genetic algorithms and response surface methodology in metal cutting processes. Finally, the document describes applying these techniques to solve the specific problem of optimizing parameters in a face profile grinding process to reduce rejection rates.
Application of non traditional optimization for quality improvement in tool ...iaemedu
This document discusses applying non-traditional optimization techniques to improve quality in tool holders. It begins with an abstract that outlines using Taguchi's design of experiments, response surface methodology, and genetic algorithms to optimize grinding process parameters and minimize defects. The document then reviews relevant literature on using optimization methods like genetic algorithms and response surface methodology to determine optimal cutting conditions. It presents the methodology used, which includes conducting experiments using an L9 orthogonal array to evaluate control parameters like cutting speed, work head speed, and feed rate. The goal is to develop a mathematical model relating these parameters to quality measures and then use the model within genetic algorithms to find optimal parameter settings that minimize defects.
IRJET- Simulation and Analysis of Step Light Mid Part using Mold Flow AnalysisIRJET Journal
This document summarizes a study that used mold flow analysis and the Taguchi method to optimize the injection molding process parameters for a step light mid part. The researchers 3D modeled the part, performed meshing, and selected polycarbonate as the material. They identified four critical parameters (mold surface temperature, melt temperature, injection time, and V/P switch over) and used an L9 orthogonal array to design experiments varying the parameters at three levels. Nine experiments were conducted and analyzed for signal-to-noise ratios to determine the optimum parameters. Simulation results showed fill time, pressure distribution, velocity, and identified the optimum gate location. Comparing the simulation and experimental trial results validated the optimized parameters.
Development of models using genetic programming for turningIAEME Publication
The document summarizes a study that used genetic programming to develop models for predicting surface roughness and cutting forces during the turning of Inconel 718 alloy with coated carbide tools. Experiments were conducted according to a Taguchi orthogonal array with cutting speed, feed rate, and depth of cut as input variables and surface roughness, cutting force, feed force, and thrust force as output responses. Genetic programming was used to generate mathematical models relating the input and output variables. The models were validated using experimental data and able to reasonably predict the output values. The models can be used to optimize the turning process and identify parameters that achieve better surface finish and higher material removal rates.
IRJET- Optimization of Machining Facility Layout by using Simulation: Cas...IRJET Journal
This document describes a case study of optimizing the facility layout of a machining facility through simulation. The existing layout is analyzed using FlexSim simulation software. An alternative layout is generated using Systematic Layout Planning methodology and also analyzed using FlexSim. Simulation results show the proposed layout reduces material handling distance by 40-65%, time to start first machining by 65-97%, and throughput time by 3-12% compared to the existing layout. This case study demonstrates how simulation can be used to evaluate facility layout alternatives and optimize performance.
This is a presentation I made for the Pacific Design and Manufacturing conference in February, 2014. There were three presentations and this was the first one. It takes a look at some of the new materials in 3D Printing.
This document discusses modeling safety features for integration in the design process. It defines safety features and categorizes them into two main groups: safety features incorporated in machines and those outside of machines. Nine criteria are identified to characterize safety features. The methodology involves defining, identifying, classifying, and characterizing safety features. It also represents the design process as a feedback loop and establishes an algorithm for integrating safety considerations at the earliest design stages. The goal is to proactively address risks to improve safety.
On July 10th Innovate UK and the KTN held a business innovation day to showcase 30 of the Innovate UK projects that are currently active in the area of Additive Manufacturing. The presentations and pitches made on the day are now available to download. Topic 3 focuses on Post Processing
This document contains an exam for a Bachelor of Commerce degree in Operations Research. It includes 5 questions testing various concepts in operations research. Question 1 has parts on inventory reasons, queuing arrivals, and formulating a linear programming problem. Question 2 involves multiple decision criteria for an advertising program choice. Question 3 covers holding costs and an economic order quantity problem. Question 4 tests project management phases and performing a critical path analysis. Question 5 has parts on queuing behavior factors and analyzing a healthcare queuing system.
This document outlines the sections and questions for an Operations Research exam, including:
Section A contains 8 multiple choice questions worth 1 point each about operations research concepts like infeasible solutions, initial basic feasible solutions, and transportation problems.
Section B contains 5 longer answer questions worth 2 points each, exploring topics such as the phases of operations research, linear programming terms and concepts, and transportation algorithms.
Section C includes 3 essay questions worth 5 points each, requiring explanations of important operations research techniques, transportation models, and solving sample linear programming problems graphically and using the transportation simplex method.
This document contains a 10 question exam on digital signal processing. The questions cover topics such as the discrete Fourier transform (DFT) and its properties, the fast Fourier transform (FFT) algorithm and its efficiency improvements over the DFT, Butterworth filters including determining filter parameters and converting between filter types, realizing transfer functions in cascade and parallel forms, divide and conquer DFT computation, overlap save and add methods for linear filtering, and FIR system structures including direct, cascade and lattice forms. Short notes are also requested on Goertzel's algorithm, circular convolution, and Parseval's theorem.
This document is a 4-page examination for an Operations Research course containing 7 questions worth a total of 100 marks. Students are asked to answer any 5 of the 7 questions, which cover topics such as linear programming, transportation problems, assignment problems, sequencing problems, queuing theory, simulation, game theory, and replacement problems. Questions involve solving models mathematically as well as discussing concepts in operations research.
This document contains information about an operations research exam including:
1) Two questions about formulating mathematical models for optimization problems involving animal feed mixtures and solving a linear programming problem.
2) Two questions about assigning jobs to mechanics to maximize return and solving a transportation problem to minimize costs.
3) Two questions involving determining optimal job sequences to minimize time and performing a critical path analysis on a project with activity times.
4) Two questions about determining optimum order levels and production quantities to minimize inventory and shortage costs.
5) A question about determining performance measures for a tax consulting firm with Poisson arrivals and exponential service times.
6) Two questions involving solving a two-player zero-sum game using dominance rules
1) A project is a temporary endeavor to create a unique product or service, while operations are ongoing activities.
2) Progressive elaboration is the process where a project concept becomes more detailed and focused over time as planning and research activities continue.
3) Project scope defines only the required work to complete the project.
Sequencing problems in Operations ResearchAbu Bashar
The document discusses sequencing problems and various sequencing rules used to optimize outputs when assigning jobs to machines. It describes Johnson's rule for minimizing completion time when scheduling jobs on two workstations. Johnson's rule involves scheduling the job with the shortest processing time first at the workstation where it finishes earliest. It provides an example of applying Johnson's rule to schedule five motor repair jobs at the Morris Machine Company across two workstations. Finally, it discusses Johnson's three machine rule for sequencing jobs across three machines.
This document provides lecture notes on operations research (OR). It begins with introductions to key OR terminology and concepts. It then outlines the 7 step methodology typically used in OR studies. A brief history of OR is given, noting its origins in the UK during World War II to aid military operations. Basic OR concepts are defined, including decision variables, constraints, objective functions, and optimal solutions. An example problem involving two mines is presented to illustrate these concepts.
This document provides an overview of game theory concepts including its assumptions, classifications, elements, significance, and limitations. It also describes methods for solving different types of games such as the prisoner's dilemma, 2-person zero-sum games, and pure strategy games. Game theory analyzes strategic decision making among interdependent parties and can provide insights into situations involving conflict or competition between rational opponents.
The document discusses three examples of project managers and their responsibilities on different projects:
1) Construction of a retail development with 26 units and a supermarket. Responsible for coordinating contractors to ensure on-time and on-budget completion.
2) Directing trials of a new analgesic drug. Responsible for designing experiments and ensuring proper scientific and legal procedures are followed.
3) Introducing multimedia resources at a teacher training college in New Delhi. Responsible for purchasing and developing resources as well as encouraging acceptance by lecturers and students.
This document discusses various concepts related to operations scheduling. It defines operations scheduling and describes how it involves assigning jobs to work centers and machines, determining start and completion times, allocating resources, and establishing time sequences. It outlines objectives like meeting delivery dates and minimizing costs/inventory. Performance measures used in scheduling like job flow time, makespan, past due jobs and utilization are also defined. Finally, it discusses sequencing jobs at single and multiple workstations using different priority rules.
Our Industrial Modeling Service (IMS) involves several important (but rarely implemented) methods to significantly improve and advance your existing models and data. Since it is well-known that good decision-making requires good models and data, IMS is ideally suited to support this continuous-improvement endeavour. IMS is specifically designed to either co-exist with your existing design, planning, scheduling, etc. applications or these same models and data can be used seamlessly into our Industrial Modeling and Programming Language (IMPL) to create new value-added applications. The following techniques form the basis of our IMS offering.
IRJET- Finite Element Analysis of Passenger Vehicle BumperIRJET Journal
This document presents a finite element analysis of passenger vehicle bumpers to improve safety performance. The study models and simulates bumpers in LS-DYNA software to analyze deformation, impact force, stress distribution, and energy absorption using different materials and designs. Results show that modifying the bumper thickness from 3mm to 5mm significantly reduces maximum strain from 50% to 15% and decreases maximum deformation from 476mm to 423mm, improving the bumper's ability to absorb impact energy.
IRJET- A Study on the Behavior of Grid Slab Subjected to Seismic LoadingIRJET Journal
This document discusses multi-cornered thin-wall sections for improving vehicle crashworthiness and occupant protection. It analyzes the collapse behavior of square, hexagonal, and octagonal cross-sections as well as a newly introduced 12-edge section through computer simulations. The 12-edge section was found to have high energy absorption capacity through stable collapse. Nonlinear finite element analysis was performed using ABAQUS to evaluate the response of the different section geometries under axial loading and over a range of steel grades. The goal was to maximize the specific energy absorption of the cross-sections to better manage crash energy and improve occupant safety.
Application of non traditional optimization for quality improvement in tool h...iaemedu
This document discusses the application of non-traditional optimization techniques for quality improvement in tool holders. It begins by introducing the use of Taguchi's design of experiments, response surface methodology, genetic algorithms, and particle swarm optimization to optimize process parameters and minimize defects. The document then reviews past literature on using various optimization methods like genetic algorithms and response surface methodology in metal cutting processes. Finally, the document describes applying these techniques to solve the specific problem of optimizing parameters in a face profile grinding process to reduce rejection rates.
Application of non traditional optimization for quality improvement in tool ...iaemedu
This document discusses applying non-traditional optimization techniques to improve quality in tool holders. It begins with an abstract that outlines using Taguchi's design of experiments, response surface methodology, and genetic algorithms to optimize grinding process parameters and minimize defects. The document then reviews relevant literature on using optimization methods like genetic algorithms and response surface methodology to determine optimal cutting conditions. It presents the methodology used, which includes conducting experiments using an L9 orthogonal array to evaluate control parameters like cutting speed, work head speed, and feed rate. The goal is to develop a mathematical model relating these parameters to quality measures and then use the model within genetic algorithms to find optimal parameter settings that minimize defects.
IRJET- Simulation and Analysis of Step Light Mid Part using Mold Flow AnalysisIRJET Journal
This document summarizes a study that used mold flow analysis and the Taguchi method to optimize the injection molding process parameters for a step light mid part. The researchers 3D modeled the part, performed meshing, and selected polycarbonate as the material. They identified four critical parameters (mold surface temperature, melt temperature, injection time, and V/P switch over) and used an L9 orthogonal array to design experiments varying the parameters at three levels. Nine experiments were conducted and analyzed for signal-to-noise ratios to determine the optimum parameters. Simulation results showed fill time, pressure distribution, velocity, and identified the optimum gate location. Comparing the simulation and experimental trial results validated the optimized parameters.
Development of models using genetic programming for turningIAEME Publication
The document summarizes a study that used genetic programming to develop models for predicting surface roughness and cutting forces during the turning of Inconel 718 alloy with coated carbide tools. Experiments were conducted according to a Taguchi orthogonal array with cutting speed, feed rate, and depth of cut as input variables and surface roughness, cutting force, feed force, and thrust force as output responses. Genetic programming was used to generate mathematical models relating the input and output variables. The models were validated using experimental data and able to reasonably predict the output values. The models can be used to optimize the turning process and identify parameters that achieve better surface finish and higher material removal rates.
IRJET- Optimization of Machining Facility Layout by using Simulation: Cas...IRJET Journal
This document describes a case study of optimizing the facility layout of a machining facility through simulation. The existing layout is analyzed using FlexSim simulation software. An alternative layout is generated using Systematic Layout Planning methodology and also analyzed using FlexSim. Simulation results show the proposed layout reduces material handling distance by 40-65%, time to start first machining by 65-97%, and throughput time by 3-12% compared to the existing layout. This case study demonstrates how simulation can be used to evaluate facility layout alternatives and optimize performance.
This is a presentation I made for the Pacific Design and Manufacturing conference in February, 2014. There were three presentations and this was the first one. It takes a look at some of the new materials in 3D Printing.
This document discusses modeling safety features for integration in the design process. It defines safety features and categorizes them into two main groups: safety features incorporated in machines and those outside of machines. Nine criteria are identified to characterize safety features. The methodology involves defining, identifying, classifying, and characterizing safety features. It also represents the design process as a feedback loop and establishes an algorithm for integrating safety considerations at the earliest design stages. The goal is to proactively address risks to improve safety.
On July 10th Innovate UK and the KTN held a business innovation day to showcase 30 of the Innovate UK projects that are currently active in the area of Additive Manufacturing. The presentations and pitches made on the day are now available to download. Topic 3 focuses on Post Processing
This document contains an exam for a Bachelor of Commerce degree in Operations Research. It includes 5 questions testing various concepts in operations research. Question 1 has parts on inventory reasons, queuing arrivals, and formulating a linear programming problem. Question 2 involves multiple decision criteria for an advertising program choice. Question 3 covers holding costs and an economic order quantity problem. Question 4 tests project management phases and performing a critical path analysis. Question 5 has parts on queuing behavior factors and analyzing a healthcare queuing system.
This document outlines the sections and questions for an Operations Research exam, including:
Section A contains 8 multiple choice questions worth 1 point each about operations research concepts like infeasible solutions, initial basic feasible solutions, and transportation problems.
Section B contains 5 longer answer questions worth 2 points each, exploring topics such as the phases of operations research, linear programming terms and concepts, and transportation algorithms.
Section C includes 3 essay questions worth 5 points each, requiring explanations of important operations research techniques, transportation models, and solving sample linear programming problems graphically and using the transportation simplex method.
This document contains a 10 question exam on digital signal processing. The questions cover topics such as the discrete Fourier transform (DFT) and its properties, the fast Fourier transform (FFT) algorithm and its efficiency improvements over the DFT, Butterworth filters including determining filter parameters and converting between filter types, realizing transfer functions in cascade and parallel forms, divide and conquer DFT computation, overlap save and add methods for linear filtering, and FIR system structures including direct, cascade and lattice forms. Short notes are also requested on Goertzel's algorithm, circular convolution, and Parseval's theorem.
This document is a 4-page examination for an Operations Research course containing 7 questions worth a total of 100 marks. Students are asked to answer any 5 of the 7 questions, which cover topics such as linear programming, transportation problems, assignment problems, sequencing problems, queuing theory, simulation, game theory, and replacement problems. Questions involve solving models mathematically as well as discussing concepts in operations research.
This document contains information about an operations research exam including:
1) Two questions about formulating mathematical models for optimization problems involving animal feed mixtures and solving a linear programming problem.
2) Two questions about assigning jobs to mechanics to maximize return and solving a transportation problem to minimize costs.
3) Two questions involving determining optimal job sequences to minimize time and performing a critical path analysis on a project with activity times.
4) Two questions about determining optimum order levels and production quantities to minimize inventory and shortage costs.
5) A question about determining performance measures for a tax consulting firm with Poisson arrivals and exponential service times.
6) Two questions involving solving a two-player zero-sum game using dominance rules
1) A project is a temporary endeavor to create a unique product or service, while operations are ongoing activities.
2) Progressive elaboration is the process where a project concept becomes more detailed and focused over time as planning and research activities continue.
3) Project scope defines only the required work to complete the project.
Sequencing problems in Operations ResearchAbu Bashar
The document discusses sequencing problems and various sequencing rules used to optimize outputs when assigning jobs to machines. It describes Johnson's rule for minimizing completion time when scheduling jobs on two workstations. Johnson's rule involves scheduling the job with the shortest processing time first at the workstation where it finishes earliest. It provides an example of applying Johnson's rule to schedule five motor repair jobs at the Morris Machine Company across two workstations. Finally, it discusses Johnson's three machine rule for sequencing jobs across three machines.
This document provides lecture notes on operations research (OR). It begins with introductions to key OR terminology and concepts. It then outlines the 7 step methodology typically used in OR studies. A brief history of OR is given, noting its origins in the UK during World War II to aid military operations. Basic OR concepts are defined, including decision variables, constraints, objective functions, and optimal solutions. An example problem involving two mines is presented to illustrate these concepts.
This document provides an overview of game theory concepts including its assumptions, classifications, elements, significance, and limitations. It also describes methods for solving different types of games such as the prisoner's dilemma, 2-person zero-sum games, and pure strategy games. Game theory analyzes strategic decision making among interdependent parties and can provide insights into situations involving conflict or competition between rational opponents.
The document discusses three examples of project managers and their responsibilities on different projects:
1) Construction of a retail development with 26 units and a supermarket. Responsible for coordinating contractors to ensure on-time and on-budget completion.
2) Directing trials of a new analgesic drug. Responsible for designing experiments and ensuring proper scientific and legal procedures are followed.
3) Introducing multimedia resources at a teacher training college in New Delhi. Responsible for purchasing and developing resources as well as encouraging acceptance by lecturers and students.
This document discusses various concepts related to operations scheduling. It defines operations scheduling and describes how it involves assigning jobs to work centers and machines, determining start and completion times, allocating resources, and establishing time sequences. It outlines objectives like meeting delivery dates and minimizing costs/inventory. Performance measures used in scheduling like job flow time, makespan, past due jobs and utilization are also defined. Finally, it discusses sequencing jobs at single and multiple workstations using different priority rules.
This presentation is an attempt to introduce Game Theory in one session. It's suitable for undergraduates. In practice, it's best used as a taster since only a portion of the material can be covered in an hour - topics can be chosen according to the interests of the class.
The main reference source used was 'Games, Theory and Applications' by L.C.Thomas. Further notes available at: http://bit.ly/nW6ULD
This document outlines the key steps and components of the research process for a study titled "A Study on Pragmatic Approaches and Quality Initiatives for Enhancing Teachers’ Caliber in Post Graduate Institutes offering MBA Programme under Bangalore University". The research methodology section defines different types of research and the scientific research process. It also provides details on key aspects of research design including objectives, hypotheses, sampling, data collection and analysis. The document concludes by mentioning the final steps of report writing and research reporting.
This document discusses project management techniques CPM and PERT. It begins by defining a project and project management. It then discusses network planning methods including CPM and PERT. The four steps to managing a project with these methods are described: describing the project, diagramming the network, estimating time of completion, and monitoring progress. Key concepts like activities, precedence relationships, and events are also defined. The document goes on to provide details on CPM and PERT, including estimating time, determining critical paths, and differences between the two methods.
This document provides an introduction to next generation sequencing (NGS) technologies. It begins with an outline of topics to be covered, including the evolution of NGS technologies, their descriptions and comparisons, bioinformatics challenges of NGS data analysis, and some aspects of NGS data analysis workflows and tools. The document then delves into explanations of specific NGS platforms, their performance characteristics, and the sequencing processes. It discusses the large computational infrastructure and data management needs of NGS, as well as quality control, preprocessing of NGS data, and popular analysis tools and workflows.
This document discusses operations scheduling. It begins by introducing operations scheduling and explaining that it involves assigning jobs, resources, and sequencing operations while accounting for deviations. It then discusses key performance measures for schedules such as job flow time, makespan, past due jobs, work-in-process inventory, total inventory, and utilization. The document proceeds to list objectives and functions of operations scheduling such as efficient resource use, on-time delivery, and minimizing costs and inventory. Finally, it briefly outlines types of scheduling like forward and backward, and methods like Johnson's algorithm and the index method.
The document provides an overview of the smartphone industry. It discusses the history and evolution of smartphones from basic phones to advanced devices. Some key points:
- Smartphones now have more computing power, connectivity, and ability to install apps compared to basic phones. Major platforms include Android, iOS, Windows, Blackberry and others.
- The global smartphone market reached $150 billion in 2014, doubling from 2009. Android and iOS dominate global sales. Asia Pacific is a major market, led by China and India.
- The industry faces competition between OS platforms, hardware manufacturers, and content providers. Barriers to entry are high due to technology and manufacturing requirements.
The presentation discusses mobile technology and how it is changing how people access the internet, with 80% predicted to access online from mobile devices by 2015; it covers different types of mobile devices like smartphones, tablets, and eReaders and benefits like portability, accessibility, and apps; and includes discussions around 4G vs WiFi networks, iOS and Android app markets, mobile optimized websites, and cloud computing.
Mobile phones were first developed in the 1940s as two-way radios for the military. Dr. Martin Cooper invented the first practical mobile phone in 1973. The first mobile phones available to the public were introduced in 1983 by Motorola and used analog technology. Throughout the 1990s, digital 2G technology improved mobile phones. Today, most phones use 3G digital technology, which allows data like emails and messages in addition to calls. Mobile phones connect to cellular networks provided by operators to make calls and access the internet. They have increasingly taken on additional functions like cameras, music players, and internet access.
A Study on Process Improvement in the Assembly Line of Switch Manufacturingijceronline
The paper is about the process improvement in the assembly line at switch manufacturing company and to improve the process by focusing into the areas viz. Process flow, Time study and rework minimization. This improvement are made by using cause-and-effect diagram, critical path method and root cause analysis. The analysis will help to reduce the amount of rework that occurs during manufacturing of modular switches in the assembly line process
Schumann, a modeling framework for supply chain management under uncertaintySSA KPI
This document presents a modeling framework for optimizing supply chain planning under uncertainty in product demand and component supply costs and delivery times. A 2-stage scenario analysis approach is used based on partial recourse, where supply chain decisions can be implemented for initial time periods without anticipating future scenarios. Very useful schemes are used for modeling balance equations and multi-period linking constraints. A dual variable splitting scheme is used to model implementable time period variables via a redundant circular linking representation. The framework is intended for large-scale manufacturing, assembly and distribution supply chain problems but has broader applications.
Optimization of Injection Molding Process-literature reviewAlex Larsh
The document discusses optimization of the injection molding process through various techniques including computer simulation, artificial neural networks (ANN), and genetic algorithms (GA). It presents a combining ANN/GA method to model and predict part quality, and optimize process conditions. A novel computer simulation–stochastic data envelopment analysis (CS-SDEA) algorithm is also proposed to optimize layout of an injection molding process. Results show the CS-SDEA algorithm performs comparably to ANN and GA methods in optimizing layout alternatives.
This document discusses optimization techniques that can be applied to synchronous generators. It provides an overview of various optimization algorithms including ant colony optimization, artificial bee colony algorithm, genetic algorithm, and particle swarm optimization. These algorithms are surveyed as potential methods for modeling synchronous generators through parameter estimation and for solving problems like optimal power flow. The document also provides context on why optimization is important for synchronous generators and power systems in areas like accurate modeling, parameter estimation, and addressing challenges like uncertainty.
This document summarizes a study that used simulation to improve the layout of a job shop manufacturing facility using group technology concepts. 34 parts processed on 6 machines were analyzed. The parts were divided into 4 families using direct clustering. A new layout was designed with similar machines grouped together using the CRAFT algorithm. This resulted in a minimum material handling cost layout. The initial layout cost was 1738.75 units per period, while the optimized layout using group technology concepts and CRAFT algorithm cost 1071.25 units per period, a significant cost reduction.
A complex garment assembly line balancing usingAdnan Hameed
This document summarizes a research article that proposes using simulation-based optimization to balance a complex garment assembly line with stochastic task times. The study was conducted at a garment facility to minimize cycle time for a trouser assembly line with 69 workstations. Discrete event simulation was developed using Arena software to model the current line. OptQuest optimization tool was then used to find local and global optimal solutions. Results showed throughput increased by 30% for local optimal and 55% for global optimal balancing, while cycle time reduced by 23% and 36% respectively. The research aims to improve production efficiency for garment assembly lines using simulation-based optimization.
Analysis of impact of process changes on cluster tool performance using anIAEME Publication
The document discusses two integrated models for analyzing cluster tool performance when process parameters and times are changed. The first is a network model that evaluates total lot processing time given a sequence of activities. It relates processing time to process parameters and times. The second is an integrated simulation model for when the wafer sequence is determined by a scheduling rule. Both models can quantify the impact of changes to process parameters and characteristics on lot processing time. Examples are provided to illustrate insights gained about cluster tool behavior from the integrated models.
Job Shop Layout Design Using Group TechnologyIJMER
This document summarizes a study that uses simulation to improve the performance of a job shop layout by reconfiguring the machines. 34 job elements that are processed on 6 machines were analyzed. The jobs were clustered into 4 part families using direct clustering. Similar machines were then grouped together. Computerized Relative Allocation of Facilities Technique (CRAFT) with computer graphics was used to design a new layout. The initial layout had a total material handling cost of 1738.75 units per period. The optimized layout designed using CRAFT reduced this cost to 1071.25 units, a significant improvement without additional investment.
Quasi-Static Evaluation of a Modular and Reconfigurable Manufacturing CellHillary Green
This document presents a novel modular and reconfigurable manufacturing cell (MRMC) system that aims to provide flexible manufacturing capabilities. Some key points:
- The MRMC system consists of modular manipulation hardware and software that can be quickly configured and reconfigured for different assembly and packaging applications.
- It uses a unique interconnect design to allow mechanical and electrical connection between modules. Distributed intelligence and self-locating software enables automatic configuration.
- Analytical evaluation of precision shows the MRMC maintains necessary accuracy and repeatability for tasks like pick-and-place despite reconfiguration.
- The goal is to offer a low-cost, low-risk solution for prototyping and low-volume manufacturing through
NUMERICAL STUDIES OF TRAPEZOIDAL PROTOTYPE AUDITORY MEMBRANE (PAM)IJCSEA Journal
In this research, we developed numerically a Prototype Auditory Membrane (PAM) for a fully implantable and self contained artificial cochlea. Cochleae are one of the important organs for hearing in the human and animals. Material of the prototype and implant of PAM are made of Polyvinylidene fluoride (PVDF)- Kureha, Japan which is fabricated using MEMS and thin film technologies. Another important thing in the characteristic of the PAM is not only convert the acoustic wave into electric signal but also the frequency selectivity. The thickness, Young’s modulus and density of the PAM are 40 μm, 4 GPa, and 1.79 103 kg/m3, respectively. The shape and dimension of the PAM is trapezoidal with the width is linearly changed from 2.0 to 4.0 mm with the length are 30 mm. Numerically, we develop the model of PAM is based on commercial CFD software, Fluent 6.3.26 and Gambit 2.4.6. The geometry model of the PAM consists of one-sided blocks of quadrilateral elements for 2D model and tetrahedral elements for 3 D model respectively. In this study we set the flow as laminar and carried out using unsteady time dependent calculation. The results show that the frequency selectivity of the membrane is detected on the membrane surface.
The measured mile/baseline method has been widely accepted to quantify labor
productivity loss, but it has not been successfully used in engineering productivity because of the
challenges in engineering productivity measurement and the determination of productivity
benchmark. This paper presents a series of procedures based on the measured mile/baseline
method to quantify engineering productivity loss from a project specific perspective. A case study
on the piping discipline in a large scale process plant project is used to illustrate the calculation.
The paper also includes a proposed approach, the two mile method, to quantify engineering
productivity loss using data for similar work, but different complexity.
The document discusses additive manufacturing (AM) processes and applications in construction. It provides an overview of AM, including common processes like material extrusion and powder bed fusion. Examples of AM construction projects are described, such as a printed office building and hotel. Key challenges in applying AM to large-scale construction include developing suitable feedstock materials that can be extruded while maintaining appropriate properties. Cementitious materials are most commonly used and studies have found optimized mixes include cement, fly ash, silica fume and additives to achieve desired strength and printability.
The document proposes a modified version of the Manufacturing Cost Deployment (MCD) method called Project Cost Deployment (PCD) for analyzing engineer-to-order (ETO) production systems. The PCD introduces two key modifications: 1) replacing the concept of production stations with manual assembly macro-activities, and 2) introducing a new structure for classifying and analyzing losses specific to manual assembly tasks. The validity of the PCD approach is demonstrated through a real-world industrial application to a train wagon manufacturer. The results show that PCD can identify hidden losses, quantify wastes economically, and estimate the impacts of potential lean improvements in terms of efficiency and effectiveness.
Survey on deep learning applied to predictive maintenance IJECEIAES
Prognosis health monitoring (PHM) plays an increasingly important role in the management of machines and manufactured products in today’s industry, and deep learning plays an important part by establishing the optimal predictive maintenance policy. However, traditional learning methods such as unsupervised and supervised learning with standard architectures face numerous problems when exploiting existing data. Therefore, in this essay, we review the significant improvements in deep learning made by researchers over the last 3 years in solving these difficulties. We note that researchers are striving to achieve optimal performance in estimating the remaining useful life (RUL) of machine health by optimizing each step from data to predictive diagnostics. Specifically, we outline the challenges at each level with the type of improvement that has been made, and we feel that this is an opportunity to try to select a state-of-the-art architecture that incorporates these changes so each researcher can compare with his or her model. In addition, post-RUL reasoning and the use of distributed computing with cloud technology is presented, which will potentially improve the classification accuracy in maintenance activities. Deep learning will undoubtedly prove to have a major impact in upgrading companies at the lowest cost in the new industrial revolution, Industry 4.0.
4 cired2013 distributed energy resourcesDutch Power
This document summarizes Session 4 of the CIRED Congress 2013 on distributed energy resources and energy efficiency. It describes the four blocks of papers presented in the session, covering topics like DG/DER planning and integration, operation and control, customer-side developments, and DG/DER technologies. For each block, it provides brief summaries of some of the selected papers to be presented, including their relevance, writing quality, importance, and whether they are worth reading.
Infrared Monitoring of Aluminium Milling Processes for Reduction of Environme...IRJESJOURNAL
Abstract:-In modern manufacturing contexts, process monitoring is an important tool aimed at ensuring quality standard fulfilment whilst maximising throughput. In this work, a monitoring system comprised of an infrared (IR) camera was employed for tool state identification and surface roughness assessment with the objective of reducing environmental impacts of a milling process. Two data processing techniques, based on statistical parameters and polynomial fitting, were applied to the temperature signal acquired from the IR camera during milling operations in order to extract significant features. These features were inputted to two different neural network based procedures: pattern recognition and fitting, for decision making support on tool condition and surface roughness evaluation respectively. These capabilities are discussed in terms of reducing waste products and energy consumption whilst further improvingproductivity.
TAG Manufacturing Kick Off Meeting, The Future of ManufacturingMelanie Brandt
The document summarizes research being conducted at the Manufacturing Research Center (MARC) at Georgia Tech. MARC focuses on developing new manufacturing technologies in areas like design, machining, rapid prototyping, and factory information systems. Specific projects mentioned include developing nano-lubricants to improve grinding efficiency, using lasers for hard turning and coating applications, and applying thin-film wireless sensors to monitor machining processes. MARC aims to improve productivity, lower production costs, develop new materials and processes, and transfer technologies to industry.
IRJET- Integrated Optimization of Multi-Period Supply Chains and Commonality ...IRJET Journal
This document discusses integrating optimization of multi-period supply chains with commonality decisions at the modular level across product families. It presents a mathematical model developed using integer linear programming to minimize total supply chain cost. The model determines the level of commonality, selected technology, and any common module inventory at each period. The results show that the optimal commonality decision depends on factors like the cost ratio of high-end to variant modules, quantity discount rates, and inventory costs. At low cost ratios, inventory dominates the decision, while at high ratios the discount rate is more influential.
This document presents a mathematical model for analyzing a generic single channel, multi-phase production line. The model aims to minimize system costs by reducing idle machine times and work-in-process inventory levels between machines. The model accounts for machine cycle times and calculates the times at which products enter and exit each machine in the production line. It assumes deterministic arrival rates and develops equations to determine the optimal level of service to minimize the total expected costs of providing service and of waiting for service.
Similar to Operations research final report mee 437 (20)
Batteries -Introduction – Types of Batteries – discharging and charging of battery - characteristics of battery –battery rating- various tests on battery- – Primary battery: silver button cell- Secondary battery :Ni-Cd battery-modern battery: lithium ion battery-maintenance of batteries-choices of batteries for electric vehicle applications.
Fuel Cells: Introduction- importance and classification of fuel cells - description, principle, components, applications of fuel cells: H2-O2 fuel cell, alkaline fuel cell, molten carbonate fuel cell and direct methanol fuel cells.
artificial intelligence and data science contents.pptxGauravCar
What is artificial intelligence? Artificial intelligence is the ability of a computer or computer-controlled robot to perform tasks that are commonly associated with the intellectual processes characteristic of humans, such as the ability to reason.
› ...
Artificial intelligence (AI) | Definitio
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
1. OPERATIONS RESEARCH(MEE437)
SLOT C1
FINAL REPORT
ON PROJECT BASED LEARNING
DISASSEMBLY SEQUENCING PROBLEM
A CASE STUDY OF A CELL PHONE
SUBMITTED BY SUBMITTED TO
D.VIKRANTH REDDY PROF:JEEVA P.A
12BME0026
C1 SLOT
2. Disassembly Sequencing Problem: A Case Study of a Cell Phone
ABSTRACT:
(THIS ALL THE DISASSEMBLY IS DONE BY THREE WORKERS IN DIFFERENT TIME FOR DIFFERENT
PARTS)
Selection of an optimal disassembly sequence is essential for the efficient processing of a
product at the end of its life. Disassembly sequences are listings of disassembly actions (such
as the separation of an assembly into two or more subassemblies, or removing one or more
connections between components). Disassembly takes place in remanufacturing, recycling,
and disposal with a disassembly line being the best choice for automation. In this paper, the
disassembly sequencing problem is solved for a cell phone case on a disassembly line,
seeking a sequence which is feasible, minimizes the number of workstations (and hence idle
times), provides for early removal of high demand/value parts, provides the removal of parts
that lead to the access of greatest number of still-installed parts, and early removal of
hazardous parts as well as for the grouping of parts for removal having identical part removal
directions. Since finding the optimal sequence is computationally intensive due to factorial
growth, a heuristic method is used taking into account various disassembly-specific matters.
Using the experimentally determined precedence relationships and task times of areal-world
cell phone, a table of sequencing and a sequencing solution is generated. Finally, Design for
Disassembly (DFD) improvements are recommended with respect to environmentally
conscious manufacturing.
3. INTRODUCTION
Following the industrial revolution, manufacturing started to influence everyday life with
new products rapidly entering the marketplace. Since then, products have been routinely
disposed of in landfills at the end of their lives. Until recently, there was no major concern
over landfill growth or hazardous materials entering the environment, nor was there
arealization of the value of used parts, components and materials. In the last decade, however,
environmentally conscious manufacturing (ECM) and product recovery have became an
obligation for many companies due to new government regulations and consumer interest .
Consequently, companies have grown increasingly interested in how to efficiently
disassemble the products they manufacture. They have invested time, money and engineering
expertise in determining how customers return end of life (EOL) products (reverse logistics)
and how to make the product recovery process profitable,or atleast be conducted at minimum
cost.
Within these researches, numerous ways for efficiently performing disassembly processes
have been proposed. The most promising of these recommendations is the creation of
disassembly lines, similar in concept to assembly lines, but with several significant
differences. Small products, especially those with hazardous and/or valuable parts, are
especially suited to the disassembly line due to their ease of collection, portability, and
typically larger production numbers when compared to large products which are often more
convenient to disassemble in place rather than relocating them on a line. Larger products are
also prone to repair and maintenance as well as modification, making successful and
consistently efficient disassembly on a line more of a challenge.
In the past decade, the use of cellular phones has grown significantly. Due to rapid changes in
usage, technology (e.g.,analog to digital), and features (e.g., color, web access), dozens of
new models regularly enter the market. Consumers have responded by willingly replacing
their old phones with newer technologies, often with their current phone not yet at the end of
its useful life and still fully functional. These unwanted cell phones eventually end up in
landfills and typically contain numerous hazardous parts including: mercury, cadmium, lead,
gallium arsenide and beryllium. These materials,if improperly disposed, can pose a
significant threat to the environment.In this example we will use sequencing problem in
operations research to solve and we will extract the results.
4. Operations research, or operational research in British usage, is a discipline that deals
with the application of advanced analytical methods to help make better decisions. It is often
considered to be a sub-field of mathematics. The terms management science and decision
science are sometimes used as synonyms.
Employing techniques from other mathematical sciences, such as mathematical modeling,
statistical analysis, and mathematical optimization, operations research arrives at optimal or
near-optimal solutions to complex decision-making problems. Because of its emphasis on
human-technology interaction and because of its focus on practical applications, operations
research has overlap with other disciplines, notably industrial engineering and operations
management, and draws on psychology and organization science. Operations research is
often concerned with determining the maximum (of profit, performance, or yield) or
minimum (of loss, risk, or cost) of some real-world objective. Originating in military efforts
before World War II, its techniques have grown to concern problems in a variety of
industries.
SEQUENCING PROBLEM:
It is nothing but finding out the optimal order or sequence in which ‘n’ jobs have to be
processed on definite number of facilities. So that the total elapsed time is minimum.
This is related to waiting line theory and is applicable when the facilities are fixed, but the
order of servicing may be controlled. The scheduling of service or the sequencing of jobs is
done to minimize the relevant costs and time
SEQUENCING PROBLEM ALGORITHM:
Even though we can still confine our attention to permutation schedules when seeking to
minimize the makespan, the three machine problem is already too hard for a general optimal
algorithm. There are many special cases for which simple process have been proposed;almost
all are which simple procedures have been proposed; almost all are situations in which the
middle machine turns out to be an on bottle neck stage: no job ever has to wait for another
1atmachine2.However,experimentshaveshownthatonlyone of these situations is at all likely to
arise in practice (unless the shop has a special structure, such asconstant processing times in
one of the stages). The following condition was found to hold in almost half the randomly
generated test problems.
Define a two-machine problem with p'jl = Pjl + Pj2and p'j2 = Pj2+Pj3' and solve it using
Algorithm. Suppose this produces a makespan of M, while the same jobsequence results in a
makespan of M' for the original three-machine problem. If M' = M+Ijpj 2' then the sequence
is optimal.
Incidentally, the artificial two-machine problem defined above gives reasonable schedules
even when they are not--optimal.
6. LITERATURE REVIEW:
Brennan, et al. provide an overview of disassembly while Gungor and Gupta present a
survey of environmentally conscious manufacturing and product recovery . Disassembly
optimization using goal programming and considering financial and environmental factors
are demonstrated in and . Veerakamolmal and Gupta analyze design efficiency for the
disassembly of electronic products by building up an index for measuring and comparing
efficiency. Moore et al. use Petri nets with products having complex AND/OR precedence
relationships Gungor and Gupta study disassembly in processes that included possible task
failures and demonstrate how various factors, including component value and hazardous
parts content, could be accommodated to balance a paced disassembly line Lambert provides
a timely and thorough survey of disassembly sequencing [. Metaheuristic techniques are
applied to the disassembly line balancing problem in , and while deterministic algorithms are
developed in MATLAB.
Here we have collected the data of Samsung mobile as mentioned below from a journal.There
are are three employees working for the dissambly of the Samsung mobile.There are twenty
five different types of parts and their dismantling by three employees are given below.By
seeing this table and by using Sequencing operation in Operations Research we will mention
the Total Estimated Time(TET)of all the three workers and idle times individually.This helps
us to place workers accordingly in batching process.
Problem assumptions include the following:
A single product type is to be disassembled on a disassembly line,
The supply of the end-of-life product is infinite, The exact quantity of each part available in
the product is known and constant,
A disassembly task cannot be divided between two workstations,
Each part has an assumed associated resale value which includes its market value and
recycled material value,
Disassembly tasks are to be assigned to a sequence of workstations without violating
precedence relationships among the tasks,and
Complete disassembly is performed on the product.
7. MODEL DESCRIPTION
DATA TAKEN FROM A JOURNAL
The 2001 model year Samsung SCH-3500 is selected for analysis. Collected data on the
SCH-3500 is listed in table
25PARTS WISE
DISASSEMBLY
TIME BY
WORKER 1
TIME CONSUMED
BY WORKER 2
TIME CONSUMED
BY WORKER 3
1.Remove Antenna 3 4 5
2.Remove Battery 2 5 8
3.Discard antenna
Guide path
3 7 4
4.Remove bolt type1A 10 7 4
5.Remove bolt type1B 10 7 6
6.Remove bolt type21 15 7 11
7.Remove bolt type22 15 7 12
8.Remove bolt type23 15 7 13
9.Remove bolt type24 2 7 10
10.Remove clip 2 6 3
11.Remove rubber seal 2 6 3
12.Remove speaker 2 6 4
13.Disconnectwhite
cable
2 7 8
14.Disconnect red/blue
cable
2 4 8
15.Disconnectorange
cable
2 2 3
16.Remove metaltop 2 5 1
17.Remove front cover 2 6 6
18.Remove back cover 3 7 2
19.Remove circuiboard 18 7 10
20.Remove plast screen 5 4 4
21.Remove keyboard 5 3 2
22.Discard lcd 5 5 3
23.Remove sub-keyboard 15 6 3
24.Remove internal IC
board
2 3 4
25.Remove microphone 2 4 4
8. CALCULATION:
Adding Column 1 and Column 2 for X and column 2 and Column 3
for Y
25PARTS WISE
DISASSEMBLY
X(C1+C2) Y(C2+C3)
1.Remove Antenna 7 9
2.Remove Battery 7 13
3.Discard antenna
Guide path
10 11
4.Remove bolt type1A 17 11
5.Remove bolt type1B 17 13
6.Remove bolt type21 22 18
7.Remove bolt type22 22 19
8.Remove bolt type23 22 20
9.Remove bolt type24 9 17
10.Remove clip 8 9
11.Remove rubber seal 8 9
12.Remove speaker 8 10
13.Disconnectwhite
cable
9 15
14.Disconnect red/blue
cable
6 12
15.Disconnectorange
cable
4 5
16.Remove metaltop 7 6
17.Remove front cover 8 12
18.Remove back cover 10 9
19.Remove circuiboard 25 17
20.Remove plast screen 9 8
21.Remove keyboard 8 5
22.Discard lcd 10 8
23.Remove sub-
keyboard
21 9
24.Remove internal IC
board
5 7
25.Remove microphone 6 8
According to sequencing,the order is as shown below
X 15 24 14 25 2 1 17 12 10 11 9 13 3 8 7 6 19 5 4 18 23 20 22 16 21 Y
10. SOFTWARE DEVELOPMENT
A MATLAB program was developed for assigning the cell phone disassembly tasks to a
minimum number of workstations while preserving precedence and addressing the function
weights. MATLAB is a high-level technical computing language and interactive environment
for algorithm development, data visualization, data analysis, and numerical computation.
The program requires the following inputs:
• Precedence relationships,
• Task times,
• Resale value,
• Hazardous binary value,
• Direction of disassembly,
• Number of predecessors to each task,
and provides these outputs:
• The individual tasks assigned to each workstation and their sequence,
• Number of workstations,
• Idle time at each workstation.
RESULTS AND DISCUSSION:
TOTAL ESTIMATED TIME=154mins
IDLE TIME WORKER 1 =154-145=9mins
IDLE TIME WORKER 2=2+1+3+3+4+2=15mins
IDLE TIME WORKER 3=4+3+1+1+2+2=13mins
Thus after seeing the numerical results we extracted above using sequencing problem the
Total estimated time and idle time of employees are individually found out.We can easily
say that the first worker has the capacity of easy and good dismantling.Likewise the idle
times of worker 3 and worker 2 stands next after worker 1.
IDLE TIMES=WORKER 1<WORKER 3<WORKER 2
11. Though not seen with the SCH-3500 cell phone data, an important weakness of the developed
algorithm is its inability to consistently balance the workstations, specifically in the last
workstation. This can be attributed to the use of the Next-Fit rule (developed for the bin-
packing problem) in assigning tasks to workstations, but can be addressed by trial and error
through increasing or decreasing c in increments of 1.
While this heuristic approach to the disassembly sequencing problem cannot assure an
optimum solution, it is able to rapidly generate near-optimum solutions and its result is much
faster than searching the N! permutations required to ensure the optimum sequence.
The SCH-3500 cell phone has already been designed and manufactured for the market.
However, if new productionversions of the SCH-3500 are made, one could make some
improvements in the design for ease of disassembly. Onerecommended improvement would
be a redesign of the antenna guide path. The current antenna guide path design doesnot allow
access to one of the bolts required for disassembly. Thus, until disassembling the antenna and
its guide path,one cannot obtain access to any other internal components, limiting options and
mandating the early removal of thiscomponent.
CONCLUSION
In this paper, a disassembly case study of an existing, contemporary cell phone was
described. A weighted algorithm was developed and programmed in MATLAB that provided
a feasible part removal sequence while attempting to find the sequence having the minimum
number of workstations (minimum idle time) as well as providing consideration for early
removal of high demand/value parts, whose removal provides access to the greatest number
of still-installed parts,hazardous parts, and removing parts with identical part removal
directions adjacent to each other in the problem described above case study.
Although a sub-optimum heuristic, this method quickly generated a feasible solution to this
complex, real-world problemexample, while the experimentally determined data provides a
new evaluation tool for future disassembly algorithmdevelopment.