The document discusses various techniques for reverse engineering products, including teardown analysis and benchmarking. It describes the process of reverse engineering as examining how other designers have combined parts to meet customer needs. The key steps are listed as: 1) examining design issues and limitations, 2) disassembling and analyzing parts, 3) creating a bill of materials. Benchmarking competitors allows learning from their solutions and establishing best practices. Measurement and specification are important for quantifying customer needs and benchmarking performance.
This document provides an overview manual for Owens Corning's global Fitness For Use process. The FFU process is designed to ensure that Owens Corning's products and services can meet critical customer and supplier criteria. The manual describes the five stages of the FFU process: 1) Identifying customer and supplier base, 2) Determining customer and supplier criteria, 3) Assessing ability to meet criteria, 4) Definition of products/processes, and 5) Implementation. It provides definitions of key terms and explains the benefits of the standardized FFU methodology for communication, understanding linkages, and improving customer satisfaction. Appendices include forms, statistics definitions, and a process deployment chart.
The document describes the process of designing, developing, testing, and maintaining a configuration model in Oracle Configurator. It includes steps for creating the configuration model structure and rules, importing BOM and inventory data, developing the UI, testing, publishing the model, and refreshing imported data. Key aspects covered are the model structure, logical rules, importing BOM models and data, UI templates, publishing models, and refreshing imported BOM data.
The three-day course, "Introduction to CMMI", introduces participants to the fundamental concepts of the CMMI model. The course assists companies in integrating best practices from proven discipline-specific process improvement models, including systems engineering, software engineering, integrated product and process development and supplier sourcing.
The course is composed of lectures and class exercises with ample opportunity for participant questions and discussions. After attending the course, participants will be able to describe the components of CMMI, discuss the process areas in CMMI, and locate relevant information in the model.
The workshop will help the participants to:
Understand the CMMI framework
Understand the detailed requirements of the process areas in the CMMI V1.3
Make valid judgments regarding the organization's implementation of process areas
Identify issues that should be addressed in performing process improvements using the CMMI V1.3
The document summarizes the findings from a benchmarking study of supply chain management processes conducted as part of the SMArTMAN SME project. The study involved comparing processes at eight benchmarking partners across different industries in Europe. Two key findings were identified: 1) A "best practice" supply chain process was developed by merging the best elements of each partner's process. This represents superior performance compared to individual processes. 2) Descriptions of specific best practices observed, such as the use of shared IT databases, make-or-buy decision teams, supplier progress reports, and long-term supplier agreements. The goal of the benchmarking was to help industrial partners learn from others' best practices.
The document discusses proposed changes to the CMMI model for high maturity practices. It describes an approach used to solicit feedback on draft proposed changes from SEI authorized individuals. Feedback was collected using "ATLAS" (Ask The Lead AppraiserS) and potential changes were rated. Several specific proposed changes to the OPP, QPM, CAR and OID process areas are presented, including revising practices, goals and adding new practices. The changes aim to better align the model with SEI training courses and appraisal methods for high maturity organizations.
Application Performance Monitoring (APM) tools monitor software applications from the end user perspective by monitoring availability and performance, and detecting early signs of issues. The goals of APM include proactively determining environment health ahead of customer issues, having cross-application data to quickly pinpoint root causes of outages, and determining capacity nearing for critical applications. Potential vendors were evaluated based on use cases through a proof of concept, with the findings being that all shortlisted candidates would improve monitoring capabilities and setup has been validated to be easier than expected, with pricing similarly for a 3-year term.
This document describes processes for product verification and validation. It discusses assembling product components, evaluating the assembled components, packaging and delivering the product. It also covers establishing verification and validation environments and procedures, performing verification and validation activities, and analyzing the results. The goal is to ensure products are built correctly (verification) and that the right products are being built (validation).
This document discusses business process testing. It begins by outlining the objectives of understanding and promoting business process testing. It then defines business components as reusable units that perform specific tasks in a business process. It explains that components can be keyword-driven or scripted. The document also discusses defining requirements, creating a test plan in the test lab, running test scripts and viewing results. Key benefits highlighted include reducing testing time by 50% and increasing reusability. Automating business processes provides earlier quality process automation and ensures documentation and maintenance.
This document provides an overview manual for Owens Corning's global Fitness For Use process. The FFU process is designed to ensure that Owens Corning's products and services can meet critical customer and supplier criteria. The manual describes the five stages of the FFU process: 1) Identifying customer and supplier base, 2) Determining customer and supplier criteria, 3) Assessing ability to meet criteria, 4) Definition of products/processes, and 5) Implementation. It provides definitions of key terms and explains the benefits of the standardized FFU methodology for communication, understanding linkages, and improving customer satisfaction. Appendices include forms, statistics definitions, and a process deployment chart.
The document describes the process of designing, developing, testing, and maintaining a configuration model in Oracle Configurator. It includes steps for creating the configuration model structure and rules, importing BOM and inventory data, developing the UI, testing, publishing the model, and refreshing imported data. Key aspects covered are the model structure, logical rules, importing BOM models and data, UI templates, publishing models, and refreshing imported BOM data.
The three-day course, "Introduction to CMMI", introduces participants to the fundamental concepts of the CMMI model. The course assists companies in integrating best practices from proven discipline-specific process improvement models, including systems engineering, software engineering, integrated product and process development and supplier sourcing.
The course is composed of lectures and class exercises with ample opportunity for participant questions and discussions. After attending the course, participants will be able to describe the components of CMMI, discuss the process areas in CMMI, and locate relevant information in the model.
The workshop will help the participants to:
Understand the CMMI framework
Understand the detailed requirements of the process areas in the CMMI V1.3
Make valid judgments regarding the organization's implementation of process areas
Identify issues that should be addressed in performing process improvements using the CMMI V1.3
The document summarizes the findings from a benchmarking study of supply chain management processes conducted as part of the SMArTMAN SME project. The study involved comparing processes at eight benchmarking partners across different industries in Europe. Two key findings were identified: 1) A "best practice" supply chain process was developed by merging the best elements of each partner's process. This represents superior performance compared to individual processes. 2) Descriptions of specific best practices observed, such as the use of shared IT databases, make-or-buy decision teams, supplier progress reports, and long-term supplier agreements. The goal of the benchmarking was to help industrial partners learn from others' best practices.
The document discusses proposed changes to the CMMI model for high maturity practices. It describes an approach used to solicit feedback on draft proposed changes from SEI authorized individuals. Feedback was collected using "ATLAS" (Ask The Lead AppraiserS) and potential changes were rated. Several specific proposed changes to the OPP, QPM, CAR and OID process areas are presented, including revising practices, goals and adding new practices. The changes aim to better align the model with SEI training courses and appraisal methods for high maturity organizations.
Application Performance Monitoring (APM) tools monitor software applications from the end user perspective by monitoring availability and performance, and detecting early signs of issues. The goals of APM include proactively determining environment health ahead of customer issues, having cross-application data to quickly pinpoint root causes of outages, and determining capacity nearing for critical applications. Potential vendors were evaluated based on use cases through a proof of concept, with the findings being that all shortlisted candidates would improve monitoring capabilities and setup has been validated to be easier than expected, with pricing similarly for a 3-year term.
This document describes processes for product verification and validation. It discusses assembling product components, evaluating the assembled components, packaging and delivering the product. It also covers establishing verification and validation environments and procedures, performing verification and validation activities, and analyzing the results. The goal is to ensure products are built correctly (verification) and that the right products are being built (validation).
This document discusses business process testing. It begins by outlining the objectives of understanding and promoting business process testing. It then defines business components as reusable units that perform specific tasks in a business process. It explains that components can be keyword-driven or scripted. The document also discusses defining requirements, creating a test plan in the test lab, running test scripts and viewing results. Key benefits highlighted include reducing testing time by 50% and increasing reusability. Automating business processes provides earlier quality process automation and ensures documentation and maintenance.
The document discusses five core quality tools: APQP (Advanced Product Quality Planning), FMEA (Failure Modes and Effects Analysis), PPAP (Production Part Approval Process), MSA (Measurement Systems Analysis), and SPC (Statistical Process Control). It provides a brief overview of each tool, noting that APQP is used to develop products that satisfy customers, FMEA ensures potential problems are considered, PPAP ensures products meet specifications, MSA assesses measurement systems, and SPC enables process control and improvement. The document emphasizes that these five tools are considered core tools for quality management.
Partnering is a long term commitment between two or more organization for the purpose of achieving specific business goals and objectives by maximizing the effectiveness of each participant’s resources.
this is a Manufacturing Planning and Scheduling Techniques assignment.
Case Study
To complete this assignment, select one of the following engineering manufacturing businesses relevant to your engineering speciality (electrical and electronics) to refer to in your answers throughout this assignment:
1. Alloy Wheel manufacturer and distributer
2. Communication Cable manufacturer and distributer
Alternatively, you can select your product or employer business as a case study and refer to in your answers.
Business process testing provides an organized methodology for testing applications using reusable components. It features roles for business process experts, component experts, testing engineers, and QA testers. The methodology includes defining components, implementing them in QuickTest Professional, scheduling component and process tests in Quality Center, running tests, and analyzing results.
The document describes the key aspects of managing requirements according to the Capability Maturity Model Integration (CMMI). It discusses the specific goals, specific practices, generic goals, and generic practices for the Requirements Management process area. The specific goals cover obtaining and maintaining an understanding of requirements, managing changes to requirements, and ensuring traceability between requirements. The generic practices provide guidance on establishing policy, planning, resourcing, monitoring, and improving the requirements management process.
The document discusses several software quality models:
- McCall's 1977 model identified quality factors like maintainability, flexibility, and testability from the user's perspective. Each factor has criteria and metrics.
- Boehm's 1978 model has high-level, intermediate, and primitive characteristics contributing to overall quality. Intermediate factors include portability, reliability, and usability.
- Gilb's 1988 model emphasizes defining attributes important to users and required quality levels. Attributes have sub-attributes to aid measurement.
Specifications provide clear requirements for materials, products, and services. They help control risks, obtain value for money, and maximize success. Specifications should be performance-based to promote competition. Estimations involve calculating approximate costs before work based on drawings, specifications, rates, and factors like location, labor, and materials. Rough estimates decide feasibility while detailed estimates provide costs for approval and contractor payments.
The document discusses testing throughout the software lifecycle. It describes different software development models like the waterfall model and iterative development models. It also discusses different levels of testing like component, integration, system, and acceptance testing. Additionally, it outlines different types of testing like functional, non-functional, structural, and regression testing. Finally, it briefly discusses maintenance testing for modifications, migrations, and system retirements.
The document discusses using operational and functional analysis techniques from systems engineering to effectively capture requirements prior to project bidding. It begins with an introduction on the importance of fully capturing requirements upfront.
It then provides an overview of the requirements capture process using these techniques, which involves analyzing operational scenarios, stakeholders, and functional requirements. A case study on developing a mission computer for an aircraft is presented to illustrate applying these techniques. Key activities in operational analysis like identifying scenarios, stakeholders, and requirements flow are described.
This document discusses Business Process Testing (BPT) methodology using QuickTest Professional (QTP). It covers topics like the different types of components in BPT, how to create scripted components, parameterization, creating test scripts by pulling together components in a workflow, and running regression testing. The benefits of BPT include reusability of components, quicker test creation, reduced maintenance compared to traditional automation, and effective data-driven testing.
The document discusses specifications, which are explicit sets of requirements for materials, products, or services. It describes different types of specifications like formal, program, functional, and document specifications. It also outlines how specifications are developed by various organizations, their common uses in engineering and business, guidance for writing good specifications, and considerations for process capabilities during production.
The document provides information on various quality models and standards including Six Sigma, Total Quality Management (TQM), ISO 9001. It discusses the goals, methodology, and evolution of Six Sigma. It explains the key principles and structure of TQM and ISO 9001. It also provides a case study on how Toyota has implemented TQM based on principles of customer focus, continuous improvement, and total participation.
This document discusses benchmarking, which is a strategic analysis tool used to identify areas for improvement through best practices. There are three types of benchmarking: internal, which compares a company's current performance to its past performance; competitor, which compares a company's performance to its industry rivals; and process/activity benchmarking, which compares a company's business processes to best practices of other companies. The document also outlines advantages and disadvantages of benchmarking, as well as a 7-step process for effective benchmarking.
This document outlines an assignment for a quality management course. It includes 6 questions about topics related to quality management. The questions address benchmarking and quality function deployment, key drivers for developing quality culture, statistical distributions used to model reliability, design reviews, audit reporting, and short notes on quality assurance, Juran's trilogy, quality standards, and types of quality costs. Students are instructed to provide concise answers with keywords and specific examples for partial credit, and answers for 10-mark questions should be approximately 400 words.
The document describes the software quality assurance process used by a company. It involves initial project planning, requirements analysis, development, testing of individual modules by developers and testers, integration testing, testing for compatibility, load, and system testing, and finally release after test report approval. Testing of existing vendor products includes peer reviews, validation, data-driven, load, compatibility, and usability testing. Testing new systems developed from scratch includes requirements, test strategy, traceability, cases, risks, tools, resources, schedule, deliverables, defect tracking, and approval processes.
Fitri haryati (testing throughout the software life cycle)Fitri Haryati
1. The document discusses software development models and test types throughout the software life cycle. It describes various development models like the V-model, iterative life cycles, rapid application development, and agile development.
2. It also covers different types of testing like functional testing based on requirements or business processes, non-functional testing of characteristics like reliability and usability, and structural testing of a system's architecture. Regression testing and confirmation testing are mentioned as well.
3. The author is Fitri Haryati, a student at the Department of Information Systems, Faculty of Science and Technology at UIN Suska Riau in Indonesia. References included the CMMI and ISO/IEC 12207 standards.
The document discusses SAP Quality Management (QM) module. It provides an overview of key QM features like quality planning, inspection, and control. The quality inspection process in SAP includes inspection lot creation, results recording, defects recording, and usage decision. Inspection lots are created to group products for inspection. Inspection results and any defects found are recorded. A usage decision is then made to accept or reject the lot. Quality certificates and notifications can also be generated within the QM module.
The Validation Master Plan (VMP) describes all validation requirements for the production facility. It covers validation aspects for all production areas, storage, utilities, and staff facilities. The VMP ensures processes will consistently produce products that meet specifications. It describes the principles of validation and organization of qualification activities and equipment.
The V-Model is a software development lifecycle model where development and testing occur in sequential and inverse phases resembling the shape of a V. It is called the V-Model because the process looks like the letter V, with validation on the left side and verification on the right. The key phases include requirements analysis, design, coding, unit testing, integration testing, system testing, and acceptance testing. The V-Model allows testing to start early and to be conducted at each phase, in parallel with development. It is simple to understand but works best for smaller, well-defined projects where requirements are stable.
The document discusses product design and development. It describes the six phases of the product development process: product planning, concept development, system-level design, detail design, testing and refinement, and production ramp-up. Key aspects of each phase are identified. The document also discusses product verification, validation, testing, and the roles involved in product development teams. Product development involves a range of technical, marketing, and financial activities, while product design focuses specifically on meeting technical requirements.
This document summarizes a chapter about product specifications from a textbook on product design and development. It discusses the nature and purpose of specifications, the process for setting target and final specifications, and guidelines for establishing metrics and assigning values. Target specifications are set based on customer needs and benchmarks, while final specifications are refined based on the selected concept and testing. The key aspects covered are:
- Specifications represent an agreement on what the team will achieve to satisfy customer needs
- Target specs are goals and final specs reflect feasibility testing and trade-offs
- Metrics should sufficiently address customer needs and be practical to measure
- Benchmarking, models, and trade-offs inform refining specs
The document discusses five core quality tools: APQP (Advanced Product Quality Planning), FMEA (Failure Modes and Effects Analysis), PPAP (Production Part Approval Process), MSA (Measurement Systems Analysis), and SPC (Statistical Process Control). It provides a brief overview of each tool, noting that APQP is used to develop products that satisfy customers, FMEA ensures potential problems are considered, PPAP ensures products meet specifications, MSA assesses measurement systems, and SPC enables process control and improvement. The document emphasizes that these five tools are considered core tools for quality management.
Partnering is a long term commitment between two or more organization for the purpose of achieving specific business goals and objectives by maximizing the effectiveness of each participant’s resources.
this is a Manufacturing Planning and Scheduling Techniques assignment.
Case Study
To complete this assignment, select one of the following engineering manufacturing businesses relevant to your engineering speciality (electrical and electronics) to refer to in your answers throughout this assignment:
1. Alloy Wheel manufacturer and distributer
2. Communication Cable manufacturer and distributer
Alternatively, you can select your product or employer business as a case study and refer to in your answers.
Business process testing provides an organized methodology for testing applications using reusable components. It features roles for business process experts, component experts, testing engineers, and QA testers. The methodology includes defining components, implementing them in QuickTest Professional, scheduling component and process tests in Quality Center, running tests, and analyzing results.
The document describes the key aspects of managing requirements according to the Capability Maturity Model Integration (CMMI). It discusses the specific goals, specific practices, generic goals, and generic practices for the Requirements Management process area. The specific goals cover obtaining and maintaining an understanding of requirements, managing changes to requirements, and ensuring traceability between requirements. The generic practices provide guidance on establishing policy, planning, resourcing, monitoring, and improving the requirements management process.
The document discusses several software quality models:
- McCall's 1977 model identified quality factors like maintainability, flexibility, and testability from the user's perspective. Each factor has criteria and metrics.
- Boehm's 1978 model has high-level, intermediate, and primitive characteristics contributing to overall quality. Intermediate factors include portability, reliability, and usability.
- Gilb's 1988 model emphasizes defining attributes important to users and required quality levels. Attributes have sub-attributes to aid measurement.
Specifications provide clear requirements for materials, products, and services. They help control risks, obtain value for money, and maximize success. Specifications should be performance-based to promote competition. Estimations involve calculating approximate costs before work based on drawings, specifications, rates, and factors like location, labor, and materials. Rough estimates decide feasibility while detailed estimates provide costs for approval and contractor payments.
The document discusses testing throughout the software lifecycle. It describes different software development models like the waterfall model and iterative development models. It also discusses different levels of testing like component, integration, system, and acceptance testing. Additionally, it outlines different types of testing like functional, non-functional, structural, and regression testing. Finally, it briefly discusses maintenance testing for modifications, migrations, and system retirements.
The document discusses using operational and functional analysis techniques from systems engineering to effectively capture requirements prior to project bidding. It begins with an introduction on the importance of fully capturing requirements upfront.
It then provides an overview of the requirements capture process using these techniques, which involves analyzing operational scenarios, stakeholders, and functional requirements. A case study on developing a mission computer for an aircraft is presented to illustrate applying these techniques. Key activities in operational analysis like identifying scenarios, stakeholders, and requirements flow are described.
This document discusses Business Process Testing (BPT) methodology using QuickTest Professional (QTP). It covers topics like the different types of components in BPT, how to create scripted components, parameterization, creating test scripts by pulling together components in a workflow, and running regression testing. The benefits of BPT include reusability of components, quicker test creation, reduced maintenance compared to traditional automation, and effective data-driven testing.
The document discusses specifications, which are explicit sets of requirements for materials, products, or services. It describes different types of specifications like formal, program, functional, and document specifications. It also outlines how specifications are developed by various organizations, their common uses in engineering and business, guidance for writing good specifications, and considerations for process capabilities during production.
The document provides information on various quality models and standards including Six Sigma, Total Quality Management (TQM), ISO 9001. It discusses the goals, methodology, and evolution of Six Sigma. It explains the key principles and structure of TQM and ISO 9001. It also provides a case study on how Toyota has implemented TQM based on principles of customer focus, continuous improvement, and total participation.
This document discusses benchmarking, which is a strategic analysis tool used to identify areas for improvement through best practices. There are three types of benchmarking: internal, which compares a company's current performance to its past performance; competitor, which compares a company's performance to its industry rivals; and process/activity benchmarking, which compares a company's business processes to best practices of other companies. The document also outlines advantages and disadvantages of benchmarking, as well as a 7-step process for effective benchmarking.
This document outlines an assignment for a quality management course. It includes 6 questions about topics related to quality management. The questions address benchmarking and quality function deployment, key drivers for developing quality culture, statistical distributions used to model reliability, design reviews, audit reporting, and short notes on quality assurance, Juran's trilogy, quality standards, and types of quality costs. Students are instructed to provide concise answers with keywords and specific examples for partial credit, and answers for 10-mark questions should be approximately 400 words.
The document describes the software quality assurance process used by a company. It involves initial project planning, requirements analysis, development, testing of individual modules by developers and testers, integration testing, testing for compatibility, load, and system testing, and finally release after test report approval. Testing of existing vendor products includes peer reviews, validation, data-driven, load, compatibility, and usability testing. Testing new systems developed from scratch includes requirements, test strategy, traceability, cases, risks, tools, resources, schedule, deliverables, defect tracking, and approval processes.
Fitri haryati (testing throughout the software life cycle)Fitri Haryati
1. The document discusses software development models and test types throughout the software life cycle. It describes various development models like the V-model, iterative life cycles, rapid application development, and agile development.
2. It also covers different types of testing like functional testing based on requirements or business processes, non-functional testing of characteristics like reliability and usability, and structural testing of a system's architecture. Regression testing and confirmation testing are mentioned as well.
3. The author is Fitri Haryati, a student at the Department of Information Systems, Faculty of Science and Technology at UIN Suska Riau in Indonesia. References included the CMMI and ISO/IEC 12207 standards.
The document discusses SAP Quality Management (QM) module. It provides an overview of key QM features like quality planning, inspection, and control. The quality inspection process in SAP includes inspection lot creation, results recording, defects recording, and usage decision. Inspection lots are created to group products for inspection. Inspection results and any defects found are recorded. A usage decision is then made to accept or reject the lot. Quality certificates and notifications can also be generated within the QM module.
The Validation Master Plan (VMP) describes all validation requirements for the production facility. It covers validation aspects for all production areas, storage, utilities, and staff facilities. The VMP ensures processes will consistently produce products that meet specifications. It describes the principles of validation and organization of qualification activities and equipment.
The V-Model is a software development lifecycle model where development and testing occur in sequential and inverse phases resembling the shape of a V. It is called the V-Model because the process looks like the letter V, with validation on the left side and verification on the right. The key phases include requirements analysis, design, coding, unit testing, integration testing, system testing, and acceptance testing. The V-Model allows testing to start early and to be conducted at each phase, in parallel with development. It is simple to understand but works best for smaller, well-defined projects where requirements are stable.
The document discusses product design and development. It describes the six phases of the product development process: product planning, concept development, system-level design, detail design, testing and refinement, and production ramp-up. Key aspects of each phase are identified. The document also discusses product verification, validation, testing, and the roles involved in product development teams. Product development involves a range of technical, marketing, and financial activities, while product design focuses specifically on meeting technical requirements.
This document summarizes a chapter about product specifications from a textbook on product design and development. It discusses the nature and purpose of specifications, the process for setting target and final specifications, and guidelines for establishing metrics and assigning values. Target specifications are set based on customer needs and benchmarks, while final specifications are refined based on the selected concept and testing. The key aspects covered are:
- Specifications represent an agreement on what the team will achieve to satisfy customer needs
- Target specs are goals and final specs reflect feasibility testing and trade-offs
- Metrics should sufficiently address customer needs and be practical to measure
- Benchmarking, models, and trade-offs inform refining specs
PRODUCT BRIEF DEVELOPMENT TOOLS Quality Function Dep.docxbriancrawford30935
PRODUCT BRIEF
DEVELOPMENT
TOOLS
Quality Function Deployment
In a few words: The voice of the customer translated into the voice of the engineer.
To design a product well, a design teams needs to know what it is
they are designing, and what the end-users will expect from it.
Quality Function Deployment is a systematic approach to design
based on a close awareness of customer desires, coupled with the
integration of corporate functional groups. It consists in
translating customer desires (for example, the ease of writing for
a pen) into design characteristics (pen ink viscosity, pressure on
ball-point) for each stage of the product development (Rosenthal,
1992).
Ultimately the goal of QFD is to translate
often subjective quality criteria into objective
ones that can be quantified and measured and
which can then be used to design and
manufacture the product. It is a complimentary
method for determining how and where
priorities are to be assigned in product
development. The intent is to employ
objective procedures in increasing detail
throughout the development of the product.
(Reilly, 1999)
Quality Function Deployment was developed
by Yoji Akao in Japan in 1966. By 1972 the
power of the approach had been well
demonstrated at the Mitsubishi Heavy
Industries Kobe Shipyard (Sullivan, 1986) and
in 1978 the first book on the subject was
published in Japanese and then later translated
into English in 1994 (Mizuno and Akao,
1994).
In Akao’s words, QFD "is a method for developing a design quality aimed at satisfying the
consumer and then translating the consumer's demand into design targets and major quality
assurance points to be used throughout the production phase. ... [QFD] is a way to assure the
design quality while the product is still in the design stage." As a very important side benefit he
points out that, when appropriately applied, QFD has demonstrated the reduction of development
time by one-half to one-third. (Akao, 1990)
The 3 main goals in implementing QFD are:
1. Prioritize spoken and unspoken customer wants and needs.
2. Translate these needs into technical characteristics and specifications.
3. Build and deliver a quality product or service by focusing everybody toward customer
satisfaction.
Technique useful for:
Derivative First of a kind
Me too with
a twist Next generation
Familiar New
E
st
ab
lis
he
d
N
ew
M
ar
ke
t
Product Concept
Since its introduction, Quality Function Deployment has helped to transform the way many
companies:
• Plan new products
• Design product requirements
• Determine process characteristics
• Control the manufacturing process
• Document already existing product specifications
QFD uses some principles from Concurrent Engineering in that cross-functional teams are
involved in all phases of product development. Each of the four phases in a QFD process uses a
matrix to translate customer requirements from initial plann.
This document discusses various aspects of product design and development. It covers the product development process and typical phases from planning to production ramp-up. It also discusses designing for customers through techniques like quality function deployment and the house of quality. Additionally, it discusses designing for manufacturability and measuring product development performance through various metrics. The goal is to develop products that meet customer needs while being efficient and cost-effective to manufacture and bring to market.
1. Quality Function Deployment (QFD) uses a matrix format called the House of Quality to capture customer requirements and translate them into engineering targets for new product design.
2. The House of Quality contains six major components: customer requirements, technical requirements, a planning matrix, an interrelationship matrix, a technical correlation matrix, and technical priorities/benchmarks and targets.
3. It helps companies determine customer needs, specify them as engineering requirements, identify how well requirements are met compared to competitors, establish connections between customer and technical requirements, and set targets for technical requirements.
The document describes the process for establishing target product specifications and setting final specifications. It involves:
1. Establishing target specifications after identifying customer needs by preparing metrics, collecting benchmarks, and setting ideal and acceptable values.
2. Setting final specifications after selecting a product concept by developing technical and cost models, refining specifications through trade-offs, flowing specifications down to subsystems, and reflecting on results.
Quality Function Deployment (QFD) is a systematic approach to design that focuses on customer needs. It involves translating customer requirements into technical specifications across each stage of product development. The QFD process involves building a House of Quality matrix to prioritize customer needs, technical attributes, and their relationships. It then translates the voice of the customer through subsequent phases of product design, process planning, and production control. Implementing QFD alongside Lean Six Sigma aims to reduce waste and variability to improve quality, efficiency and meet customer requirements.
Embedded Product Development Life Cycle(EDLC)UshaRani289
The document describes the embedded product development life cycle (EDLC) which involves multiple phases from conceptualization to retirement. It begins with identifying a need for a new or upgraded product. This is followed by conceptualization, analysis, design, development and testing, deployment, support, and upgrades. Each phase is described in detail along with its key activities such as feasibility studies, requirements analysis, interface definition, testing plans, product installation, and providing support. The life cycle concludes with retiring the product when a new technology becomes available.
The document provides an overview of fundamentals of software development including definitions of software, characteristics of software, software engineering, layered approach to software engineering, need for software engineering, and common software development life cycle models. It describes system software and application software. It outlines characteristics like understandability, cost, maintainability, modularity, reliability, portability, documentation, reusability, and interoperability. It also defines software engineering, layered approach, and need for software engineering. Finally, it explains popular life cycle models like waterfall, iterative waterfall, prototyping, spiral, and RAD models.
Quality Function Deployment (QFD) Seminar PresentationOrange Slides
Quality Function Deployment (QFD) is a method to translate customer needs into technical requirements for new product development. It was developed in Japan in the 1970s and involves capturing customer needs, prioritizing them, benchmarking competitors, and setting target values. The process results in a comprehensive product specification. Key tools include affinity diagrams, relations diagrams, matrices, and the House of Quality which maps customer and technical requirements. QFD aims to design products that meet customer needs and satisfy them better than competitors.
1. object oriented concepts & principles poonam bora
Here is an object diagram defining the Book object with attributes and operations:
[OBJECT DIAGRAM]
Book: Book
- title: string
- author: string
- pages: int
+ read()
+ turnPage()
+ getTitle(): string
+ getAuthor(): string
This object diagram defines a Book object instantiated from the Book class. The Book object has:
- Private attributes title (string), author (string), and pages (int)
- Public operations read(), turnPage(), getTitle() which returns a string, and getAuthor() which returns a string
The colon (:) separates the object name from the class name. The visibility of each attribute
The document discusses the four phases of the software development lifecycle: inception, elaboration, construction, and transition. It provides details on the objectives and essential activities of each phase. The inception phase focuses on establishing scope and demonstrating architecture. The elaboration phase builds prototypes and baselines requirements, architecture, and plans. Construction integrates components and tests features. Transition deploys the software to end users through activities like beta testing and training. The document also discusses engineering artifact sets for managing development, including management, requirements, design, implementation, and deployment artifacts.
Designing Product for the Customer,House of quality matrix and design for man...Rohit K.
The document discusses design for manufacturing and assembly (DFMA) techniques. It defines design for manufacturing (DFM) as optimizing the manufacturing process to minimize part production costs. Design for assembly (DFA) is defined as optimizing the assembly process to minimize assembly costs. The key differences and similarities between DFM and DFA are explained. Some principles of DFMA include reducing the number of parts, using common features and axes, and utilizing standards. The overall goal of DFMA is to design products that can be easily and cost-effectively manufactured and assembled.
The document discusses the House of Quality tool used in Quality Function Deployment. The House of Quality matrix translates customer requirements into engineering targets for new product design. It has six major components: customer requirements, technical requirements, a planning matrix, an interrelationship matrix, a technical correlation matrix, and a technical priorities/benchmarks section. The matrix helps identify what must be done to the product design to fulfill requirements and meet customer needs.
The document discusses various topics related to project management including reasons for project termination, methods of project visualization, priorities for monitoring, change control procedures, software configuration management, stress management techniques, types of contracts, stages in contract placement, and the review process model. Key points include that reasons for project termination can include lack of resources, incomplete requirements, or obsolete technologies, and the termination process involves project surveys, debriefing meetings, and result publication. Project visualization methods include Gantt charts, slip charts, and timeline charts. Priorities for monitoring include critical path activities, activities with no free float, and high risk activities. The review process model involves planning, preparation, a review meeting, and rework stages.
This document discusses software product lines and component-based software architecture. It defines a software product line as a set of software systems that share common features to satisfy market needs. Product lines allow for significant reuse of common assets across products. The document also discusses challenges like architectural mismatch that can occur when integrating components, and techniques for avoiding, detecting, and repairing mismatches. It frames architecture development as an ongoing business process and discusses how commercial components impact architectural design decisions.
The document describes several software development life cycle models:
1. The classical waterfall model divides the life cycle into sequential phases from requirements to maintenance.
2. The iterative waterfall model adds feedback loops to allow correcting defects in earlier phases.
3. The prototyping model develops prototypes to refine requirements before development.
4. The evolutionary model incrementally develops modules and delivers functioning systems in successive versions.
5. The spiral model represents each phase as a loop through objective setting, risk assessment, development, and review.
The document outlines the product development process and types of products. The product development process consists of 6 phases: planning, concept development, system-level design, design detail, testing and refinement, and production ramp-up. These phases involve specifying market needs, generating and selecting concepts, designing subsystems and components, creating drawings and specifications, building and testing prototypes, and ramping up full-scale production. The document also lists 8 types of products including generic, technology push, platform, process incentive, customized, high risk, quick build, and complex systems.
The document discusses production planning and process planning. It outlines the key stages in production planning which include marketing analysis, feasibility studies, and advanced product planning. It also discusses product planning and value analysis, which aims to systematically identify and eliminate unnecessary costs. The value of a product can be increased by reducing costs or improving functions. Process planning involves preparing instructions for manufacturing a product and its parts, including selecting processes, machines, and equipment. The responsibilities of process planning engineers include interpreting part designs, selecting machining processes, tooling, and operation sequences.
The CBC machine is a common diagnostic tool used by doctors to measure a patient's red blood cell count, white blood cell count and platelet count. The machine uses a small sample of the patient's blood, which is then placed into special tubes and analyzed. The results of the analysis are then displayed on a screen for the doctor to review. The CBC machine is an important tool for diagnosing various conditions, such as anemia, infection and leukemia. It can also help to monitor a patient's response to treatment.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
Software Engineering and Project Management - Introduction, Modeling Concepts...Prakhyath Rai
Introduction, Modeling Concepts and Class Modeling: What is Object orientation? What is OO development? OO Themes; Evidence for usefulness of OO development; OO modeling history. Modeling
as Design technique: Modeling, abstraction, The Three models. Class Modeling: Object and Class Concept, Link and associations concepts, Generalization and Inheritance, A sample class model, Navigation of class models, and UML diagrams
Building the Analysis Models: Requirement Analysis, Analysis Model Approaches, Data modeling Concepts, Object Oriented Analysis, Scenario-Based Modeling, Flow-Oriented Modeling, class Based Modeling, Creating a Behavioral Model.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Discover the latest insights on Data Driven Maintenance with our comprehensive webinar presentation. Learn about traditional maintenance challenges, the right approach to utilizing data, and the benefits of adopting a Data Driven Maintenance strategy. Explore real-world examples, industry best practices, and innovative solutions like FMECA and the D3M model. This presentation, led by expert Jules Oudmans, is essential for asset owners looking to optimize their maintenance processes and leverage digital technologies for improved efficiency and performance. Download now to stay ahead in the evolving maintenance landscape.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
artificial intelligence and data science contents.pptxGauravCar
What is artificial intelligence? Artificial intelligence is the ability of a computer or computer-controlled robot to perform tasks that are commonly associated with the intellectual processes characteristic of humans, such as the ability to reason.
› ...
Artificial intelligence (AI) | Definitio
1. Unit no-4 Reverse Engineering
Reference: Product design by Kevin Otto and Kristin Wood
Reverse engineering is another name for product dissection.
Reverse engineering is done for the sole purpose of copying a product.
Reverse engineering gives a snapshot of how other designers have combined parts to
meet customer needs. The “teardown” of a product is often a part of product
benchmarking, but without the intent of copying the design
In this context, by reverse engineering, you seek to determine:
1)What are the problems/limitations of the design?
2)What are the great features/functions of the design?
3) Is this product part of a family of similar products?
4)Are any online reviews/customer feedback available?
Product Teardown process
Teardowns are carried out in industry to benchmark work against the competition.
Technology and architecture and cost of competitive products uncovered. Many large
companies have in house staff whose sole job is to reverse engineer the competition.
One must able to analyze competitor product and transform this analysis in to information
that can be used as a part of new design.
Step No-1 List the design issues-Design issues are mainly related to customer needs. Some
basic factors are weight, material, colour /finish, manufacturing process, Geometric
tolerances, cost per part, dimensional measurements
Step-2 Prepare for product teardown
Step-3 Examine the distribution and installation
Step-4 Disassemble, Measure and analyze data by assemblies
Step-5 Form a Bill Of Materials
Methods of product teardown:
Subtract and Operate procedure(SOP)
It is a five step procedure aimed at exposing redundant components in an assembly or
subassembly through identification of the true functionality of each component.
2. Step1: Subtract one component of assembly: Removal of components may occur in any order
.However it may be necessary to remove one or several components in order to remove
desired component
Step2: Operate the system through its full range: This step should test the product through the
range of customer needs. After removing component the product should be thoroughly tested
for each customer need.
Step 3: Analyze the effect: This process is usually carried through visual inspection or with
measuring instruments .
Step4: Deduce the subtraction of missing components:
Step5: Replace the component and repeat the procedure n times.
Components that when removed cause no change in the DOF or other factors of the design
are termed as Type 1 redundant components .Components that cause no change in DOF but
do have effect due to their removal are termed Type 2 redundant components.
Type 1 components are always components for removal
Type 2 components may be removed if another component can be parametrically redesigned
to compensate for the other effects.
Example of SOP
3. Force Flow (Energy flow filed)Diagram
Force flow diagrams represent the transfer of force through the products component. The
components are symbolized as nodes using circle and forces are as arrows connecting the
components in which the force transfer takes place. Force flow diagram focus on component
combination.
Primary intent of force flow diagrams is
1)Identify the functions of a sequence of components(subassemblies)
2)Identify potential avenues for component combination
Construction of force flow model: The motivation behind constructing a force flow diagram
of product is to map the force flow through a product so that the diagram can be analyzed to
help expose opportunities for component combination. Once the model is formulated the first
step in analyzing force flow diagram is to place “R” on the flows that have relative motion
between two components. Once this step is completed, the diagram can be decomposed in to
groups separated by “R”s.
The component comprising “R” groups are candidates for combination if not prohibited by
material or assembly/disassembly issues. Combination between a member of one group and
member outside group requires more complex redesign.
Step1:Identify primary force flows transmitted through product
Step2:Map the force(Energy) flow from the external source through each component of
product until the flow exits to ground.
Step3:Document the result in force flow diagrams
Step4:Analye diagram labeling relative motion between components with R
4. Step5:Decompose the diagram in to groups separated by “R”s. Box these components
Step6:Deduce the sub functions and affected customer needs for each group
Step7:Devlop creative conceptual design to combine components together.
Step8:Repeat for each force flow.
Example of Force flow diagram for stapler
5. Measurement and experimentation:
A products specification should provide a precise and measurable description of what the
product must accomplish .Specifications consist of Metric , denoting the type of measurement
and target value in the form of number or range. The product target values are performance
levels that designer must achieve. The target values are generally determined by examining
the uses the product will be put to or by examining performance of other similar products
which is called benchmarking.
Most of design methods initially concentrate on various aspects of acquiring product
information, and ultimately the data is covered into a set of quantifiable specifications. These
matrices provide clear goal for evaluating product in form of target values developed by
determining state of art in product benchmarks.
The importance of measurement in design is illustrated by noticing how design methods are
concerned with ensuring that both the customer needs and functionality of product are
quantifiable.
The key success in experimental work is to ask continually : What am I looking for? Why am
I measuring this –does the measurement really answer any of my question? What does the
measurement tell me?
Measurement method of product teardown :It majorly consist of four steps given below
Step1:Select a product domain
Step2:Determine the most important sub function
Step3:Destermine the necessary measurements
Step4:Selection of measurement devices
Benchmarking
Product developers must learn from competitors for which you must know your competitors
in business. For this benchmarking is useful and regular activity in product based industries.
It consists of following steps:
Step1:Form a list of design issues
Step2:Form a list of competitive or related products
Step 3:Conduct an information search through various consumer report magazines, Trade
magazines, Patents, market share reports, Libraries, reports from various rating agencies,
Standards
Step4:Tear down multiple products in class
6. Step5:Benchmark by function: Summarize the product comparison by form. fit and function
Step6:Establish best in class competitors by function
Step7:Plot industry trends
Benchmarking of competitors is similar in spirit to the benchmarking of technical solutions
and considers the performance over time of entire portfolio of a company .Corporate
strategies may be deciphered from their performance on business criteria such as market
share in different regions, price points in the market over time, assets over time, buyouts,
inventory costs as fraction of sales, labour cost as fraction of sales.
Examples of Benchmarking
Different Coffee making mills in market which are available at different costs are
benchmarked on four major criteria
Like Noise max capacity, Drip timing and Espresso test
7. Sporting tools of benchmarking:
1)Assembly cost analysis: One key result of a benchmarking activity is a comparative
understanding of the cost structures that different competitors face. In teardown .one can
estimate cost of each part which is function of material, equipment, labour cost.
Example of assembly cost analysis
8. 2)Function form diagram
This diagram basically list the various solutions that are proven on the market for particular
product function. The result of benchmarking should be listed on function form diagram such
as part count, cost and any material information Finally best in class solutions on the market
should be called out such as highest quality and lowest cost models as shown in example
mentioned below
9. Methods to establish product specification:
Specifications for a new product are quantitative measurable criteria that the product should
be designed to satisfy. They are the measurable goals for the design team .Product
specifications should be established early and revised often. These are also called engineering
requirements.
Functional requirements are statements of specific performance of a design i.e. what product
should do. Product function is abstract formulation of the task that is to be accomplished and
is independent of any particular solution that is employed to achieve the desired result.
1)Method of specification sheets
This has following steps
i)Compile specifications-Arrange functional requirements and constraints in to clear order
ii)Determine if each functional requirement is demand or wish
iii)Determine if functional requirement and constraints are logically consistent. Check for
oblivious conflicts.
iv)Quantify wherever possible
v)Determine detailed approach for ultimately testing and verifying the specifications during
product development process.
vi)circulate specifications for comment and amendments
Categories for searching and decomposing specifications:
10. 2)Method of house of quality: Discussed earlier unit
3)Value Analysis:
For any engineering requirement, a target value is determined by simultaneously judging the
cost of attaining that target and customer desire in delivering that target. Value analysis
method is very quantitative and numerical.
We can define value or worth as the difference in the desire of the customer from the cost of
producing it. V=D-C and then we can pick a target value that maximizes this quantity.
But specifying desires of customer in various market situations is challenge.
Value analysis is a useful technique for comparing alternatives and alternative specifications
in reasonably well understood domains, where the customer perceive and state well their
expected and desired needs, and domains where the technology or markets are reasonably
unchanging and can be foreseen.
Trend Analysis:
Trend analysis is important to know changing desires of the customer and changes in market
segmentation. It’s a dynamic process which is driven by various data points from market such
as product sales figures, data related to market share, economic surveys, Technological
breakthroughs, and demography.
Regular tracking of this data and drawing meaningful conclusions from it called trend
analysis. It is very important process in defining future goals of organization and setting
product specifications. It also provides base for various important corporate level decisions
about organization.
11. Product Portfolio and Product Architecture:
Product portfolios are the set of different product offerings that a company provides. Product
portfolio architecture is the system strategy for laying out components and systems on
multiple products to best satisfy current and future market needs.
There are two basic corporate objectives considered in developing a product portfolio
architecture: cost and revenue.
Revenues increase with expanded offerings in a larger portfolio, as company can then make
products more tailored to each customer in the market but cost go up with added complexity
of developing, supporting and manufacturing larger set of different products.
Product Portfolio architectures falls into three basic categories: Fixed unsharing platform,
modular and massively customizable as shown in below figure
1)Fixed unsharing portfolio architecture:
In it each product in portfolio is unique and shares no components or systems with any other
product member in portfolio. It is used for high volume products eg. screw driver set
Single offer example is car model available in one colour only and example of robust type is
electric appliance suitable for 50Hz and 60 Hz frequency simultaneously.
12. 2)Modular architecture:
A modular product family is defined as the set of products supported at any one time by a
platform. e.g Two and four slice toaster
Modular product generation is defined as the architecture for product offerings that share the
same modular components in offerings that succeeded each other through time. E.g.same
capacity engines used in different types of car model.
Consumable platform :Two products using same consumable items e.g. Printer cartridge
Standard platform- A subset of a product system in the standard portfolio platform where a
subset of product system in a portfolio of products is a platform that conforms to an industry
agreed standards e.g. two different software running on same operating system.
Adjustable to purchase-Different market segment may have different requirements for some
of the subsystem in product e.g. computer manufacturers sell to a variety of customers with
different electrical power input requirements. To meet this the power supply function might
be isolated from the rest of the product as a module like SMPS.
3)Mass customization
Fabricate to fit:It is mass customization platform where the customer can special order the
platform at the exact specification desired e.g. In 1990 TOYOTA permitted customer to
specially order vehicle from large array of operation which led to increase in cost and
TOYOTA finally dropped the idea.
Adjustable for use-camera that permits adjusting focus to different distance are adjustable to
use type of architecture.
Types of Product architectures:
Developing product architecture is strategic decision for any class of products. It is where we
begin to take key decisions on how the product will physically operate..
Product architecting ,at basic level, starts the creation of effective layouts of components and
subsystems, where different tasks are completed by subset of product development team.
Creating a product architecture is focused on transforming product function to product form.
1) Integral product architecture-are physical structure where all the subfunctions map to a
single or very small number of physical elements. E.g. Two headed spanners.
2) Modular architecture-Product modules are integral physical product substructures that
have one to one correspondence with a subset of product’s functional model. E.g. Machines
with different assembly components.
13. Comparison of modular and integral architectures:
Pros cons
Modular
architecture
1)Improves device reconfigurability
2)Increase the device variety and speed of
introduction for new devices
3)Improves maintainability and serviceability
of device
4)Decouples development tasks and
manufacturing tasks
1)May make device look too
similar
2)Makes imitation easier to
competitors
3)Reduces device
performance
4)Modular design may be
more expensive than integral
design
Integral
Architecture
1)Harder for competitors to copy design
2)Tighter coupling of team with less interface
problems
3)Increase system performance
4)Possible reduction in system cost
1)Hinders change of design
in production
2) Reduces the variety of
devices that can be produced.
Product Modularity:
Two benefits of modular design are standardization of components and reconfigurability of
devices. We define two major types of modularity: Function based and manufacturing based.
A)Function based modularity:
1)Slot modularity: One basic devices uses several different components to allow it to perform
multiple tasks. This is also associated with the concept of component standardization.
e.g Different models of Power tool set having different operation tools but uses the same
battery to operate.
2)Bus modularity: A device that is equipped with a standard interface that accepts any
combinations of different functioning module. e.g. USB slots in computers
3)section modularity: chained interconnections of modules each equipped with an identical
interface- A device common heating element used in Travel hairdryer cum iron .
4)Mix modularity-Componets combines to make endless combination of products e.g.
Mechano toy sets
B)Manufacturing based modularity:
Different subassemblies are grouped on basis of manufacturing technique and assembly
operations. e.g .American Axle and Jamna Auto together provides Transmission Axle and
leaf springs to various models of different Original Equipment Manufacturers(OEM car
companies)
14. Basic method for modular design:
It is a four step process which has following four steps
Step1:Create a function structure of the product
Step2:Cluster the elements in to modules or chunk
Step3:Create rough geometric layouts
Step4:Define interactions and detail performance characteristics
References:
1)Product design book by Kevin Otto and Kristin woods
2)Engineering Design by George E Dieter
3)Product Design and development by Karl T Ulrich and Steven Eppinger
4) Various Web sources
Note-These notes are prepared for subject of Product DesignDevelopment (PDD) as
per syllabus of University of Pune .
These are for private circulation and not for any commercial use.