This document reviews different methods for distributing effort across phases of the software development lifecycle (SDLC). It summarizes effort distribution patterns from models like COCOMO, RUP, and SLIM. Analysis of a dataset from CSBSG shows differences from COCOMO II, with more effort allocated to planning/requirements and less to design. Statistical analysis found development type and software size significantly impact code and test effort distribution, while team size less so. The document concludes by providing guidelines for phase-based effort distribution based on factors like development type, software size, and team size.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
Enhancing the Software Effort Prediction Accuracy using Reduced Number of Cos...IRJET Journal
This document presents research on modifying the COCOMO II software cost estimation model to improve prediction accuracy. The researchers reduced the number of cost estimation factors (called cost drivers) from 17 to 13 by adjusting the definitions and impact levels to better reflect current industry situations. They estimated effort for software projects using the modified model and found lower percentage errors compared to the original COCOMO II model, demonstrating improved estimation efficiency. The goal of the research was to analyze cost drivers and their impact on effort estimation in COCOMO II and enhance the model for more accurate predictions.
FACTORS ON SOFTWARE EFFORT ESTIMATION ijseajournal
Software effort estimation is an important process of system development life cycle, as it may affect the
success of software projects if project designers estimate the projects inaccurately. In the past of few
decades, various effort prediction models have been proposed by academicians and practitioners.
Traditional estimation techniques include Lines of Codes (LOC), Function Point Analysis (FPA) method
and Mark II Function Points (Mark II FP) which have proven unsatisfactory for predicting effort of all
types of software. In this study, the author proposed a regression model to predict the effort required to
design small and medium scale application software. To develop such a model, the author used 60
completed software projects developed by a software company in Macau. From the projects, the author
extracted factors and applied them to a regression model. A prediction of software effort with accuracy of
MMRE = 8% was constructed.
Conveyor Belt Project Report using MS PROJECT by creating work package,deliverable, sub-deliverables and allocating resources to them. Analysis was done and suggestion was made for the overall imporvement
The purpose of this report is to provide management with a revised status of the Super Conveyer Belt project. The report is organized by the four phases of the project life cycle which include Defining / Initiating, Planning, Executing, and closing. The first phase, Defining, will incorporate high level activities such as goals, specifications, identifying key tasks, and roles and responsibilities. The second phase, Planning, includes creating schedules, defining budgets, determining resources available and requirements, assessing risks and staffing the team. The third phase, executing, involves the development of status reports, dealing with change, ensuring quality, and forecasting. All activities associated with “closing” will be projections as that phase has not yet occurred. Closure activities include, training the customer, transferring documents, release of resources, evaluations and lessons learned.
This chapter discusses defining the scope and structure of a project. It covers 5 steps: 1) defining the project scope, 2) establishing priorities, 3) creating a work breakdown structure (WBS), 4) integrating the WBS with the organizational structure, and 5) coding the WBS for information systems. The WBS breaks down the project into deliverables, work packages, and schedules. It is integrated with the organizational breakdown structure to assign responsibilities to groups.
This document summarizes research on the potential benefits of implementing 3D parametric modeling in precast concrete construction. Leading precast concrete companies have invested in developing such modeling software solutions with the goal of improving productivity throughout the precast business process. Initial experiences are beginning to confirm expectations of productivity gains and error reduction. This paper provides benchmarks for quantifying various direct and indirect benefits that have been identified, including estimated economic benefits for a large precast company over four years of adoption. It outlines how 3D modeling can automate routine tasks, apply standardized details, and maintain consistency through parametric relationships to improve engineering productivity.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
Enhancing the Software Effort Prediction Accuracy using Reduced Number of Cos...IRJET Journal
This document presents research on modifying the COCOMO II software cost estimation model to improve prediction accuracy. The researchers reduced the number of cost estimation factors (called cost drivers) from 17 to 13 by adjusting the definitions and impact levels to better reflect current industry situations. They estimated effort for software projects using the modified model and found lower percentage errors compared to the original COCOMO II model, demonstrating improved estimation efficiency. The goal of the research was to analyze cost drivers and their impact on effort estimation in COCOMO II and enhance the model for more accurate predictions.
FACTORS ON SOFTWARE EFFORT ESTIMATION ijseajournal
Software effort estimation is an important process of system development life cycle, as it may affect the
success of software projects if project designers estimate the projects inaccurately. In the past of few
decades, various effort prediction models have been proposed by academicians and practitioners.
Traditional estimation techniques include Lines of Codes (LOC), Function Point Analysis (FPA) method
and Mark II Function Points (Mark II FP) which have proven unsatisfactory for predicting effort of all
types of software. In this study, the author proposed a regression model to predict the effort required to
design small and medium scale application software. To develop such a model, the author used 60
completed software projects developed by a software company in Macau. From the projects, the author
extracted factors and applied them to a regression model. A prediction of software effort with accuracy of
MMRE = 8% was constructed.
Conveyor Belt Project Report using MS PROJECT by creating work package,deliverable, sub-deliverables and allocating resources to them. Analysis was done and suggestion was made for the overall imporvement
The purpose of this report is to provide management with a revised status of the Super Conveyer Belt project. The report is organized by the four phases of the project life cycle which include Defining / Initiating, Planning, Executing, and closing. The first phase, Defining, will incorporate high level activities such as goals, specifications, identifying key tasks, and roles and responsibilities. The second phase, Planning, includes creating schedules, defining budgets, determining resources available and requirements, assessing risks and staffing the team. The third phase, executing, involves the development of status reports, dealing with change, ensuring quality, and forecasting. All activities associated with “closing” will be projections as that phase has not yet occurred. Closure activities include, training the customer, transferring documents, release of resources, evaluations and lessons learned.
This chapter discusses defining the scope and structure of a project. It covers 5 steps: 1) defining the project scope, 2) establishing priorities, 3) creating a work breakdown structure (WBS), 4) integrating the WBS with the organizational structure, and 5) coding the WBS for information systems. The WBS breaks down the project into deliverables, work packages, and schedules. It is integrated with the organizational breakdown structure to assign responsibilities to groups.
This document summarizes research on the potential benefits of implementing 3D parametric modeling in precast concrete construction. Leading precast concrete companies have invested in developing such modeling software solutions with the goal of improving productivity throughout the precast business process. Initial experiences are beginning to confirm expectations of productivity gains and error reduction. This paper provides benchmarks for quantifying various direct and indirect benefits that have been identified, including estimated economic benefits for a large precast company over four years of adoption. It outlines how 3D modeling can automate routine tasks, apply standardized details, and maintain consistency through parametric relationships to improve engineering productivity.
This document provides an overview of key project management terms and concepts related to the Project Management Professional (PMP) certification. It includes a mind map covering areas such as processes, knowledge areas, project documents, scheduling techniques, communication methods, procurement, and risk management. The document is intended to familiarize readers with important vocabulary and topics in the field of project management.
IRJET - Model Driven Methodology for JAVAIRJET Journal
This document proposes a model-driven methodology to automatically convert standard Java applications into real-time Java applications using the Real-Time Specification for Java (RTSJ). It describes applying the principles of model-driven engineering to develop real-time systems in Java. The methodology defines a series of model transformation steps to convert existing time-sharing Java code into code that complies with RTSJ and supports real-time features like scheduling, memory management, and asynchronous event handling. The methodology aims to reduce the cost and errors of developing real-time applications by automating the conversion process using modeling standards.
Process-Centred Functionality View of Software Configuration Management: A Co...theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Sirin et al A Model Identity Card to Support Simulation Model Development Pro...goknursirin
This document proposes a Model Identity Card (MIC) to help classify simulation models and support the simulation model development process in a collaborative multidisciplinary design environment. It aims to reduce inconsistencies, ambiguity, and rework between different domain experts and simulation model providers by establishing a common vocabulary and formalizing the model design phase early in the development process. The MIC would provide concise specifications for simulation models, including input/output parameters, method, and usage to improve knowledge sharing. An industrial case study is used to validate how the MIC and integrated model design phase could be implemented.
This document discusses various aspects of project management including defining a project, characteristics of projects, project life cycles, and estimating project time and costs. It defines a project as a complex, non-routine effort with established objectives, a defined life span, and cross-organizational participation. Successful project management requires understanding an organization's strategy and culture. Estimating project time and costs involves both top-down and bottom-up approaches.
Computer information project planning is one of the most important activities in the modern software
development process. Without an objective and realistic plan of software project, the development of
software process cannot be managed effectively. This research will identify general measures for the
specific goals and its specific practices of Project Planning Process Area in Capability Maturity Model
Integration (CMMI). CMMI is developed in USA by Software Engineering Institute (SEI) in Carnegie
Mellon University. CMMI is a framework for assessment and improvement of computer information
systems. The procedure we used to determine the measures is to apply the Goal Questions Metrics (GQM)
approach to the three specific goals and its fourteen specific practices of Project Planning Process Area in
CMMI.
The document discusses project planning measures in the Capability Maturity Model Integration (CMMI). It applies the Goal Question Metric (GQM) approach to identify measures for the three specific goals and fourteen specific practices of the Project Planning process area in CMMI. The paper defines questions and measures related to each specific practice by following the three steps of GQM: defining goals, generating quantifiable questions, and defining measures to answer the questions. The identified measures are intended to help evaluate and control software products and processes.
This document discusses concurrent development of formal models and software implementations in an evolutionary development process. It proposes using formal models, such as those written in B or Z, to support iterative software development approaches. The document describes benefits like animation and model-checking that formal models enable through tool support. It then presents two case studies on e-business applications where teams attempted to develop models and implementations concurrently and evolve them together. The case studies aimed to evaluate how well formal modeling techniques could be integrated into an evolutionary development process.
Senior Capstone - Systems Operations ManualKevin Kempton
The CISB 471 team is working on a project that involves redesigning web pages and databases for multiple departments at Mesa State College. The project includes redesigning the Business Department website, creating a new online store interface for the bookstore, developing a new scholarship tracking database for the Financial Aid Office, and building a new survey system for the Business Department. The team utilized the Systems Development Life Cycle approach and created Gantt charts to plan and manage the project schedule.
The document provides details of a computer-controlled conveyor belt project including a work breakdown structure, schedule, and resource plan. It includes instructions for a multi-part exercise to develop the project plan, address schedule and resource constraints, and provide quarterly status reports. Revised estimates are then provided requiring an update to the estimated completion date, cost, and recommendations.
The document outlines steps for defining system capabilities to achieve objectives for specific scenarios, including: 1) partitioning capabilities into classes and connecting them to requirements, 2) defining measures of effectiveness and performance, 3) creating scenarios for each capability and connecting them to a value stream map, 4) assigning costs and defining risks, and 5) making tradeoffs that compare costs, schedule, and technical performance using measures of effectiveness and performance.
Role of Functional Organization in Large Engineering and Construction ProgramsBob Prieto
Large corporate organizations typically employ some form of matrix organization to ensure a consistent approach in key areas across the organization. The nature and extent of this matrix or functional organization will be driven by:
•common approaches to human resources
•consistent application of legal approvals and reviews of significant actions
•common financial functions related to accounting, cash management, insurance and claims & suits
•common managerial, technical and support functions which accrue benefits from a consistent and coordinated approach
Within a project setting, required resources generally reside at the project level and corporate functional activities extend into the project environment only to the extent required to protect the parent organization, consistent with client requirements and practices.
The situation in large programs, however, is different and a functional organization more akin to the corporate functional organization is often created within the program team. This program level functional organization acts much in the same way as the corporate functional organization but its role and emphasis evolves throughout the programs life.
A typical program management organization will include a functional organization that will provide people, management processes, program-level project control tools, and systems. The program management team will thereby bring enhanced management, quality control, efficiency, and coordination to the entire program.
Abstract The management of software cost, development effort and project planning are the key aspects of software development. Throughout the sixty-odd years of software development, the industry has gone at least four generations of programming languages and three major development paradigms. Still the total ability to move consistently from idea to product is yet to be achieved. In fact, recent studies document that the failure rate for software development has risen almost to 50 percent. There is no magic in managing software development successfully, but a number of issues related to software development make it unique. The basic problem of software development is risky. Some example of risk is error in estimation, schedule slips, project cancelled after numerous slips, high defect rate, system goes sour, business misunderstanding, false feature rich, staff turnover. XSoft Estimation addresses the risks by accurate measurement. A new methodology to estimate using software COSMIC-Full Function Point and named as EXtreme Software Estimation (XSoft Estimation). Based on the experience gained on the original XSoft project develpment, this paper describes what makes XSoft Estimation work from sizing to estimation. Keywords: -COSMIC function size unit, XSoft Estimation, XSoft Measurement, Cost Estimation.
This document discusses various aspects of project scheduling and risk management for software projects. It covers topics such as defining tasks, critical path analysis, earned value analysis, identifying risks, estimating risk probability and impact, and mitigating risks. The key aspects are determining the schedule using techniques like CPM, tracking progress through earned value analysis, and taking a proactive approach to risk management by identifying, analyzing, and developing plans to address potential risks.
This document discusses communication challenges that arise during the requirements elicitation process for software projects. It classifies these challenges into four categories based on differences in the perspectives and experiences of users and analysts. The document then introduces the Repertory Grid technique as a way to help address these communication issues and facilitate shared understanding between users and analysts. It provides an example of how Repertory Grid was used in a case study of requirements elicitation for a complex data warehouse project.
So WHY invest in better project management and process improvement? This paper points to the answer. The lessons are very powerful. Cut defects and costs by 60%. Read on ...
The business case for software process improvement may also be the business case for project management - in as much as CMMi process improvement implements the fundamentals of project management. This paper leverages work done by Larry Putnam, Stan Rifkin and Joe Kolinger in applying metrics to understand productivity and cost improvements in technology projects.
IRJET- Code Reuse & Reusability of the SoftwareIRJET Journal
This document discusses code reuse and reusability in software development. It defines code reuse as using existing software to create new software. Reuse can reduce costs and improve software quality by reusing code, structures, architectures and other components from one application to another. The document reviews literature on software metrics and reuse programs. It describes an ideal software reuse process with four components: create reusable assets, use assets to develop new products, support the asset library, and manage the overall reuse process and organization.
This document describes a new methodology called Extreme Software Estimation (XSoft Estimation) for accurately estimating software projects. XSoft Estimation uses COSMIC-Full Function Points (FFP) to measure software size and then applies a model of Development Effort = Size * Variable to estimate effort, cost, and schedule. The methodology was tested on 5 projects measuring their size in CFP units and comparing actual development time between expert and skilled teams, different programming languages and layers. The results showed expert teams and some languages/layers took significantly less time than others for the same sized functionality. XSoft Estimation aims to improve on past methods by basing estimates directly on measured functionality using COSMIC FFP.
Software metrics sucess, failures and new directionsAndrws Vieira
This document summarizes the history and status of software metrics in both academia and industry. It discusses that while academic research on software metrics has grown exponentially, industrial use of metrics has remained focused on simple counts like lines of code and defects. The document argues that traditional regression models used to relate metrics to quality are inadequate, and that capturing uncertainty and combining evidence is needed. It introduces Bayesian belief networks as an approach to building management tools using simple metrics while handling these issues.
A new model for software costestimationijfcstjournal
Accurate and realistic estimation is always considered to be a great challenge in software industry.
Software Cost Estimation (SCE) is the standard application used to manage software projects. Determining
the amount of estimation in the initial stages of the project depends on planning other activities of the
project. In fact, the estimation is confronted with a number of uncertainties and barriers’, yet assessing the
previous projects is essential to solve this problem. Several models have been developed for the analysis of
software projects. But the classical reference method is the COCOMO model, there are other methods
which are also applied such as Function Point (FP), Line of Code(LOC); meanwhile, the expert`s opinions
matter in this regard. In recent years, the growth and the combination of meta-heuristic algorithms with
high accuracy have brought about a great achievement in software engineering. Meta-heuristic algorithms
which can analyze data from multiple dimensions and identify the optimum solution between them are
analytical tools for the analysis of data. In this paper, we have used the Harmony Search (HS)algorithm for
SCE. The proposed model which is a collection of 60 standard projects from Dataset NASA60 has been
assessed.The experimental results show that HS algorithm is a good way for determining the weight
similarity measures factors of software effort, and reducing the error of MRE.
A NEW MODEL FOR SOFTWARE COSTESTIMATION USING HARMONY SEARCHijfcstjournal
Accurate and realistic estimation is always considered to be a great challenge in software industry.
Software Cost Estimation (SCE) is the standard application used to manage software projects. Determining
the amount of estimation in the initial stages of the project depends on planning other activities of the
project. In fact, the estimation is confronted with a number of uncertainties and barriers’, yet assessing the
previous projects is essential to solve this problem. Several models have been developed for the analysis of
software projects. But the classical reference method is the COCOMO model, there are other methods
which are also applied such as Function Point (FP), Line of Code(LOC); meanwhile, the expert`s opinions
matter in this regard. In recent years, the growth and the combination of meta-heuristic algorithms with
high accuracy have brought about a great achievement in software engineering. Meta-heuristic algorithms
which can analyze data from multiple dimensions and identify the optimum solution between them are
analytical tools for the analysis of data. In this paper, we have used the Harmony Search (HS)algorithm for
SCE. The proposed model which is a collection of 60 standard projects from Dataset NASA60 has been
assessed.The experimental results show that HS algorithm is a good way for determining the weight
similarity measures factors of software effort, and reducing the error of MRE.
This document provides an overview of key project management terms and concepts related to the Project Management Professional (PMP) certification. It includes a mind map covering areas such as processes, knowledge areas, project documents, scheduling techniques, communication methods, procurement, and risk management. The document is intended to familiarize readers with important vocabulary and topics in the field of project management.
IRJET - Model Driven Methodology for JAVAIRJET Journal
This document proposes a model-driven methodology to automatically convert standard Java applications into real-time Java applications using the Real-Time Specification for Java (RTSJ). It describes applying the principles of model-driven engineering to develop real-time systems in Java. The methodology defines a series of model transformation steps to convert existing time-sharing Java code into code that complies with RTSJ and supports real-time features like scheduling, memory management, and asynchronous event handling. The methodology aims to reduce the cost and errors of developing real-time applications by automating the conversion process using modeling standards.
Process-Centred Functionality View of Software Configuration Management: A Co...theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Sirin et al A Model Identity Card to Support Simulation Model Development Pro...goknursirin
This document proposes a Model Identity Card (MIC) to help classify simulation models and support the simulation model development process in a collaborative multidisciplinary design environment. It aims to reduce inconsistencies, ambiguity, and rework between different domain experts and simulation model providers by establishing a common vocabulary and formalizing the model design phase early in the development process. The MIC would provide concise specifications for simulation models, including input/output parameters, method, and usage to improve knowledge sharing. An industrial case study is used to validate how the MIC and integrated model design phase could be implemented.
This document discusses various aspects of project management including defining a project, characteristics of projects, project life cycles, and estimating project time and costs. It defines a project as a complex, non-routine effort with established objectives, a defined life span, and cross-organizational participation. Successful project management requires understanding an organization's strategy and culture. Estimating project time and costs involves both top-down and bottom-up approaches.
Computer information project planning is one of the most important activities in the modern software
development process. Without an objective and realistic plan of software project, the development of
software process cannot be managed effectively. This research will identify general measures for the
specific goals and its specific practices of Project Planning Process Area in Capability Maturity Model
Integration (CMMI). CMMI is developed in USA by Software Engineering Institute (SEI) in Carnegie
Mellon University. CMMI is a framework for assessment and improvement of computer information
systems. The procedure we used to determine the measures is to apply the Goal Questions Metrics (GQM)
approach to the three specific goals and its fourteen specific practices of Project Planning Process Area in
CMMI.
The document discusses project planning measures in the Capability Maturity Model Integration (CMMI). It applies the Goal Question Metric (GQM) approach to identify measures for the three specific goals and fourteen specific practices of the Project Planning process area in CMMI. The paper defines questions and measures related to each specific practice by following the three steps of GQM: defining goals, generating quantifiable questions, and defining measures to answer the questions. The identified measures are intended to help evaluate and control software products and processes.
This document discusses concurrent development of formal models and software implementations in an evolutionary development process. It proposes using formal models, such as those written in B or Z, to support iterative software development approaches. The document describes benefits like animation and model-checking that formal models enable through tool support. It then presents two case studies on e-business applications where teams attempted to develop models and implementations concurrently and evolve them together. The case studies aimed to evaluate how well formal modeling techniques could be integrated into an evolutionary development process.
Senior Capstone - Systems Operations ManualKevin Kempton
The CISB 471 team is working on a project that involves redesigning web pages and databases for multiple departments at Mesa State College. The project includes redesigning the Business Department website, creating a new online store interface for the bookstore, developing a new scholarship tracking database for the Financial Aid Office, and building a new survey system for the Business Department. The team utilized the Systems Development Life Cycle approach and created Gantt charts to plan and manage the project schedule.
The document provides details of a computer-controlled conveyor belt project including a work breakdown structure, schedule, and resource plan. It includes instructions for a multi-part exercise to develop the project plan, address schedule and resource constraints, and provide quarterly status reports. Revised estimates are then provided requiring an update to the estimated completion date, cost, and recommendations.
The document outlines steps for defining system capabilities to achieve objectives for specific scenarios, including: 1) partitioning capabilities into classes and connecting them to requirements, 2) defining measures of effectiveness and performance, 3) creating scenarios for each capability and connecting them to a value stream map, 4) assigning costs and defining risks, and 5) making tradeoffs that compare costs, schedule, and technical performance using measures of effectiveness and performance.
Role of Functional Organization in Large Engineering and Construction ProgramsBob Prieto
Large corporate organizations typically employ some form of matrix organization to ensure a consistent approach in key areas across the organization. The nature and extent of this matrix or functional organization will be driven by:
•common approaches to human resources
•consistent application of legal approvals and reviews of significant actions
•common financial functions related to accounting, cash management, insurance and claims & suits
•common managerial, technical and support functions which accrue benefits from a consistent and coordinated approach
Within a project setting, required resources generally reside at the project level and corporate functional activities extend into the project environment only to the extent required to protect the parent organization, consistent with client requirements and practices.
The situation in large programs, however, is different and a functional organization more akin to the corporate functional organization is often created within the program team. This program level functional organization acts much in the same way as the corporate functional organization but its role and emphasis evolves throughout the programs life.
A typical program management organization will include a functional organization that will provide people, management processes, program-level project control tools, and systems. The program management team will thereby bring enhanced management, quality control, efficiency, and coordination to the entire program.
Abstract The management of software cost, development effort and project planning are the key aspects of software development. Throughout the sixty-odd years of software development, the industry has gone at least four generations of programming languages and three major development paradigms. Still the total ability to move consistently from idea to product is yet to be achieved. In fact, recent studies document that the failure rate for software development has risen almost to 50 percent. There is no magic in managing software development successfully, but a number of issues related to software development make it unique. The basic problem of software development is risky. Some example of risk is error in estimation, schedule slips, project cancelled after numerous slips, high defect rate, system goes sour, business misunderstanding, false feature rich, staff turnover. XSoft Estimation addresses the risks by accurate measurement. A new methodology to estimate using software COSMIC-Full Function Point and named as EXtreme Software Estimation (XSoft Estimation). Based on the experience gained on the original XSoft project develpment, this paper describes what makes XSoft Estimation work from sizing to estimation. Keywords: -COSMIC function size unit, XSoft Estimation, XSoft Measurement, Cost Estimation.
This document discusses various aspects of project scheduling and risk management for software projects. It covers topics such as defining tasks, critical path analysis, earned value analysis, identifying risks, estimating risk probability and impact, and mitigating risks. The key aspects are determining the schedule using techniques like CPM, tracking progress through earned value analysis, and taking a proactive approach to risk management by identifying, analyzing, and developing plans to address potential risks.
This document discusses communication challenges that arise during the requirements elicitation process for software projects. It classifies these challenges into four categories based on differences in the perspectives and experiences of users and analysts. The document then introduces the Repertory Grid technique as a way to help address these communication issues and facilitate shared understanding between users and analysts. It provides an example of how Repertory Grid was used in a case study of requirements elicitation for a complex data warehouse project.
So WHY invest in better project management and process improvement? This paper points to the answer. The lessons are very powerful. Cut defects and costs by 60%. Read on ...
The business case for software process improvement may also be the business case for project management - in as much as CMMi process improvement implements the fundamentals of project management. This paper leverages work done by Larry Putnam, Stan Rifkin and Joe Kolinger in applying metrics to understand productivity and cost improvements in technology projects.
IRJET- Code Reuse & Reusability of the SoftwareIRJET Journal
This document discusses code reuse and reusability in software development. It defines code reuse as using existing software to create new software. Reuse can reduce costs and improve software quality by reusing code, structures, architectures and other components from one application to another. The document reviews literature on software metrics and reuse programs. It describes an ideal software reuse process with four components: create reusable assets, use assets to develop new products, support the asset library, and manage the overall reuse process and organization.
This document describes a new methodology called Extreme Software Estimation (XSoft Estimation) for accurately estimating software projects. XSoft Estimation uses COSMIC-Full Function Points (FFP) to measure software size and then applies a model of Development Effort = Size * Variable to estimate effort, cost, and schedule. The methodology was tested on 5 projects measuring their size in CFP units and comparing actual development time between expert and skilled teams, different programming languages and layers. The results showed expert teams and some languages/layers took significantly less time than others for the same sized functionality. XSoft Estimation aims to improve on past methods by basing estimates directly on measured functionality using COSMIC FFP.
Software metrics sucess, failures and new directionsAndrws Vieira
This document summarizes the history and status of software metrics in both academia and industry. It discusses that while academic research on software metrics has grown exponentially, industrial use of metrics has remained focused on simple counts like lines of code and defects. The document argues that traditional regression models used to relate metrics to quality are inadequate, and that capturing uncertainty and combining evidence is needed. It introduces Bayesian belief networks as an approach to building management tools using simple metrics while handling these issues.
A new model for software costestimationijfcstjournal
Accurate and realistic estimation is always considered to be a great challenge in software industry.
Software Cost Estimation (SCE) is the standard application used to manage software projects. Determining
the amount of estimation in the initial stages of the project depends on planning other activities of the
project. In fact, the estimation is confronted with a number of uncertainties and barriers’, yet assessing the
previous projects is essential to solve this problem. Several models have been developed for the analysis of
software projects. But the classical reference method is the COCOMO model, there are other methods
which are also applied such as Function Point (FP), Line of Code(LOC); meanwhile, the expert`s opinions
matter in this regard. In recent years, the growth and the combination of meta-heuristic algorithms with
high accuracy have brought about a great achievement in software engineering. Meta-heuristic algorithms
which can analyze data from multiple dimensions and identify the optimum solution between them are
analytical tools for the analysis of data. In this paper, we have used the Harmony Search (HS)algorithm for
SCE. The proposed model which is a collection of 60 standard projects from Dataset NASA60 has been
assessed.The experimental results show that HS algorithm is a good way for determining the weight
similarity measures factors of software effort, and reducing the error of MRE.
A NEW MODEL FOR SOFTWARE COSTESTIMATION USING HARMONY SEARCHijfcstjournal
Accurate and realistic estimation is always considered to be a great challenge in software industry.
Software Cost Estimation (SCE) is the standard application used to manage software projects. Determining
the amount of estimation in the initial stages of the project depends on planning other activities of the
project. In fact, the estimation is confronted with a number of uncertainties and barriers’, yet assessing the
previous projects is essential to solve this problem. Several models have been developed for the analysis of
software projects. But the classical reference method is the COCOMO model, there are other methods
which are also applied such as Function Point (FP), Line of Code(LOC); meanwhile, the expert`s opinions
matter in this regard. In recent years, the growth and the combination of meta-heuristic algorithms with
high accuracy have brought about a great achievement in software engineering. Meta-heuristic algorithms
which can analyze data from multiple dimensions and identify the optimum solution between them are
analytical tools for the analysis of data. In this paper, we have used the Harmony Search (HS)algorithm for
SCE. The proposed model which is a collection of 60 standard projects from Dataset NASA60 has been
assessed.The experimental results show that HS algorithm is a good way for determining the weight
similarity measures factors of software effort, and reducing the error of MRE.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
How Should We Estimate Agile Software Development Projects and What Data Do W...Glen Alleman
Estimating techniques for an acquisition program progresses from analogies to actual cost method as the program matures and more information is known. The analogy method is most appropriate early in the program life cycle when the system is not yet fully defined.
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...ijseajournal
In software development phase software artifacts are not in consistent states such as: some of the class artifacts are fully developed some are half developed, some are major developed, some are minor developed and some are not developed yet. At this stage allowing too many software requirement changes may possibly delay in project delivery and increase development budget of the software. On the other hand rejecting too many changes may increase customer dissatisfaction. Software change effort estimation is one of the most challenging and important activity that helps software project managers in accepting or rejecting changes during software development phase. This paper extends our previous works on developing a software requirement change effort estimation model prototype tool for the software development phase. The significant achievements of the tool are demonstrated through an extensive experimental validation using several case studies. The experimental analysis shows improvement in the estimation accuracy over current change effort estimation models.
Efficient Indicators to Evaluate the Status of Software Development Effort Es...IJMIT JOURNAL
Development effort is an undeniable part of the project management which considerably influences the
success of project. Inaccurate and unreliable estimation of effort can easily lead to the failure of project.
Due to the special specifications, accurate estimation of effort in the software projects is a vital
management activity that must be carefully done to avoid from the unforeseen results. However numerous
effort estimation methods have been proposed in this field, the accuracy of estimates is not satisfying and
the attempts continue to improve the performance of estimation methods. Prior researches conducted in
this area have focused on numerical and quantitative approaches and there are a few research works that
investigate the root problems and issues behind the inaccurate effort estimation of software development
effort. In this paper, a framework is proposed to evaluate and investigate the situation of an organization in
terms of effort estimation. The proposed framework includes various indicators which cover the critical
issues in field of software development effort estimation. Since the capabilities and shortages of
organizations for effort estimation are not the same, the proposed indicators can lead to have a systematic
approach in which the strengths and weaknesses of organizations in field of effort estimation are
discovered
Perspectives on the adherance to scrum rules in software project managementnooriasukmaningtyas
Adapting users need to fulfill their requirements and delivering products to be on time within the planned cost, is critical matter that all software project managers (SPM) put the highest priority for it while considering the users satisfaction at the same time. Agile methodology is one of the solutions provided by software engineers (SE), to get the customers involved in the system development life cycle (SDLC) to avoid the risk nonconformance cost. Yet SPM’s still facing the nonconformance costs and the dynamic changes, and the root cause of the issue is not pointed on to find a solution for it. This undertaking research aimed at determining whether software developers understand scrum rules. In addition, how does this knowledge gab affect the software projects success from the project management perspective. Furthermore, the engagement studied the impact of lack of enough knowledge on the topic to project delivery. The collected data from the qualitative and quantitative methods, which was conducted with scrum teams who worked in the health information system (HIS), Educational solutions, and Governmental solutions has showed deviations in organizational practices and team conflicting, competition, and pressure as well as declined product quality.
A NEW HYBRID FOR SOFTWARE COST ESTIMATION USING PARTICLE SWARM OPTIMIZATION A...ieijjournal
Software Cost Estimation (SCE) is considered one of the most important sections in software engineering that results in capabilities and well-deserved influence on the processes of cost and effort. Two factors of cost and effort in software projects determine the success and failure of projects. The project that will be completed in a certain time and manpower is a successful one and will have good profit to project
managers. In most of the SCE techniques, algorithmic models such as COCOMO algorithm models have been used. COCOMO model is not capable of estimating the close approximations to the actual cost, because it runs in the form of linear. So, the models should be adapted that simultaneously with the number of Lines of Code (LOC) has the ability to estimate in a fair and accurate fashion for effort factors. Metaheuristic algorithms can be a good model for SCE due to the ability of local and global search. In this paper, we have used the hybrid of Particle Swarm Optimization (PSO) and Differential Evolution (DE) for the SCE. Test results on NASA60 software dataset show that the rate of Mean Magnitude of Relative Error (MMRE) error on hybrid model, in comparison with COCOMO model is reduced to about 9.55%
A NEW HYBRID FOR SOFTWARE COST ESTIMATION USING PARTICLE SWARM OPTIMIZATION A...ieijjournal1
Software Cost Estimation (SCE) is considered one of the most important sections in software engineering
that results in capabilities and well-deserved influence on the processes of cost and effort. Two factors of
cost and effort in software projects determine the success and failure of projects. The project that will be
completed in a certain time and manpower is a successful one and will have good profit to project
managers. In most of the SCE techniques, algorithmic models such as COCOMO algorithm models have
been used. COCOMO model is not capable of estimating the close approximations to the actual cost,
because it runs in the form of linear. So, the models should be adapted that simultaneously with the number
of Lines of Code (LOC) has the ability to estimate in a fair and accurate fashion for effort factors. Metaheuristic
algorithms can be a good model for SCE due to the ability of local and global search. In this
paper, we have used the hybrid of Particle Swarm Optimization (PSO) and Differential Evolution (DE) for
the SCE. Test results on NASA60 software dataset show that the rate of Mean Magnitude of Relative Error
(MMRE) error on hybrid model, in comparison with COCOMO model is reduced to about 9.55%.
This document discusses the transition from traditional waterfall software development models to more agile approaches like Scrum and Kanban. It outlines some key limitations of the waterfall model, including unrealistic assumptions about requirements stability and integration challenges. Many software projects adopting waterfall experienced late delivery, changing requirements issues, and customer dissatisfaction. More iterative agile methods like Scrum and Kanban address these issues by emphasizing working software over documentation, incremental delivery, and flexibility. Studies show higher success rates for agile projects compared to waterfall. Large organizations are increasingly adopting agile practices across many teams and projects.
The peer-reviewed International Journal of Engineering Inventions (IJEI) is started with a mission to encourage contribution to research in Science and Technology. Encourage and motivate researchers in challenging areas of Sciences and Technology.
The document proposes updated definitions for technology, manufacturing, and services readiness levels based on lean product development principles. It argues the current definitions promote a flawed "build-test-fix" approach and presents alternative "Lean TRL", "Lean MRL", and "SRL" definitions grounded in robust design, design for six sigma, and lean principles. The updated levels aim to characterize and validate performance earlier to reduce costly late iterations compared to the conventional approach.
Software projects mostly exceeds budget, delivered late and does not meet with the customer’s satisfaction for years. In the past, many traditional development models like waterfall, spiral, iterative, and prototyping methods are used to build the software systems. In recent years, agile models are widely used in developing the software products. The major reasons are – simplicity, incorporating the requirement changes at any time, light-weight approach and delivering the working product early and in short duration. Whatever the development model used, it still remains a challenge for software engineer’s to accurately estimate the size, effort and the time required for developing the software system. This survey focuses on the existing estimation models used in traditional as well in agile software development.
The performance of an algorithm can be improved using a parallel computing programming approach. In this study, the performance of bubble sort algorithm on various computer specifications has been applied. Experimental results have shown that parallel computing programming can save significant time performance by 61%-65% compared to serial computing programming.
Shell is using business simulation software to improve its front-end planning processes for oil and gas projects. The software allows for faster modeling and scenario analysis compared to traditional spreadsheet methods. It also facilitates integrative planning across subsurface, surface, and economic domains. The new approach aims to reduce time spent on opportunity evaluations and planning while maintaining understanding of complex projects. Shell managers emphasize that successful adoption requires changes to workflows and thinking, not just the software itself.
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
A Methodology For Large-Scale E-Business Project ManagementApril Smith
This document proposes an environment-based methodology for managing large-scale e-business projects. The methodology defines six working environments - development, integration, pre-production, production, demonstration, and software repository - that represent increasing stages of stability for a software product. It describes the tasks and migration processes between environments. The methodology aims to systematically guide e-business project management according to an organization's needs and resources.
A Synergistic Approach To Information Systems Project ManagementJoe Osborn
This document discusses synergism in information systems project management. It presents a model of group synergy in IS project management that considers both qualitative and quantitative factors. Qualitative factors include morale, supportiveness, participation, coordination, integration and commitment. Quantitative output includes a structured and on-task team with synchronized productivity. The model provides a framework to better understand how technical and organizational factors can facilitate or inhibit IS project success. It aims to bring qualitative and quantitative aspects of project management together in a complementary manner to help IS professionals improve systems development.
EFFICIENCY OF SOFTWARE DEVELOPMENT AFTER IMPROVEMENTS IN REQUIREMENTS ENGINEE...ijseajournal
In the past decade multiple challenges arose from the method of software development [4, 4]. As described by Davenport, the development process needs an overhaul [4, 4]. Different disciplines, like project management, requirements engineering, the development of code or quality assurance have been investigated intensively, in order to improve the productivity of development. To obtain valid results, the
overhaul needs to start with the refactoring of the right process at first. Often, it is sensible to start with such processes, which operate at the interface to the customer, because they are perhaps the most critical to an organization’s success [3, 270 – 271]. Mainly, software development consists of four sub processes:
requirements engineering, development, quality assurance and delivery. Requirements engineering and
delivery operate on the interface to the customers. Because of the fact that the analysis of requirements is
groundbreaking, we select this process as the starting point of a process innovation initiative. We analyse
the impact of requirements engineering in KANBAN development processes. Special emphasis is put on the
productivity of the overall development process, after a refactoring of requirements engineering.
IRJET- Analysis of Software Cost Estimation TechniquesIRJET Journal
This document analyzes and compares different software cost estimation techniques using machine learning algorithms. It uses the COCOMO and function point estimation models on NASA project datasets to test the performance of the ZeroR and M5Rules classifiers. The M5Rules classifier produced more accurate results with lower mean absolute errors and root mean squared errors compared to COCOMO, function points, and the ZeroR classifier. Therefore, the study suggests using M5Rules techniques to build models for more precise software effort estimation.
These days we have an increased number of heart diseases including increased risk of heart attacks. Our proposed system users sensors that allow to detect heart rate of a person using heartbeat sensing even if the person is at home. The sensor is then interfaced to a microcontroller that allows checking heart rate readings and transmitting them over internet. The user may set the high as well as low levels of heart beat limit. After setting these limits, the system starts monitoring and as soon as patient heart beat goes above a certain limit, the system sends an alert to the controller which then transmits this over the internet and alerts the doctors as well as concerned users. Also the system alerts for lower heartbeats. Whenever the user logs on for monitoring, the system also displays the live heart rate of the patient. Thus concerned ones may monitor heart rate as well get an alert of heart attack to the patient immediately from anywhere and the person can be saved on time.This value will continue to grow if no proper solution is found. Internet of Things (IoT) technology developments allows humans to control a variety of high-tech equipment in our daily lives. One of these is the ease of checking health using gadgets, either a phone, tablet or laptop. we mainly focused on the safety measures for both driver and vehicle by using three types of sensors: Heartbeat sensor, Traffic light sensor and Level sensor. Heartbeat sensor is used to monitor heartbeat rate of the driver constantly and prevents from the accidents by controlling through IOT.
ABSTRACT The success of the cloud computing paradigm is due to its on-demand, self-service, and pay-by-use nature. Public key encryption with keyword search applies only to the certain circumstances that keyword cipher text can only be retrieved by a specific user and only supports single-keyword matching. In the existing searchable encryption schemes, either the communication mode is one-to-one, or only single-keyword search is supported. This paper proposes a searchable encryption that is based on attributes and supports multi-keyword search. Searchable encryption is a primitive, which not only protects data privacy of data owners but also enables data users to search over the encrypted data. Most existing searchable encryption schemes are in the single-user setting. There are only few schemes in the multiple data users setting, i.e., encrypted data sharing. Among these schemes, most of the early techniques depend on a trusted third party with interactive search protocols or need cumbersome key management. To remedy the defects, the most recent approaches borrow ideas from attribute-based encryption to enable attribute-based keyword search (ABKS
This document reviews the behavior of reinforced concrete deep beams. Deep beams are defined as having a shear span to depth ratio of less than 5. The response of deep beams differs from regular beams due to the influence of shear deformations and stresses. Failure modes include flexure, flexural-shear, and diagonal cracking. Previous studies investigated factors affecting shear strength such as concrete strength, reinforcement, and loading conditions. Equations have been proposed to predict shear strength based on test results.
Subcutaneous administration of toluene to rabbits for 6 weeks resulted in significant increases in liver enzyme levels and histopathological changes in the liver tissue. Liver sections from toluene-treated rabbits showed congested central veins, flattening and vacuolation of hepatocytes, and disarrangement of hepatic architecture. In contrast, liver sections from control rabbits appeared normal. Toluene exposure is known to cause oxidative stress and damage cell membranes in the liver through its metabolism.
This document summarizes a research paper that proposes a system to analyze crop phenology (growth stages) using IoT to support parallel agriculture management. The system would use sensors to collect data on soil moisture, temperature, humidity and other parameters. This data would be input to a database. Then, a multiple linear regression model trained on past data would predict the optimal crop and expected yield based on the tested sensor data and parameters. This system aims to help farmers select crops and fertilization practices tailored to their specific fields' conditions.
This document summarizes a study that determined the liberation size of gold ore from the Iperindo-Ilesha deposit in Nigeria and assessed its amenability to froth flotation. Samples of the ore were collected and subjected to sieve analysis to determine particle size fractions. Chemical analysis found that the actual and economic liberation sizes were 45μm and 250μm, respectively. Froth flotation experiments at 45μm particle size and varying collector dosages achieved a maximum gold recovery of 78.93% at 0.3 mol/dm3 collector dosage, with concentrate grade of 115 ppm Au. These parameters will be used for further processing to extract gold from this deposit.
This document presents a proposal for an IOT-based intelligent baby care system with a web application for remote baby monitoring. The system uses sensors to automatically swing a cradle when a baby cries, sound alarms if the baby cries for too long or the mattress is wet, and sends alerts to a web page for parents to monitor the baby's status from anywhere via internet connection. The proposed system aims to help working parents manage childcare remotely using sensors, a Raspberry Pi, web camera, and cloud server to detect the baby's activities and notify parents through a web application on their phone.
This document discusses various sources of water pollution and new techniques being developed for water purification. It begins by outlining how water pollution occurs from industrial wastes like mining and manufacturing, agricultural runoff containing pesticides, and domestic waste. It then examines some specific pollutants in more depth from these sources. New techniques under research for water purification are also mentioned, with the goal of developing more affordable methods. The document aims to analyze the impact of pollutants on water and introduce promising new purification techniques.
This document summarizes a research paper on using big data methodologies with IoT and its applications. It discusses how big data analytics is being used across various fields like engineering, data management, and more. It also discusses how IoT enables the collection of massive amounts of data from sensors and devices. Machine learning techniques are used to analyze this big data from IoT and enable communication between devices. The document provides examples of domains where big data and IoT are being applied, such as healthcare, energy, transportation, and others. It analyzes the similarities and differences in how big data techniques are used across these IoT domains.
The document describes a proposed smart library automation and monitoring system using RFID technology. The system uses RFID tags attached to books and student ID cards. An RFID scanner reads the tags to automate processes like tracking student entry and exit, book check-in/check-out, and inventory management. This allows transactions to occur without manual intervention. The system also includes an Android app for students to search books and check availability. The goals are to streamline library operations, prevent unauthorized access, and help locate misplaced books. Raspberry Pi hardware and a MySQL database are part of the proposed implementation.
This document discusses congestion control techniques for vehicular ad hoc networks (VANETs). It first provides background on VANETs, noting their use of vehicle-to-vehicle communication to share information. Congestion can occur when there is a sudden increase in data from nodes in the network. The document then reviews different existing congestion control schemes, which vary in how they adjust source sending rates and handle transient congestion. It proposes a priority-based congestion control technique using dual queues, one for transit packets and one for locally generated packets. This approach aims to route packets along less congested paths when congestion is detected based on buffer occupancy.
This document summarizes a research paper that proposes applying principles of Vedic mathematics to optimize the design of multipliers, squarers, and cubers. It begins by providing background on multipliers and their importance in electronic systems. It then reviews related work applying Vedic mathematics to multiplier design. The document outlines the methodology for performing multiplication, squaring, and cubing according to Vedic mathematics principles. It presents simulation and synthesis results comparing the proposed Vedic designs to traditional array-based designs, finding improvements in speed, power, and area. The document concludes that Vedic mathematics provides an effective approach for optimizing the design of these fundamental arithmetic components.
Cloud computing is the one of the emerging techniques to process the big data. Large collection of set or large
volume of data is known as big data. Processing of big data (MRI images and DICOM images) normally takes
more time compare with other data. The main tasks such as handling big data can be solved by using the concepts
of hadoop. Enhancing the hadoop concept it will help the user to process the large set of images or data. The
Advanced Hadoop Distributed File System (AHDF) and MapReduce are the two default main functions which
are used to enhance hadoop. HDF method is a hadoop file storing system, which is used for storing and retrieving
the data. MapReduce is the combinations of two functions namely maps and reduce. Map is the process of
splitting the inputs and reduce is the process of integrating the output of map’s input. Recently, in medical fields
the experienced problems like machine failure and fault tolerance while processing the result for the scanned
data. A unique optimized time scheduling algorithm, called Advanced Dynamic Handover Reduce Function
(ADHRF) algorithm is introduced in the reduce function. Enhancement of hadoop and cloud introduction of
ADHRF helps to overcome the processing risks, to get optimized result with less waiting time and reduction in
error percentage of the output image
Text mining has turned out to be one of the in vogue handle that has been joined in a few research
fields, for example, computational etymology, Information Retrieval (IR) and data mining. Natural
Language Processing (NLP) methods were utilized to extricate learning from the textual text that is
composed by people. Text mining peruses an unstructured form of data to give important
information designs in a most brief day and age. Long range interpersonal communication locales
are an awesome wellspring of correspondence as the vast majority of the general population in this
day and age utilize these destinations in their everyday lives to keep associated with each other. It
turns into a typical practice to not compose a sentence with remedy punctuation and spelling. This
training may prompt various types of ambiguities like lexical, syntactic, and semantic and because of
this kind of indistinct data; it is elusive out the genuine data arrange. As needs be, we are directing
an examination with the point of searching for various text mining techniques to get different
textual requests via web-based networking media sites. This review expects to depict how
contemplates in online networking have utilized text investigation and text mining methods to
identify the key topics in the data. This study concentrated on examining the text mining
contemplates identified with Facebook and Twitter; the two prevailing web-based social networking
on the planet. Aftereffects of this overview can fill in as the baselines for future text mining research.
Colorectal cancer (CRC) has potential to spread within the peritoneal cavity, and this transcoelomic
dissemination is termed “peritoneal metastases” (PM).The aim of this article was to summarise the current
evidence regarding CRC patients at high risk of PM. Colorectal cancer is the second most common cause of cancer
death in the UK. Prompt investigation of suspicious symptoms is important, but there is increasing evidence that
screening for the disease can produce significant reductions in mortality.High quality surgery is of paramount
importance in achieving good outcomes, particularly in rectal cancer, but adjuvant radiotherapy and chemotherapy
have important parts to play. The treatment of advanced disease is still essentially palliative, although surgery for
limited hepatic metastases may be curative in a small proportion of patients.
This document summarizes a research paper on the thermal performance of air conditioners using nanofluids compared to base fluids. Key points:
- Nanofluids, which are liquids containing nanoparticles, can improve heat transfer in heat pipes and cooling systems due to their higher thermal conductivity compared to base fluids.
- The document reviews how factors like nanofluid type, nanoparticle size and concentration affect thermal efficiency and heat transfer limits. It also examines using nanofluids to enhance heat exchange in transmission fluids.
- An experimental setup is described to study heat transfer and friction factors of water-based Al2O3 nanofluids in a horizontal tube under constant heat flux. Temperature, pressure and flow rate are measured
Now-a-day’s pedal powered grinding machine is used only for grinding purpose. Also, it requires lots of efforts
and limited for single application use. Another problem in existing model is that it consumed more time and also has
lower efficiency. Our aim is to design a human powered grinding machine which can also be used for many purposes
like pumping, grinding, washing, cutting, etc. it can carry water to a height 8 meter and produces 4 ampere of electricity
in most effective way. The system is also useful for the health conscious work out purpose. The purpose of this technical
study is to increase the performance and output capacity of pedal powered grinding machine.
This document summarizes a research paper that proposes using distributed control of multiple energy storage units (ESUs) to manage voltage and loading in electric distribution networks with renewable energy sources like solar and wind. The distributed control approach coordinates the ESUs to store excess power generated during peak periods and discharge it during peak load periods. Each ESU can provide both active and reactive power to support voltage and manage power flows. The distributed control strategy uses a consensus algorithm to divide the required active power reduction equally among ESUs based on their available capacity. Simulation results are presented to analyze the coordinated control of ESU active and reactive power outputs over time.
The steady increase in non-linear loads on the power supply network such as, AC variable speed drives,
DC variable Speed drives, UPS, Inverter and SMPS raises issues about power quality and reliability. In this
subject, attention has been focused on harmonics . Harmonics overload the power system network and cause
reliability problems on equipment and system and also waste energy. Passive and active harmonic filters are
used to mitigate harmonic problems. The use of both active and passive filter is justified to mitigate the
harmonics. The difficulty for practicing engineers is to select and deploy correct harmonic filters , This paper
explains which solutions are suitable when it comes to choosing active and passive harmonic filters and also
explains the mistakes need to be avoided.
This Paper is aimed at analyzing the few important Power System equipment failures generally
occurring in the Industrial Power Distribution system. Many such general problems if not resolved it may
lead to huge production stoppage and unforeseen equipment damages. We can improve the reliability of
Power system by simply applying the problem solving tool for every case study and finding out the root cause
of the problem, validation of root cause and elimination by corrective measures. This problem solving
approach to be practiced by every day to improve the power system reliability. This paper will throw the light
and will be a guide for the Practicing Electrical Engineers to find out the solution for every problem which
they come across in their day to day maintenance activity.
More from IJET - International Journal of Engineering and Techniques (20)
This study Examines the Effectiveness of Talent Procurement through the Imple...DharmaBanothu
In the world with high technology and fast
forward mindset recruiters are walking/showing interest
towards E-Recruitment. Present most of the HRs of
many companies are choosing E-Recruitment as the best
choice for recruitment. E-Recruitment is being done
through many online platforms like Linkedin, Naukri,
Instagram , Facebook etc. Now with high technology E-
Recruitment has gone through next level by using
Artificial Intelligence too.
Key Words : Talent Management, Talent Acquisition , E-
Recruitment , Artificial Intelligence Introduction
Effectiveness of Talent Acquisition through E-
Recruitment in this topic we will discuss about 4important
and interlinked topics which are
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Transcat
Join us for this solutions-based webinar on the tools and techniques for commissioning and maintaining PV Systems. In this session, we'll review the process of building and maintaining a solar array, starting with installation and commissioning, then reviewing operations and maintenance of the system. This course will review insulation resistance testing, I-V curve testing, earth-bond continuity, ground resistance testing, performance tests, visual inspections, ground and arc fault testing procedures, and power quality analysis.
Fluke Solar Application Specialist Will White is presenting on this engaging topic:
Will has worked in the renewable energy industry since 2005, first as an installer for a small east coast solar integrator before adding sales, design, and project management to his skillset. In 2022, Will joined Fluke as a solar application specialist, where he supports their renewable energy testing equipment like IV-curve tracers, electrical meters, and thermal imaging cameras. Experienced in wind power, solar thermal, energy storage, and all scales of PV, Will has primarily focused on residential and small commercial systems. He is passionate about implementing high-quality, code-compliant installation techniques.
Blood finder application project report (1).pdfKamal Acharya
Blood Finder is an emergency time app where a user can search for the blood banks as
well as the registered blood donors around Mumbai. This application also provide an
opportunity for the user of this application to become a registered donor for this user have
to enroll for the donor request from the application itself. If the admin wish to make user
a registered donor, with some of the formalities with the organization it can be done.
Specialization of this application is that the user will not have to register on sign-in for
searching the blood banks and blood donors it can be just done by installing the
application to the mobile.
The purpose of making this application is to save the user’s time for searching blood of
needed blood group during the time of the emergency.
This is an android application developed in Java and XML with the connectivity of
SQLite database. This application will provide most of basic functionality required for an
emergency time application. All the details of Blood banks and Blood donors are stored
in the database i.e. SQLite.
This application allowed the user to get all the information regarding blood banks and
blood donors such as Name, Number, Address, Blood Group, rather than searching it on
the different websites and wasting the precious time. This application is effective and
user friendly.
Build the Next Generation of Apps with the Einstein 1 Platform.
Rejoignez Philippe Ozil pour une session de workshops qui vous guidera à travers les détails de la plateforme Einstein 1, l'importance des données pour la création d'applications d'intelligence artificielle et les différents outils et technologies que Salesforce propose pour vous apporter tous les bénéfices de l'IA.
Generative AI Use cases applications solutions and implementation.pdfmahaffeycheryld
Generative AI solutions encompass a range of capabilities from content creation to complex problem-solving across industries. Implementing generative AI involves identifying specific business needs, developing tailored AI models using techniques like GANs and VAEs, and integrating these models into existing workflows. Data quality and continuous model refinement are crucial for effective implementation. Businesses must also consider ethical implications and ensure transparency in AI decision-making. Generative AI's implementation aims to enhance efficiency, creativity, and innovation by leveraging autonomous generation and sophisticated learning algorithms to meet diverse business challenges.
https://www.leewayhertz.com/generative-ai-use-cases-and-applications/
Supermarket Management System Project Report.pdfKamal Acharya
Supermarket management is a stand-alone J2EE using Eclipse Juno program.
This project contains all the necessary required information about maintaining
the supermarket billing system.
The core idea of this project to minimize the paper work and centralize the
data. Here all the communication is taken in secure manner. That is, in this
application the information will be stored in client itself. For further security the
data base is stored in the back-end oracle and so no intruders can access it.
1. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 59
Review of Effort Distribution in IT Companies
Reetuparna Mukherjee1
, Taniya Gupta2
, Mythili Thirugnanam3
School of Computer Science and Engineering
VIT University, Vellore
INTRODUCTION
Effort distribution methods have been studied and evolved
over the past few decades. Different methods employed
include knowledge-based modeling, parametric modeling,
fuzzy logic, neural networks, dynamic modeling or case based
reasoning to increase the accuracy of estimation and to
improve the estimation capability with respect to existing
development paradigms, e.g. COTS or open source-based
development . However, despite the intensive study on
estimation methods, it still is a great challenge for software
projects to be completed successfully as per recent studies.
The major limitations that impede software project from
meeting the success criteria, viz. within budget, on time, and
with specified functionalities. Limited progress to fully
understand the characteristics of emergent development
paradigms and its implications to cost estimation complicates
process-oriented decisions in design imperatives and cost
reconciliation. According to the waterfall development model,
there is restricted research effort towards understanding and
analyzing the root causes that lead to the variations in effort
distribution patterns. Effort distribution, be it by phase or
activity, is an important aspect of SDLC. Yet it is often
overlooked in the process of cost estimation. Bad effort
distribution is among the significant causes of reworking that
is due to insufficiently resourced activities. Distribution of
effort in software engineering process has been the platform
for facilitating more reasonable software project development
planning, and is provided as one of the major functionalities in
most planning or estimating methods. For example, a work
breakdown structure of the total estimate over different phases
or activities with corresponding percentage of effort
associated with each phase. With numerous estimation
methods and tools becoming available, users are often left
with greater difficulty and confusion in effort distribution
because of the large variation in the SDLC activities. Many
software practitioners solely rely on Rule of Thumb or expert
judgment to handle project phase effort planning. Compared
with a total estimate, it is more important to develop an
accurate phase-based distribution in order to facilitate strategic
resource allocation and planning. This paper investigates the
general effort distribution profiles of different development
type, software size, team size, business area, etc.
EXISTING WORK ANALYSIS
The assessment that is conducted on software development
project data collected from various software firms, resulted in
the production of several widely adopted bases to guide
software estimation practices. Norden [14], in his work to cost
estimation field, found the Rayleigh Curve to be a good
staffing level approximation to hardware projects. Though
inapplicable in software development projects owing to slow
early build-up and long-tail effects, Rayleigh distribution has
influenced many later models in terms of effort distribution,
such as the COCOMO [6, 11] and the SLIM [15]. Based on
studies conducted on effort distribution data, the waterfall
activity breakdown quantities in the COCOMO model [6, 11]
was presented by Boehm, and the lifecycle effort distribution
RESEARCH ARTICLE OPEN ACCESS
Abstract:
Effort distribution, be it by phase or activity, is an important aspect of SDLC. Yet it is often overlooked in
the process of cost estimation. Poor effort allocation is one of the root causes of rework owing to early activities
being insufficiently resourced. This paper presents various phase effort distribution patterns and variation sources.
The analysis results of these patterns show some consistency in effects of size of software and team size on code
and test phase distribution variations, and some considerable deviations in design, requirements, and transition
phases, compared with recommendations in the COCOMO model. Software size, in turn, depends on the small-
medium, medium-large companies having different schemes. Finally, the major findings of this paper discusses
about threats to validity and presents general guidelines in directing effort distribution across the various software
development methods, time duration of the phases of SDLC and the team strength. Based on the above factors,
effort distribution can be estimated.
Keywords — COTS, COCOMO, RUP, SLIM, SEER-SEM, ANOVA
2. International Journal of Engineering and Techniques
ISSN: 2395-1303
structure used by the Rational Unified Process (RUP) [12] was
given by Krutchen. The way of handling phase distribution, as
per COCOMO 81 model’s [11], is to provide effort ratios for
every development phase of SDLC. It also defines 5 levels of
project scale, 3 development modes, phase-sensitive effort
multipliers to help derive more accurate allocation decision
according to specific project needs. As per the COCOMO
model, with increase in software size, integration and test
phase should be given more focus, than the code phase. This
enables the users and effort multipliers to get the more
estimated and accurate result i.e. concluded phase wise, by
keeping the record of the different effort ratings . This offers a
method to adapt the basic distribution scheme which is better
and in accordance with the project requirements and situation.
But, the use of detailed COCOMO model leads a complex
procedure to produce the forms and tables having restrictive
steps, which requires intensive project knowledge. The
COCOMO II [6] model combines all the effort multipliers that
is same for all the phases used for the development purposes
owing to lack of detailed COCOMO calibration data and
simplification of model usage. The COCOMO II extends the
COCOMO 81 model to include the “plans and requirements”
and “transition” phases into its waterfall lifecycle effort
distribution scheme, and a MBASE/RUP phase/activity
distribution scheme adopted from [12]. This takes a toll on
flexibility in providing more relevant effort distribution
guidance to meet the project needs as compared to COCOMO
81 model.
There exists large variation in the SDLC phases in different
methods of estimation, resulting in lack of phase distribution
data. This in turn complicates other processes viz. synthesis of
lifecycle concept, model usage, collection and analysis of
data, and cost model calibration.
COCOMO 81 [11] distributes effort among Plan and
requirement, preliminary design, detailed design (25%), code
(33%), Integration & Test (25) as shown in the figure below.
Fig 1. Effort Distribution in COCOMO 81
Figure 2 shows how COCOMO II [16] distributes effort
:Waterfall distribution scheme: Planning &
requirement (7%), preliminary design (17%),detailed design
(25%), code (33%), Integration & Test (25), deployment &
maintenance (12%);
25%
33%
25%
Plan and
requirement,
preliminary
design, detailed
design
Code
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov –
1303 http://www.ijetjournal.org
structure used by the Rational Unified Process (RUP) [12] was
given by Krutchen. The way of handling phase distribution, as
per COCOMO 81 model’s [11], is to provide effort ratios for
fines 5 levels of
sensitive effort
multipliers to help derive more accurate allocation decision
according to specific project needs. As per the COCOMO
model, with increase in software size, integration and test
se should be given more focus, than the code phase. This
enables the users and effort multipliers to get the more
estimated and accurate result i.e. concluded phase wise, by
keeping the record of the different effort ratings . This offers a
the basic distribution scheme which is better
and in accordance with the project requirements and situation.
But, the use of detailed COCOMO model leads a complex
procedure to produce the forms and tables having restrictive
project knowledge. The
COCOMO II [6] model combines all the effort multipliers that
is same for all the phases used for the development purposes
owing to lack of detailed COCOMO calibration data and
simplification of model usage. The COCOMO II extends the
COCOMO 81 model to include the “plans and requirements”
and “transition” phases into its waterfall lifecycle effort
distribution scheme, and a MBASE/RUP phase/activity
distribution scheme adopted from [12]. This takes a toll on
re relevant effort distribution
guidance to meet the project needs as compared to COCOMO
There exists large variation in the SDLC phases in different
methods of estimation, resulting in lack of phase distribution
data. This in turn complicates other processes viz. synthesis of
lifecycle concept, model usage, collection and analysis of
COCOMO 81 [11] distributes effort among Plan and
requirement, preliminary design, detailed design (25%), code
as shown in the figure below.
1. Effort Distribution in COCOMO 81
distributes effort
requirement (7%), preliminary design (17%),detailed design
Test (25), deployment &
Fig. 2. Effort Distribution in COCOMO II
COCOMO II[16] MBASE/RUP distribution scheme:
Inception(6%),Elaboration(24%), Construction(76%),
Transition(12%) as shown in the figure below.
Fig. 3. Effort Distribution in COCOMO II MBASE/RUP
RUP [12] allots 5% to Inception, 20%
65% to Construction, 10% to Transition
below.
Fig. 4. Effort Distribution in RUP
COCOTS [6] distributes effort among
Gluecode & Integration. There is no unified distribution
guideline. SLIM [16] distributes among
Plan and
requirement,
preliminary
design, detailed
design
Code
7%
17%
25%
33%
25%
12%
6%
24%
76%
12%
5%
65%
20%
10%
– Dec 2016
Page 60
ution in COCOMO II
COCOMO II[16] MBASE/RUP distribution scheme:
Inception(6%),Elaboration(24%), Construction(76%),
as shown in the figure below.
. Effort Distribution in COCOMO II MBASE/RUP
RUP [12] allots 5% to Inception, 20% to Elaboration,
65% to Construction, 10% to Transition as shown in the figure
Effort Distribution in RUP
COCOTS [6] distributes effort among Assessment, Tailoring,
unified distribution
. SLIM [16] distributes among Concept Definition,
Plan and
requirement
Preliminary
design
Detailed design
Code
Inception
Elaboration
Construction
Transition
Inception
Construction
Elaboration
Transition
3. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 61
Requirement & Design, Construct & Test, Perfective
Maintenance. SEER-SEM allots effort among early
specification, design, development, delivery
& maintenance.
Heijstek and Chaudron [10] reported empirical data on effort
distribution used model-based development, and confirmed
the similarity between the original RUP hump-chart and the
reported distribution [12]. Yiftachel et. al. [18] came up with
an economic model for optimal resource allocation across
various phases of development. Optimal effort distribution
across phases based on defect-introduction and defect-slippage
considerations have been studied [19, 20] but there still exists
notable inconsistency in results from different researchers
regarding phase distribution.
DATA COLLECTION AND CLEANSING
Data collection: A study conducted by CSBSG sent an
electronic questionnaire to a number of organizations. The
questionnaire had some capability to check some errors
automatically, such as data inconsistency and spelling error.
On receipt of data from the organizations, the data quality is
checked by a group of experts to confirm whether it can be
added to the database. During this method, there is a
specialized contact personnel between that expert group and
the organization, and all the information about the
organization is hidden in the data submitted to the expert
group; the contact personnel reacts to a data problem, if found.
Data Cleaning: We only make use of a subset of the data, that
too with only focused attributes for the purpose of our study.
The data cleaning steps and attributes are summarized.
Table 1. Attributes covered in the study
METRIC UNIT DESCRIPTION
Size Sloc Total lines of code
Effort Person-hour Summary work effort
Plan phase effort Person-hour Work effort of plan
phase
Requirements phase
effort
Person-hour Work effort of
requirements phase
Design phase effort Person-hour Work effort of design
phase
Code phase effort Person-hour Work effort of code
phase
Test phase effort Person-hour Work effort of test
phase
Transition phase
effort
Person-hour Work effort of
transition phase
Development life
cycle
Nominal Waterfall, iterative,
rapid
Team size Person Maximum size of the
development
Development type Nominal New development,
enhancement,
redevelopment
RESULT AND ANALYSIS
The table below summarizes the statistical features of effort
distribution percentage for each of the five development
phases. Data has been collected from the CSBSG.
Table 2. Summary of phase distribution profile
*Plan&Req Planning and Requirement
**Trans Transition
As per [6], “major variations in both waterfall and RUP phase
distribution quantities come in the phases outside the core
development phases”. The collected data shows that coding
and testing phases exhibit greater variation than other phases,
e.g. the project with the least design phase effort distribution
quantity is also the one with greatest coding effort distribution
quantity.
Fig..5 Comparison of the mean distribution profile of CSBSG
dataset and the COCOMO II
Waterfall distribution quantities helps us examine the
differences of individual phase distribution between CSBSG
dataset and COCOMO II recommendations.
The similarity lies only in the last two phases whereas there
are certain significant differences. The CSBSG dataset lays
greater emphasis on Planning and Requirements phase.
CSBSG allots 16.14% of overall effort to this phase as
compared to a mere 6% in COCOMO II. Also, the design
phase receives only 14.88% effort in CSBSG whereas
COCOMO II allots 35%. The average distribution for code
phase is 40.36 % in CSBSG and 27.8% in COCOMO II.
ANOVA analysis is used to examine the variance in term of
effort distribution percentage for each phase that is explained
by class variables, i.e. software size, team size and
development type.
4. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 62
The data summarized in Table shows the degree of variance
for the variation in individual phase effort distribution
explained by an influencing factor. For example, it also
indicates that for measuring effective software size and
predictive development effort , FP is a stronger predictor and
for determining appropriate adjustment for code and test
phases, development type is also one of the major factor.
However, team size is considered as less significant factor in
effort distribution.
Table 3. ANOVA Results
Phase
Distribution
(%)
Development
Type
Software Size Team
SizeLOC
Scale
FP
Scale
Planning &
Requirement
3.96 14.80 13.65 5.50
Design 2.57 3.67 3.36 0.62
Code 12.22 7.93 20.56 0.02
Test 7.47 7.16 14.59 3.05
Transition 2.72 9.36 22.45 1.52
Through such model, these parametric model are considered
as current method for estimation and benchmarking purposes.
Due of the wide scope and variety of software projects, it is
infeasible to deploy a single distribution scheme
encompassing all projects. Analysis of empirical data helps to
understand the variations in phase effort distribution and their
possible causes. Identification of most frequently presented
effort distribution patterns can help derive insightful
conclusions leading to improved project planning practices.
Following are general guidelines for software estimation and
management.
Phase effort distribution analysis should be
performed along with cost estimation
FP based software size and development type must
be considered in effort distribution
For enhancement type of projects, percentage of
development effort in code phase decreases by
10.67%, and that for testing increases by 5.4%.
As software size increases, distribution has an
intensive focus on both coding and testing.
Team size has a significant effect on testing phase.
Hence team size should be decided accordingly.
CONCLUSION
Lack of recognition to process variations and lack of
understanding of effort distribution patterns leads to poor
effort distribution. This study aims to examine effort
distribution profile and gain in-depth understanding of the
variations and causes of phase effort distribution. It identifies
significant differences and similarities with COCOMO II and
further analyses patterns in phase effort distribution.
Guidelines for efficient effort distribution amongst phases of
SDLC have been provided for improved project management.
REFERENCES
[1] Boehm, B. and C. Abts, Software Development Cost
Estimation Approaches-A Survey. Annals of Software
Engineering, 2000. 10(1-4): p. 177-205.
[2] Jorgensen, M. and M. Shepperd, A System Review of
Software Development Cost Estimation Studies. IEEE
Transaction on Software Engineering, 2007. 33(1): p. 33-53.
[3] Basili, V.R.: Software development: a paradigm for the
future. Proceedings of the 13th Annual International Computer
Software and Application Conference, (1989) 471-485.
[4] A Framework Analysis of The Open Source Software
Development Paradigm Proceedings of the 21st International
Conference in Information Systems(ICIS 2000).(2000),58-69.
[5] The Standish Group. 2004 the 3rd Quarter Research
Report, (2004). http://www.standishgroup.com
[6] Boehm, B.W., et al.: Software Cost Estimation with
COCOMO II. Prentice Hall, NY(2000)
[7] Reifer, D.: Industry software cost, quality and productivity
benchmarks, software. Tech News 7(2) (July 2004)
[8] Kroll, P.: Planning and estimating a RUP project using
IBM rational SUMMIT ascendant. Technical report, IBM
Developer works (May 2004)
[9] QSM Inc.: The QSM Software Almanac: Application
Development Series (IT Metrics Edition) Application
Development Data and Research for the Agile
Enterprise.Quantitative Software Management Inc., McLean,
Virginia,USA (2006)
[10] Heijstek, W. and Chaudron, M. R. V.: Effort distribution
in model-based development. 2nd Workshop on Model Size
Metrics (2007)
[11] Boehm, B.W.: Software Engineering Economics. Prentice
Hall, (1981)
[12] Kruchten, P.: The Rational Unified Process: An
Introduction. Addison-Wesley Longman Publishing Co., Inc.,
Boston,MA, USA (2003)
[13] Wohlin, C.: Distribution Patterns of Effort estimation.
EUROMICRO Conference (2004)
[14] Norden P.V.: Curve Fitting for a Model of Applied
Research and Development Scheduling. IBM J. Research and
Development, Vol. 3, No. 2, (1958), 232-248.
[15] Putnam, L. and Myers, W. (1992), Measures for
Excellence, Yourdon Press Computing Series.
[16] He, M., et al.: An Investigation of Software Development
Productivity in China. ICSP 2008, (2008) 381-394
[17] SEER-SEM. http://www.galorath.com/index.php
[18] Peleg Yiftachel, et al.: Resource Allocation among
Development Phases: An Economic Approach. EDSER’06,
May 27, 2006, Shanghai, China, (2006) 43-48
[19] Huang, L., Boehm, B.: Determining how much software
assurance is enough?: a value-based approach. In: EDSER
’05: Proceedings of the seventh international workshop on
Economics-driven software engineering research, NY,
USA,ACM Press (2005) 1–5
[20] Yiftachel, P., Peled, D., Hadar, I., Goldwasser, D.:
Resource allocation among development phases: an economic
approach. In: EDSER ’06: Proceedings of the 2006
5. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 63
international workshop on Economics driven software
engineering research, New York, NY, USA, ACM
Press(2006) 43–48.
[21] Jiang, Z., Naudé, P. and Comstock, C.: An investigation
on
the variation of software development productivity.
International Journal of Computer, Information, and Systems
Sciences, and Engineering, Vol. 1, No. 2 (2007) 72-81
[22] ISBSG Benchmark Release 8. http://www.isbsg.org
[23] SPR Programming Languages Table 2003, Software
Productivity Research. ttp://www.spr.com
[24] Agrawal, M., Chari, K.: Software Effort, Quality and
Cycle
Time: A Study of CMM Level 5 Projects. IEEE Transactions
on Software Engineering, Vol. 33, No. 3 (2007) 145-156
[25] Software Measurement Services Ltd. ‘Small project’,
‘medium-size project’ and ‘large project’: what do these
terms mean?”
http://www.measuresw.com/library/Papers/Rule/RulesRelati
veSizeScale%20v1b.pdf
[26] http://www.csbsg.org