The document proposes a model for estimating resources needed to develop software using the Objectory process. The model is based on function points, which counts types of inputs, outputs, inquiries etc to determine the size of the system. It then adjusts for technical complexity factors and new environmental factors. It presents this Use Case Points model and shows how to calculate unadjusted use case points, technical complexity factor, and environmental factor. Finally, it validates the model using data from three projects. In summary, the model provides an early way to estimate resources for Objectory projects based on use case analysis and adjustment factors.
IOSR Journal of Electrical and Electronics Engineering(IOSR-JEEE) is an open access international journal that provides rapid publication (within a month) of articles in all areas of electrical and electronics engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in electrical and electronics engineering. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Model Based Software Timing Analysis Using Sequence Diagram for Commercial Ap...iosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
IOSR Journal of Electrical and Electronics Engineering(IOSR-JEEE) is an open access international journal that provides rapid publication (within a month) of articles in all areas of electrical and electronics engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in electrical and electronics engineering. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Model Based Software Timing Analysis Using Sequence Diagram for Commercial Ap...iosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Textbook Solutions refer https://pythonxiisolutions.blogspot.com/
Practical's Solutions refer https://prippython12.blogspot.com/
Idea of Algorithmic Efficiency. The quality of algorithm depends on internal and external factors. Time, Space, Size, Quality, Speed. Computational Complexity. Big-O notation.
Interior Dual Optimization Software Engineering with Applications in BCS Elec...BRNSS Publication Hub
Interior optimization software and algorithms programming methods provide a computational tool with
a number of applications. Theory and computational demonstrations/techniques were primarily shown
in previous articles. The mathematical framework of this new method, (Casesnoves, 2018–2020), was
also proven.The links among interior optimization, graphical optimization (Casesnoves, 2016–7), and
classical methods in non-linear equations systems were developed. This paper is focused on software
engineering with mathematical methods implementation in programming as a primary subject. The
specific details for interior optimization computational adaptation on every specific problem, such as
engineering, physics, and electronics physics, are explained. Second subject is electronics applications
of software in the field of superconductors. It comprises a series of new BCS equation optimization
for Type I superconductors, based on previous research for other different Type I superconductors
previously published. A new dual optimization for two superconductors is simulated. Results are
acceptable with low errors and imaging demonstrations of the interior optimization utility.
CyberLab Training Division :
Intel VTune Amplifier is a commercial application for software performance analysis for 32 and 64-bit x86 based machines, and has both GUI and command line interfaces. It is available for both Linux and Microsoft Windows operating systems. Although basic features work on both Intel and AMD hardware, advanced hardware-based sampling requires an Intel-manufactured CPU.
Whether you are tuning for the first time or doing advanced performance optimization, Intel® VTune Amplifier provides a rich set of performance insight into CPU & GPU performance, threading performance & scalability, bandwidth, caching and much more. Analysis is faster and easier because VTune Amplifier understands common threading models and presents information at a higher level that is easier to interpret. Use its powerful analysis to sort, filter and visualize results on the timeline and on your source.
It is available as part of Intel Parallel Studio or as a stand-alone product.
VTune Amplifier assists in various kinds of code profiling including stack sampling, thread profiling and hardware event sampling. The profiler result consists of details such as time spent in each sub routine which can be drilled down to the instruction level. The time taken by the instructions are indicative of any stalls in the pipeline during instruction execution. The tool can be also used to analyze thread performance. The new GUI can filter data based on a selection in the timeline.
For More Details.
Visit: http://www.cyberlabzone.com
Design & Analysis of Algorithms Lecture NotesFellowBuddy.com
FellowBuddy.com is an innovative platform that brings students together to share notes, exam papers, study guides, project reports and presentation for upcoming exams.
We connect Students who have an understanding of course material with Students who need help.
Benefits:-
# Students can catch up on notes they missed because of an absence.
# Underachievers can find peer developed notes that break down lecture and study material in a way that they can understand
# Students can earn better grades, save time and study effectively
Our Vision & Mission – Simplifying Students Life
Our Belief – “The great breakthrough in your life comes when you realize it, that you can learn anything you need to learn; to accomplish any goal that you have set for yourself. This means there are no limits on what you can be, have or do.”
Like Us - https://www.facebook.com/FellowBuddycom
In this presentation we introduce a family of gossiping algorithms whose members share the same structure though they vary their performance in function of a combinatorial parameter. We show that such parameter may be considered as a “knob” controlling the amount of communication parallelism characterizing the algorithms. After this we introduce procedures to operate the knob and choose parameters matching the amount of communication channels currently provided by the available communication system(s). In so doing we provide a robust mechanism to tune the production of requests for communication after the current operational conditions of the consumers of such requests. This can be used to achieve high performance and programmatic avoidance of undesirable events such as message collisions.
Paper available at https://dl.dropboxusercontent.com/u/67040428/Articles/pdp12.pdf
AN OPTIMAL FUZZY LOGIC SYSTEM FOR A NONLINEAR DYNAMIC SYSTEM USING A FUZZY BA...IJCNCJournal
The impetus for this paper is the development of Fuzzy Basis Function “FBF” that assigns in an optimal fashion, a function approximation for a nonlinear dynamic system. A fuzzy basis function is applied to find the best location of the characteristic points by specifying the set of fuzzy rules. The advantage of this technique is that, it may produce a simple and well-performing system because it selects the most significant fuzzy basis functions to minimize an objective function in the output error for the fuzzy rules. The fuzzy basis function is a linguistic fuzzy IF_THEN rule. It provides a combination of the numerical information and the linguistic information in the form input-output pairs and in the form of fuzzy rules. The proposed control scheme is applied to a magnetic ball suspension system.
ANN Based Prediction Model as a Condition Monitoring Tool for Machine ToolsIJMER
Abstract: In today’s world of manufacturing, a machine tool has an important role to produce best
quality and quantity in phase with the demand. Machine tool in good working condition enhances the
productivity and provides an opportunity for the overall development of the industry. Several
parameters such as vibration of a machine tool structure, temperature at cutting zones, machined
surface roughness, noise levels in the moving parts etc. provide the information of its working
condition which is related to its productivity. Surface roughness of the machined part is one of the
parameters to indicate machine tool condition. In the recent trends, soft computing tools have
emerged as an aid for the condition monitoring of machine tools. In the present work, experiments
have been conducted based on Taguchi technique for turning operation with different process input
parameters using carbide cutting tool insert and the surface roughness of the turned parts were
measured as output characteristic. Relation between input and output was established and a
prediction model for surface roughness was built by using artificial neural network (ANN)
backpropagation learning algorithm. The predicted values of surface roughness from the prediction
model in comparison with the experimental values are found to be in close agreement. This establishes
the use of ANN in developing prediction models for better monitoring of the condition of a machine
tool for enhancing the productivity
Textbook Solutions refer https://pythonxiisolutions.blogspot.com/
Practical's Solutions refer https://prippython12.blogspot.com/
Idea of Algorithmic Efficiency. The quality of algorithm depends on internal and external factors. Time, Space, Size, Quality, Speed. Computational Complexity. Big-O notation.
Interior Dual Optimization Software Engineering with Applications in BCS Elec...BRNSS Publication Hub
Interior optimization software and algorithms programming methods provide a computational tool with
a number of applications. Theory and computational demonstrations/techniques were primarily shown
in previous articles. The mathematical framework of this new method, (Casesnoves, 2018–2020), was
also proven.The links among interior optimization, graphical optimization (Casesnoves, 2016–7), and
classical methods in non-linear equations systems were developed. This paper is focused on software
engineering with mathematical methods implementation in programming as a primary subject. The
specific details for interior optimization computational adaptation on every specific problem, such as
engineering, physics, and electronics physics, are explained. Second subject is electronics applications
of software in the field of superconductors. It comprises a series of new BCS equation optimization
for Type I superconductors, based on previous research for other different Type I superconductors
previously published. A new dual optimization for two superconductors is simulated. Results are
acceptable with low errors and imaging demonstrations of the interior optimization utility.
CyberLab Training Division :
Intel VTune Amplifier is a commercial application for software performance analysis for 32 and 64-bit x86 based machines, and has both GUI and command line interfaces. It is available for both Linux and Microsoft Windows operating systems. Although basic features work on both Intel and AMD hardware, advanced hardware-based sampling requires an Intel-manufactured CPU.
Whether you are tuning for the first time or doing advanced performance optimization, Intel® VTune Amplifier provides a rich set of performance insight into CPU & GPU performance, threading performance & scalability, bandwidth, caching and much more. Analysis is faster and easier because VTune Amplifier understands common threading models and presents information at a higher level that is easier to interpret. Use its powerful analysis to sort, filter and visualize results on the timeline and on your source.
It is available as part of Intel Parallel Studio or as a stand-alone product.
VTune Amplifier assists in various kinds of code profiling including stack sampling, thread profiling and hardware event sampling. The profiler result consists of details such as time spent in each sub routine which can be drilled down to the instruction level. The time taken by the instructions are indicative of any stalls in the pipeline during instruction execution. The tool can be also used to analyze thread performance. The new GUI can filter data based on a selection in the timeline.
For More Details.
Visit: http://www.cyberlabzone.com
Design & Analysis of Algorithms Lecture NotesFellowBuddy.com
FellowBuddy.com is an innovative platform that brings students together to share notes, exam papers, study guides, project reports and presentation for upcoming exams.
We connect Students who have an understanding of course material with Students who need help.
Benefits:-
# Students can catch up on notes they missed because of an absence.
# Underachievers can find peer developed notes that break down lecture and study material in a way that they can understand
# Students can earn better grades, save time and study effectively
Our Vision & Mission – Simplifying Students Life
Our Belief – “The great breakthrough in your life comes when you realize it, that you can learn anything you need to learn; to accomplish any goal that you have set for yourself. This means there are no limits on what you can be, have or do.”
Like Us - https://www.facebook.com/FellowBuddycom
In this presentation we introduce a family of gossiping algorithms whose members share the same structure though they vary their performance in function of a combinatorial parameter. We show that such parameter may be considered as a “knob” controlling the amount of communication parallelism characterizing the algorithms. After this we introduce procedures to operate the knob and choose parameters matching the amount of communication channels currently provided by the available communication system(s). In so doing we provide a robust mechanism to tune the production of requests for communication after the current operational conditions of the consumers of such requests. This can be used to achieve high performance and programmatic avoidance of undesirable events such as message collisions.
Paper available at https://dl.dropboxusercontent.com/u/67040428/Articles/pdp12.pdf
AN OPTIMAL FUZZY LOGIC SYSTEM FOR A NONLINEAR DYNAMIC SYSTEM USING A FUZZY BA...IJCNCJournal
The impetus for this paper is the development of Fuzzy Basis Function “FBF” that assigns in an optimal fashion, a function approximation for a nonlinear dynamic system. A fuzzy basis function is applied to find the best location of the characteristic points by specifying the set of fuzzy rules. The advantage of this technique is that, it may produce a simple and well-performing system because it selects the most significant fuzzy basis functions to minimize an objective function in the output error for the fuzzy rules. The fuzzy basis function is a linguistic fuzzy IF_THEN rule. It provides a combination of the numerical information and the linguistic information in the form input-output pairs and in the form of fuzzy rules. The proposed control scheme is applied to a magnetic ball suspension system.
ANN Based Prediction Model as a Condition Monitoring Tool for Machine ToolsIJMER
Abstract: In today’s world of manufacturing, a machine tool has an important role to produce best
quality and quantity in phase with the demand. Machine tool in good working condition enhances the
productivity and provides an opportunity for the overall development of the industry. Several
parameters such as vibration of a machine tool structure, temperature at cutting zones, machined
surface roughness, noise levels in the moving parts etc. provide the information of its working
condition which is related to its productivity. Surface roughness of the machined part is one of the
parameters to indicate machine tool condition. In the recent trends, soft computing tools have
emerged as an aid for the condition monitoring of machine tools. In the present work, experiments
have been conducted based on Taguchi technique for turning operation with different process input
parameters using carbide cutting tool insert and the surface roughness of the turned parts were
measured as output characteristic. Relation between input and output was established and a
prediction model for surface roughness was built by using artificial neural network (ANN)
backpropagation learning algorithm. The predicted values of surface roughness from the prediction
model in comparison with the experimental values are found to be in close agreement. This establishes
the use of ANN in developing prediction models for better monitoring of the condition of a machine
tool for enhancing the productivity
we have made this ppr to inform all of us about an occupation /business that generally remains unnoticed i.e "Dabbawallas" or we call them tiffin service provider . in our ppt we have shown all the relevant info. regarding dabbawallas , their work culture , their awards and many more thing
its a presentation which i and my friend have made on the Mutual Funds ,what are the various benefits of investing in MF , what are the various types of mutual funds ,how does one earn handsome returns in mutual funds, what the investor has to pay it's advantages and disadvantages and many more.............
This presentation was made to show how discipline plays an important role in every one's life's , if one follows discipline in his or her life, he or she can achieve any target , we have also shown the various disciplinary actions that could be taken against the employees if he does not follow the rules laid down by the employer and the process of disciplinary actions
Software cost estimation is a key open issue for the software industry, which
suffers from cost overruns frequently. As the most popular technique for object-oriented
software cost estimation is Use Case Points (UCP) method, however, it has two major
drawbacks: the uncertainty of the cost factors and the abrupt classification. To address
these two issues, refined the use case complexity classification using fuzzy logic theory which
mitigate the uncertainty of cost factors and improve the accuracy of classification.
Software estimation is a crucial task in software engineering. Software estimation
encompasses cost, effort, schedule, and size. The importance of software estimation becomes
critical in the early stages of the software life cycle when the details of software have not
been revealed yet. Several commercial and non-commercial tools exist to estimate software
in the early stages. Most software effort estimation methods require software size as one of
the important metric inputs and consequently, software size estimation in the early stages
becomes essential.
The proposed method presents a techniques using fuzzy logic theory to improve the
accuracy of the use case points method by refining the use case classification.
Detailed description of the use case point estimation method use to estimate the size of Application before Developing it. This Model is used in Software Engineering Field
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Benchmark methods to analyze embedded processors and systemsXMOS
xCORE multicore microcontrollers are 100x more responsive than traditional micros. The unparalleled responsiveness of the xCORE I/O ports is rooted in some fundamental features:
- Single cycle instruction execution
- No interrupts
- No cache
- Multiple cores allow concurrent independent task execution
- Hardware scheduler performs 'RTOS-like' functions
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Key Trends Shaping the Future of Infrastructure.pdf
Karner resource estimation for objectory projects
1. Resource Estimation for Objectory Projects
Gustav Karner
Objective Systems SF AB
Torshamnsgatan 39, Box 1128
164 22 Kista
email: gustav@os.se
September 17, 1993
Abstract
In order to estimate the resources needed to develop a
software system with the Objectory process, one would like
to have a model which predict the total amount of resources
early in the developing process. The model described in this
article will help you to do a prediction like that.
1. Introduction
Objectory, see Jacobson I., Christerson M., Jonsson P. and
Övergaard G. (1992), is a well defined process for
developing industrial object oriented applications. The
process is built of four different main processes which are:
•
Requirements Analysis
•
Construction
•
Testing
2. Function Points
Function Points (FP) is a common model for estimations,
proposed by Albrecht (1979). It is useful when you want to
estimate man hours in an early phase, when you only have
the specifications.
The model counts the number and complexity of (in order to
estimate the size of the system):
1.
Inputs, the number of different commands the software
will accept.
2.
Outputs, how many types of information it can generate.
3.
Inquiries, how many different sorts of question a user
can ask the system.
4.
Files, how many it can cope with simultaneously.
5.
Interfaces, the number of links it can have with other
software.
Robustness Analysis
•
software with the Objectory process. The model is based on
the Function Points.
It would be of great value if a prediction of the resources
needed for these processes for a specific project could be
done early in the developing process e.g. after the
requirements analysis. With such an early estimation, one
could more easily plan and predict for the rest of the project.
When a project is set up it is very important to know, in
advance, how much resources is needed for the project to be
completed. This kind of knowledge will help us to estimate
the cost and the lead time for the project. It will also help to
plan the use of the resources. These estimations should of
course be available as soon as possible in the project.
This paper will initially survey a model for making early
estimations called Function Points. It will then present a
model for making early estimations when developing
Every one of these items are given a value as simple, average
or complex with different weights ( W i ). The Unadjusted
Function Count (UFC) is:
5
UFC = ∑ n i * W i
i =1
where n i is the number of items of variety
for the number of items 1, items 2 etc.. and
of i .
i where i stands
W i is the weight
2. These are later adjusted with a technical factor which
describes the size of the technical complexity involved in the
development and implementation of the system. The TCF is
computed as:
n
TCF = C1 + C2 ∑ Fi
i=1
where
C1 = 0.65
C 2 = 0.01
Fi is the factors valued from 0 to 5. 0 if it is irrelevant and 5
if it is essential.
The Use Case Points (UCP) are the product of these three
factors. The UCPs gives an estimation of the size of the effort
to develop the system which can be mapped to man hours 1 to
complete various phases of Objectory or complete the whole
project.
The Use Case Points can also be mapped to for example the
number of classes or LOC and from that estimate man hours
needed with help from a model like the COCOMO model.
The reason to do so is that the COCOMO model does not
approximate the mapping as linear.
The weights in this article are a first approximation by people
at Objective Systems.
The Function Points are finally:
FP = UFC* TCF
3.1 UUCP - Unadjusted Use Case Point
When analysing the result we must have a statistical value
from previous measures of how many Lines Of Code (LOC)
a function point will need to be constructed. This value is
multiplied with the number of function points of the system
and we get the total amount of LOC needed. Typically one
Function Point is 110 lines of COBOL code. This resulting
value maybe used together with the COCOMO model, see
Boehm (1981), to estimate the resources needed.
To compute the UUCPs judge every actor if it is a simple,
average or complex with help from Table 1 and every use
case with help from Table 2. Only concrete actors and use
cases are counted.
This model has been very popular recently. One of this
model's greatest advantages is that it does not require a
specific way of describing the system, for example a specific
design method. The model has been criticised and some
weakness has been noted, see Symons (1988). Some
disadvantages is that function points cannot be computed
automatically, it is not objective since you have to make
many subjective decisions. This is because every input,
output, inquire, file and interfaces must be valued as simple,
average or complex and the technical factors must be valued
by a human too. Furthermore it does not take into account
any personnel factors.
Complexity
Definition
SIMPLE
An actor is simple if it represents 1
another system with a defined
application programming
interface.
AVERAGE
An actor is average if it is:
2
1. An interaction with another
system through a protocol
2. A human interaction with a
line terminal.
COMPLEX
3. Use Case Points
Our model Use Case Points is inspired by Function Points
but with the benefit of the requirements analysis in the
Objectory process. It starts with measuring the functionality
of the system based on the use case model in a count called
Unadjusted Use Case Point (UUCP). Technical factors involved in developing this functionality are assessed, similar
to the Function Points. The last step in the estimation is
however not from the Function Points and it is a new factor
called Environmental Factor proposed by the author. This
factor seems to be very important according to experienced
Objectory users.
Weight
An actor is complex if it interacts 3
through a graphical user
interface.
Table 1 Weighted actors.
Complexity
Definition
Weight
SIMPLE
A use case is simple if it has 3 or
less transactions including
alternative courses. You should
be able to realise the use case
with less than 5 analysis objects.
5
1
The first approach is an approximation that within a given
interval from 2 000 to 20 000 man hours the map can be done
linear.
3. AVERAGE
COMPLEX
A use case is average if it has 3
to 7 transactions including
alternative courses. You should
be able to realise the use case
with 5 to 10 analysis objects.
UUCP = ∑ ni * Wi
i =1
where
n i is the number of items of variety i .
If we do not have more information about the implementation
project environment and the environment we can use the
UUCP for our estimation. Otherwise we will adjust the
UUCP to get a better estimation.
1
End user efficiency (on-line).
1
F4
Complex internal processing.
1
F5
Reusability, the code must be able to
reuse in other applications.
1
Installation ease.
0.5
F7
Operational ease, usability.
0.5
F8
Portability.
2
Changeability.
1
F10
6
Application performance objectives,
in either response or throughput.
F9
We sum the weights of the actors and the use cases together
to get the UUCP.
2
F6
Table 2 Weighted use cases.
Distributed systems.
F3
A use case is complex if it has
15
more than 7 transactions
including alternative courses.
The use case should at least need
10 analysis objects to be realised.
F1
F2
10
Concurrency.
1
F11
Special security features.
1
F12
Provide direct access for third parties
1
F13
Special user training facilities
1
Table 3 Factors contributing to complexity.
3.2 TCF - Technical Complexity Factor
Fi is a factor which is rated on a scale 0, 1, 2, 3, 4 and 5. 0
The UUCP is weighted with the technical complexity factor
(TCF), which vary depending on how difficult the system
will be to construct. The TCF is nearly the same as for
Function Points, the differences are that we have added some
and removed some of the factors and we have weighted the
factors differently based on experience in Objectory projects.
means that it is irrelevant and 5 means it is essential. If the
factor is not important nor irrelevant it will have the value 3.
If all factors have the value 3 the TCF ≈ 1.
Symons (1988) define a criterion for a technical factor:
We weight the UCP with the Environmental Factor (EF)
which help us to estimate how efficient our project is. This
factor is on the same form as the technical factor.
"A system requirement other than those concerned with
information content intrinsic to and affecting the size of the
task, but not arising from the project environment"
The constants and weights are proposed by Albrecht (1979)
but C 1 is decreased from 0.65 to 0.6 to fit with the numbers
of factors. The TCF is computed like this:
3.3 EF - Environmental Factor
The constants are early estimations. They seem to be
reasonable, based on interviews with experienced Objectory
users at Objective Systems.
8
TCF = C 1 + C 2 ∑ Fi * W i
13
EF = C1 + C2 ∑ Fi * Wi
i =1
where
where
C1 = 1.4
C1 = 0.6
C 2 = -0.03
C 2 = 0.01
and
i =1
and
Fi
Fi
Factors Contributing to Complexity
Wi
Factors contributing to efficiency
Wi
4. F1
Familiar with Objectory
1.5
4.1 The Measurements
F2
Part time workers
-1
Data from 3 projects is used to validate the approach above.
Let us call these projects A, B and C.
F3
Analyst capability
0.5
F4
Application experience
0.5
Project A
F5
Object oriented experience
1
F6
Motivation
1
F7
Difficult programming language
-1
The project developed an information system for operation
support of performance management in telecommunication
networks. A quite well defined project with only a few
people who was newcomers to Objectory, but they were very
motivated.
F8
Stable requirements
2
Table 4 Factors contributing to efficiency.
Fi is a factor which is rated on a scale 0, 1, 2, 3, 4 and 5. 0
means that it is irrelevant and 5 means it is essential. If the
factor is not important nor irrelevant it will have the value 3.
If all factors have the value 3 the EF ≈ 1.
3.4 The Result and Analysis
Finally the Use Case Points is calculated as:
UCP = UUCP * TCF * EF .
Based on the calculated UCPs we look at statistics from
earlier projects to see how much resources is needed per
UCP. After that we multiply the number of UCP for our
project with the Mean Resources needed per UCP (MR). We
also use the Standard Deviation of the MR (SDMR) to see
how good the estimations are.
Different MR and SDMR can be used for different Objectory
processes (for example specific MR and SDMR for the
robustness analysis) instead of the whole development. We
also can have different statistics for different application
domains, for example one for technical applications and one
for MIS applications, to get a better estimation.
The analysis of the result is approximated as linear. This
seems to be a good enough approximation for most of the
projects. The institution tells us that the curve should be
exponential since large projects typically have a lower
productivity in general.
// More tomorrow!!
4. Projects Estimated
The model has been applied on a few projects of various size.
The result is presented in this chapter.
Unadjusted Use Case Points:
Number of Actors: 5 average actors
Number of Use Cases: 10 average use cases
UUCP = 110
Technical Complexity Factor:
All the factors have the default value 3.
TCF = 1
Environmental Factor:
Familiar with the method = 1.
Motivation = 5.
Stable requirements = 4.
Rest of the factors have the default value 3.
EF = 0.975
UCP = UUCP * TCF * EF = 107.25
Resources Used:
Man Hours to complete the project: 2150 h.
MR project A = 2150 / 107.25 ≈ 20.0
Project B
The project developed a LAN management system. The
requirements were unstable. The developers had no previous
experience of Objectory.
Unadjusted Use Case Points:
Number of Actors: 5 average actors
Number of Use Cases: 50 average use cases
UUCP = 510
Technical Complexity Factor:
All the factors have the default value 3.
TCF = 1
Environmental Factor:
Familiar with the method = 1.
Stable requirements = 1.
Rest of the factors have the default value 3.
EF = 1.175
UCP = UUCP * TCF * EF ≈ 599.25
5. Resources Used:
Man Hours to complete the project: 12 500 h.
We can see from the plot that one UCP need 10000/540 ≈ 20
man hours to be completed.
MR project B = 12 500 / 599.25 ≈ 20.9
The linear approximation gives us:
y = α + β (x − x ), x =
Project C
The project developed a telecommunication system. The
team did not know much about Objectory.
Unadjusted Use Case Points:
Number of Actors: 5 average actors
Number of Use Cases: 15 average use cases
UUCP = 160
1
x ≈ 298.2
n∑ i
α * = y ≈ 6.683
β* =
∑ (x − x )(y − y ) ≈ 0.01981
∑ (x − x )
i
i
2
i
y = 6.683 + 0.01981(x − 298.2) = 0.01981x + 0.7769
Q0 = ∑[y i − α * − β *(x i − x )]2 ≈ 54.64
Technical Complexity Factor:
All the factors have the default value 3.
TCF = 1
s=
Environmental Factor:
Familiar with the method = 1.
Application experience = 5.
Stable requirements = 2.
Difficult programming language = 5
Object oriented experience = 2.
Rest of the factors have the default value 3.
EF = 1.175
Q0
≈ 7.392
n−2
t-distribution with the probability that the values will be
within the interval of 90 % gives us
t α / 2 (n − 2) = t 0.05 (1) = 6.31
the interval for
α:
s
= 4.268
n
I α = ( α * ±t 0. 05 (1) * d) ≈ 6.683 ± 7.348 ⇒
d=
UCP = UUCP * TCF * EF ≈ 188.0
Resources Used:
Man Hours to complete the project: 5400 h.
[−0.665,14.03]
MR project C = 5400 / 188 ≈ 28.7
the interval for
β:
s
4.2 The Result
d=
The result is that there seems to be a correlation between
UCPs and resources needed to finish a project. In figure 1
you can see the UCP plotted against the number of man hours
for the projects mentioned here.
I α = (β * ±t 0.05 (1) * d) ≈ 0.01981 ± 0.1250 ⇒
[−0.1052,0.1448]
∑ (x i − x )2
≈ 0.01981
The intervals are much too wide to tell us anything. That is
because the number of measured projects were too few.
Th o u s a n d s o f
ma n h o u rs
P ro ject B
14
12
5. Conclusions
10
8
6
² ma n h o u rs =
10 0 0 0
P ro ject C
4
2
P ro ject A
10 0
200
² TS T = 5 40
300
40 0
500
600
U s e Ca s e P o in ts
(U CP )
Figure 1 Man hours as a function of UCPs.
We don not know if the model for estimating Objectory
projects based on the use case model and some adjustments
for technical and environmental factors works. That is
because we have a too few projects to measure from so far.
The work will continue with metrics from many more
projects, because we still believe that this model could work
a an estimation method.
6. 6. Further Work
More data from projects need to be gathered, to adjust the
model, weights and the constants. Objective Systems will
have a database where everybody who is using Objectory in a
project can send their data based on experience.
Anonymously will be guaranteed if requested. Please, send
an email to magnus@os.se with the form in appendix A filled
in.
References
Albrecht A. J. (1979). Measuring application development
productivity. Proc. of IBM Applic. Dev. Joint
SHARE/GUIDE Symposium, Monterey, CA, 1979, pp. 8392.
Boehm B. W. (1981). Software engineering economics:
Prentice-Hall, New York, 1981.
Jacobson I., Christerson M., Jonsson P. and Övergaard G.
(1992). Object-Oriented Software Engineering: AddisonWesley.
Symons C. R. (1988). Function Point Analysis : Difficulties
and improvements. IEEE Transactions on Software
Engineering, Vol. SE-14, No. 1. Jan. 1988.
7. Appendix A: Metrics Used to Adjust the Estimation Model
Actors
Complexity
Definition
SIMPLE
An actor is simple if it represents
another system with a defined
application programming
interface.
AVERAGE
Number of
An actor is average if it is:
1. An interaction with another
system through a protocol
2. A human interaction with a
line terminal.
COMPLEX
An actor is complex if it interacts
through a graphical user
interface.
Uses Cases
Complexity
Definition
SIMPLE
A use case is simple if it has 3 or
less transactions including
alternative courses. You should
be able to realise the use case
with less than 5 analysis objects.
AVERAGE
A use case is average if it has 3
to 7 transactions including
alternative courses. You should
be able to realise the use case
with 5 to 10 analysis objects.
COMPLEX
A use case is complex if it has
more than 7 transactions
including alternative courses.
The use case should at least need
10 analysis objects to be realised.
Technical Complexity Factor
Number of
8. (If you can not fill in this factors for any reason use the default value 3 for every factors)
Fi
Factors Contributing to Complexity
F1
Distributed systems.
F2
Application performance objectives,
in either response or throughput.
F3
End user efficiency (on-line).
F4
Complex internal processing.
F5
Reusability, the code must be able to
reuse in other applications.
F6
Installation ease.
F7
Operational ease, usability.
F8
Portability.
F9
Changeability.
F10
Concurrency.
F11
Special security features.
F12
Provide direct access for third parties
F13
Special user training facilities
0..5
Environmental Factor
(If you can not fill in this factors for any reason use the default value 3 for every factors)
Fi
Factors contributing to efficiency
F1
Familiar with Objectory
F2
Part time workers
F3
Analyst capability
F4
Application experience
F5
Object oriented experience
F6
Motivation
F7
Difficult programming language
F8
Stable requirements
0..5
Resources used
Process
Man Hours