Levels of conceptual interoperability model is used to develop the method and model towards enhancing interoperability among mobile apps. The LCIM is used as descriptive and prescriptive form and it also make available of both metric of the degree of conceptual representation that exists between interoperating systems. In descriptive form LCIM is used to decrease the discrepancies in rating mobile apps based on content by suggesting a rating system that is completely based on interoperability. In the prescriptive form it receives information for app development, which allows producing apps with prominent level of interoperability. The Levels of Conceptual Interoperability has the abstract backbone for developing and implementing an interoperability framework that supports to exchange of XML based languages used by M&S systems across the web
Graph-Based Algorithm for a User-Aware SaaS Approach: Computing Optimal Distr...IJERA Editor
As a tool to exploit economies of scale, Software as a Service cloud models promote Multi-Tenancy which is the notion of sharing instances among a large group of tenants. However, Multi-Tenancy only satisfies requirements that are common to all tenants as well as the fact that tenants themselves hesitate about sharing. In a try to solve this problem, the present paper propose a User-Aware approach for Software as a Service models using Rich-Variant Components. The main contribution of this approach is a framework summarized in a graphbased algorithm enabling deduction of an optimal distribution of instances on application's tenants. To illustrate and evaluate the framework, the approach is applied on a Software as a Service Application for private school management
Semantic web based software engineering by automated requirements ontology ge...IJwest
This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA). The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may
be software development, mechanics, construction, psychology and so on. These demarcations work fine
as long as the requirements are within one discipline. However, if the project extends over several
disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature.
Predicting the performance of web services during early stages of software development is significant. In
Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for
predicting the performance from UML models is available. Hence, In this paper, a methodology for
developing Use Case model and Activity model from User Interface is presented. The methodology is
illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UI ijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
Graph-Based Algorithm for a User-Aware SaaS Approach: Computing Optimal Distr...IJERA Editor
As a tool to exploit economies of scale, Software as a Service cloud models promote Multi-Tenancy which is the notion of sharing instances among a large group of tenants. However, Multi-Tenancy only satisfies requirements that are common to all tenants as well as the fact that tenants themselves hesitate about sharing. In a try to solve this problem, the present paper propose a User-Aware approach for Software as a Service models using Rich-Variant Components. The main contribution of this approach is a framework summarized in a graphbased algorithm enabling deduction of an optimal distribution of instances on application's tenants. To illustrate and evaluate the framework, the approach is applied on a Software as a Service Application for private school management
Semantic web based software engineering by automated requirements ontology ge...IJwest
This paper presents an approach for automated generation of requirements ontology using UML diagrams in service-oriented architecture (SOA). The goal of this paper is to convenience progress of software engineering processes like software design, software reuse, service discovering and etc. The proposed method is based on a four conceptual layers. The first layer includes requirements achieved by stakeholders, the second one designs service-oriented diagrams from the data in first layer and extracts XMI codes of them. The third layer includes requirement ontology and protocol ontology to describe behavior of services and relationships between them semantically. Finally the forth layer makes standard the concepts exists in ontologies of previous layer. The generated ontology exceeds absolute domain ontology because it considers the behavior of services moreover the hierarchical relationship of them. Experimental results conducted on a set of UML4Soa diagrams in different scopes demonstrate the improvement of the proposed approach from different points of view such as: completeness of requirements ontology, automatic generation and considering SOA.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may
be software development, mechanics, construction, psychology and so on. These demarcations work fine
as long as the requirements are within one discipline. However, if the project extends over several
disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature.
Predicting the performance of web services during early stages of software development is significant. In
Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for
predicting the performance from UML models is available. Hence, In this paper, a methodology for
developing Use Case model and Activity model from User Interface is presented. The methodology is
illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UI ijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may
be software development, mechanics, construction, psychology and so on. These demarcations work fine
as long as the requirements are within one discipline. However, if the project extends over several
disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature.
Predicting the performance of web services during early stages of software development is significant. In
Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for
predicting the performance from UML models is available. Hence, In this paper, a methodology for
developing Use Case model and Activity model from User Interface is presented. The methodology is
illustrated with a case study on Amazon.com
Adaptive guidance model based similarity for software process development pro...ijseajournal
This paper describes a modeling approach SAGM (Similarity for Adaptive Guidance Model) that provides
adaptive and recursive guidance for software process development. This approach, in accordance to
developer needs, allows specific tailored guidance regarding the profile of developers. A profile is partially
or completely defined from a model of developers, through their roles, their qualifications, and through the
relationships between the context of the current activity and the model of the defined activities. This
approach aims to define the generic profile of development context and a similarity measure that evaluates
the similarities between the profiles created from the model of developers and those of the development
team involved in the execution of a software process. This is to identify the profiles classification and to
deduce the appropriate type of assistance to developers (that can be corrective, constructive or specific).
Systems variability modeling a textual model mixing class and feature conceptsijcsit
System’s reusability and cost are very important in software product line design area. Developers’ goal is
to increase system reusability and decreasing cost and efforts for building components from scratch for
each software configuration. This can be reached by developing software product line (SPL). To handle
SPL engineering process, several approaches with several techniques were developed. One of these
approaches is called separated approach. It requires separating the commonalities and variability for
system’s components to allow configuration selection based on user defined features. Textual notationbased
approaches have been used for their formal syntax and semantics to represent system features and
implementations. But these approaches are still weak in mixing features (conceptual level) and classes
(physical level) that guarantee smooth and automatic configuration generation for software releases. The
absence of methodology supporting the mixing process is a real weakness. In this paper, we enhanced
SPL’s reusability by introducing some meta-features, classified according to their functionalities. As a first
consequence, mixing class and feature concepts is supported in a simple way using class interfaces and
inherent features for smooth move from feature model to class model. And as a second consequence, the
mixing process is supported by a textual design and implementation methodology, mixing class and feature
models by combining their concepts in a single language. The supported configuration generation process
is simple, coherent, and complete.
TOWARDS CUSTOMIZED SMART GOVERNMENT QUALITY MODELijseajournal
Smart government is the next generation of the e-government that touching people closely in the
perception of service quality. Although, the existing a varies of models that able to measure the level of
normal quality, but there is a lack of the models that most needed for measuring the quality of Smart
Government Services However, to build a smart government, it is crucial to take the quality into
consideration. This paper aims o customized quality model for smart government. Building such quality
model will be based on the available software quality models for smart government portals. To achieve the
aims of the research, it was critical to analysis and obtain the intersection of the variable and sub variable
form the key related models (McCall’s,Boehm, Dromey, FURPS and the ISO 9126 Quality Model. It will
consist of the most appropriate and related quality characteristics and sub characteristics. The key finding
has indicated the importance of conducting practical study for proposing novel model for these purposes
The objective of this paper is to provide an insight preview into various
agent oriented methodologies by using an enhanced comparison
framework based on criteria like process related criteria, steps and
techniques related criteria, steps and usability criteria, model related or
“concepts” related criteria, comparison regarding model related criteria
and comparison regarding supportive related criteria. The result also
constitutes inputs collected from the users of the agent oriented
methodologies through a questionnaire based survey.
Variability modeling for customizable saas applicationsijcsit
Most of current Software-as-a-Service (SaaS) applications are developed as customizable service-oriented
applications that serve a large number of tenants (users) by one application instance. The current rapid
evolution of SaaS applications increases the demand to study the commonality and variability in software
product lines that produce customizable SaaS applications. During runtime, Customizability is required to
achieve different tenants’ requirements. During the development process, defining and realizing
commonalty and variability in SaaS applications’ families is required to develop reusable, flexible, and
customizable SaaS applications at lower costs, in shorter time, and with higher quality. In this paper,
Orthogonal Variability Model (OVM) is used to model variability in a separated model, which is used to
generate simple and understandable customization model. Additionally, Service oriented architecture
Modeling Language (SoaML) is extended to define and realize commonalty and variability during the
development of SaaS applications.
Bio-Inspired Requirements Variability Modeling with use Case ijseajournal
Background.Feature Model (FM) is the most important technique used to manage the variability through products in Software Product Lines (SPLs). Often, the SPLs requirements variability is by using variable use case modelwhich is a real challenge inactual approaches: large gap between their concepts and those of real world leading to bad quality, poor supporting FM, and the variability does not cover all requirements modeling levels. Aims. This paper proposes a bio-inspired use case variability modeling methodology dealing with the above shortages.
Method. The methodology is carried out through variable business domain use case meta modeling,
variable applications family use case meta modeling, and variable specific application use case generating.
Results. This methodology has leaded to integrated solutions to the above challenges: it decreases the gap
between computing concepts and real world ones. It supports use case variability modeling by introducing
versions and revisions features and related relations. The variability is supported at three meta levels
covering business domain, applications family, and specific application requirements.
Conclusion. A comparative evaluation with the closest recent works, upon some meaningful criteria in the
domain, shows the conceptual and practical great value of the proposed methodology and leads to
promising research perspectives
A metric based approach for measuring the conceptual integrity of software ar...ijseajournal
Software architectures evaluation has an important role in the life cycle of software systems. The
conceptual integrity is one of the quality attributes which could be closely related to software architectural
design. It is the underlying theme or vision that unifies all levels of the system's design. In this paper, a
method for measuring the conceptual integrity of software architecture is provided. Conceptual integrity
measurement is done in several steps by extracting a graph structure which its nodes are architectural
concepts and its edges are relationship between them. The constructed graph is then weighted according to
the type of relationship among the architectural concepts. Finally, a metric for evaluating the conceptual
integrity from the refined graph is provided.
In this paper we present an approach of Model Versioning and Model Repository in context of Living
Models view. The idea of Living Models is a step forward from Model Based Software Development
(MBSD) in a sense that there is tight coupling between various artifacts of software development process.
These artifacts include System Models, Test Models, Executable artifacts etc. We explore the issues of
storage (import/export) of model elements into repository, inputs of cross link information, version
management and system analysis. The modeling environment in which these issues will be discussed is a
heterogeneous modeling environment, where different models types and different modeling tools are used
in the development process. An overview of the tool architecture is also presented..
An Adaptive Service Choreography approach based on Ontology-Driven Policy Ref...dannyijwest
Business corporations usually require choreography of services to be dynamic and adaptable. One way
for answering this demand is to develop the services having dynamic behaviours. However, it is not
enough and their behaviours must be composed dynamically too.
The current model such as WS-CDL has a static structure to specify choreography and is not able to
describe the choreography of services in a dynamic fashion. From another view, there are various types
of changes that each needs to be handled diversely. This work is going to bring adaptability into service
choreography model in response to policy changes. Precisely, this target will be met when choreography
model has dynamic structure and also the policy changes can be automatically (semi-automatically)
refined and propagated into the elements of model. Our proposal is describing choreography model on
the basis policy-enabled UML state machine which the policy can be refined through a ontology based
process
In the last few years user modeling has become an important research area in Human Computer Interaction. A large amount of research has been conducted in this field where different approaches on user modeling are shown. In this paper, we provide an overview of the field of user modeling and describes
the different user model namely, GOMS family of models, cognitive architecture, grammar-based model, and application specific models. We have discussed a few examples of user models in each category. This paper also discusses the future challenges of this research area.
BIO-INSPIRED REQUIREMENTS VARIABILITY MODELING WITH USE CASE mathsjournal
Background.Feature Model (FM) is the most important technique used to manage the variability through
products in Software Product Lines (SPLs). Often, the SPLs requirements variability is by using variable
use case modelwhich is a real challenge inactual approaches: large gap between their concepts and those of
real world leading to bad quality, poor supporting FM, and the variability does not cover all requirements
modeling levels.
Facial Feature Recognition Using Biometricsijbuiiir1
Face recognition is one of the few biometric methods that possess the merits of both high accuracy and low intrusiveness. Biometric requires no physical interaction on behalf of the user. Biometric allows to perform passive identification in a one to many environments. Passwords and PINs are hard to remember and can be stolen or guessed; cards, tokens, keys and the like can be misplaced, forgotten, purloined or duplicated; magnetic cards can become corrupted and unreadable. However individuals biological traits cannot be misplaced, forgotten, stolen or forged.
Partial Image Retrieval Systems in Luminance and Color Invariants : An Empiri...ijbuiiir1
Color of the surface is one of the most imperative characteristics in the process of recognition as well as classification of the object which is based on camera. On the other hand, color of the object sometimes differs a lot due to the difference in illumination as well as the conditions of the surface. Utilization of the diverse features of the color gets impeded due to such variations. However, Characterization of the color of the object is possible with a controlling tool known as color invariants without considering the factors such asillumination and conditions of the surface. In the research proposal, analysis has been done on the estimation procedure related to RGB images color invariants. Object color is an imperative descriptor that can find the corresponding matching object in applications based on image matching as well as search, like- Object searching based on template and CBIR otherwise known as Content Based Image Retrieval. But, many times it has been observed that the apparent color of different objects gets varied significantly due to illumination, conditions of the surface as well as observation (Finlayson et al., 1996)
More Related Content
Similar to Enhancing Effective Interoperability Between Mobile Apps Using LCIM Model
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may
be software development, mechanics, construction, psychology and so on. These demarcations work fine
as long as the requirements are within one discipline. However, if the project extends over several
disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature.
Predicting the performance of web services during early stages of software development is significant. In
Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for
predicting the performance from UML models is available. Hence, In this paper, a methodology for
developing Use Case model and Activity model from User Interface is presented. The methodology is
illustrated with a case study on Amazon.com
Adaptive guidance model based similarity for software process development pro...ijseajournal
This paper describes a modeling approach SAGM (Similarity for Adaptive Guidance Model) that provides
adaptive and recursive guidance for software process development. This approach, in accordance to
developer needs, allows specific tailored guidance regarding the profile of developers. A profile is partially
or completely defined from a model of developers, through their roles, their qualifications, and through the
relationships between the context of the current activity and the model of the defined activities. This
approach aims to define the generic profile of development context and a similarity measure that evaluates
the similarities between the profiles created from the model of developers and those of the development
team involved in the execution of a software process. This is to identify the profiles classification and to
deduce the appropriate type of assistance to developers (that can be corrective, constructive or specific).
Systems variability modeling a textual model mixing class and feature conceptsijcsit
System’s reusability and cost are very important in software product line design area. Developers’ goal is
to increase system reusability and decreasing cost and efforts for building components from scratch for
each software configuration. This can be reached by developing software product line (SPL). To handle
SPL engineering process, several approaches with several techniques were developed. One of these
approaches is called separated approach. It requires separating the commonalities and variability for
system’s components to allow configuration selection based on user defined features. Textual notationbased
approaches have been used for their formal syntax and semantics to represent system features and
implementations. But these approaches are still weak in mixing features (conceptual level) and classes
(physical level) that guarantee smooth and automatic configuration generation for software releases. The
absence of methodology supporting the mixing process is a real weakness. In this paper, we enhanced
SPL’s reusability by introducing some meta-features, classified according to their functionalities. As a first
consequence, mixing class and feature concepts is supported in a simple way using class interfaces and
inherent features for smooth move from feature model to class model. And as a second consequence, the
mixing process is supported by a textual design and implementation methodology, mixing class and feature
models by combining their concepts in a single language. The supported configuration generation process
is simple, coherent, and complete.
TOWARDS CUSTOMIZED SMART GOVERNMENT QUALITY MODELijseajournal
Smart government is the next generation of the e-government that touching people closely in the
perception of service quality. Although, the existing a varies of models that able to measure the level of
normal quality, but there is a lack of the models that most needed for measuring the quality of Smart
Government Services However, to build a smart government, it is crucial to take the quality into
consideration. This paper aims o customized quality model for smart government. Building such quality
model will be based on the available software quality models for smart government portals. To achieve the
aims of the research, it was critical to analysis and obtain the intersection of the variable and sub variable
form the key related models (McCall’s,Boehm, Dromey, FURPS and the ISO 9126 Quality Model. It will
consist of the most appropriate and related quality characteristics and sub characteristics. The key finding
has indicated the importance of conducting practical study for proposing novel model for these purposes
The objective of this paper is to provide an insight preview into various
agent oriented methodologies by using an enhanced comparison
framework based on criteria like process related criteria, steps and
techniques related criteria, steps and usability criteria, model related or
“concepts” related criteria, comparison regarding model related criteria
and comparison regarding supportive related criteria. The result also
constitutes inputs collected from the users of the agent oriented
methodologies through a questionnaire based survey.
Variability modeling for customizable saas applicationsijcsit
Most of current Software-as-a-Service (SaaS) applications are developed as customizable service-oriented
applications that serve a large number of tenants (users) by one application instance. The current rapid
evolution of SaaS applications increases the demand to study the commonality and variability in software
product lines that produce customizable SaaS applications. During runtime, Customizability is required to
achieve different tenants’ requirements. During the development process, defining and realizing
commonalty and variability in SaaS applications’ families is required to develop reusable, flexible, and
customizable SaaS applications at lower costs, in shorter time, and with higher quality. In this paper,
Orthogonal Variability Model (OVM) is used to model variability in a separated model, which is used to
generate simple and understandable customization model. Additionally, Service oriented architecture
Modeling Language (SoaML) is extended to define and realize commonalty and variability during the
development of SaaS applications.
Bio-Inspired Requirements Variability Modeling with use Case ijseajournal
Background.Feature Model (FM) is the most important technique used to manage the variability through products in Software Product Lines (SPLs). Often, the SPLs requirements variability is by using variable use case modelwhich is a real challenge inactual approaches: large gap between their concepts and those of real world leading to bad quality, poor supporting FM, and the variability does not cover all requirements modeling levels. Aims. This paper proposes a bio-inspired use case variability modeling methodology dealing with the above shortages.
Method. The methodology is carried out through variable business domain use case meta modeling,
variable applications family use case meta modeling, and variable specific application use case generating.
Results. This methodology has leaded to integrated solutions to the above challenges: it decreases the gap
between computing concepts and real world ones. It supports use case variability modeling by introducing
versions and revisions features and related relations. The variability is supported at three meta levels
covering business domain, applications family, and specific application requirements.
Conclusion. A comparative evaluation with the closest recent works, upon some meaningful criteria in the
domain, shows the conceptual and practical great value of the proposed methodology and leads to
promising research perspectives
A metric based approach for measuring the conceptual integrity of software ar...ijseajournal
Software architectures evaluation has an important role in the life cycle of software systems. The
conceptual integrity is one of the quality attributes which could be closely related to software architectural
design. It is the underlying theme or vision that unifies all levels of the system's design. In this paper, a
method for measuring the conceptual integrity of software architecture is provided. Conceptual integrity
measurement is done in several steps by extracting a graph structure which its nodes are architectural
concepts and its edges are relationship between them. The constructed graph is then weighted according to
the type of relationship among the architectural concepts. Finally, a metric for evaluating the conceptual
integrity from the refined graph is provided.
In this paper we present an approach of Model Versioning and Model Repository in context of Living
Models view. The idea of Living Models is a step forward from Model Based Software Development
(MBSD) in a sense that there is tight coupling between various artifacts of software development process.
These artifacts include System Models, Test Models, Executable artifacts etc. We explore the issues of
storage (import/export) of model elements into repository, inputs of cross link information, version
management and system analysis. The modeling environment in which these issues will be discussed is a
heterogeneous modeling environment, where different models types and different modeling tools are used
in the development process. An overview of the tool architecture is also presented..
An Adaptive Service Choreography approach based on Ontology-Driven Policy Ref...dannyijwest
Business corporations usually require choreography of services to be dynamic and adaptable. One way
for answering this demand is to develop the services having dynamic behaviours. However, it is not
enough and their behaviours must be composed dynamically too.
The current model such as WS-CDL has a static structure to specify choreography and is not able to
describe the choreography of services in a dynamic fashion. From another view, there are various types
of changes that each needs to be handled diversely. This work is going to bring adaptability into service
choreography model in response to policy changes. Precisely, this target will be met when choreography
model has dynamic structure and also the policy changes can be automatically (semi-automatically)
refined and propagated into the elements of model. Our proposal is describing choreography model on
the basis policy-enabled UML state machine which the policy can be refined through a ontology based
process
In the last few years user modeling has become an important research area in Human Computer Interaction. A large amount of research has been conducted in this field where different approaches on user modeling are shown. In this paper, we provide an overview of the field of user modeling and describes
the different user model namely, GOMS family of models, cognitive architecture, grammar-based model, and application specific models. We have discussed a few examples of user models in each category. This paper also discusses the future challenges of this research area.
BIO-INSPIRED REQUIREMENTS VARIABILITY MODELING WITH USE CASE mathsjournal
Background.Feature Model (FM) is the most important technique used to manage the variability through
products in Software Product Lines (SPLs). Often, the SPLs requirements variability is by using variable
use case modelwhich is a real challenge inactual approaches: large gap between their concepts and those of
real world leading to bad quality, poor supporting FM, and the variability does not cover all requirements
modeling levels.
Similar to Enhancing Effective Interoperability Between Mobile Apps Using LCIM Model (20)
Facial Feature Recognition Using Biometricsijbuiiir1
Face recognition is one of the few biometric methods that possess the merits of both high accuracy and low intrusiveness. Biometric requires no physical interaction on behalf of the user. Biometric allows to perform passive identification in a one to many environments. Passwords and PINs are hard to remember and can be stolen or guessed; cards, tokens, keys and the like can be misplaced, forgotten, purloined or duplicated; magnetic cards can become corrupted and unreadable. However individuals biological traits cannot be misplaced, forgotten, stolen or forged.
Partial Image Retrieval Systems in Luminance and Color Invariants : An Empiri...ijbuiiir1
Color of the surface is one of the most imperative characteristics in the process of recognition as well as classification of the object which is based on camera. On the other hand, color of the object sometimes differs a lot due to the difference in illumination as well as the conditions of the surface. Utilization of the diverse features of the color gets impeded due to such variations. However, Characterization of the color of the object is possible with a controlling tool known as color invariants without considering the factors such asillumination and conditions of the surface. In the research proposal, analysis has been done on the estimation procedure related to RGB images color invariants. Object color is an imperative descriptor that can find the corresponding matching object in applications based on image matching as well as search, like- Object searching based on template and CBIR otherwise known as Content Based Image Retrieval. But, many times it has been observed that the apparent color of different objects gets varied significantly due to illumination, conditions of the surface as well as observation (Finlayson et al., 1996)
Applying Clustering Techniques for Efficient Text Mining in Twitter Dataijbuiiir1
Knowledge is the ultimate output of decisions on a dataset. The revolution of the Internet has made the global distance closer with the touch on the hand held electronic devices. Usage of social media sites have increased in the past decades. One of the most popular social media micro blog is Twitter. Twitter has millions of users in the world. In this paper the analysis of Twitter data is performed through the text contained in hash tags. After Preprocessing clustering algorithms are applied on text data. The different clusters formed are compared through various parameters. Visualization techniques are used to portray the results from which inferences like time series and topic flow can be easily made. The observed results show that the hierarchical clustering algorithm performs better than other algorithms.
A Study on the Cyber-Crime and Cyber Criminals: A Global Problemijbuiiir1
Today, Cybercrime has caused lot of damages to individuals, organizations and even the Government. Cybercrime detection methods and classification methods have came up with varying levels of success for preventing and protecting data from such attacks. Several laws and methods have been introduced in order to prevent cybercrime and the penalties are laid down to the criminals. However, the study shows that there are many countries facing this problem even today and United States of America is leading with maximum damage due to the cybercrimes over the years. According to the recent survey carried out it was noticed that year 2013 saw the monetary damage of nearly 781.84 million U.S. dollars. This paper describes about the common areas where cybercrime usually occurs and the different types of cybercrimes that are committed today. The paper also shows the studies made on e-mail related crimes as email is the most common medium through which the cybercrimes occur. In addition, some of the case studies related to cybercrimes are also laid down
Vehicle to Vehicle Communication of Content Downloader in Mobileijbuiiir1
The content downloading is internet based service and the expectation of this services are highly popular in wireless communication it will be supporting for road side communication. We are focusing in content downloading system for both infrastructure-to-vehicle and vehicle-to-vehicle communication. The goal to improving system throughput and formulating a max-flow problem including the channel contention and data transfer paradigm. A system communication while transferring the files or downloading some application in road side environment there is the possibility of getting disconnected. The purpose of this study used to avoid the in conventional connection at the road side environment while using system or mobile based internet connection used for content or file downloader using MILP(Mixed Integer Linear programming) for max flow problem. The bounding box technique will be used to get the proper signal from base station. To avoid the traffic and access the quick response from the server the bounding box will used. The mail goal of the mobility management service is to trace the location where the subscribers are, allowing calls, SMS and other mobile phone services to be delivered to them. First we can analysing the data and select for correct location.. It will be provide challenging in vehicular networks, that is the transmission speed of the nodes will even more efficient though the area surrounded of buildings and many other architectural infrastructures of the radio signal.
SPOC: A Secure and Private-pressuring Opportunity Computing Framework for Mob...ijbuiiir1
Today we have an abundant increase in the development of Information and Technology, which inturn made the Humans body even to carry a Mini- Computer in their Palms with Screen touch, Ex: Smart phone�s & Tablets etc., and parallel with the rich Enhancement in the Wireless Body Sensor Units, it is quite useful to the Enrichment of the Medical Treatment to be perfectly useful, comfortable via Smart Phones Using the networks (2G & 3G) carriers and made the treatment very easy even to the Common person in the society with the low cost money. With these the healthcare Authorities can treat the Patients (medical users) remotely where the patients reside at home or company or school or college or anywhere or at various places they work. This type of a treatment called for MHealthcare (Mobile- Healthcare). Although in the mhealthcare service there are many security and data Private problems to be overcome. Here we have A Secure and Private- Preserving Opportunistic Computing Framework called M-HealthCare, for Mobile-Healthcare Emergency. Using the Smart phone and SPOC, the Software or Hardware like computing power and energy can be gathered opportunistically to process the intensive Personal Health Information (PHI) of the medical user when he/she is in critical situation with minimal Private Disclosure. And also we introduce an efficient usercentric Private access control in SPOC Framework which is based on attribute access control and a new privatepreserving scalar product computation (PPSPC) technique and Makes a medical user (patient) to participate in opportunistic computing in transmitting his PHI data. Elaborated security analysis describes that the proposed SPOC framework can efficiently achieve usercentric Private access control in M-Healthcare emergency. In this paper we introduce Private-Preserving Support for Mobile Healthcare using Message Digest where we have used MD5 algorithm ,which can certainly achieves an efficient way and minimizes the memory consumed and the large amount of PHI data of the medical user (patient) is reduced to a fixed amount of size compared to AES which parallels increases the Speed of the data to be sent to TA without any delay which in-turn the professionals at Healthcare center can get exactly the Recent tablet user PHI data and can save their lives in correct time. As the algorithm is provided tight security in transmitting the patients PHI to TA. In respective performance evaluations with extensive simulations explains the MD (message digest) effectiveness in-term of providing high-reliable Personal Health Information (PHI) process and transmission while reducing the Private disclosure during Mobile-Healthcare emergency
A Survey on Implementation of Discrete Wavelet Transform for Image Denoisingijbuiiir1
Image Denoising has been a well studied problem in the field of image processing. Images are often received in defective conditions due to poor scanning and transmitting devices. Consequently, it creates problems for the subsequent process to read and understand such images. Removing noise from the original signal is still a challenging problem for researchers because noise removal introduces artifacts and causes blurring of the images. There have been several published algorithms and each approach has its assumptions, advantages, and limitations. This paper deals with using discrete wavelet transform derived features used for digital image texture analysis to denoise an image even in the presence of very high ratio of noise. Image Denoising is devised as a regression problem between the noise and signals, therefore, Wavelets appear to be a suitable tool for this task, because they allow analysis of images at various levels of resolution.
A Study on migrated Students and their Well - being using Amartya Sens Functi...ijbuiiir1
This paper deals with the multidimensional analysis of well-being from the theoretical point of views suggested by Dr. Amartya Sen. Sens Functioning Multidimensional Approach is broadly recognized as one of the most satisfying approaches to well-being. The Capability approach and the Functioning approach of Sen have found relatively many pragmatic applications mainly for its strong informational and methodological requirements. An attempt has been made to realize a multidimensional assessment of Sens concept of wellbeing with the use of the Fuzzy Set theory. The methodology is applied to the evaluative space of Functionings, with an experimental application to migrated students studying in Chennai, Tamil Nadu.
Methodologies on user Behavior Analysis and Future Request Prediction in Web ...ijbuiiir1
Web Usage Mining is a kind of web mining which provides knowledge about user navigation behavior and gets the interesting patterns from web. Web usage mining refers to the mechanical invention and scrutiny of patterns in click stream and linked data treated as a consequence of user interactions with web resources on one or more web sites. Identify the need and interest of the user and its useful for upgrade web Sources. Web site developers they can update their web site according to their attention. In this paper discuss about the different types of Methodologies which has been carried out in previous research work for Discovering User Behavior and Predicting the Future Request.
Innovative Analytic and Holistic Combined Face Recognition and Verification M...ijbuiiir1
Automatic recognition and verification of human faces is a significant problem in the development and application of Human Computer Interaction (HCI).In addition, the demand for reliable personal identification in computerized access control has resulted in an increased interest in biometrics to replace password and identification (ID) card. Over the last couple of years, face recognition researchers have been developing new techniques fuelled by the advances in computer vision techniques, Design of computers, sensors and in fast emerging face recognition systems. In this paper, a Face Recognition and Verification System has been designed which is robust to variations of illumination, pose and facial expression but very sensitive to variations of the features of the face. This design reckons in the holistic or global as well as the analyticor geometric features of the face of the human beings. The global structure of the human face is analysed by Principal Component Analysis while the features of the local structure are computed considering the geometric features of the face such as the eyes, nose and the mouth. The extracted local features of the face are trained and later tested using Artificial Neural Network (ANN). This combined approach of the global and the local structure of the face image is proved very effective in the system we have designed as it has a correct recognition rate of over 90%.
Deployment of Intelligent Transport Systems Based on User Mobility to be Endo...ijbuiiir1
The emerging increase in vehicles and very high traffic, demands the need for improved Intelligent Transport Systems (ITS). The available ITSs do not meet all the requirements of the present day situation in providing safetravels and avoidance of congestionin spite of its limitations on road. Intelligent Transport Systemsrequiremore research and implementation of better solutions on the traffic network with increased mobility and more rapid acquisition of data by sense network technology. In this paper a review is made on the present ITS where research is required so that improvement in the course of implementing reality mining can enhance the behavior of ITS. This will breed a forward leap in the improvement of safety and convenience of personal and commercial travel and in turn guarantee an ultimate drop in fatality in the society
Stock Prediction Using Artificial Neural Networksijbuiiir1
Accurate prediction of stock price movements is highly challenging and significant topic for investors. Investors need to understand that stock price data is the most essential information which is highly volatile, non-linear, and non-parametric and are affected by many uncertainties and interrelated economic and political factors across the globe. Artificial Neural Networks (ANN) have been found to be an efficient tool in modeling stock prices and quite a large number of studies have been done on it. In this paper ANN modeling of stock prices of selected stocks under BSE is attempted to predict closing prices. The network developed consists of an input layer, one hidden layer and an output layer and the inputs being opening price, high, low, closing price and volume. Mean Absolute Percentage Error, Mean Absolute Deviation and Root Mean Square Error are used as indicators of performance of the networks. This paper is organized as follows. In the first section, the adaptability of ANN in stock prediction is discussed. In section two, we justify the using of ANNs in forecasting stock prices. Section three gives the literature review on the applications of ANNs in predicting the stock prices. Section four gives an overview of artificial neural networks. Section five presents the methodology adopted. Section six gives the simulation and performance analysis. Last section concludes with future direction of the study
Indian Language Text Representation and Categorization Using Supervised Learn...ijbuiiir1
India is the home of different languages, due to its cultural and geographical diversity. The official and regional languages of India play an important role in communication among the people living in the country. In the Constitution of India, a provision is made for each of the Indian states to choose their own official language for communicating at the state level for official purpose. In the eighth schedule as of May 2008, there are 22 official languages in India.The availability of constantly increasing amount of textual data of various Indian regional languages in electronic form has accelerated. So the Classification of text documents based on languages is essential. The objective of the work is the representation and categorization of Indian language text documents using text mining techniques. South Indian language corpus such as Kannada, Tamil and Telugu language corpus, has been created. Several text mining techniques such as naive Bayes classifier, k-Nearest-Neighbor classifier and decision tree for text categorization have been used.There is not much work done in text categorization in Indian languages. Text categorization in Indian languages is challenging as Indian languages are very rich in morphology. In this paper an attempt has been made to categories Indian language text using text mining algorithms
Highly Secured Online Voting System (OVS) Over Networkijbuiiir1
Internet voting systems have gained popularity and have been used for government elections and referendums in the United Kingdom, Estonia and Switzerland as well as municipal elections in Canada and party primary elections in the United States. Voting system can involve transmission of ballots and votes via private computer networks or the Internet. Electronic voting technology can speed the counting of ballots and can provide improved accessibility for disabled voters. The aim of this paper is to people who have citizenship of India and whose age is above 18 years and of any sex can give their vote through online without going to any physical polling station. Election Commission Officer (Election Commission Officer who will verify whether registered user and candidates are authentic or not) to participate in online voting. This online voting system is highly secured, and its design is very simple, ease of use and also reliable. The proposed software is developed and tested to work on Ethernet and allows online voting. It also creates and manages voting and an election detail as all the users must login by user name and password and click on his favorable candidates to register vote. This will increase the voting percentage in India. By applying high security it will reduce false votes.
Software Developers Performance relationship with Cognitive Load Using Statis...ijbuiiir1
The success of the software development highly appreciated with the intellectual capital instead of physical assets of the concern. Human resource is the challenging resource in the software development industry to meet the customer requirement and deliver the project on time to the client. The software industry requires multi-skill and dynamic performers to meet the challenges. The skills domain knowledge and the developer�s performance are considered as the potential key factors for the success of the delivery of projects. The developer�s performance is influenced with cognitive factors and its measures. This study aimed to relate developer�s performance in the software industry with his/her cognitive workload. The various statistical measures like correlation, regression, variance and standard deviations are to be calculated for the developer�s performance with the cognitive load. The real-time development sector observations made of around 250 employees, 15 projects work and its corresponding cognitive load such as physical ability, mental ability, temporal ability, effort, frustration and performance in Web application, Database application and Multimedia. This paper provides the measurable analysis of the development process of the developers with their assigned task using statistical approach
Wireless Health Monitoring System Using ZigBeeijbuiiir1
Recent developments in off-the-shelf wireless embedded computing boards and the increasing need for efficient health monitoring systems, fueled by the increasing number of patients, has prompted R&D professionals to explore better health monitoring systems that are both mobile and cheap. This work investigates the feasibility of using the ZigBee embedded technology in health-related monitoring applications. Selected vital signs of patients are acquired using sensor nodes and readings are transmitted wirelessly using devices that utilize the ZigBee communications protocols. A prototype system has been developed and tested with encouraging results
Image Compression Using Discrete Cosine Transform & Discrete Wavelet Transformijbuiiir1
This research paper presents a proposed method for the compression of medical images using hybrid compression technique (DWT, DCT and Huffman coding). The objective of this hybrid scheme is to achieve higher compression rates by first applying DWT and DCT on individual components RGB. After applying this image is quantized to calculate probability index for each unique quantity so as to find out the unique binary code for each unique symbol for their encoding. Finally the Huffman compression is applied. Results show that the coding performance can be significantly improved by the hybrid DWT, DCT and Huffman coding algorithm
Agile development methodologies are very promising in the software industry. Agile development techniques are very realistic n understanding the fact that requirement in a business environment changes constantly. Agile development processes optimize the opportunity provided by cloud computing by doing software release iteratively and getting user feedback more frequently. The research work, a study on Agile Methods and cloud computing. This paper analyzes the Agile Management and development methods and its benefits with cloud computing. Combining agile development methodology with cloud computing brings the best of both worlds. A business strategy, the outcomes of which optimize profitability revenue and customer satisfaction by organizing around customer segments, fostering customer-satisfying behaviors, and implementing customer-centric processes
SOA is becoming important for Business Process Management and Enterprises. Now SOA is widely used by Enterprises as it provides seamless environment, flexibility, interoperability, but at the same time security should also consider because the basic SOA framework doesn�t possess any security. It depends upon the respective proprietor for security [1]. In recent times many research work had done for SOA security. Researchers have also proposed various frameworks and models such as FIX [2], SAVT [3] which tries a lot, but cannot achieve any landmark as they are based on XML schema.This proposed novel work contains an inbuilt security module which was based on PKI. At the same time this model will intact the flexibility and interoperability as the security module is embedded by analyzing the nature of WSDL, UDDI, SOAP and XML. These protocols are also compatible with PKI. Proposed Model was implemented in the asp.net environment then experimental results are compared with other security methods such as data mining based web security and automata based web security
Internet Scheduling of Bluetooth Scatter Nets By Inter Piconet Schedulingijbuiiir1
Bluetooth includes the concept of devices participating in multiple �piconets� interconnected via bridge devices and thereby forming a �scatternet�. This paper presents a scheme for Bluetooth scatternet operation that adapts to varying traffic patterns. According to the traffic information of all masters that the bridge is connected, the bridge switches to the masters with high traffic loads and increase the usage of the bridge. In addition, load adaptive interpiconet scheduling can reduce the number of failed unsniffs and the overhead of the bridge switch wastes to further increase the overall system performance.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
TOP 10 B TECH COLLEGES IN JAIPUR 2024.pptxnikitacareer3
Looking for the best engineering colleges in Jaipur for 2024?
Check out our list of the top 10 B.Tech colleges to help you make the right choice for your future career!
1) MNIT
2) MANIPAL UNIV
3) LNMIIT
4) NIMS UNIV
5) JECRC
6) VIVEKANANDA GLOBAL UNIV
7) BIT JAIPUR
8) APEX UNIV
9) AMITY UNIV.
10) JNU
TO KNOW MORE ABOUT COLLEGES, FEES AND PLACEMENT, WATCH THE FULL VIDEO GIVEN BELOW ON "TOP 10 B TECH COLLEGES IN JAIPUR"
https://www.youtube.com/watch?v=vSNje0MBh7g
VISIT CAREER MANTRA PORTAL TO KNOW MORE ABOUT COLLEGES/UNIVERSITITES in Jaipur:
https://careermantra.net/colleges/3378/Jaipur/b-tech
Get all the information you need to plan your next steps in your medical career with Career Mantra!
https://careermantra.net/
A review on techniques and modelling methodologies used for checking electrom...nooriasukmaningtyas
The proper function of the integrated circuit (IC) in an inhibiting electromagnetic environment has always been a serious concern throughout the decades of revolution in the world of electronics, from disjunct devices to today’s integrated circuit technology, where billions of transistors are combined on a single chip. The automotive industry and smart vehicles in particular, are confronting design issues such as being prone to electromagnetic interference (EMI). Electronic control devices calculate incorrect outputs because of EMI and sensors give misleading values which can prove fatal in case of automotives. In this paper, the authors have non exhaustively tried to review research work concerned with the investigation of EMI in ICs and prediction of this EMI using various modelling methodologies and measurement setups.
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...ssuser7dcef0
Power plants release a large amount of water vapor into the
atmosphere through the stack. The flue gas can be a potential
source for obtaining much needed cooling water for a power
plant. If a power plant could recover and reuse a portion of this
moisture, it could reduce its total cooling water intake
requirement. One of the most practical way to recover water
from flue gas is to use a condensing heat exchanger. The power
plant could also recover latent heat due to condensation as well
as sensible heat due to lowering the flue gas exit temperature.
Additionally, harmful acids released from the stack can be
reduced in a condensing heat exchanger by acid condensation. reduced in a condensing heat exchanger by acid condensation.
Condensation of vapors in flue gas is a complicated
phenomenon since heat and mass transfer of water vapor and
various acids simultaneously occur in the presence of noncondensable
gases such as nitrogen and oxygen. Design of a
condenser depends on the knowledge and understanding of the
heat and mass transfer processes. A computer program for
numerical simulations of water (H2O) and sulfuric acid (H2SO4)
condensation in a flue gas condensing heat exchanger was
developed using MATLAB. Governing equations based on
mass and energy balances for the system were derived to
predict variables such as flue gas exit temperature, cooling
water outlet temperature, mole fraction and condensation rates
of water and sulfuric acid vapors. The equations were solved
using an iterative solution technique with calculations of heat
and mass transfer coefficients and physical properties.
Literature Review Basics and Understanding Reference Management.pptxDr Ramhari Poudyal
Three-day training on academic research focuses on analytical tools at United Technical College, supported by the University Grant Commission, Nepal. 24-26 May 2024
ACEP Magazine edition 4th launched on 05.06.2024Rahul
This document provides information about the third edition of the magazine "Sthapatya" published by the Association of Civil Engineers (Practicing) Aurangabad. It includes messages from current and past presidents of ACEP, memories and photos from past ACEP events, information on life time achievement awards given by ACEP, and a technical article on concrete maintenance, repairs and strengthening. The document highlights activities of ACEP and provides a technical educational article for members.
Enhancing Effective Interoperability Between Mobile Apps Using LCIM Model
1. Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 03 Issue: 01 June 2014 Page No.1-4
ISSN: 2278-2389
1
Enhancing Effective Interoperability Between Mobile
Apps Using LCIM Model
C.Shanthi 1
, Josphine 2
,
1
Dept of BCA & IT,Vels UniversityChennai-43
2
Dept of MCA,DR M.G.R University
Email: Shan_c08@yahoo.co.in, Chennai,josejbr@yahoo.com
Abstract— Levels of conceptual interoperability model is used
to develop the method and model towards enhancing
interoperability among mobile apps. The LCIM is used as
descriptive and prescriptive form and it also make available of
both metric of the degree of conceptual representation that
exists between interoperating systems. In descriptive form
LCIM is used to decrease the discrepancies in rating mobile
apps based on content by suggesting a rating system that is
completely based on interoperability. In the prescriptive form
it receives information for app development, which allows
producing apps with prominent level of interoperability. The
Levels of Conceptual Interoperability has the abstract
backbone for developing and implementing an interoperability
framework that supports to exchange of XML based languages
used by M&S systems across the web.
Index Terms— levels of conceptual interoperability model,
descriptive form, prescriptive form, mobile computing, Mobile
apps, interoperability rating. Software engineering
I. INTRODUCTION
LCIM model not only enhances the interoperability but it also
provides a better way to speed up mobile apps. The mobile
apps are at present rated based on content. There are no
definite approaches to rate the mobile apps on technical terms.
Currently many apps that are technically sound are being
rejected because of the content they provided so the rating
mechanism need to be changed thus soothing the developers.
This paper suggests that rating the mobile apps based on
interoperability would get to the bottom of the problem by
enhancing LCIM model.Mobile device continue to boost in the
market with improvement in mobile infrastructure, device
affordability and cultural acceptance. However, there is a
campaigning trend in mobile application but becoming
increasingly focused on applications with less interoperable.
On analyzing all the existing interoperability models, it was
found that LCIM model used in systems engineering domain
can be adapted to get better and effective interoperability of
mobile apps. This paper introduces a general model dealing
with various levels of conceptual interoperability that goes
away from the technical reference models for interoperable
solutions.The LCIM is used in both descriptive and
prescriptive form. In descriptive the LCIM can be used to
describe the levels and properties of interoperability and rate
the apps based on interoperability. In the prescriptive role, the
model prescribes the methods and requirements that must be
satisfied during engineering phases of app development to
attain a preferred level of interoperability. The Levels of
Conceptual Interoperability Model was evaluated to see at
what level it can support the different necessary artifacts. The
LCIM started as a model to make clear the needs, to be
specified and also assure interoperation on different levels.
Interoperation is only possible - in a meaningful way- if the
technical structures are associated with the conceptual ideas.
As such, the LCIM is a framework describing a spectrum how
this is technically implemented. It request that data (entities,
propertied concepts) and how they are used (process concepts,
methods) and the constraints are described by artifacts
following the ideas of system architecture and modeling as
known from the domain of system engineering.
II. CONCEPTUAL MODELING AND
INTEROPERABILITY
Zeigler proposed a model for understanding a system that is
within another observing system. He proposed three main
properties that must be supported and they are Perception,
Meta model and Mapping.
Perception
Observing system must be able to sense the behavior of the
system that is being observed.
Meta model
The observing system must have a Meta model of the
observed system.
Mapping
We must be able to map the experiential properties process
and constraints resulting into perceptions and Meta models.
These three property need to be clear in order for the apps to
interoperate efficiently. Thus conceptual models are the best
way to express these premises more clearly.LCIM is applied to
an idea about its potential and limitations in the current
simulation interoperability approaches, in particular the High
Level Architecture (HLA) and Base Object Models (BOM). It
emphasizes the relevant accurate engineering methods and
principles and replaces ad-hoc approaches.A conceptual model
is the intangible and simplified representation of systems for
some specific purpose by languages, figures, tables, or other
suitable artifacts. Robinson defines a conceptual model as "a
non-software specific description of the simulation model that
is to be developed, describing the objectives, inputs, outputs,
content, assumptions, and simplifications of the model." [1]
Following Robinson’s proposal, conceptual models and
conceptual modeling:
Reduce ambiguity, incompleteness, inconsistency, and
mistakes in the description of requirements, Facilitate the
communication between stakeholders in modeling and
simulation processes, such as users, architects, analyzers,
domain experts, modelers, and developers,
2. Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 03 Issue: 01 June 2014 Page No.1-4
ISSN: 2278-2389
2
Form the basis and start for successive phases (analysis,
design, implementation and application), Facilitate the
Verification, Validation and Accreditation (VV&A) of models
and systems, and Promote the reusability, interoperability, and
composability of simulation resources.
Now we observe that conceptualization promotes
composability. Composable systems are extremely self
contained which means that a system can be deployed
independently. The module will work together with other
component but the dependant components are replaceable.
Stateless means that the components indulge the requests from
other components independently and the requests are not
connected to any previous requests.When apps interoperate
efficiently they must be self contained and stateless, which can
be effectively achieved through conceptual modeling. LCIM
was chosen as a successful model for enhancing
interoperability among mobile apps as it basically depends on
conceptual modeling and makes the apps composable.
III. LEVELS OF CONCEPTUAL
INTEROPERABILITY MODEL
LCIM was initially proposed by Tolk and Muguira [2]. After
continuous evolution, it forms the latest version illustrated in
Figure 1. The seven levels from “no interoperability to
conceptual interoperability‟ is notated from L0 to L6, whose
implications are listed in Table 1. Bottom-up refers to from L0
to L6, top-down vice versa. Besides the simulation
interoperability community, LCIM is used by scientists of
various disciplines to contract with troubles in their
communities, e.g. system biologist and ontology researchers.
Figure 1. The Levels of Conceptual Interoperability Model
The Levels of Conceptual Interoperability Model (LCIM) was
evaluated; to what extent it can support different necessary
artifacts. The results are based on research that tried to create
the different features highlighted in various publications on the
LCIM into an overview and to obtain new insights. It is first
used as an outline for conceptual modeling. Second, its
applicability as a persuasive, as well as a prescriptive model is
evaluated. Finally, it is useful to give the impression of being
at current simulation interoperability standards, in particular
the High Level Architecture (HLA) and Base Object Models
(BOM).
IV. INTIATE A SYSTEMS ENGINEERING FOR
APPS INTEROPERATION
Systems engineering defines a lot of principles that maintain
the design of systems and documenting their functionality and
interfaces in a way ensuring that independently designed
systems can interoperate with each other. This section, we will
initiate the ideas of conceptual modeling and capturing the
resourceful artifacts in a way, using the LCIM as the guiding
frame, in support of App interoperation.
V. DESCRIPTIVE AND PRESCRIPTIVE ROLES
Systems engineering distinguishes between explaining
(descriptive) and mandating (prescriptive) models. LCIM can
serve in both functions [3]: In descriptive role, LCIM
describes the levels and properties of interoperability existing
in a given app. In the prescriptive role, LCIM prescribes the
methods and requirements that must be satisfied during the
engineering of an app to achieve a desired level of
interoperability. Referring to the definitions of descriptive
linguistics [4] and prescriptive linguistics [16] in addition to
the definitions of descriptive and prescriptive models by
Department of Defense (DoD)[5], the definition of descriptive
and prescriptive roles of LCIM are defined and their
relationship is clarified.
Descriptive Role
Developers brand their applications according to Google Play
ratings system, which consists of four levels, Everyone Low
maturity, medium maturity, high maturity. The apps get
discarded due to the content they provide even they provide
the users with a lot of technical comfortness. Descriptive form
of LCIM allows us to rate them based on interoperability. The
definition of descriptive model by the DOD [5], characterizing
the descriptive function of LCIM used to depict or analyze the
ability, properties, characteristics and the levels of conceptual
interoperability of an existing system or system of systems In
prescriptive role, LCIM serves as documentation and maturity
model, the goal of descriptive role describes how existing
systems are interoperating and what level of conceptual
interoperability can be reached by user's detailed approaches
without prescription. In descriptive role, LCIM is used to show
the gaps that need to be closed.
The characteristics of the descriptive role are listed as:
Real systems or system of systems that have been
implemented or existed;
The specific technical approaches, implementations and
documentations are known;
LCIM is used as a maturity model. The lower levels must have
been satisfied; When they reaches higher levels.
The levels are common compassionate bottom-up. The lower
level is the idea of the higher and the higher needs the
implementation of lower levels.
The processes of descriptive role describe the documentation,
specific approach, and implementation of interoperating
systems from L0 to L6 according to Table 2, and evaluate the
levels they have reached.
3. Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 03 Issue: 01 June 2014 Page No.1-4
ISSN: 2278-2389
3
Table 2. Descriptive Role of LCIM
Levels Description of Interoperability at this level
L6 Interoperating apps at this level are completely
conscious of each others information, processes,
contexts, and modeling assumptions.
L5 Interoperating apps can also reorient information
formation and using based on understood changes
to meaning, due to changing context as time
increases.
L4 Interoperating apps is aware of the context
(system states and processes) and meaning of
information being exchanged.
L3 Interoperating apps are exchanging a set of terms
that they can semantically parse.
L2 Apps have an established protocol to exchange the
right forms of data in the right order, but the
meaning of data elements is not recognized.
L1 Apps can exchange data e methodological
connection(s)
L0 NA
Prescriptive Role
The mobile apps need to be modeled and implemented in a
systematic way so as to boost interoperation among them. The
prescriptive form of LCIM guides to build our app in a
structured way which makes the apps to reach a high level of
interoperability. It also prescribes the approaches and
requirements that must be satisfied to accommodate a target
degree of conceptual representation between systems.
The characteristics of the prescriptive role are listed as the
following: Real systems or system of systems do not continue
living in the phase of theoretical modeling; As an
interoperability guidance model, LCIM is used to suggest the
required conditions and potential engineering approaches to
reach a certain interoperability level; Top-down mappings are
required from formation to implementation. If problems are
dealt at the conceptual level without any real systems, the
lower levels are unreachable until real systems are
implemented; The levels are equally supportive bottom-up,
mapped, and refined top-down. Higher levels need the
implementation of the lower, so there is much work to do for
the mapping from the higher to the lower; Getting the higher
level is not adequate, full band interoperability at all the levels
is needed in particular for the prescriptive role of LCIM; As
the advance of techniques, the suggested approaches by the
prescriptive role are always varying and embryonic and only
have proportional constancy in a certain period.The contents
of prescriptive role at each level are illustrated in Table 3.
From which Ontology, Unified Modeling Language (UML),
Model Driven Architecture (MDA), and Discrete Event
System Specification (DEVS) are the important approaches to
develop the conceptual interoperability and composabilityThe
prescriptive role of LCIM is used to facilitate the
transformation from conceptual modeling to systems
implementation focusing on the interoperation aspects. LCIM
is a guide - or check list - how to reach the target. In this case,
it builds logically top-down.
Comparison and Relationship between Descriptive and
Prescriptive Roles
The descriptive and prescriptive roles apply to LCIM through
different viewpoints. To reach semantically lossless
interoperation, all levels are needed in both cases. Description
and prescription have differences and similarities as illustrated
in Table 4.The combinations can facilitate better solutions to
interoperation and composition problems. There are two kinds
of combinations. Prescription can be first used with the
analysis to the interoperation objectives, costs and conditions.
Next descriptive role can be used to estimate the
interoperability of that process. The prescription - description
–prescription again - description again iterative cycle can
based.
be used for the solution. The other kind of combination is
reuse-
Some systems or subsystems have existed or been
implemented and need to be integrated to reach a certain
interoperation goal. The description and evaluation -
prescription - objective execution - description - evaluation
iterative cycle can be used. In both cases, LCIM can be seen as
the two sides of a coin - one is what exist or have done, the
other is what needs to be done. LCIM is a good theory or
model to evaluate the levels of interoperability.
I. CONCLUSION
This paper will describe a framework that will merge the
various specification schemes and apply a methodology to
define the structural variances between components. The
mobile apps are facing many challenges in their
interoperation; the limited resources significantly impede the
improvement of interoperability between mobile apps. In
LCIM the apps can be rated based on interoperability. Apps of
different vendors can interoperate efficiently and effectively
when LCIM is followed while implementation. When vendors
permit users to interoperate apps with third party module, this
allows users to manage their apps more efficiently instead of
downloading lots of apps. This helps users to select apps from
different vendors as a choice of sticking to apps of single
vendor.
4. Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 03 Issue: 01 June 2014 Page No.1-4
ISSN: 2278-2389
4
Table 3. Prescriptive Role of LCIM REFERENCES
[1] Robinson S. "Conceptual modelling for simulation part I: Definition and
requirements". Journal of the Operational Re-search Society, 2008,
59:278-290. A. N. Netravali and B. G. Haskell, Digital Pictures, 2nd ed.,
Plenum Press: New York, 1995, pp. 613-651.
[2] Tolk, A. and Muguira, J.A. (2003). The Levels of Conceptual
Interoperability Model (LCIM). Proceedings IEEE Fall Simulation
Interoperability Workshop, IEEE CS Press
http://kiwigis.blogspot.in/2011/07/app-interoperability-
conundrum.html.http://blogs.sybase.com/carolynfitton/2011/10/applicati
on- interoperability-and-the-mobile-enterprise.
[4] Luis Corral, Alberto Sillitti, Giancarlo Succi. Preparing Mobile Software
Development Processes to Meet Mission-Critical Requirements.
[5] Florence T. Balagtas-Fernandez. Model-Driven Development of Mobile
Applications.
[6] Rawan Tarek Khalil and Ala’ Fathi Khalifeh.System. Design and
Software Engineering Challenges in Building an Android Application: a
Case Study.
[7] Rajesh Krishna balan, Joao petro souse, Mahadev satyanarayanan.
Meeting the software engineering challenges of adaptive mobile
applications.
[8] Tolk A, Muguira JA. "The levels of conceptual interoperability model".
Fall Simulation Interoperability Workshop. Orlando, Florida:
Simulation Interoperability Standards Organization, 2003.
[9] Tolk A, Turnitsa C, Diallo S. "Implied ontological representation within
the levels of conceptual interoperability model". Intelligent Decision
Technologies, 2008, 2:3-19. H. Sun, W. Kwok, and J. Zdepski,
“Architectures for MPEG compressed bitstream scaling,” IEEE Trans.
Circuits Syst. Video Technol., vol. 6, no. 2, pp. 191-199, Apr. 1996.
[10] Wikipedia. "Descriptive linguistics". 2008/2008-07-29.
http://en.wikipedia.org/wiki/Descriptive_linguistics.
[11] "DoD modeling and simulation (M&S) glossary". DoD 5000.59-M.
United States: Defense Modeling and Simulation Office, 1998.
https://www.dmso.mil. R. Nicole, “Title of paper with only first word
capitalized,” accepted for publication in IEEE Trans. Broadcast
Technology.
[12] R. Hema1, C. Shanthi2, R. Madhavarajan3, Dr. A. Muthukumaravel4”
Enhancing Interoperability Among Mobile Apps Using LCIM Model
Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified
Journal, Volume 3, Issue 5, May 2013)
Levels Prescription of
Requirements to
reach this Level
Common
Reference
Engineering
Approaches
L6(Conc
eptual)
A shared
understanding of the
conceptual model of
a system (exposing
its information,
processes, states,
and operations).
DoDAF; Military
Mission to Means
Framework;
Platform
Independent Models
of the Model Driven
Architecture;
SysML
L5(Dyna
mic)
The means of
producing and
consuming the
definitions of
meaning and context
is required.
Ontology for
Services; UML
artifacts; DEVS;
complete UML;
BOM
L4(Prag
matic)
A method for
sharing meaning of
terms and methods
for anticipating
context are required.
Taxonomies;
Ontology; UML
artifacts, in
particular sequence
diagrams; DEVS;
OWL; MDA
L3(Sema
ntic)
Agreement between
all systems on a set
of terms that
grammatically
satisfies the
syntactic level
solution
requirements is
required.
Common Reference
Model, Dictionaries;
Glossaries; Protocol
Data Units; RPR
FOM
L2(Synta
ctic)
An agreed-to
protocol that all can
be supported by the
technical level
solution is required.
XML; HLA OMT;
Interface
Description
Language; CORBA;
SOAP
L1(Techn
ical)
Capability to
produce and
consume data in
exchange with
systems external to
self is required.
Network connection
standards such as
HTTP; TCP/IP;etc.
L0(No) NA NA