In the paper, a principles underlying the construction of an intelligent information system estimated results of the dynamic interaction of orbital systems with space debris is presented. It describes the knowledge database model based on these principles which is the synthesis of theoretical and practical information in the field of estimating the high-speed interaction of objects.
Construction Management (CM) has to deal with a variety of uncertainties related to Time, Cost, Quality, and Safety, to name a few. Such uncertainties make the entire construction process highly unpredictable. It, therefore, falls under the purview of artificial neural networks (ANNs) in which the given hazy information can be effectively interpreted in order to arrive at meaningful conclusions. This paper reviews the application of ANNs in construction activities related to the prediction of costs, risk, and safety, tender bids, as well as labor and equipment productivity. The review suggests that the ANN’s had been highly beneficial in correctly interpreting inadequate input information. It was seen that most of the investigators used the feed forward back propagation type of the network; however, if a single ANN architecture was found to be insufficient, then hybrid modeling in association with other machine learning tools such as genetic programming and support vector machines were much useful. It was however clear that the authenticity of data and experience of the modeler are important in obtaining good results.
Funding agencies such as the U.S. National Science Foundation (NSF), U.S. National Institutes of Health (NIH), and the Transportation Research Board (TRB) of The National Academies make their online grant databases publicly available which document a variety of information on grants that have been funded over the past few decades. In this paper, based on a quantitative analysis of the TRB’s Research In Progress (RIP) online database, we explore the feasibility of automatically estimating the appropriate funding level, given the textual description of a transportation research project. We use statistical Text Mining (TM) and Machine Learning (ML) technologies to build this model using the 14,000 or more records of the TRB’s RIP research grants big data. Several Natural Language Processing (NLP) based text representation models such as the Latent Dirichlet Allocation (LDA), Latent Semantic Indexing (LSI) and the Doc2Vec Machine Learning (ML) approach are used to vectorize the project descriptions and generate semantic vectors. Each of these representations is then used to train supervised regression models such as Random Forest (RF) regression. Out of the three latent feature generation models, we found LDA gives the least Mean Absolute Error (MAE) using 300 feature dimensions and RF regression model. However, based on the correlation coefficients, it was found that it is not very feasible to accurately predict the funding level directly from the unstructured project abstract, given the large variations in source agencies, subject areas, and funding levels. By using separate prediction models for different types of funding agencies, funding levels were better correlated with the project abstract.
The final cost of public school building projects, like other construction projects, is unknown
to the owner till the account closure. Artificial Neural Networks (ANN) is used in an attempt to
predict the final cost of two story (12 classes) school projects under lowest bid system of award
before work starts. A database of (65) school projects records completed in (2007-2012) are used to
develop and verify the ANN model. Based on expert opinions, nine out of eleven parameters are
considered to have the most significant impact on the magnitude of final cost. Hence they are used as
model inputs while the output of the model is going to be the final cost (FC). These parameters are;
accepted bid price, average bid price, estimated cost, contractor rank, supervising engineer
experience, project location, number of bidders, year of contracting, and contractual duration. It was
found that ANN has the ability to predict the final cost for school projects with very good degree of
accuracy having a coefficient of correlation (R) of (91%), and an average accuracy percentage of
(99.98%).
GET IEEE BIG DATA,JAVA ,DOTNET,ANDROID ,NS2,MATLAB,EMBEDED AT LOW COST WITH BEST QUALITY PLEASE CONTACT BELOW NUMBER
FOR MORE INFORMATION PLEASE FIND THE BELOW DETAILS:
Nexgen Technology
No :66,4th cross,Venkata nagar,
Near SBI ATM,
Puducherry.
Email Id: praveen@nexgenproject.com
Mobile: 9791938249
Telephone: 0413-2211159
www.nexgenproject.com
Comparison of Cost Estimation Methods using Hybrid Artificial Intelligence on...IJERA Editor
Cost estimating at schematic design stage as the basis of project evaluation, engineering design, and cost
management, plays an important role in project decision under a limited definition of scope and constraints in
available information and time, and the presence of uncertainties. The purpose of this study is to compare the
performance of cost estimation models of two different hybrid artificial intelligence approaches: regression
analysis-adaptive neuro fuzzy inference system (RANFIS) and case based reasoning-genetic algorithm (CBRGA)
techniques. The models were developed based on the same 50 low-cost apartment project datasets in
Indonesia. Tested on another five testing data, the models were proven to perform very well in term of accuracy.
A CBR-GA model was found to be the best performer but suffered from disadvantage of needing 15 cost drivers
if compared to only 4 cost drivers required by RANFIS for on-par performance.
Construction Management (CM) has to deal with a variety of uncertainties related to Time, Cost, Quality, and Safety, to name a few. Such uncertainties make the entire construction process highly unpredictable. It, therefore, falls under the purview of artificial neural networks (ANNs) in which the given hazy information can be effectively interpreted in order to arrive at meaningful conclusions. This paper reviews the application of ANNs in construction activities related to the prediction of costs, risk, and safety, tender bids, as well as labor and equipment productivity. The review suggests that the ANN’s had been highly beneficial in correctly interpreting inadequate input information. It was seen that most of the investigators used the feed forward back propagation type of the network; however, if a single ANN architecture was found to be insufficient, then hybrid modeling in association with other machine learning tools such as genetic programming and support vector machines were much useful. It was however clear that the authenticity of data and experience of the modeler are important in obtaining good results.
Funding agencies such as the U.S. National Science Foundation (NSF), U.S. National Institutes of Health (NIH), and the Transportation Research Board (TRB) of The National Academies make their online grant databases publicly available which document a variety of information on grants that have been funded over the past few decades. In this paper, based on a quantitative analysis of the TRB’s Research In Progress (RIP) online database, we explore the feasibility of automatically estimating the appropriate funding level, given the textual description of a transportation research project. We use statistical Text Mining (TM) and Machine Learning (ML) technologies to build this model using the 14,000 or more records of the TRB’s RIP research grants big data. Several Natural Language Processing (NLP) based text representation models such as the Latent Dirichlet Allocation (LDA), Latent Semantic Indexing (LSI) and the Doc2Vec Machine Learning (ML) approach are used to vectorize the project descriptions and generate semantic vectors. Each of these representations is then used to train supervised regression models such as Random Forest (RF) regression. Out of the three latent feature generation models, we found LDA gives the least Mean Absolute Error (MAE) using 300 feature dimensions and RF regression model. However, based on the correlation coefficients, it was found that it is not very feasible to accurately predict the funding level directly from the unstructured project abstract, given the large variations in source agencies, subject areas, and funding levels. By using separate prediction models for different types of funding agencies, funding levels were better correlated with the project abstract.
The final cost of public school building projects, like other construction projects, is unknown
to the owner till the account closure. Artificial Neural Networks (ANN) is used in an attempt to
predict the final cost of two story (12 classes) school projects under lowest bid system of award
before work starts. A database of (65) school projects records completed in (2007-2012) are used to
develop and verify the ANN model. Based on expert opinions, nine out of eleven parameters are
considered to have the most significant impact on the magnitude of final cost. Hence they are used as
model inputs while the output of the model is going to be the final cost (FC). These parameters are;
accepted bid price, average bid price, estimated cost, contractor rank, supervising engineer
experience, project location, number of bidders, year of contracting, and contractual duration. It was
found that ANN has the ability to predict the final cost for school projects with very good degree of
accuracy having a coefficient of correlation (R) of (91%), and an average accuracy percentage of
(99.98%).
GET IEEE BIG DATA,JAVA ,DOTNET,ANDROID ,NS2,MATLAB,EMBEDED AT LOW COST WITH BEST QUALITY PLEASE CONTACT BELOW NUMBER
FOR MORE INFORMATION PLEASE FIND THE BELOW DETAILS:
Nexgen Technology
No :66,4th cross,Venkata nagar,
Near SBI ATM,
Puducherry.
Email Id: praveen@nexgenproject.com
Mobile: 9791938249
Telephone: 0413-2211159
www.nexgenproject.com
Comparison of Cost Estimation Methods using Hybrid Artificial Intelligence on...IJERA Editor
Cost estimating at schematic design stage as the basis of project evaluation, engineering design, and cost
management, plays an important role in project decision under a limited definition of scope and constraints in
available information and time, and the presence of uncertainties. The purpose of this study is to compare the
performance of cost estimation models of two different hybrid artificial intelligence approaches: regression
analysis-adaptive neuro fuzzy inference system (RANFIS) and case based reasoning-genetic algorithm (CBRGA)
techniques. The models were developed based on the same 50 low-cost apartment project datasets in
Indonesia. Tested on another five testing data, the models were proven to perform very well in term of accuracy.
A CBR-GA model was found to be the best performer but suffered from disadvantage of needing 15 cost drivers
if compared to only 4 cost drivers required by RANFIS for on-par performance.
Forecasting number of vulnerabilities using long short-term neural memory net...IJECEIAES
Cyber-attacks are launched through the exploitation of some existing vulnerabilities in the software, hardware, system and/or network. Machine learning algorithms can be used to forecast the number of post release vulnerabilities. Traditional neural networks work like a black box approach; hence it is unclear how reasoning is used in utilizing past data points in inferring the subsequent data points. However, the long short-term memory network (LSTM), a variant of the recurrent neural network, is able to address this limitation by introducing a lot of loops in its network to retain and utilize past data points for future calculations. Moving on from the previous finding, we further enhance the results to predict the number of vulnerabilities by developing a time series-based sequential model using a long short-term memory neural network. Specifically, this study developed a supervised machine learning based on the non-linear sequential time series forecasting model with a long short-term memory neural network to predict the number of vulnerabilities for three vendors having the highest number of vulnerabilities published in the national vulnerability database (NVD), namely microsoft, IBM and oracle. Our proposed model outperforms the existing models with a prediction result root mean squared error (RMSE) of as low as 0.072.
Data Science Solutions by Materials Scientists: The Early Case StudiesTony Fast
Improvements in algorithms, technology, and computation are directly impacting the landscape of information use in materials science. The 3 V’s of Big Data (volume, velocity, and variety) are becoming evermore apparent within all sectors of the field. Novel approaches will be required to confront the emerging data deluge and extract the richest knowledge from simulated and empirical information in complex evolving 3-D spaces. Microstructure Informatics (μInformatics) is an emerging suite of signal processing techniques, advanced statistical tools, and data science methods tailored specifically for this new frontier. μInformatics curates and transforms large collections of materials science information using efficient workflows to extract knowledge of bi-directional structure-property/processing connections for most material classes.
In this talk, a few early case studies in data-driven methods to solve materials science problems will be explored. Emerging spatial statistics tools will be explored that enable an objective comparison of static and evolving 3-D material volumes from molecular dynamics simulation, micro-CT, and Scanning Electron Microscopy. Also, the statistics will provide a foundation to create improved bottom-up homogenization relationships in fuel cell materials. Lastly, applications of the Materials Knowledge System, a data-driven meta-model to create top-down localization relationships will be explored for phase field model and finite element model information.
DEVELOPMENT OF A CONCEPTUAL MODEL OF ADAPTIVE ACCESS RIGHTS MANAGEMENT WITH U...IAEME Publication
The paper describes the conceptual model of adaptive control of cyber protection
of the informatization object (IO). Petri's Networks were used as a mathematical
device to solve the problem of adaptive control of user access rights. The simulation
model is proposed and the simulation in PIPE v4.3.0 package is performed. The
possibility of automating the procedures for adjusting the user profile to minimize or
neutralize cyber threats in the objects of informatization is shown. The model of
distribution of user tasks in computer networks of IO is proposed. The model, unlike
the existing, is based on the mathematical apparatus of Petri's Networks and contains
variables that allow reducing the power of the state space. Access control method
(ACM) is added. The addenda touched upon aspects of reconciliation of access rights
that are requested by the task and requirements of the security policy and the degree
of consistency of tasks and access to the IO nodes. Adjustment of rules and security
metrics for new tasks or redistributable tasks is described in the notation of Petri nets
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Sarvesh Kumar
The work is carried on the application of differential equation (DE) and its computational technique of genetic algorithm and neural (GANN) in C#, which is frequently used in globalised world by human wings. Diagrammatical and flow chart presentation is the major concerned for easy undertaking of these two concepts with indication of its present and future application is the new initiative taken in this paper along with computational approaches in C#. Little observation has been also pointed during working, functioning and development process of above algorithm in C# under given boundary value condition of DE for genetic and neural. Operations of fitness function and Genetic operations were completed for behavioural transmission of chromosome.
Activity Context Modeling in Context-AwareEditor IJCATR
The explosion of mobile devices has fuelled the advancement of pervasive computing to provide personal assistance in this
information-driven world. Pervasive computing takes advantage of context-aware computing to track, use and adapt to contextual
information. The context that has attracted the attention of many researchers is the activity context. There are six major techniques that
are used to model activity context. These techniques are key-value, logic-based, ontology-based, object-oriented, mark-up schemes and
graphical. This paper analyses these techniques in detail by describing how each technique is implemented while reviewing their pros
and cons. The paper ends with a hybrid modeling method that fits heterogeneous environment while considering the entire of modeling
through data acquisition and utilization stages. The modeling stages of activity context are data sensation, data abstraction and
reasoning and planning. The work revealed that mark-up schemes and object-oriented are best applicable at the data sensation stage.
Key-value and object-oriented techniques fairly support data abstraction stage whereas the logic-based and ontology-based techniques
are the ideal techniques for reasoning and planning stage. In a distributed system, mark-up schemes are very useful in data
communication over a network and graphical technique should be used when saving context data into database.
Data mining is utilized to manage huge measure of information which are put in the data ware houses and databases, to discover required information and data. Numerous data mining systems have been proposed, for example, association rules, decision trees, neural systems, clustering, and so on. It has turned into the purpose of consideration from numerous years. A re-known amongst the available data mining strategies is clustering of the dataset. It is the most effective data mining method. It groups the dataset in number of clusters based on certain guidelines that are predefined. It is dependable to discover the connection between the distinctive characteristics of data.
In k-mean clustering algorithm, the function is being selected on the basis of the relevancy of the function for predicting the data and also the Euclidian distance between the centroid of any cluster and the data objects outside the cluster is being computed for the clustering the data points. In this work, author enhanced the Euclidian distance formula to increase the cluster quality.
The problem of accuracy and redundancy of the dissimilar points in the clusters remains in the improved k-means for which new enhanced approach is been proposed which uses the similarity function for checking the similarity level of the point before including it to the cluster.
Predictive Data Mining with Normalized Adaptive Training Method for Neural Ne...IJERDJOURNAL
Abstract:- Predictive data mining is an upcoming and fast-growing field and offers a competitive edge for the benefit of organization. In recent decades, researchers have developed new techniques and intelligent algorithms for predictive data mining. In this research paper, we have proposed a novel training algorithm for optimizing neural networks for prediction purpose and to utilize it for the development of prediction models. Models developed in MATLAB Neural Network Toolbox have been tested for insurance datasets taken from a live data warehouse. A comparative study of the proposed algorithm with other popular first and second order algorithms has been presented to judge the predictive accuracy of the suggested technique. Various graphs have been presented to analyse the convergence behaviour of different algorithms towards point of minimum error.
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...Ricardo Viana Vargas
This paper aims to discuss the use of the Artificial Neural Networks (ANN) to model aspects of the project budget where traditional algorithms and formulas are not available or not easy to apply. Neural networks use a process analogous to the human brain, where a training component takes place with existing data and subsequently, a trained neural network becomes an “expert” in the category of information it has been given to analyse. This “expert” can then be used to provide projections given new situations based on an adaptive learning (STERGIOU & CIGANOS, 1996).
The article also presents a fictitious example of the use of neural networks to determine the cost of project management activities based on the complexity, location, budget, duration and number of relevant stakeholders. The example is based on data from 500 projects and is used to predict the project management cost of a given project.
Modelling and simulation methodology for unidirectional composite laminates i...Olben Falcó Salcines
A reliable virtual testing framework for unidirectionally laminated composites is presented that allows the
prediction of failure loads and modes of general in-plane coupons with great realism. This is a toolset based on
finite element analysis that relies on a cohesive-frictional constitutive formulation coupled with the kinematics
of penalty-based contact surfaces, on sophisticated three-dimensional continuum damage models, and overall on
a modelling approach based on mesh structuring and crack-band erosion to capture the appropriate crack paths
in unidirectional fibre reinforced plies. An extensive and rigorous validation of the overall approach is presented,
demonstrating that the virtual testing laboratory is robust and can be reliably used in for composite materials
screening, design and certification.
10 9242 it geo-spatial information for managing ambiguity(edit ty)IAESIJEECS
An innate test emerging in any dataset containing data of space as well as time is vulnerability due to different wellsprings of imprecision. Incorporating the effect of the instability is a principal while evaluating the unwavering quality (certainty) of any question result from the hidden information. To bargain with vulnerability, arrangements have been proposed freely in the geo-science and the information science look into group. This interdisciplinary instructional exercise crosses over any barrier between the two groups by giving an exhaustive diagram of the distinctive difficulties required in managing indeterminate geo-spatial information, by looking over arrangements from both research groups, and by distinguishing likenesses, cooperative energies and open research issues.
Spiking ink drop spread clustering algorithm and its memristor crossbar conce...IJECEIAES
In this study, a new clustering algorithm that combines neural networks and fuzzy logic properties is proposed based on spiking neural network and ink drop spread (IDS) concepts. The proposed structure is a single-layer artificial neural network with leaky integrate and fire (LIF) neurons. The structure implements the IDS algorithm as a fuzzy concept. Each training data will result in firing the corresponding input neuron and its neighboring neurons. A synchronous time coding algorithm is used to manage input and output neurons firing time. For an input data, one or several output neurons of the network will fire; confidence degree of the network to outputs is defined as the relative delay of the firing times with respect to the synchronous pulse. A memristor crossbar-based hardware is introduced for implementation of the proposed algorithm as a processing hardware. The simulation result corroborates that the proposed algorithm can be used as a neuro-fuzzy clustering and vector quantization algorithm.
Forecasting number of vulnerabilities using long short-term neural memory net...IJECEIAES
Cyber-attacks are launched through the exploitation of some existing vulnerabilities in the software, hardware, system and/or network. Machine learning algorithms can be used to forecast the number of post release vulnerabilities. Traditional neural networks work like a black box approach; hence it is unclear how reasoning is used in utilizing past data points in inferring the subsequent data points. However, the long short-term memory network (LSTM), a variant of the recurrent neural network, is able to address this limitation by introducing a lot of loops in its network to retain and utilize past data points for future calculations. Moving on from the previous finding, we further enhance the results to predict the number of vulnerabilities by developing a time series-based sequential model using a long short-term memory neural network. Specifically, this study developed a supervised machine learning based on the non-linear sequential time series forecasting model with a long short-term memory neural network to predict the number of vulnerabilities for three vendors having the highest number of vulnerabilities published in the national vulnerability database (NVD), namely microsoft, IBM and oracle. Our proposed model outperforms the existing models with a prediction result root mean squared error (RMSE) of as low as 0.072.
Data Science Solutions by Materials Scientists: The Early Case StudiesTony Fast
Improvements in algorithms, technology, and computation are directly impacting the landscape of information use in materials science. The 3 V’s of Big Data (volume, velocity, and variety) are becoming evermore apparent within all sectors of the field. Novel approaches will be required to confront the emerging data deluge and extract the richest knowledge from simulated and empirical information in complex evolving 3-D spaces. Microstructure Informatics (μInformatics) is an emerging suite of signal processing techniques, advanced statistical tools, and data science methods tailored specifically for this new frontier. μInformatics curates and transforms large collections of materials science information using efficient workflows to extract knowledge of bi-directional structure-property/processing connections for most material classes.
In this talk, a few early case studies in data-driven methods to solve materials science problems will be explored. Emerging spatial statistics tools will be explored that enable an objective comparison of static and evolving 3-D material volumes from molecular dynamics simulation, micro-CT, and Scanning Electron Microscopy. Also, the statistics will provide a foundation to create improved bottom-up homogenization relationships in fuel cell materials. Lastly, applications of the Materials Knowledge System, a data-driven meta-model to create top-down localization relationships will be explored for phase field model and finite element model information.
DEVELOPMENT OF A CONCEPTUAL MODEL OF ADAPTIVE ACCESS RIGHTS MANAGEMENT WITH U...IAEME Publication
The paper describes the conceptual model of adaptive control of cyber protection
of the informatization object (IO). Petri's Networks were used as a mathematical
device to solve the problem of adaptive control of user access rights. The simulation
model is proposed and the simulation in PIPE v4.3.0 package is performed. The
possibility of automating the procedures for adjusting the user profile to minimize or
neutralize cyber threats in the objects of informatization is shown. The model of
distribution of user tasks in computer networks of IO is proposed. The model, unlike
the existing, is based on the mathematical apparatus of Petri's Networks and contains
variables that allow reducing the power of the state space. Access control method
(ACM) is added. The addenda touched upon aspects of reconciliation of access rights
that are requested by the task and requirements of the security policy and the degree
of consistency of tasks and access to the IO nodes. Adjustment of rules and security
metrics for new tasks or redistributable tasks is described in the notation of Petri nets
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Sarvesh Kumar
The work is carried on the application of differential equation (DE) and its computational technique of genetic algorithm and neural (GANN) in C#, which is frequently used in globalised world by human wings. Diagrammatical and flow chart presentation is the major concerned for easy undertaking of these two concepts with indication of its present and future application is the new initiative taken in this paper along with computational approaches in C#. Little observation has been also pointed during working, functioning and development process of above algorithm in C# under given boundary value condition of DE for genetic and neural. Operations of fitness function and Genetic operations were completed for behavioural transmission of chromosome.
Activity Context Modeling in Context-AwareEditor IJCATR
The explosion of mobile devices has fuelled the advancement of pervasive computing to provide personal assistance in this
information-driven world. Pervasive computing takes advantage of context-aware computing to track, use and adapt to contextual
information. The context that has attracted the attention of many researchers is the activity context. There are six major techniques that
are used to model activity context. These techniques are key-value, logic-based, ontology-based, object-oriented, mark-up schemes and
graphical. This paper analyses these techniques in detail by describing how each technique is implemented while reviewing their pros
and cons. The paper ends with a hybrid modeling method that fits heterogeneous environment while considering the entire of modeling
through data acquisition and utilization stages. The modeling stages of activity context are data sensation, data abstraction and
reasoning and planning. The work revealed that mark-up schemes and object-oriented are best applicable at the data sensation stage.
Key-value and object-oriented techniques fairly support data abstraction stage whereas the logic-based and ontology-based techniques
are the ideal techniques for reasoning and planning stage. In a distributed system, mark-up schemes are very useful in data
communication over a network and graphical technique should be used when saving context data into database.
Data mining is utilized to manage huge measure of information which are put in the data ware houses and databases, to discover required information and data. Numerous data mining systems have been proposed, for example, association rules, decision trees, neural systems, clustering, and so on. It has turned into the purpose of consideration from numerous years. A re-known amongst the available data mining strategies is clustering of the dataset. It is the most effective data mining method. It groups the dataset in number of clusters based on certain guidelines that are predefined. It is dependable to discover the connection between the distinctive characteristics of data.
In k-mean clustering algorithm, the function is being selected on the basis of the relevancy of the function for predicting the data and also the Euclidian distance between the centroid of any cluster and the data objects outside the cluster is being computed for the clustering the data points. In this work, author enhanced the Euclidian distance formula to increase the cluster quality.
The problem of accuracy and redundancy of the dissimilar points in the clusters remains in the improved k-means for which new enhanced approach is been proposed which uses the similarity function for checking the similarity level of the point before including it to the cluster.
Predictive Data Mining with Normalized Adaptive Training Method for Neural Ne...IJERDJOURNAL
Abstract:- Predictive data mining is an upcoming and fast-growing field and offers a competitive edge for the benefit of organization. In recent decades, researchers have developed new techniques and intelligent algorithms for predictive data mining. In this research paper, we have proposed a novel training algorithm for optimizing neural networks for prediction purpose and to utilize it for the development of prediction models. Models developed in MATLAB Neural Network Toolbox have been tested for insurance datasets taken from a live data warehouse. A comparative study of the proposed algorithm with other popular first and second order algorithms has been presented to judge the predictive accuracy of the suggested technique. Various graphs have been presented to analyse the convergence behaviour of different algorithms towards point of minimum error.
Similar to Development Principles of Knowledge Database of Intelligent System for Estimation of Dynamical Interaction of Orbital Systems with Space Debris
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...Ricardo Viana Vargas
This paper aims to discuss the use of the Artificial Neural Networks (ANN) to model aspects of the project budget where traditional algorithms and formulas are not available or not easy to apply. Neural networks use a process analogous to the human brain, where a training component takes place with existing data and subsequently, a trained neural network becomes an “expert” in the category of information it has been given to analyse. This “expert” can then be used to provide projections given new situations based on an adaptive learning (STERGIOU & CIGANOS, 1996).
The article also presents a fictitious example of the use of neural networks to determine the cost of project management activities based on the complexity, location, budget, duration and number of relevant stakeholders. The example is based on data from 500 projects and is used to predict the project management cost of a given project.
Modelling and simulation methodology for unidirectional composite laminates i...Olben Falcó Salcines
A reliable virtual testing framework for unidirectionally laminated composites is presented that allows the
prediction of failure loads and modes of general in-plane coupons with great realism. This is a toolset based on
finite element analysis that relies on a cohesive-frictional constitutive formulation coupled with the kinematics
of penalty-based contact surfaces, on sophisticated three-dimensional continuum damage models, and overall on
a modelling approach based on mesh structuring and crack-band erosion to capture the appropriate crack paths
in unidirectional fibre reinforced plies. An extensive and rigorous validation of the overall approach is presented,
demonstrating that the virtual testing laboratory is robust and can be reliably used in for composite materials
screening, design and certification.
10 9242 it geo-spatial information for managing ambiguity(edit ty)IAESIJEECS
An innate test emerging in any dataset containing data of space as well as time is vulnerability due to different wellsprings of imprecision. Incorporating the effect of the instability is a principal while evaluating the unwavering quality (certainty) of any question result from the hidden information. To bargain with vulnerability, arrangements have been proposed freely in the geo-science and the information science look into group. This interdisciplinary instructional exercise crosses over any barrier between the two groups by giving an exhaustive diagram of the distinctive difficulties required in managing indeterminate geo-spatial information, by looking over arrangements from both research groups, and by distinguishing likenesses, cooperative energies and open research issues.
Spiking ink drop spread clustering algorithm and its memristor crossbar conce...IJECEIAES
In this study, a new clustering algorithm that combines neural networks and fuzzy logic properties is proposed based on spiking neural network and ink drop spread (IDS) concepts. The proposed structure is a single-layer artificial neural network with leaky integrate and fire (LIF) neurons. The structure implements the IDS algorithm as a fuzzy concept. Each training data will result in firing the corresponding input neuron and its neighboring neurons. A synchronous time coding algorithm is used to manage input and output neurons firing time. For an input data, one or several output neurons of the network will fire; confidence degree of the network to outputs is defined as the relative delay of the firing times with respect to the synchronous pulse. A memristor crossbar-based hardware is introduced for implementation of the proposed algorithm as a processing hardware. The simulation result corroborates that the proposed algorithm can be used as a neuro-fuzzy clustering and vector quantization algorithm.
IDENTIFICATION OF DELAMINATION SIZE AND LOCATION OF COMPOSITE LAMINATE FROM TIME DOMAIN DATA OF MAGNETOSTRICTIVE SENSOR AND ACTUATOR USING ARTIFICIAL NEURAL NETWORK.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
COMPARATIVE PERFORMANCE ANALYSIS OF RNSC AND MCL ALGORITHMS ON POWER-LAW DIST...acijjournal
Cluster analysis of graph related problems is an important issue now-a-day. Different types of graph
clustering techniques are appeared in the field but most of them are vulnerable in terms of effectiveness
and fragmentation of output in case of real-world applications in diverse systems. In this paper, we will
provide a comparative behavioural analysis of RNSC (Restricted Neighbourhood Search Clustering) and
MCL (Markov Clustering) algorithms on Power-Law Distribution graphs. RNSC is a graph clustering
technique using stochastic local search. RNSC algorithm tries to achieve optimal cost clustering by
assigning some cost functions to the set of clusterings of a graph. This algorithm was implemented by A.
D. King only for undirected and unweighted random graphs. Another popular graph clustering
algorithm MCL is based on stochastic flow simulation model for weighted graphs. There are plentiful
applications of power-law or scale-free graphs in nature and society. Scale-free topology is stochastic i.e.
nodes are connected in a random manner. Complex network topologies like World Wide Web, the web of
human sexual contacts, or the chemical network of a cell etc., are basically following power-law
distribution to represent different real-life systems. This paper uses real large-scale power-law
distribution graphs to conduct the performance analysis of RNSC behaviour compared with Markov
clustering (MCL) algorithm. Extensive experimental results on several synthetic and real power-law
distribution datasets reveal the effectiveness of our approach to comparative performance measure of
these algorithms on the basis of cost of clustering, cluster size, modularity index of clustering results and
normalized mutual information (NMI).
Recent changes to the existing power grid are expected to influence the way energy is provided and consumed by customers. Advanced Metering Infrastructure (AMI) is a tool to incorporate these changes for modernizing the electricity grid. Growing energy needs are forcing government agencies and utility companies to move towards AMI systems as part of larger smart grid initiatives. The smart grid promises to enable a more reliable, sustainable, and efficient power grid by taking advantage of information and communication technologies. However, this information-based power grid can reveal sensitive private information from the user’s perspective due to its ability to gather highly granular power consumption data. This has resulted in limited consumer acceptance and proliferation of the smart grid. Hence, it is crucial to design a mechanism to prevent the leakage of such sensitive consumer usage information in
smart grid. Among different solutions for preserving consumer privacy in Smart Grid Network(SGN), private data aggregation techniques have received a tremendous focus from security researchers. Existing privacy-preserving aggregation mechanisms in SGNs utilize cryptographic techniques, specifically homomorphic properties of public-key cryptosystems. Such homomorphic approaches are bandwidthintensive (due to large output blocks they generate), and in most cases, are computationally complex. In this paper, we present a novel and efficient CDMA-based approach to achieve privacy-preserving aggregation in SGNs by utilizing random perturbation of power consumption data and with limited use of traditional cryptography. We evaluate and validate the efficiency and performance of our proposed privacy preserving data aggregation scheme through extensive statistical analyses and simulations.
Application of Multiple Kernel Support Vector Regression for Weld Bead Geomet...IJECEIAES
Modelling and prediction of weld bead geometry is an important issue in robotic GMAW process. This process is highly non-linear and coupled multivariable system and the relationship between process parameters and weld bead geometry cannot be defined by an explicit mathematical expression. Therefore, application of supervised learning algorithms can be useful for this purpose. Support vector machine is a very successful approach to supervised learning. In this approach, a higher degree of accuracy and generalization capability can be obtained by using the multiple kernel learning framework, which is considered as a great advantage in prediction of weld bead geometry due to the high degree of prediction accuracy required. In this paper, a novel approach for modelling and prediction of the weld bead geometry, based on multiple kernel support vector regression analysis has been proposed, which benefits from a high degree of accuracy and generalization capability. This model can be used for proper selection of welding parameters in order to obtain a desired weld bead geometry in robotic GMAW process.
Multi-task learning using non-linear autoregressive models and recurrent neur...IJECEIAES
Tide level forecasting plays an important role in environmental management and development. Current tide level forecasting methods are usually implemented for solving single task problems, that is, a model built based on the tide level data at an individual location is only used to forecast tide level of the same location but is not used for tide forecasting at another location. This study proposes a new method for tide level prediction at multiple locations simultaneously. The method combines nonlinear autoregressive moving average with exogenous inputs (NARMAX) model and recurrent neural networks (RNNs), and incorporates them into a multi-task learning (MTL) framework. Experiments are designed and performed to compare single task learning (STL) and MTL with and without using non-linear autoregressive models. Three different RNN variants, namely, long short- term memory (LSTM), gated recurrent unit (GRU) and bidirectional LSTM (BiLSTM) are employed together with non-linear autoregressive models. A case study on tide level forecasting at many different geographical locations (5 to 11 locations) is conducted. Experimental results demonstrate that the proposed architectures outperform the classical single-task prediction methods.
Analytical Modelling of Power Efficient Reliable Operation of Data Fusion in ...IJECEIAES
Irrespective of inclusion of Wireless Sensor Network (WSN) in majority of the research proposition for smart city planning, it is still shrouded with some significant issues. A closer look into problems in WSN shows that energy parameter is the origination point of majority of the other problems in resource-constrained sensors as well as it significant minimizes the reliability in standard sensory operation in adverse environment. Therefore, this manuscript presents a novel analytical model that is meant for establishing a well balance between energy efficiency over multi-path data forwarding and reliable operation with improved network performance. The complete process is emphasized during data fusion stage to ensure data quality too. A simulation study has been carried out using benchmarked test-bed of MEMSIC nodes to find that proposed system offers good energy conservation process during data fusion operation as well as it also ensure good reliable operation in comparison to existing system.
Similar to Development Principles of Knowledge Database of Intelligent System for Estimation of Dynamical Interaction of Orbital Systems with Space Debris (20)
Securing Cloud Computing Through IT GovernanceITIIIndustries
Lack of alignment between information technology (IT) and the business is a problem facing many organizations. Most organizations, today, fundamentally depend on IT. When IT and the business are aligned in an organization, IT delivers what the business needs and the business is able to deliver what the market needs. IT has become a strategic function for most organizations, and it is imperative that IT and business are aligned. IT governance is one of the most powerful ways to achieve IT to business alignment. Furthermore, as the use of cloud computing for delivering IT functions becomes pervasive, organizations using cloud computing must effectively apply IT governance to it. While cloud computing presents tremendous
opportunities, it comes with risks as well. Information security
is one of the top risks in cloud computing. Thus, IT governance must be applied to cloud computing information security to help manage the risks associated with cloud computing information security. This study advances knowledge by extending IT governance to cloud computing and information security governance.
Information Technology in Industry(ITII) - November Issue 2018ITIIIndustries
IT Industry publishes original research articles, review articles, and extended versions of conference papers. Articles resulting from research of both theoretical and/or practical natures performed by academics and/or industry practitioners are welcome. IT in Industry aims to become a leading IT journal with a high impact factor.
Design of an IT Capstone Subject - Cloud RoboticsITIIIndustries
This paper describes the curriculum of the three year IT undergraduate program at La Trobe University, and the faculty requirements in designing a capstone subject, followed by the ACM’s recommended IT curriculum covering the five pillars of the IT discipline. Cloud robotics, a broad multidisciplinary research area, requiring expertise in all five pillars with mechatronics, is an ideal candidate to offer capstone experiences to IT students. Therefore, in this paper, we propose a long term
master project in developing a cloud robotics testbed, with many capstone sub-projects spanning across the five IT pillars, to meet the objectives of capstone experience. This paper also describes the design and implementation of the testbed, and proposes potential capstone projects for students with different interests.
Dimensionality Reduction and Feature Selection Methods for Script Identificat...ITIIIndustries
The goal of this research is to explore effects of dimensionality reduction and feature selection on the problem of script identification from images of printed documents. The kadjacent segment is ideal for this use due to its ability to capture visual patterns. We have used principle component analysis to reduce the size of our feature matrix to a handier size that can be trained easily, and experimented by including varying combinations of dimensions of the super feature set. A modular
approach in neural network was used to classify 7 languages – Arabic, Chinese, English, Japanese, Tamil, Thai and Korean.
Image Matting via LLE/iLLE Manifold LearningITIIIndustries
Accurately extracting foreground objects is the problem of isolating the foreground in images and video, called image matting which has wide applications in digital photography. This problem is severely ill-posed in the sense that, at each pixel, one must estimate the foreground and background pixels and the so-called alpha value from only pixel information. The most recent work in natural image matting rely on local smoothness assumptions about foreground and background colours on which a cost function has been established. In this paper, we propose an extension to the class of affinity based matting techniques by incorporating local manifold structural
information to produce both a smoother matte based on the socalled improved Locally Linear Embedding. We illustrate our new algorithm using the standard benchmark images and very comparable results have been obtained.
Annotating Retina Fundus Images for Teaching and Learning Diabetic Retinopath...ITIIIndustries
With the improvement in IT industry, more and more application of computer software is introduced in teaching and learning. In this paper, we discuss the development process of such software. Diabetic Retinopathy is a common complication for diabetic patients. It may cause sight loss if not treated early. There are several stages of this disease. Fundus imagery is required to identify the stage and severity of the disease. Due to the lack of proper dataset of the fundus images and proper annotation, it is very difficult to perform research on this topic. Moreover, medical students are often facing difficulty with identifying the diseases in later stage of their practice as they may not have seen a sample of all of the stages of Diabetic Retinopathy problems. To mitigate the problem, we have collected fundus images from different geographic area of Bangladesh and designed an annotation software to store information about the patient, the infection level and their locations in the images. Sometimes, it is difficult to select all appropriate pixels of the infected region. To resolve the issue, we have introduced a K nearest neighbor (KNN) based technique to accurately select the region of interest (ROI). Once an expert (ophthalmologist) has annotated the images, the software can be used by the students for learning.
A Framework for Traffic Planning and Forecasting using Micro-Simulation Calib...ITIIIndustries
This paper presents the application of microsimulation for traffic planning and forecasting, and proposes a new framework to model complex traffic conditions by calibrating and adjusting traffic parameters of a microsimulation model. By using an open source micro-simulator package, TRANSIMS, in this study, animated and numerical results were produced and analysed. The framework of traffic model calibration was evaluated for its usefulness and practicality. Finally, we discuss future applications such as providing end users with real time traffic information through Intelligent Transport System (ITS) integration.
Investigating Tertiary Students’ Perceptions on Internet SecurityITIIIndustries
Internet security threats have grown from just simple viruses to various forms of computer hacking, scams, impersonation, cyber bullying, and spyware. The Internet has great influence on most people. It has profound influence and one can spend endless hours on internet activities. In particular, youth engage in more online activities than any other age group. Excessive internet usage is an emerging threat that has negative impacts on these youth; hence it is vital to investigate youths' online behavior. This work studies tertiary students’ risk awareness, and provides some findings that allow us to understand their knowledge on risks and their behavior towards online activities. It reveals several important online issues amongst tertiary students; Firstly, the lack of online security awareness; second, a lack of awareness and information about the dangers of rootkits, internet cookies and spyware; thirdly, female students are more unflinching than male students when commenting on social networking sites; fourthly, students are cautious only when obvious security warnings are present; and finally, their usage of internet hotspots is common without fully understanding its associated danger. These findings enable us to recommend types of internet security habits and safety practices that students should adopt in future when they are exposed to online activities. A more holistic approach was considered which aims to minimize any future risks and dangers with online activities involving students.
Blind Image Watermarking Based on Chaotic MapsITIIIndustries
Security of a watermark refers to its resistance to unauthorized detecting and decoding, while watermark robustness refers to the watermark’s resistance against common processing. Many watermarking schemes emphasize robustness more than security. However, a robust watermark is not enough to accomplish protection because the range of hostile attacks is not limited to common processing and distortions. In this paper, we give consideration to watermark security. To achieve this, we employ chaotic maps due to their extreme sensitivity to the initial values. If one fails to provide these values, the watermark will be wrongly extracted. While the chaotic maps provide perfect watermarking security, the proposed scheme is also intended to achieve robustness.
Programmatic detection of spatial behaviour in an agent-based modelITIIIndustries
The automated detection of aspects of spatial behaviour in an agent-based model is necessary for model testing and analysis. In this paper we compare four predictors of herding behaviour in a model of a grazing herbivore. We find that a) the mean number of neighbours adjusted to account for population variation and b) the mean Hamming distance between rows of the two-dimensional environment can be used to detect herding. Visual inspection of the model behaviour revealed that herding occurs when the herbivore mobility reaches a threshold level. Using this threshold we identify a limits for these predictors to use in the program code. These results apply only to one set of parameters and environment size; future research will involve a wider parameter space.
Design of an IT Capstone Subject - Cloud RoboticsITIIIndustries
This paper describes the curriculum of the three year IT undergraduate program at La Trobe University, and the faculty requirements in designing a capstone subject, followed by the ACM’s recommended IT curriculum covering the five pillars of the IT discipline. Cloud robotics, a broad multidisciplinary research area, requiring expertise in all five pillars with mechatronics, is an ideal candidate to offer capstone experiences to IT students. Therefore, in this paper, we propose a long term master project in developing a cloud robotics testbed, with many capstone sub-projects spanning across the five IT pillars, to meet the objectives of capstone experience. This paper also describes the design and implementation of the testbed, and proposes potential capstone projects for students with different interests.
A Smart Fuzzing Approach for Integer Overflow DetectionITIIIndustries
Fuzzing is one of the most commonly used methods to detect software vulnerabilities, a major cause of information security incidents. Although it has advantages of simple design and low error report, its efficiency is usually poor. In this paper we present a smart fuzzing approach for integer overflow detection and a tool, SwordFuzzer, which implements this approach. Unlike standard fuzzing techniques, which randomly change parts of the input file with no information about the underlying syntactic structure of the file, SwordFuzzer uses online dynamic taint analysis to identify which bytes in the input file are used in security sensitive operations and then focuses on mutating such bytes. Thus, the generated inputs are more likely to trigger potential vulnerabilities. We evaluated SwordFuzzer with an example program and a number of real-world applications. The experimental results show that SwordFuzzer can accurately locate the key bytes of the input file and dramatically improve the effectiveness of fuzzing in detecting real-world vulnerabilities
The banking experience for many people today is fundamentally an application of technology to be able to carry out their financial tasks. While the need to visit a bank branch remains essential for a number of activities, increasingly the need to support mobile usage is becoming the central focus of many bank strategies. The core banking systems that process financial transactions must remain highly available and able to support large volumes of activity. These systems represent a long term investment for banks and when the need arises to modernize these large systems, the transformation initiative is often very expensive and of high risk. We present in this paper our experiences in bank modernization and transformation, and outline the strategies for rolling out these large programs. As banking institutions embark upon transformation programs to upgrade their banking channels and core banking systems, it is hoped that the insights presented here are useful as a framework to support these initiatives.
Detecting Fraud Using Transaction Frequency DataITIIIndustries
Despite all attempts to prevent fraud, it continues to be a major threat to industry and government. In this paper, we present a fraud detection method which detects irregular frequency of transaction usage in an Enterprise Resource Planning (ERP) system. We discuss the design, development and empirical evaluation of outlier detection and distance measuring techniques to detect frequency-based anomalies within an individual user’s profile, relative to other similar users. Primarily, we propose three automated techniques: a univariate method, called Boxplot which is based on the sample’s median; and two multivariate methods which use Euclidean distance, for detecting transaction frequency anomalies within each transaction profile. The two multivariate approaches detect potentially fraudulent activities by identifying: (1) users where the Euclidean distance between their transaction-type set is above a certain threshold and (2) users/data points that lie far apart from other users/clusters or represent a small cluster size, using k-means clustering. The proposed methodology allows an auditor to investigate the transaction frequency anomalies and adjust the different parameters, such as the outlier threshold and the Euclidean distance threshold values to tune the number of alerts. The novelty of the proposed technique lies in its ability to automatically trigger alerts from transaction profiles, based on transaction usage performed over a period of time. Experiments were conducted using a real dataset obtained from the production client of a large organization using SAP R/3 (presently the most predominant ERP system), to run its business. The results of this empirical research demonstrate the effectiveness of the proposed approach.
Mapping the Cybernetic Principles of Viable System Model to Enterprise Servic...ITIIIndustries
This paper describes the results of a theoretical mapping of the cybernetic principles of the Viable System Model (VSM) to an Enterprise Service Bus (ESB) model, with the aim to identify the management principles for the integration of services at all levels in the enterprise. This enrichment directly contributes to the viability of service-oriented systems and the justification of Business/IT alignment within enterprise. The model was identified to be suitable for further adaption in the industrial setting planned within Australian governmental departments.
Using Watershed Transform for Vision-based Two-Hand Occlusion in an Interacti...ITIIIndustries
To achieve a natural interaction in augmented reality environment, we have suggested to use markerless visionbased two-handed gestures for the interaction; with an outstretched hand and a pointing hand used as virtual object registration plane and pointing device respectively. However, twohanded interaction always causes mutual occlusion which jeopardizes the hand gesture recognition. In this paper, we present a solution for two-hand occlusion by using watershed transform. The main idea is to start from a two-hand occlusion image in binary format, then form a grey-scale image based on the distance of each non-object pixel to object pixel. The watershed algorithm is applied to the negation of the grey scaled image to form watershed lines which separate the two hands. Fingertips are then identified and each hand is recognized based on the number of fingertips on each hand. The outstretched hand is assumed to contain 5 fingertips and the pointing device contains less than 5 fingertips. An example of applying our result in hand and virtual object interaction is displayed at the end of the paper.
Speech Feature Extraction and Data VisualisationITIIIndustries
—This paper presents a signal processing approach to analyse and identify accent discriminative features of four groups of English as a second language (ESL) speakers, including Chinese, Indian, Japanese, and Korean. The features used for speech recognition include pitch, stress, formant frequencies, the Mel frequency coefficient, log frequency coefficient, and the intensity and duration of vowels spoken. This paper presents our study using the Matlab Speech Analysis Toolbox, and highlights how data processing can be automated and results visualised. The proposed algorithm achieved an average success rate of 57.3% in identifying vowels spoken in a speech by the four nonnative English speaker groups.
Bayesian-Network-Based Algorithm Selection with High Level Representation Fee...ITIIIndustries
A real-world intelligent system consists of three basic modules: environment recognition, prediction (or estimation), and behavior planning. To obtain high quality results in these modules, high speed processing and real time adaptability on a case by case basis are required. In the environment recognition module many different algorithms and algorithm networks exist with varying performance. Thus, a mechanism that selects the best possible algorithm is required. To solve this problem we are using an algorithm selection approach to the problem of natural image understanding. This selection mechanism is based on machine learning; a bottom-up algorithm selection from real-world image features and a top-down algorithm selection using information obtained from a high level symbolic world description and algorithm suitability. The algorithm selection method iterates for each input image until the high-level description cannot be improved anymore. In this paper we present a method of iterative composition of the high level description. This step by step approach allows us to select the best result for each region of the image by evaluating all the intermediary representations and finally keep only the best one.
Instance Selection and Optimization of Neural NetworksITIIIndustries
Credit scoring is an important tool in financial institutions, which can be used in credit granting decision. Credit applications are marked by credit scoring models and those with high marks will be treated as “good”, while those with low marks will be regarded as “bad”. As data mining technique develops, automatic credit scoring systems are warmly welcomed for their high efficiency and objective judgments. Many machine learning algorithms have been applied in training credit scoring models, and ANN is one of them with good performance. This paper presents a higher accuracy credit scoring model based on MLP neural networks trained with back propagation algorithm. Our work focuses on enhancing credit scoring models in three aspects: optimize data distribution in datasets using a new method called Average Random Choosing; compare effects of training-validation-test instances numbers; and find the most suitable number of hidden units. Another contribution of this paper is summarizing the tendency of scoring accuracy of models when the number of hidden units increases. The experiment results show that our methods can achieve high credit scoring accuracy with imbalanced datasets. Thus, credit granting decision can be made by data mining methods using MLP neural networks.
Signature Forgery and the Forger – An Assessment of Influence on Handwritten ...ITIIIndustries
Signatures are widely used as a form of personal authentication. Despite ubiquity in deployment, individual signatures are relatively easy to forge, especially when only the static ‘pictorial’ outcome of the signature is considered at verification time. In this study, we explore opinions on signature usage for verification purposes, and how individuals rate a particular third-party signature in terms of ease of forgeability and their own ability to forge. We examine responses with respect to an individual’s experience of the forgeability/complexity of their own signature. Our study shows that past experience does not generally have an effect on perceived signature complexity nor the perceived effectiveness of an individual to themselves forge a signature. In assessing forgeability, most subjects cite the overall signature complexity and distinguishing features in reaching this decision. Furthermore, our research indicates that individuals typically vary their signature according to the scenario but generally little effort into the production of the signature.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Welocme to ViralQR, your best QR code generator.ViralQR
Welcome to ViralQR, your best QR code generator available on the market!
At ViralQR, we design static and dynamic QR codes. Our mission is to make business operations easier and customer engagement more powerful through the use of QR technology. Be it a small-scale business or a huge enterprise, our easy-to-use platform provides multiple choices that can be tailored according to your company's branding and marketing strategies.
Our Vision
We are here to make the process of creating QR codes easy and smooth, thus enhancing customer interaction and making business more fluid. We very strongly believe in the ability of QR codes to change the world for businesses in their interaction with customers and are set on making that technology accessible and usable far and wide.
Our Achievements
Ever since its inception, we have successfully served many clients by offering QR codes in their marketing, service delivery, and collection of feedback across various industries. Our platform has been recognized for its ease of use and amazing features, which helped a business to make QR codes.
Our Services
At ViralQR, here is a comprehensive suite of services that caters to your very needs:
Static QR Codes: Create free static QR codes. These QR codes are able to store significant information such as URLs, vCards, plain text, emails and SMS, Wi-Fi credentials, and Bitcoin addresses.
Dynamic QR codes: These also have all the advanced features but are subscription-based. They can directly link to PDF files, images, micro-landing pages, social accounts, review forms, business pages, and applications. In addition, they can be branded with CTAs, frames, patterns, colors, and logos to enhance your branding.
Pricing and Packages
Additionally, there is a 14-day free offer to ViralQR, which is an exceptional opportunity for new users to take a feel of this platform. One can easily subscribe from there and experience the full dynamic of using QR codes. The subscription plans are not only meant for business; they are priced very flexibly so that literally every business could afford to benefit from our service.
Why choose us?
ViralQR will provide services for marketing, advertising, catering, retail, and the like. The QR codes can be posted on fliers, packaging, merchandise, and banners, as well as to substitute for cash and cards in a restaurant or coffee shop. With QR codes integrated into your business, improve customer engagement and streamline operations.
Comprehensive Analytics
Subscribers of ViralQR receive detailed analytics and tracking tools in light of having a view of the core values of QR code performance. Our analytics dashboard shows aggregate views and unique views, as well as detailed information about each impression, including time, device, browser, and estimated location by city and country.
So, thank you for choosing ViralQR; we have an offer of nothing but the best in terms of QR code services to meet business diversity!
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.