The document describes research applying the k-nearest neighbor (KNN) algorithm to user log data to determine which standard operating procedures (SOPs) documents are most relevant to a user's preferences. The researchers collected implicit feedback data on users' searches of 11 SOP documents over 5 weeks. They applied the KNN algorithm to calculate distances between user data and each SOP document to classify them as suitable or unsuitable based on the user's profile. The results found a ratio of 3:2 for suitable vs. unsuitable classifications, indicating the KNN approach accurately identified SOPs matching 80% of user profiles based on their implicit feedback behavior.
Mixed integer nonlinear programming (MINLP)-based bandwidth utility function ...IJECEIAES
The development of the internet in this era of globalization has increased fast. The need for internet becomes unlimited. Utility functions as one of measurements in internet usage, were usually associated with a level of satisfaction of users for the use of information services used. There are three internet pricing schemes used, that are flat fee, usage based and two-part tariff schemes by using one of the utility function which is Bandwidth Diminished with Increasing Bandwidth with monitoring cost and marginal cost. Internet pricing scheme will be solved by LINGO 13.0 in form of non-linear optimization problems to get optimal solution. The optimal solution is obtained using the either usage-based pricing scheme model or two-part tariff pricing scheme model for each services offered, if the comparison is with flat-fee pricing scheme. It is the best way for provider to offer network based on usage based scheme. The results show that by applying two part tariff scheme, the providers can maximize its revenue either for homogeneous or heterogeneous consumers.
Correlating objective factors with videoIJCNCJournal
To succeed in providing services, the quality of services should meet users’ satisfaction. This is a motivation to study the relationship between the service quality and the real perceived quality of users, which is commonly referred to as the quality of experience (QoE). However, most of existing QoE studies that focus on video-on-demand or IPTV services analyze only the influence of network behaviors to video quality. This paper focuses on P2P video streaming services, which are becoming a significant portion of Internet traffic, and pays attention to the change of users’ perception with the adjustment of objective
factors as well as network behaviors. We propose to use mean opinion score and peak signal to noise ratio
methods as QoE evaluations to consider the effect of the chunk loss ratio, the group-of-picture size, and the
chunk size. The experimental results provide a convincing reference to build the complete relationship
between objective factors and QoE. We believe that this assessment will contribute to study a new service
quality evaluation mechanism based on users’ satisfaction in the future.
Collaborative Filtering Approach For QoS PredictionEditor IJMTER
Many researchers propose that, not only functional but also non-functional properties, also
known as quality of service (QoS), should be taken into consideration when consumers select
services. Consumers need to make prediction on quality of unused web services before selecting.
Usually, this prediction is based on other consumers’ experiences. Being aware of different QoS
experiences of consumers, this paper proposes a collaborative filtering based approach to making
similarity mining and prediction from consumers’ experiences. Experimental results demonstrate that
this approach can make significant improvement on the effectiveness of QoS prediction for web
services.
Mixed integer nonlinear programming (MINLP)-based bandwidth utility function ...IJECEIAES
The development of the internet in this era of globalization has increased fast. The need for internet becomes unlimited. Utility functions as one of measurements in internet usage, were usually associated with a level of satisfaction of users for the use of information services used. There are three internet pricing schemes used, that are flat fee, usage based and two-part tariff schemes by using one of the utility function which is Bandwidth Diminished with Increasing Bandwidth with monitoring cost and marginal cost. Internet pricing scheme will be solved by LINGO 13.0 in form of non-linear optimization problems to get optimal solution. The optimal solution is obtained using the either usage-based pricing scheme model or two-part tariff pricing scheme model for each services offered, if the comparison is with flat-fee pricing scheme. It is the best way for provider to offer network based on usage based scheme. The results show that by applying two part tariff scheme, the providers can maximize its revenue either for homogeneous or heterogeneous consumers.
Correlating objective factors with videoIJCNCJournal
To succeed in providing services, the quality of services should meet users’ satisfaction. This is a motivation to study the relationship between the service quality and the real perceived quality of users, which is commonly referred to as the quality of experience (QoE). However, most of existing QoE studies that focus on video-on-demand or IPTV services analyze only the influence of network behaviors to video quality. This paper focuses on P2P video streaming services, which are becoming a significant portion of Internet traffic, and pays attention to the change of users’ perception with the adjustment of objective
factors as well as network behaviors. We propose to use mean opinion score and peak signal to noise ratio
methods as QoE evaluations to consider the effect of the chunk loss ratio, the group-of-picture size, and the
chunk size. The experimental results provide a convincing reference to build the complete relationship
between objective factors and QoE. We believe that this assessment will contribute to study a new service
quality evaluation mechanism based on users’ satisfaction in the future.
Collaborative Filtering Approach For QoS PredictionEditor IJMTER
Many researchers propose that, not only functional but also non-functional properties, also
known as quality of service (QoS), should be taken into consideration when consumers select
services. Consumers need to make prediction on quality of unused web services before selecting.
Usually, this prediction is based on other consumers’ experiences. Being aware of different QoS
experiences of consumers, this paper proposes a collaborative filtering based approach to making
similarity mining and prediction from consumers’ experiences. Experimental results demonstrate that
this approach can make significant improvement on the effectiveness of QoS prediction for web
services.
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSISIJwest
ABSTRACT
Graphical user interfaces design in software development process focuses on maximizing usability and the user's experience, in order to make the interaction for users easy, flexible and efficient. In this paper, we propose an approach for evaluating the usability satisfaction degree of a web-based system. The proposed method has been accomplished in two phases and implemented on an airlines website as a case study. In the first phase, a website usability test is implemented by a number of users, and then the results obtained are translated into charts for a final web-based system evaluation in the second phase. The results achieved
were satisfactory, since the places where the weaknesses and gaps in the website are identified and recommended solutions to avoid them are drawn. The authenticity of the results have been confirmed by comparing them with user opinions acquired from a questionnaire, which proves the precision in which the website is rated.
Sentiment analysis of online licensing service quality in the energy and mine...CSITiaesprime
The Ministry of Energy and Mineral Resources of the Republic of Indonesia regularly assessed public satisfaction with its online licensing services. User rated their satisfaction at 3.42 on a scale of 4, below the organization's average of 3.53. Evaluating public service performance is crucial for quality improvement. Previous research relied solely on survey data to assess public satisfaction. This study goes further by analyzing user feedback in text form from an online licensing application to identify negative aspects of the service that need enhancement. The dataset spanned September 2019 to February 2023, with 24,112 entries. The choice of classification methods on the highest accuracy values among decision tree, random forest, naive bayes, stochastic gradient descent, logistic regression (LR), and k-nearest neighbor. The text data was converted into numerical form using CountVectorizer and term frequency-inverse document frequency (TF-IDF) techniques, along with unigrams and bigrams for dividing sentences into word segments. LR bigram CountVectorizer ranked highest with 89% for average precision, F1-score, and recall, compared to the other five classification methods. The sentiment analysis polarity level was 36.2% negative. Negative sentiment revealed expectations from the public to the ministry to improve the top three aspects: system, mechanism, and procedure; infrastructure and facilities; and service specification product types.
Utility Function-based Pricing Strategies in Maximizing the Information Servi...IJECEIAES
Previous research only focus on maximizing revenue for pricing strategies for information good with regardless the marginal and monitoring costs. This paper aims to focus on the addition of marginal and monitoring costs into the pricing strategies to maintain the maximal revenue while introduce the costs incurred in adopting the strategies. The well-known utility functions applied to also consider the consumer’s satisfaction towards the service offered. The results show that the addition costs incurred for setting up the strategies can also increase the profit for the providers rather than neglecting the costs. It is also showed that the Cobb-Douglas utility functions used can enhance the notion of provider to optimize the revenue compared to quasi linear and perfect substitutes.
Software size estimation at early stages of project development holds great significance to meet
the competitive demands of software industry. Software size represents one of the most
interesting internal attributes which has been used in several effort/cost models as a predictor
of effort and cost needed to design and implement the software. The whole world is focusing
towards object oriented paradigm thus it is essential to use an accurate methodology for
measuring the size of object oriented projects. The class point approach is used to quantify
classes which are the logical building blocks in object oriented paradigm. In this paper, we
propose a class point based approach for software size estimation of On-Line Analytical
Processing (OLAP) systems. OLAP is an approach to swiftly answer decision support queries
based on multidimensional view of data. Materialized views can significantly reduce the
execution time for decision support queries. We perform a case study based on the TPC-H
benchmark which is a representative of OLAP System. We have used a Greedy based approach
to determine a good set of views to be materialized. After finding the number of views, the class
point approach is used to estimate the size of an OLAP System The results of our approach are
validated.
Software size estimation at early stages of project development holds great significance to meet the competitive demands of software industry. Software size represents one of the most
interesting internal attributes which has been used in several effort/cost models as a predictor of effort and cost needed to design and implement the software. The whole world is focusing
towards object oriented paradigm thus it is essential to use an accurate methodology for measuring the size of object oriented projects. The class point approach is used to quantify classes which are the logical building blocks in object oriented paradigm. In this paper, we propose a class point based approach for software size estimation of On-Line Analytical
Processing (OLAP) systems. OLAP is an approach to swiftly answer decision support queries based on multidimensional view of data. Materialized views can significantly reduce the
execution time for decision support queries. We perform a case study based on the TPC-H benchmark which is a representative of OLAP System. We have used a Greedy based approach
to determine a good set of views to be materialized. After finding the number of views, the class point approach is used to estimate the size of an OLAP System The results of our approach are validated.
UI/UX integrated holistic monitoring of PAUD using the TCSD methodjournalBEEI
User interface (UI)/user experience (UX) is one part of the stages in the development of the system to produce interactive and attractive web-based application layouts so that it is easy to understand and use by users. In this research, a case study was conducted on early childhood in PAUD Kuntum Mekar. The design of the UI/UX model for holistic integrative PAUD monitoring becomes one of the solutions to help parents and teachers. The method used to design UI/UX is the task centered system design (TCSD) approach starting from the stages; 1) identification scope of use, 2) user centered requirement analysis, 3) design and scenario, and 4) walkthrough evaluate, the method used for system testing is user satisfaction, and heuristic usability. The purpose of this study is the UI/UX design with TCSD can provide valid data needs of each actor based on the assignment and design of the story board of the developed system. The results of this research are UI/UX model design for integrative holistic PAUD monitoring application.
Dashboard settings design in SVARA using user-centred design methodTELKOMNIKA JOURNAL
SVARA is the first Social Media audio application in Indonesia developed by PT. Zamrud Teknologi Khatulistiwa. At present, this application does not have feature settings to display content and other basic settings on the user's side. This situation results in users not having the role to manage the appearance of the dashboard according to their preferences. Settings are done entirely by administrators using scripts and must take APIs with regular PHP scripts. And this is very troublesome. So to give a role to the view of user management, the application needs to be made a dashboard setting feature as a follow-up. Through this paper, the researchers propose designing this dashboard feature using the User-Centered Design (UCD) method. The design results show that this method has a positive correlation with user involvement support in the application development process.
ESTIMATING THE EFFORT OF MOBILE APPLICATION DEVELOPMENTcsandit
The rise of the use of mobile technologies in the world, such as smartphones and tablets,
connected to mobile networks is changing old habits and creating new ways for the society to
access information and interact with computer systems. Thus, traditional information systems
are undergoing a process of adaptation to this new computing context. However, it is important
to note that the characteristics of this new context are different. There are new features and,
thereafter, new possibilities, as well as restrictions that did not exist before. Finally, the systems
developed for this environment have different requirements and characteristics than the
traditional information systems. For this reason, there is the need to reassess the current
knowledge about the processes of planning and building for the development of systems in this
new environment. One area in particular that demands such adaptation is software estimation.
The estimation processes, in general, are based on characteristics of the systems, trying to
quantify the complexity of implementing them. Hence, the main objective of this paper is to
present a proposal for an estimation model for mobile applications, as well as discuss the
applicability of traditional estimation models for the purpose of developing systems in the
context of mobile computing. Hence, the main objective of this paper is to present an effort
estimation model for mobile applications.
E-commerce online review for detecting influencing factors users perceptionjournalBEEI
To date, online shopping using e-commerce services becomes a trend. The emergence of e-commerce truly helps people to shop more effectively and efficiently. However, there are still some problems encountered in e-commerce, especially from the user perspective. This research aims to explore user review data, particularly on factors that influence user perception of e-commerce applications, classify, and identify potential solutions to finding problems in e-commerce applications. Data is grabbed using web scraping techniques and classified using proper machine learning, i.e., support vector machine (SVM). Text associations and fishbone analysis are performed based on the classified user review data. The results of this study show that the user satisfaction problem can be captured. Furthermore, various services that should be provided as a potential solution to experienced customers' problems or application users' perception problems can be generated. A detailed discussion of these findings is available in this article.
Knowledge management system SOP using semantic networks connected with person...TELKOMNIKA JOURNAL
Document Standard Operating Procedures (SOP) can manage business processes and employee performance in an organization. This study aims to develop a system that can publish SOP documents automatically distributed to employees. In this study, an analysis of the relationship between SOP documents and employees is carried out so that it can be directly allocated to employees according to the position. Knowledge analysis of the relationship between SOP documents and employees is done using semantic network analysis. Semantic networks are used to analyze components of knowledge and the relationship between SOP documents and employees. The results of the report of the elements of knowledge and the relationship between SOP documents found 25 SOP documents were consisting of 6 types of central nodes with 156 child nodes and had 7 types of relations containing 207 relations. SOP knowledge management system is connected to the personnel information system (SIPEG) so that it makes it easier for users to find, accommodate, and manage the knowledge contained in SOP documents. System implementation is done using PHP programming language with CodeIgniter framework, Rest API, and MySQL database.
Improving collaborative filtering using lexicon-based sentiment analysisIJECEIAES
Since data is available increasingly on the Internet, efforts are needed to develop and improve recommender systems to produce a list of possible favorite items. In this paper, we expand our work to enhance the accuracy of Arabic collaborative filtering by applying sentiment analysis to user reviews, we also addressed major problems of the current work by applying effective techniques to handle the scalability and sparsity problems. The proposed approach consists of two phases: the sentiment analysis and the recommendation phase. The sentiment analysis phase estimates sentiment scores using a special lexicon for the Arabic dataset. The item-based and singular value decomposition-based collaborative filtering are used in the second phase. Overall, our proposed approach improves the experiments’ results by reducing average of mean absolute and root mean squared errors using a large Arabic dataset consisting of 63,000 book reviews.
We provide project guidance for final year MTech, BTech, MSc, MCA, ME, BE, BSc, BCA & Diploma students in Electronics, Computer Science, Information Technology, Instrumentation, Electrical & Electronics, Power electronics, Mechanical, Automobile etc. We provide live project assistance and will make the students involve throughout the project. We specialize in Matlab, VLSI, CST, JAVA, .NET, ANDROID, PHP, NS2, EMBEDDED, ARDUINO, ARM, DSP, etc based areas. We research in Image processing, Signal Processing, Wireless communication, Cloud computing, Data mining, Networking, Artificial Intelligence and several other areas. We provide complete support in project completion, documentation and other works related to project.Success is a lousy teacher. It seduces smart people into thinking they can't lose.we have better knowledge in this field and updated with new innovative technologies.
Call me at: 9037291113.
Context Based Classification of Reviews Using Association Rule Mining, Fuzzy ...journalBEEI
The Internet has facilitated the growth of recommendation system owing to the ease of sharing customer experiences online. It is a challenging task to summarize and streamline the online textual reviews. In this paper, we propose a new framework called Fuzzy based contextual recommendation system. For classification of customer reviews we extract the information from the reviews based on the context given by users. We use text mining techniques to tag the review and extract context. Then we find out the relationship between the contexts from the ontological database. We incorporate fuzzy based semantic analyzer to find the relationship between the review and the context when they are not found therein. The sentence based classification predicts the relevant reviews, whereas the fuzzy based context method predicts the relevant instances among the relevant reviews. Textual analysis is carried out with the combination of association rules and ontology mining. The relationship between review and their context is compared using the semantic analyzer which is based on the fuzzy rules.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
Amazon products reviews classification based on machine learning, deep learni...TELKOMNIKA JOURNAL
In recent times, the trend of online shopping through e-commerce stores and websites has grown to a huge extent. Whenever a product is purchased on an e-commerce platform, people leave their reviews about the product. These reviews are very helpful for the store owners and the product’s manufacturers for the betterment of their work process as well as product quality. An automated system is proposed in this work that operates on two datasets D1 and D2 obtained from Amazon. After certain preprocessing steps, N-gram and word embedding-based features are extracted using term frequency-inverse document frequency (TF-IDF), bag of words (BoW) and global vectors (GloVe), and Word2vec, respectively. Four machine learning (ML) models support vector machines (SVM), logistic regression (RF), logistic regression (LR), multinomial Naïve Bayes (MNB), two deep learning (DL) models convolutional neural network (CNN), long-short term memory (LSTM), and standalone bidirectional encoder representations (BERT) are used to classify reviews as either positive or negative. The results obtained by the standard ML, DL models and BERT are evaluated using certain performance evaluation measures. BERT turns out to be the best-performing model in the case of D1 with an accuracy of 90% on features derived by word embedding models while the CNN provides the best accuracy of 97% upon word embedding features in the case of D2. The proposed model shows better overall performance on D2 as compared to D1.
Design, simulation, and analysis of microstrip patch antenna for wireless app...TELKOMNIKA JOURNAL
In this study, a microstrip patch antenna that works at 3.6 GHz was built and tested to see how well it works. In this work, Rogers RT/Duroid 5880 has been used as the substrate material, with a dielectric permittivity of 2.2 and a thickness of 0.3451 mm; it serves as the base for the examined antenna. The computer simulation technology (CST) studio suite is utilized to show the recommended antenna design. The goal of this study was to get a more extensive transmission capacity, a lower voltage standing wave ratio (VSWR), and a lower return loss, but the main goal was to get a higher gain, directivity, and efficiency. After simulation, the return loss, gain, directivity, bandwidth, and efficiency of the supplied antenna are found to be -17.626 dB, 9.671 dBi, 9.924 dBi, 0.2 GHz, and 97.45%, respectively. Besides, the recreation uncovered that the transfer speed side-lobe level at phi was much better than those of the earlier works, at -28.8 dB, respectively. Thus, it makes a solid contender for remote innovation and more robust communication.
More Related Content
Similar to K-Nearest neighbor algorithm on implicit feedback to determine SOP
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSISIJwest
ABSTRACT
Graphical user interfaces design in software development process focuses on maximizing usability and the user's experience, in order to make the interaction for users easy, flexible and efficient. In this paper, we propose an approach for evaluating the usability satisfaction degree of a web-based system. The proposed method has been accomplished in two phases and implemented on an airlines website as a case study. In the first phase, a website usability test is implemented by a number of users, and then the results obtained are translated into charts for a final web-based system evaluation in the second phase. The results achieved
were satisfactory, since the places where the weaknesses and gaps in the website are identified and recommended solutions to avoid them are drawn. The authenticity of the results have been confirmed by comparing them with user opinions acquired from a questionnaire, which proves the precision in which the website is rated.
Sentiment analysis of online licensing service quality in the energy and mine...CSITiaesprime
The Ministry of Energy and Mineral Resources of the Republic of Indonesia regularly assessed public satisfaction with its online licensing services. User rated their satisfaction at 3.42 on a scale of 4, below the organization's average of 3.53. Evaluating public service performance is crucial for quality improvement. Previous research relied solely on survey data to assess public satisfaction. This study goes further by analyzing user feedback in text form from an online licensing application to identify negative aspects of the service that need enhancement. The dataset spanned September 2019 to February 2023, with 24,112 entries. The choice of classification methods on the highest accuracy values among decision tree, random forest, naive bayes, stochastic gradient descent, logistic regression (LR), and k-nearest neighbor. The text data was converted into numerical form using CountVectorizer and term frequency-inverse document frequency (TF-IDF) techniques, along with unigrams and bigrams for dividing sentences into word segments. LR bigram CountVectorizer ranked highest with 89% for average precision, F1-score, and recall, compared to the other five classification methods. The sentiment analysis polarity level was 36.2% negative. Negative sentiment revealed expectations from the public to the ministry to improve the top three aspects: system, mechanism, and procedure; infrastructure and facilities; and service specification product types.
Utility Function-based Pricing Strategies in Maximizing the Information Servi...IJECEIAES
Previous research only focus on maximizing revenue for pricing strategies for information good with regardless the marginal and monitoring costs. This paper aims to focus on the addition of marginal and monitoring costs into the pricing strategies to maintain the maximal revenue while introduce the costs incurred in adopting the strategies. The well-known utility functions applied to also consider the consumer’s satisfaction towards the service offered. The results show that the addition costs incurred for setting up the strategies can also increase the profit for the providers rather than neglecting the costs. It is also showed that the Cobb-Douglas utility functions used can enhance the notion of provider to optimize the revenue compared to quasi linear and perfect substitutes.
Software size estimation at early stages of project development holds great significance to meet
the competitive demands of software industry. Software size represents one of the most
interesting internal attributes which has been used in several effort/cost models as a predictor
of effort and cost needed to design and implement the software. The whole world is focusing
towards object oriented paradigm thus it is essential to use an accurate methodology for
measuring the size of object oriented projects. The class point approach is used to quantify
classes which are the logical building blocks in object oriented paradigm. In this paper, we
propose a class point based approach for software size estimation of On-Line Analytical
Processing (OLAP) systems. OLAP is an approach to swiftly answer decision support queries
based on multidimensional view of data. Materialized views can significantly reduce the
execution time for decision support queries. We perform a case study based on the TPC-H
benchmark which is a representative of OLAP System. We have used a Greedy based approach
to determine a good set of views to be materialized. After finding the number of views, the class
point approach is used to estimate the size of an OLAP System The results of our approach are
validated.
Software size estimation at early stages of project development holds great significance to meet the competitive demands of software industry. Software size represents one of the most
interesting internal attributes which has been used in several effort/cost models as a predictor of effort and cost needed to design and implement the software. The whole world is focusing
towards object oriented paradigm thus it is essential to use an accurate methodology for measuring the size of object oriented projects. The class point approach is used to quantify classes which are the logical building blocks in object oriented paradigm. In this paper, we propose a class point based approach for software size estimation of On-Line Analytical
Processing (OLAP) systems. OLAP is an approach to swiftly answer decision support queries based on multidimensional view of data. Materialized views can significantly reduce the
execution time for decision support queries. We perform a case study based on the TPC-H benchmark which is a representative of OLAP System. We have used a Greedy based approach
to determine a good set of views to be materialized. After finding the number of views, the class point approach is used to estimate the size of an OLAP System The results of our approach are validated.
UI/UX integrated holistic monitoring of PAUD using the TCSD methodjournalBEEI
User interface (UI)/user experience (UX) is one part of the stages in the development of the system to produce interactive and attractive web-based application layouts so that it is easy to understand and use by users. In this research, a case study was conducted on early childhood in PAUD Kuntum Mekar. The design of the UI/UX model for holistic integrative PAUD monitoring becomes one of the solutions to help parents and teachers. The method used to design UI/UX is the task centered system design (TCSD) approach starting from the stages; 1) identification scope of use, 2) user centered requirement analysis, 3) design and scenario, and 4) walkthrough evaluate, the method used for system testing is user satisfaction, and heuristic usability. The purpose of this study is the UI/UX design with TCSD can provide valid data needs of each actor based on the assignment and design of the story board of the developed system. The results of this research are UI/UX model design for integrative holistic PAUD monitoring application.
Dashboard settings design in SVARA using user-centred design methodTELKOMNIKA JOURNAL
SVARA is the first Social Media audio application in Indonesia developed by PT. Zamrud Teknologi Khatulistiwa. At present, this application does not have feature settings to display content and other basic settings on the user's side. This situation results in users not having the role to manage the appearance of the dashboard according to their preferences. Settings are done entirely by administrators using scripts and must take APIs with regular PHP scripts. And this is very troublesome. So to give a role to the view of user management, the application needs to be made a dashboard setting feature as a follow-up. Through this paper, the researchers propose designing this dashboard feature using the User-Centered Design (UCD) method. The design results show that this method has a positive correlation with user involvement support in the application development process.
ESTIMATING THE EFFORT OF MOBILE APPLICATION DEVELOPMENTcsandit
The rise of the use of mobile technologies in the world, such as smartphones and tablets,
connected to mobile networks is changing old habits and creating new ways for the society to
access information and interact with computer systems. Thus, traditional information systems
are undergoing a process of adaptation to this new computing context. However, it is important
to note that the characteristics of this new context are different. There are new features and,
thereafter, new possibilities, as well as restrictions that did not exist before. Finally, the systems
developed for this environment have different requirements and characteristics than the
traditional information systems. For this reason, there is the need to reassess the current
knowledge about the processes of planning and building for the development of systems in this
new environment. One area in particular that demands such adaptation is software estimation.
The estimation processes, in general, are based on characteristics of the systems, trying to
quantify the complexity of implementing them. Hence, the main objective of this paper is to
present a proposal for an estimation model for mobile applications, as well as discuss the
applicability of traditional estimation models for the purpose of developing systems in the
context of mobile computing. Hence, the main objective of this paper is to present an effort
estimation model for mobile applications.
E-commerce online review for detecting influencing factors users perceptionjournalBEEI
To date, online shopping using e-commerce services becomes a trend. The emergence of e-commerce truly helps people to shop more effectively and efficiently. However, there are still some problems encountered in e-commerce, especially from the user perspective. This research aims to explore user review data, particularly on factors that influence user perception of e-commerce applications, classify, and identify potential solutions to finding problems in e-commerce applications. Data is grabbed using web scraping techniques and classified using proper machine learning, i.e., support vector machine (SVM). Text associations and fishbone analysis are performed based on the classified user review data. The results of this study show that the user satisfaction problem can be captured. Furthermore, various services that should be provided as a potential solution to experienced customers' problems or application users' perception problems can be generated. A detailed discussion of these findings is available in this article.
Knowledge management system SOP using semantic networks connected with person...TELKOMNIKA JOURNAL
Document Standard Operating Procedures (SOP) can manage business processes and employee performance in an organization. This study aims to develop a system that can publish SOP documents automatically distributed to employees. In this study, an analysis of the relationship between SOP documents and employees is carried out so that it can be directly allocated to employees according to the position. Knowledge analysis of the relationship between SOP documents and employees is done using semantic network analysis. Semantic networks are used to analyze components of knowledge and the relationship between SOP documents and employees. The results of the report of the elements of knowledge and the relationship between SOP documents found 25 SOP documents were consisting of 6 types of central nodes with 156 child nodes and had 7 types of relations containing 207 relations. SOP knowledge management system is connected to the personnel information system (SIPEG) so that it makes it easier for users to find, accommodate, and manage the knowledge contained in SOP documents. System implementation is done using PHP programming language with CodeIgniter framework, Rest API, and MySQL database.
Improving collaborative filtering using lexicon-based sentiment analysisIJECEIAES
Since data is available increasingly on the Internet, efforts are needed to develop and improve recommender systems to produce a list of possible favorite items. In this paper, we expand our work to enhance the accuracy of Arabic collaborative filtering by applying sentiment analysis to user reviews, we also addressed major problems of the current work by applying effective techniques to handle the scalability and sparsity problems. The proposed approach consists of two phases: the sentiment analysis and the recommendation phase. The sentiment analysis phase estimates sentiment scores using a special lexicon for the Arabic dataset. The item-based and singular value decomposition-based collaborative filtering are used in the second phase. Overall, our proposed approach improves the experiments’ results by reducing average of mean absolute and root mean squared errors using a large Arabic dataset consisting of 63,000 book reviews.
We provide project guidance for final year MTech, BTech, MSc, MCA, ME, BE, BSc, BCA & Diploma students in Electronics, Computer Science, Information Technology, Instrumentation, Electrical & Electronics, Power electronics, Mechanical, Automobile etc. We provide live project assistance and will make the students involve throughout the project. We specialize in Matlab, VLSI, CST, JAVA, .NET, ANDROID, PHP, NS2, EMBEDDED, ARDUINO, ARM, DSP, etc based areas. We research in Image processing, Signal Processing, Wireless communication, Cloud computing, Data mining, Networking, Artificial Intelligence and several other areas. We provide complete support in project completion, documentation and other works related to project.Success is a lousy teacher. It seduces smart people into thinking they can't lose.we have better knowledge in this field and updated with new innovative technologies.
Call me at: 9037291113.
Context Based Classification of Reviews Using Association Rule Mining, Fuzzy ...journalBEEI
The Internet has facilitated the growth of recommendation system owing to the ease of sharing customer experiences online. It is a challenging task to summarize and streamline the online textual reviews. In this paper, we propose a new framework called Fuzzy based contextual recommendation system. For classification of customer reviews we extract the information from the reviews based on the context given by users. We use text mining techniques to tag the review and extract context. Then we find out the relationship between the contexts from the ontological database. We incorporate fuzzy based semantic analyzer to find the relationship between the review and the context when they are not found therein. The sentence based classification predicts the relevant reviews, whereas the fuzzy based context method predicts the relevant instances among the relevant reviews. Textual analysis is carried out with the combination of association rules and ontology mining. The relationship between review and their context is compared using the semantic analyzer which is based on the fuzzy rules.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
Similar to K-Nearest neighbor algorithm on implicit feedback to determine SOP (20)
Amazon products reviews classification based on machine learning, deep learni...TELKOMNIKA JOURNAL
In recent times, the trend of online shopping through e-commerce stores and websites has grown to a huge extent. Whenever a product is purchased on an e-commerce platform, people leave their reviews about the product. These reviews are very helpful for the store owners and the product’s manufacturers for the betterment of their work process as well as product quality. An automated system is proposed in this work that operates on two datasets D1 and D2 obtained from Amazon. After certain preprocessing steps, N-gram and word embedding-based features are extracted using term frequency-inverse document frequency (TF-IDF), bag of words (BoW) and global vectors (GloVe), and Word2vec, respectively. Four machine learning (ML) models support vector machines (SVM), logistic regression (RF), logistic regression (LR), multinomial Naïve Bayes (MNB), two deep learning (DL) models convolutional neural network (CNN), long-short term memory (LSTM), and standalone bidirectional encoder representations (BERT) are used to classify reviews as either positive or negative. The results obtained by the standard ML, DL models and BERT are evaluated using certain performance evaluation measures. BERT turns out to be the best-performing model in the case of D1 with an accuracy of 90% on features derived by word embedding models while the CNN provides the best accuracy of 97% upon word embedding features in the case of D2. The proposed model shows better overall performance on D2 as compared to D1.
Design, simulation, and analysis of microstrip patch antenna for wireless app...TELKOMNIKA JOURNAL
In this study, a microstrip patch antenna that works at 3.6 GHz was built and tested to see how well it works. In this work, Rogers RT/Duroid 5880 has been used as the substrate material, with a dielectric permittivity of 2.2 and a thickness of 0.3451 mm; it serves as the base for the examined antenna. The computer simulation technology (CST) studio suite is utilized to show the recommended antenna design. The goal of this study was to get a more extensive transmission capacity, a lower voltage standing wave ratio (VSWR), and a lower return loss, but the main goal was to get a higher gain, directivity, and efficiency. After simulation, the return loss, gain, directivity, bandwidth, and efficiency of the supplied antenna are found to be -17.626 dB, 9.671 dBi, 9.924 dBi, 0.2 GHz, and 97.45%, respectively. Besides, the recreation uncovered that the transfer speed side-lobe level at phi was much better than those of the earlier works, at -28.8 dB, respectively. Thus, it makes a solid contender for remote innovation and more robust communication.
Design and simulation an optimal enhanced PI controller for congestion avoida...TELKOMNIKA JOURNAL
In this paper, snake optimization algorithm (SOA) is used to find the optimal gains of an enhanced controller for controlling congestion problem in computer networks. M-file and Simulink platform is adopted to evaluate the response of the active queue management (AQM) system, a comparison with two classical controllers is done, all tuned gains of controllers are obtained using SOA method and the fitness function chose to monitor the system performance is the integral time absolute error (ITAE). Transient analysis and robust analysis is used to show the proposed controller performance, two robustness tests are applied to the AQM system, one is done by varying the size of queue value in different period and the other test is done by changing the number of transmission control protocol (TCP) sessions with a value of ± 20% from its original value. The simulation results reflect a stable and robust behavior and best performance is appeared clearly to achieve the desired queue size without any noise or any transmission problems.
Improving the detection of intrusion in vehicular ad-hoc networks with modifi...TELKOMNIKA JOURNAL
Vehicular ad-hoc networks (VANETs) are wireless-equipped vehicles that form networks along the road. The security of this network has been a major challenge. The identity-based cryptosystem (IBC) previously used to secure the networks suffers from membership authentication security features. This paper focuses on improving the detection of intruders in VANETs with a modified identity-based cryptosystem (MIBC). The MIBC is developed using a non-singular elliptic curve with Lagrange interpolation. The public key of vehicles and roadside units on the network are derived from number plates and location identification numbers, respectively. Pseudo-identities are used to mask the real identity of users to preserve their privacy. The membership authentication mechanism ensures that only valid and authenticated members of the network are allowed to join the network. The performance of the MIBC is evaluated using intrusion detection ratio (IDR) and computation time (CT) and then validated with the existing IBC. The result obtained shows that the MIBC recorded an IDR of 99.3% against 94.3% obtained for the existing identity-based cryptosystem (EIBC) for 140 unregistered vehicles attempting to intrude on the network. The MIBC shows lower CT values of 1.17 ms against 1.70 ms for EIBC. The MIBC can be used to improve the security of VANETs.
Conceptual model of internet banking adoption with perceived risk and trust f...TELKOMNIKA JOURNAL
Understanding the primary factors of internet banking (IB) acceptance is critical for both banks and users; nevertheless, our knowledge of the role of users’ perceived risk and trust in IB adoption is limited. As a result, we develop a conceptual model by incorporating perceived risk and trust into the technology acceptance model (TAM) theory toward the IB. The proper research emphasized that the most essential component in explaining IB adoption behavior is behavioral intention to use IB adoption. TAM is helpful for figuring out how elements that affect IB adoption are connected to one another. According to previous literature on IB and the use of such technology in Iraq, one has to choose a theoretical foundation that may justify the acceptance of IB from the customer’s perspective. The conceptual model was therefore constructed using the TAM as a foundation. Furthermore, perceived risk and trust were added to the TAM dimensions as external factors. The key objective of this work was to extend the TAM to construct a conceptual model for IB adoption and to get sufficient theoretical support from the existing literature for the essential elements and their relationships in order to unearth new insights about factors responsible for IB adoption.
Efficient combined fuzzy logic and LMS algorithm for smart antennaTELKOMNIKA JOURNAL
The smart antennas are broadly used in wireless communication. The least mean square (LMS) algorithm is a procedure that is concerned in controlling the smart antenna pattern to accommodate specified requirements such as steering the beam toward the desired signal, in addition to placing the deep nulls in the direction of unwanted signals. The conventional LMS (C-LMS) has some drawbacks like slow convergence speed besides high steady state fluctuation error. To overcome these shortcomings, the present paper adopts an adaptive fuzzy control step size least mean square (FC-LMS) algorithm to adjust its step size. Computer simulation outcomes illustrate that the given model has fast convergence rate as well as low mean square error steady state.
Design and implementation of a LoRa-based system for warning of forest fireTELKOMNIKA JOURNAL
This paper presents the design and implementation of a forest fire monitoring and warning system based on long range (LoRa) technology, a novel ultra-low power consumption and long-range wireless communication technology for remote sensing applications. The proposed system includes a wireless sensor network that records environmental parameters such as temperature, humidity, wind speed, and carbon dioxide (CO2) concentration in the air, as well as taking infrared photos.The data collected at each sensor node will be transmitted to the gateway via LoRa wireless transmission. Data will be collected, processed, and uploaded to a cloud database at the gateway. An Android smartphone application that allows anyone to easily view the recorded data has been developed. When a fire is detected, the system will sound a siren and send a warning message to the responsible personnel, instructing them to take appropriate action. Experiments in Tram Chim Park, Vietnam, have been conducted to verify and evaluate the operation of the system.
Wavelet-based sensing technique in cognitive radio networkTELKOMNIKA JOURNAL
Cognitive radio is a smart radio that can change its transmitter parameter based on interaction with the environment in which it operates. The demand for frequency spectrum is growing due to a big data issue as many Internet of Things (IoT) devices are in the network. Based on previous research, most frequency spectrum was used, but some spectrums were not used, called spectrum hole. Energy detection is one of the spectrum sensing methods that has been frequently used since it is easy to use and does not require license users to have any prior signal understanding. But this technique is incapable of detecting at low signal-to-noise ratio (SNR) levels. Therefore, the wavelet-based sensing is proposed to overcome this issue and detect spectrum holes. The main objective of this work is to evaluate the performance of wavelet-based sensing and compare it with the energy detection technique. The findings show that the percentage of detection in wavelet-based sensing is 83% higher than energy detection performance. This result indicates that the wavelet-based sensing has higher precision in detection and the interference towards primary user can be decreased.
A novel compact dual-band bandstop filter with enhanced rejection bandsTELKOMNIKA JOURNAL
In this paper, we present the design of a new wide dual-band bandstop filter (DBBSF) using nonuniform transmission lines. The method used to design this filter is to replace conventional uniform transmission lines with nonuniform lines governed by a truncated Fourier series. Based on how impedances are profiled in the proposed DBBSF structure, the fractional bandwidths of the two 10 dB-down rejection bands are widened to 39.72% and 52.63%, respectively, and the physical size has been reduced compared to that of the filter with the uniform transmission lines. The results of the electromagnetic (EM) simulation support the obtained analytical response and show an improved frequency behavior.
Deep learning approach to DDoS attack with imbalanced data at the application...TELKOMNIKA JOURNAL
A distributed denial of service (DDoS) attack is where one or more computers attack or target a server computer, by flooding internet traffic to the server. As a result, the server cannot be accessed by legitimate users. A result of this attack causes enormous losses for a company because it can reduce the level of user trust, and reduce the company’s reputation to lose customers due to downtime. One of the services at the application layer that can be accessed by users is a web-based lightweight directory access protocol (LDAP) service that can provide safe and easy services to access directory applications. We used a deep learning approach to detect DDoS attacks on the CICDDoS 2019 dataset on a complex computer network at the application layer to get fast and accurate results for dealing with unbalanced data. Based on the results obtained, it is observed that DDoS attack detection using a deep learning approach on imbalanced data performs better when implemented using synthetic minority oversampling technique (SMOTE) method for binary classes. On the other hand, the proposed deep learning approach performs better for detecting DDoS attacks in multiclass when implemented using the adaptive synthetic (ADASYN) method.
The appearance of uncertainties and disturbances often effects the characteristics of either linear or nonlinear systems. Plus, the stabilization process may be deteriorated thus incurring a catastrophic effect to the system performance. As such, this manuscript addresses the concept of matching condition for the systems that are suffering from miss-match uncertainties and exogeneous disturbances. The perturbation towards the system at hand is assumed to be known and unbounded. To reach this outcome, uncertainties and their classifications are reviewed thoroughly. The structural matching condition is proposed and tabulated in the proposition 1. Two types of mathematical expressions are presented to distinguish the system with matched uncertainty and the system with miss-matched uncertainty. Lastly, two-dimensional numerical expressions are provided to practice the proposed proposition. The outcome shows that matching condition has the ability to change the system to a design-friendly model for asymptotic stabilization.
Implementation of FinFET technology based low power 4×4 Wallace tree multipli...TELKOMNIKA JOURNAL
Many systems, including digital signal processors, finite impulse response (FIR) filters, application-specific integrated circuits, and microprocessors, use multipliers. The demand for low power multipliers is gradually rising day by day in the current technological trend. In this study, we describe a 4×4 Wallace multiplier based on a carry select adder (CSA) that uses less power and has a better power delay product than existing multipliers. HSPICE tool at 16 nm technology is used to simulate the results. In comparison to the traditional CSA-based multiplier, which has a power consumption of 1.7 µW and power delay product (PDP) of 57.3 fJ, the results demonstrate that the Wallace multiplier design employing CSA with first zero finding logic (FZF) logic has the lowest power consumption of 1.4 µW and PDP of 27.5 fJ.
Evaluation of the weighted-overlap add model with massive MIMO in a 5G systemTELKOMNIKA JOURNAL
The flaw in 5G orthogonal frequency division multiplexing (OFDM) becomes apparent in high-speed situations. Because the doppler effect causes frequency shifts, the orthogonality of OFDM subcarriers is broken, lowering both their bit error rate (BER) and throughput output. As part of this research, we use a novel design that combines massive multiple input multiple output (MIMO) and weighted overlap and add (WOLA) to improve the performance of 5G systems. To determine which design is superior, throughput and BER are calculated for both the proposed design and OFDM. The results of the improved system show a massive improvement in performance ver the conventional system and significant improvements with massive MIMO, including the best throughput and BER. When compared to conventional systems, the improved system has a throughput that is around 22% higher and the best performance in terms of BER, but it still has around 25% less error than OFDM.
Reflector antenna design in different frequencies using frequency selective s...TELKOMNIKA JOURNAL
In this study, it is aimed to obtain two different asymmetric radiation patterns obtained from antennas in the shape of the cross-section of a parabolic reflector (fan blade type antennas) and antennas with cosecant-square radiation characteristics at two different frequencies from a single antenna. For this purpose, firstly, a fan blade type antenna design will be made, and then the reflective surface of this antenna will be completed to the shape of the reflective surface of the antenna with the cosecant-square radiation characteristic with the frequency selective surface designed to provide the characteristics suitable for the purpose. The frequency selective surface designed and it provides the perfect transmission as possible at 4 GHz operating frequency, while it will act as a band-quenching filter for electromagnetic waves at 5 GHz operating frequency and will be a reflective surface. Thanks to this frequency selective surface to be used as a reflective surface in the antenna, a fan blade type radiation characteristic at 4 GHz operating frequency will be obtained, while a cosecant-square radiation characteristic at 5 GHz operating frequency will be obtained.
Reagentless iron detection in water based on unclad fiber optical sensorTELKOMNIKA JOURNAL
A simple and low-cost fiber based optical sensor for iron detection is demonstrated in this paper. The sensor head consist of an unclad optical fiber with the unclad length of 1 cm and it has a straight structure. Results obtained shows a linear relationship between the output light intensity and iron concentration, illustrating the functionality of this iron optical sensor. Based on the experimental results, the sensitivity and linearity are achieved at 0.0328/ppm and 0.9824 respectively at the wavelength of 690 nm. With the same wavelength, other performance parameters are also studied. Resolution and limit of detection (LOD) are found to be 0.3049 ppm and 0.0755 ppm correspondingly. This iron sensor is advantageous in that it does not require any reagent for detection, enabling it to be simpler and cost-effective in the implementation of the iron sensing.
Impact of CuS counter electrode calcination temperature on quantum dot sensit...TELKOMNIKA JOURNAL
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
A progressive learning for structural tolerance online sequential extreme lea...TELKOMNIKA JOURNAL
This article discusses the progressive learning for structural tolerance online sequential extreme learning machine (PSTOS-ELM). PSTOS-ELM can save robust accuracy while updating the new data and the new class data on the online training situation. The robustness accuracy arises from using the householder block exact QR decomposition recursive least squares (HBQRD-RLS) of the PSTOS-ELM. This method is suitable for applications that have data streaming and often have new class data. Our experiment compares the PSTOS-ELM accuracy and accuracy robustness while data is updating with the batch-extreme learning machine (ELM) and structural tolerance online sequential extreme learning machine (STOS-ELM) that both must retrain the data in a new class data case. The experimental results show that PSTOS-ELM has accuracy and robustness comparable to ELM and STOS-ELM while also can update new class data immediately.
Electroencephalography-based brain-computer interface using neural networksTELKOMNIKA JOURNAL
This study aimed to develop a brain-computer interface that can control an electric wheelchair using electroencephalography (EEG) signals. First, we used the Mind Wave Mobile 2 device to capture raw EEG signals from the surface of the scalp. The signals were transformed into the frequency domain using fast Fourier transform (FFT) and filtered to monitor changes in attention and relaxation. Next, we performed time and frequency domain analyses to identify features for five eye gestures: opened, closed, blink per second, double blink, and lookup. The base state was the opened-eyes gesture, and we compared the features of the remaining four action gestures to the base state to identify potential gestures. We then built a multilayer neural network to classify these features into five signals that control the wheelchair’s movement. Finally, we designed an experimental wheelchair system to test the effectiveness of the proposed approach. The results demonstrate that the EEG classification was highly accurate and computationally efficient. Moreover, the average performance of the brain-controlled wheelchair system was over 75% across different individuals, which suggests the feasibility of this approach.
Adaptive segmentation algorithm based on level set model in medical imagingTELKOMNIKA JOURNAL
For image segmentation, level set models are frequently employed. It offer best solution to overcome the main limitations of deformable parametric models. However, the challenge when applying those models in medical images stills deal with removing blurs in image edges which directly affects the edge indicator function, leads to not adaptively segmenting images and causes a wrong analysis of pathologies wich prevents to conclude a correct diagnosis. To overcome such issues, an effective process is suggested by simultaneously modelling and solving systems’ two-dimensional partial differential equations (PDE). The first PDE equation allows restoration using Euler’s equation similar to an anisotropic smoothing based on a regularized Perona and Malik filter that eliminates noise while preserving edge information in accordance with detected contours in the second equation that segments the image based on the first equation solutions. This approach allows developing a new algorithm which overcome the studied model drawbacks. Results of the proposed method give clear segments that can be applied to any application. Experiments on many medical images in particular blurry images with high information losses, demonstrate that the developed approach produces superior segmentation results in terms of quantity and quality compared to other models already presented in previeous works.
Automatic channel selection using shuffled frog leaping algorithm for EEG bas...TELKOMNIKA JOURNAL
Drug addiction is a complex neurobiological disorder that necessitates comprehensive treatment of both the body and mind. It is categorized as a brain disorder due to its impact on the brain. Various methods such as electroencephalography (EEG), functional magnetic resonance imaging (FMRI), and magnetoencephalography (MEG) can capture brain activities and structures. EEG signals provide valuable insights into neurological disorders, including drug addiction. Accurate classification of drug addiction from EEG signals relies on appropriate features and channel selection. Choosing the right EEG channels is essential to reduce computational costs and mitigate the risk of overfitting associated with using all available channels. To address the challenge of optimal channel selection in addiction detection from EEG signals, this work employs the shuffled frog leaping algorithm (SFLA). SFLA facilitates the selection of appropriate channels, leading to improved accuracy. Wavelet features extracted from the selected input channel signals are then analyzed using various machine learning classifiers to detect addiction. Experimental results indicate that after selecting features from the appropriate channels, classification accuracy significantly increased across all classifiers. Particularly, the multi-layer perceptron (MLP) classifier combined with SFLA demonstrated a remarkable accuracy improvement of 15.78% while reducing time complexity.
Quality defects in TMT Bars, Possible causes and Potential Solutions.PrashantGoswami42
Maintaining high-quality standards in the production of TMT bars is crucial for ensuring structural integrity in construction. Addressing common defects through careful monitoring, standardized processes, and advanced technology can significantly improve the quality of TMT bars. Continuous training and adherence to quality control measures will also play a pivotal role in minimizing these defects.
Democratizing Fuzzing at Scale by Abhishek Aryaabh.arya
Presented at NUS: Fuzzing and Software Security Summer School 2024
This keynote talks about the democratization of fuzzing at scale, highlighting the collaboration between open source communities, academia, and industry to advance the field of fuzzing. It delves into the history of fuzzing, the development of scalable fuzzing platforms, and the empowerment of community-driven research. The talk will further discuss recent advancements leveraging AI/ML and offer insights into the future evolution of the fuzzing landscape.
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...Amil Baba Dawood bangali
Contact with Dawood Bhai Just call on +92322-6382012 and we'll help you. We'll solve all your problems within 12 to 24 hours and with 101% guarantee and with astrology systematic. If you want to take any personal or professional advice then also you can call us on +92322-6382012 , ONLINE LOVE PROBLEM & Other all types of Daily Life Problem's.Then CALL or WHATSAPP us on +92322-6382012 and Get all these problems solutions here by Amil Baba DAWOOD BANGALI
#vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore#blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #blackmagicforlove #blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #Amilbabainuk #amilbabainspain #amilbabaindubai #Amilbabainnorway #amilbabainkrachi #amilbabainlahore #amilbabaingujranwalan #amilbabainislamabad
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Automobile Management System Project Report.pdfKamal Acharya
The proposed project is developed to manage the automobile in the automobile dealer company. The main module in this project is login, automobile management, customer management, sales, complaints and reports. The first module is the login. The automobile showroom owner should login to the project for usage. The username and password are verified and if it is correct, next form opens. If the username and password are not correct, it shows the error message.
When a customer search for a automobile, if the automobile is available, they will be taken to a page that shows the details of the automobile including automobile name, automobile ID, quantity, price etc. “Automobile Management System” is useful for maintaining automobiles, customers effectively and hence helps for establishing good relation between customer and automobile organization. It contains various customized modules for effectively maintaining automobiles and stock information accurately and safely.
When the automobile is sold to the customer, stock will be reduced automatically. When a new purchase is made, stock will be increased automatically. While selecting automobiles for sale, the proposed software will automatically check for total number of available stock of that particular item, if the total stock of that particular item is less than 5, software will notify the user to purchase the particular item.
Also when the user tries to sale items which are not in stock, the system will prompt the user that the stock is not enough. Customers of this system can search for a automobile; can purchase a automobile easily by selecting fast. On the other hand the stock of automobiles can be maintained perfectly by the automobile shop manager overcoming the drawbacks of existing system.
Vaccine management system project report documentation..pdfKamal Acharya
The Division of Vaccine and Immunization is facing increasing difficulty monitoring vaccines and other commodities distribution once they have been distributed from the national stores. With the introduction of new vaccines, more challenges have been anticipated with this additions posing serious threat to the already over strained vaccine supply chain system in Kenya.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Courier management system project report.pdfKamal Acharya
It is now-a-days very important for the people to send or receive articles like imported furniture, electronic items, gifts, business goods and the like. People depend vastly on different transport systems which mostly use the manual way of receiving and delivering the articles. There is no way to track the articles till they are received and there is no way to let the customer know what happened in transit, once he booked some articles. In such a situation, we need a system which completely computerizes the cargo activities including time to time tracking of the articles sent. This need is fulfilled by Courier Management System software which is online software for the cargo management people that enables them to receive the goods from a source and send them to a required destination and track their status from time to time.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
2. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1425-1431
1426
classification to facilitate users in setting SOPs by their preferences, through data classification
and into a list of suggestions or predictions in query completion on search [17]. So this research
uses domain usage log data based on the SOP document search according to user preferences
in the SOP system application.
Usage log data based on SOP document searches performed by employees has
usually assumed in implicit feedback that the more employees look for the SOP document,
closer to the employee preference. By applying the k-nearest neighbor algorithm to establish the
operational standard of the procedure by worker activity with feedback can implicitly identify the
most relevant SOPs for users. Bayesian algorithms based on reservation logs show significant
results on the addition of context information with their implementation in restaurant ratings [18]
and ranking e-commerce based on timestamp [19]. Additionally, monitoring user click behavior
demonstrates the effectiveness of approaches that reflect user preferences which further
indicate the potential for widespread application of applying to e-commerce purchase
predictions [20]. Then by utilizing user interest for recommendations, such as applying a user
location vector to its implementation to determine a tourist destination produces a 65-100%
effectiveness of the prediction system as well as rank [21].
K-Nearest Neighbor (KNN) is an efficient algorithm for predicting implementations to
identify bioluminescent proteins [22]. Then KNN’s implementation shows that this method is
used to identify relevant news content recommendations based on each user’s implicit
feedback. The use of the KNN algorithm to establish the freshness classification of fish based
on the color of the fish to determine fish consumption is appropriate produces classification with
91.36% accuracy [23]. The application of KNN shows a high enough accuracy to predict the
load the next day by merely utilizing the estimated temperature by monitoring the document
search behavior. The use of KNN yields an accuracy of 94.95%, and the modified k-nearest
neighbor produces 99.51% accuracy in classifying data [24] instruments [25]. Based on
previous research, this research will determine the standard operating procedure (SOP) based
on implicit feedback by utilizing user preferences so that it will determine the recommendation of
an SOP according to consumption of SOP content on the system based on data extraction
result and monitoring of user search behavior. In research by applying the KNN algorithm can
identify the most relevant SOP for the user.
2. Methods
2.1. Implicit Feedback
Implicit Feedback is the feedback issued by the system automatically and by the wishes
of the user. This feedback contains only positive values from one class. For example, like
products that users, clicks, and bookmarks can be a single-class positive value. Feedback as
well as automatic purchases and login access by the system, and those that are more easy to
collect, and otherwise provided by the user to the item intentionally [26, 27]. Implicit feedback
data is information that can use from users without user feedback, which can not only be used
between users and content but also other available information [28]. The recommendation
system with feedback requires settings to bring feedback to the user preference level.
2.2. K-Nearest Neighbor
The KNN algorithm is one of the methods to perform a classification analysis, but in the
last few decades, the use of the KNN method is also for prediction [28]. KNN is a learning
algorithm a typical KNN method classifies the sample request with the most voting strategy. For
each query sample, KNN finds its nearest neighbor in the dataset and then assigns it to a class
that is mostly owned by its neighbors. KNN is an algorithm that classifies objects based on
similarities of data with other data [29]. The KNN is an approach to finding the case by
calculating the proximity between the new case and the old case based on matching the weight
of some existing features. Figure 1 is the steps for calculating the KNN algorithm. Here are the
steps for calculating the KNN algorithm:
a. Weigthing with determining the characteristics in the dataset is the sum of the implicit
feedback from the user personalization.
b. Calculate the number of implicit feedback within 5 weeks with the provisions if the
appropriate profile then multiplied by the percentage of 80% if not the profile match multiplied
by 20% percentage.
3. TELKOMNIKA ISSN: 1693-6930
K-Nearest neighbor algorithm on implicit feedback... (Muhammad Yusril Helmi Setyawan)
1427
c. Calculate the KNN Algorithm with determining the parameter value k, by looking at the
amount of data in the table, then taking the average to determine the value of k.
d. Calculate the distance of each training data to the data label, as described in (1):
𝑑𝑖𝑠𝑡(𝑥1𝑖, 𝑥2𝑖) = √∑𝑖=1
𝑛
(𝑥1𝑖 − 𝑥2𝑖)2
(1)
Where, x1 and x2 show two samples, x1i and x2i are their variable values.
e. Determine the data label that has the minimum distance.
f. Line the training data into the data label in step 3.
g. Repeat steps 2 through 4 until the number of each class is k.
Start
Document Learning
SOP
Determine k value
Implicit Feedback
From usage
history
1 weeks ?
KNN
Measurement
Determine SOP
Suitable
EndOrganizing SOP
Yes
No
Yes
No
Figure 1. K-Nearest neighbor algorithm
With the test document, the system finds the k-nearest neighbor in the classified training
location and obtains the category of test document according to the class distribution of these
neighbors. Which can be used to measure similarities between neighbors and the test
documents for weighting in getting a better classification effect [30]. In this research based on
implicit feedback data from usage log that is search SOP document. First calculate the amount
of implicit feedback each user has on the SOP, then measure the distance between the user
and the SOP document to determine the relevant SOP. In general, the process flow of the
application of the KNN method to determine the relevant SOP plan in Figure 2.
Figure 2. K-Nearest neighbor process
4. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1425-1431
1428
3. Result and Analysis
The calculation of K-NN done by searching the k group of objects in the closest learning
data (similar) to the object on the new data or test data, this process is to calculate the similarity
between documents calculated based on implicit feedback data in the form of log usage is the
standard document search procedure. This study uses SOP document from the Library of
Politeknik Pos Indonesia which amounts to 11 SOP documents; SOP is a document containing
guidelines or set of rules that are made to facilitate the task of employees to work by their
primary tasks and functions in their respective institutions. This study aims to establish the SOP
document that best suits the user preferences through user usage history.
3.1. Input Identification
The data used in this study amounted to 11 SOP documents. For input identification
process, 10 existing SOP documents will go through the preprocessing stage to calculate the
rating or value based on the implicit feedback. Provided if accessed by the profile then
multiplied by the percentage of 80% and if not by the profile of the meal will be multiplied by a
percentage of 20% this is done for minimizing the possibility of user access that does not match
the preferences view from the profile. However, it also not a rule out the possibility of the SOP
being recommended to be incompatible with its profile because it relies on the user’s preference
from its implicit feedback. According to the source of the number of SOP access within 5 weeks
at least 5 times access, then to determine the dataset seen from the value of the paid difference
of 5 times access that is 3. Next to determine the appropriate label and not fit on the view
dataset of the total amount of implicit feedback if the number more or equal to 2.4 then the label
given is appropriate but if below 2.4 then the label given is not appropriate, the value 2.4
obtained from the multiplication of 3 and 80%, so 2.4 used as a benchmark to determine the
label on the dataset for SOP in accordance with the profile users. The existing SOP documents
represent as documents with SOP38, SOP39, ... SOPn and document queries will also be
considerate as comparable documents with SOPq. Table 1 is the preprocessing phase by
determining the value of each SOP to 1 user by observing the history of using SOP for 5 weeks.
Table 1. The Dataset Base on the Rating of the Implicit Feedback
No SOP ID 1 Week 2 Week 3 Week 4 Week 5 Week Explanation
1 SOP48 0.8 0.8 0 0 0 Unsuitable
2 SOP39 0 0.2 0.2 0 0.2 Suitable
3 SOP40 0.4 0.2 0 0.4 0.2 Suitable
4 SOP41 0.2 0.2 0 0.2 0 Suitable
5 SOP38 0.2 0.2 0 0 0 UnSuitable
6 SOP43 0.2 0 0 0 0 Unsuitable
7 SOP44 0.8 0 0 0 0 Unsuitable
8 SOP45 0 1.6 0.8 0.8 0.8 Suitable
9 SOP46 0.8 0 0.8 1.6 0 Suitable
10 SOP47 0 0.8 0.8 0.8 1.6 Suitable
11 SOP42 0.2 0.4 0 0.8 0.2 ?
3.2. Process Identification
After preprocessing done, next is to determine the classification parameter, by looking
at the number of datasets that exist then calculate the amount of difference, then from the
above dataset obtained k=5. After determining the parameter value, then the Euclidean distance
calculation as described from (2) to (6) of each test document against document Q.
𝑑𝑖𝑠𝑡(𝑥1, 𝑥2𝑖) = √∑𝑖=1
𝑛
(𝑥1𝑖 − 𝑥2𝑖)2
(2)
𝑑𝑖𝑠𝑡 = √(0.2 − 0.8)2 + (0.4 − 0.8)2 + (0 − 0)2 + (0 − 0 )2 + (0.2 − 0)2 (3)
𝑑𝑖𝑠𝑡 = √(0.36)2 + (0.16)2 + (0)2 + (0)2 + (0.04)2 (4)
𝑑𝑖𝑠𝑡 = √0.56 (5)
𝑑𝑖𝑠𝑡 = 0.75 (6)
5. TELKOMNIKA ISSN: 1693-6930
K-Nearest neighbor algorithm on implicit feedback... (Muhammad Yusril Helmi Setyawan)
1429
Furthermore, after calculating the distance between each Euclidean object and
document Q, then the ranking for each value on the document, so that obtained Euclidean
distance and the overall ranking of documents in Table 2.
3.3. Identification of Output
In the identification output is to classify the document based on the ranking with the
provisions of parameter 5, this provision is taken based because of datasets used is 10 so the
parameter value is taken from half the number of datasets as the difference in comparison. The
majority category results where the suitable values are shown by documents with ID D2, D4,
and D6, while for un-suitable values are shown in documents with ID D5 and D3, obtaining a
ratio of 2:3, for more suitable values. Using the majority category, then based on the calculation
of KNN obtained the classification results of the 5 parameters of the data. That document Q is a
document corresponding to the user in the field of service by looking at the value of comparison
between the 5 parameters with the best distance value with the majority indicating the
appropriate label.
Figure 3 shows the process to be built on the system to determine the standard
document application procedure. Administrator will input the SOP document data; then the SOP
document will be accessed by the user by monitoring the user’s search behavior as implicit
feedback stored in the search log. Next, calculate by using the KNN algorithm to determine the
suitability of SOP documents with user preferences. The next user will get recommendations for
their preferences.
Table 2. Euclidean Distance
and Ranking
No Name Distance Ranking
1 D11, D1 0.75 4
2 D11, D2 0.35 8
3 D11, D3 0.49 6
4 D11, D4 0.35 8
5 D11, D5 0.28 10
6 D11, D6 0.45 7
7 D11, D7 0.75 4
8 D11, D8 1.77 3
9 D11, D9 1.94 1
10 D11, D10 1.85 2
Figure 3. System illustration
3.4. Result
Experiments using the KNN method with k=5 count similarities between the test data
and the center of each label data. This test determines the prosecution of standard operational
documents by using feedback from user log-out with greater provision if the selected document
is a document that corresponds to the field located on the user profile. Based on the test results,
obtained 5 SOP documents with 2:3 presentation, where the ratings of 1.4, and 5 indicate the
label according to the rank 2 and 3 label is not appropriate.
Consider the fact that document Q is a document that matches the user by looking at a
comparison between the parameters with the relevant value with the KNN method to retrieve
from the nearest neighbor that must be obtained, so the following is graph 5 which shows the
results. Based on Figure 4, the presentation of the ranking of the SOP document to be known or
the test document by looking at the classification parameters of the 5. Documents obtained a
presentation of each of the rank 1 until 5 as in the graph shows the majority of labels appear for
test data that is appropriate for the service.
6. ISSN: 1693-6930
TELKOMNIKA Vol. 17, No. 3, June 2019: 1425-1431
1430
Figure 4. Percentage ranking of document
4. Conclusion
The present study presents an approach to determine the SOP documentation using
the KNN algorithm for document classification by utilizing implicit feedback from the search log
data with the percentage of documents corresponding to the 80% and unsuitable profiles of
20%. KNN is one of the most popular classifier, easy to use and efficient enough. By finding the
value of K representing the number of neighbors as the class for the classification. Previous
research using the KNN algorithm to perform web text analysis can automatically improve
accuracy automatically while in this study there is not many classifications of documents but
also predictions or user rates.
5. Discussion
From the research, however, in the establishment of SOP documents are still in the
case study of SOP libraries by SOP determination within the scope of the division, and only
utilize implicit feedback in the form of search logs. To establish SOP documents to show more
accurate results it should be used in a broad SOP management system and utilize implicit
feedback with parameters not only in the search log but maybe user behavior, usage logs and
more and discuss performance evaluation. So it is expected that the determination in SOP
document retrieval can show more accurate results.
References
[1] Kourdali HK, Sherry L. A comparison of two takeoffs and climb out flap retraction standard operating
procedures. Integrated Communications Navigation and Surveillance (ICNS). 2016: 4A2–1.
[2] Simulation of time-on-procedure (ToP) for evaluating airline procedures. 2017 Integrated
Communications, Navigation and Surveillance Conference (ICNS). 2017. [Online]. Available:
https://doi.org/10.1109/icnsurv.2017.8011892
[3] N Wangskarn, J Siritantitam, N Meesri, P Chiravirakul. Flowty-flow: A web application for
preparation and distribution of standard operating procedures. Student Project Conference
(ICT-ISPC), 2016 Fifth ICT International. 2016: 21–24.
[4] Z Jing, L Boang, Y Hao. Fault diagnosis strategy for startup process based on standard operating
procedures. 2013 25th
Chinese Control and Decision Conference (CCDC), Guiyang. 2013:
4221-4226.
[5] Simulation of time-on-procedure (ToP) for evaluating airline procedures. in 2017 Integrated
Communications, Navigation and Surveillance Conference (ICNS). IEEE. 2017. [Online]. Available:
https://doi.org/10.1109/icnsurv.2017.8011892.
[6] S Hasegawa. How ICT changes quality assuarance in graduate education and research? From
knowledge transfe]ll improvement 2015. International Conference on Information & Communication
Technology and Systems (ICTS), Surabaya. 2015; 1-2.
[7] The Republic of Indonesia Government, Minister of Administrative and Bureaucratic Reform,
Guidelines for the preparation of operational standards for government administrative procedures.
2012: 63.
[8] The Republic of Indonesia Government. Regulation of Minister of Administrative and Bureaucratic
Reform Number 15 concerning Guidelines for Service Standards. 2014: 14.
,000
,100
,200
,300
,400
,500
D5 D2 D4 D6 D3
Percentage Ranking of Document
7. TELKOMNIKA ISSN: 1693-6930
K-Nearest neighbor algorithm on implicit feedback... (Muhammad Yusril Helmi Setyawan)
1431
[9] MY Helmi Setyawan, RM Awangga, SR Efendi. Comparison of Multinomial Naive Bayes Algorithm
And Logistic Regression For Intent Classification In Chatbot. 2018 International Conference on
Applied Engineering (ICAE). Batam, 2018: 1-5.
[10] NH Harani, AA Arman, RM Awangga. Improving togaf adm 9.1 migration planning phase by ITIL v3
service transition. Journal of Physics: Conference Series. 2018; 1007(1): 012036.
[11] SW Byun, SM Lee, SP Lee, KYKim, C Kee-Seong. A recommendation system based on the object
of the interest. Advanced Communication Technology (ICACT), 2016 18th International Conference.
2016: 689–691.
[12] P Patil, S Gore. Study of recommendation system for yoga and raga for personalized health based
on the constitution (Prakriti). International Journal of Computer Applications; 2016; 136(4): 25–27.
[13] Y Zhang, C Yang, Z Niu. A research of job recommendation system based on collaborative filtering.
Computational Intelligence and Design (ISCID), 2014 Seventh International Symposium. 2014:
533–538.
[14] K Lin, Y Chen, X Li, Q Wu, Z Xu. Friend recommendation algorithm based on location-based social
networks. Software Engineering and Service Science (ICSESS), 2016 7th
IEEE International
Conference. 2016: 233–236.
[15] J Wang, L Yu, H Zhang. Brand recommendation leveraging heterogeneous implicit feedbacks.
Computer Science and Engineering (APWC on CSE), 2015 2nd Asia-Pacific World Congress. 2015:
1–6.
[16] IN Yulita, S Purwani, R Rosadi, RM Awangga. A quantization of deep belief networks for long
short-term memory in sleep stage detection. in Advanced Informatics, Concepts, Theory, and
Applications (ICAICTA), 2017 International Conference on IEEE, 2017: 1–5.
[17] D Ajantha, J Vijay, R Sridhar. A user-location vector based approach for personalized tourism and
travel recommendation. Big Data Analytics and Computational Intelligence (ICBDAC), 2017
International Conference. 2017: 440–446.
[18] RM Awangga, M Yusril, H Setyawan. Ontology design of influential people identification using
centrality. Journal of Physics: Conference Series. 2018; 1007(1): 012012.
[19] S Gupta, U Ojha, VS Dixit. Personalized web recommendations analyzing sequential behavior using
implicit data streams: A survey. 2017 8th
International Conference on Computing, Communication,
and Networking Technologies (ICCCNT). 2017: 1–7.
[20] P Mathew, B Kuriakose, V Hegde. Book recommendation system through content-based and
collaborative filtering method. Data Mining and Advanced Computing (SAPI- ENCE), International
Conference. 2016: 47–52.
[21] G Wu, V Swaminathan, S Mitra, R Kumar. Digital content recommendation system using implicit
feedback data. 2017 IEEE International Conference on Big Data (Big Data). 2017: 2766–2771.
[22] R Jia, R Li, M Yu, S Wang. E-commerce purchase prediction approach by user behavior data.
Computer, Information and Telecommunication Systems (CITS), 2017 International Conference.
2017: 1–5.
[23] W-T Kuo, Y-C. Wang, RT-H. Tsai, JY-j. Hsu. Contextual restaurant recommendation utilizing implicit
feedback. Wireless and Optical Communication Conference (WOCC), 2015 24th
. 2015: 170–174.
[24] C Chen, D Wang, Y Ding. User actions and timestamp based personalized recommendation for
e-commerce system. Computer and Information Technology (CIT), 2016 IEEE International
Conference. 2016: 183–189.
[25] J Hu. Blknn: A k-nearest neighbors method for predicting bioluminescent proteins. Computational
Intelligence in Bioinformatics and Computational Biology 2014 IEEE conference. 2014: 1–6.
[26] H Al-Shehri, A Al-Qarni, L Al-Saati, A Batoaq, H Badukhen, S Alrashed, J Alhiyafi, SO Olatunji.
Student performance prediction using support vector machine and k-nearest neighbor. Electrical and
Computer Engineering (CCECE), 2017 IEEE 30th
Canadian Conference. 2017: 1–4.
[27] NMS Iswari, Wella, Ranny. Fish freshness classification method based on fish image using
k-nearest neighbor. 2017 4th International Conference on New Media Studies (CONMEDIA). 2017:
87–91.
[28] R Zhang, Y Xu, Z Y Dong, W Kong, KP Wong. A composite k-nearest neighbor model for
day-ahead load forecasting with limited temperature forecasts. Power and Energy Society General
Meeting (PESGM). 2016: 1–5.
[29] I Gazalba, Mustakim, NGI Reza. Comparative analysis of k-nearest neighbor and modified
k-nearest neighbor algorithm for data classification. 2017 2nd International conferences on
Information Technology, Information Systems and Electrical Engineering (ICITISEE). 2017:
294–298.
[30] C-Y Lin, L-C Wang, K.-H. Tsai. Hybrid real-time matrix factorization for implicit feedback
recommendation systems. IEEE Access; 6: 21 369–21 380, 2018.