The often problematic faced by Muslims arelack of understanding in calculating Zakahand determining the feasibility of compliant recipients based on Islamic Shari'a. This study aimed to establish Zakah management system to support calculation process based on Al Qaradhawi method, helping Board of Zakahin distributing Zakah funds to mustahik. The algorithm used for the classification of Zakahrecipients is Naive Bayes. The classification was combination of discreteand continuous data which is conducted by experiments using feasible and unfeasible data as a novelty approach.The results have shown thatthe Naive Bayes method could solve the problem with 85% of average.
Medical informatics growth can be observed now days. Advancement in different medical fields
discovers the various critical diseases and provides the guidelines for their cure. This has been possible
only because of well heeled medical databases as well as automation of data analysis process. Towards
this analysis process lots of learning and intelligence is required, the data mining techniques provides the
basis for that and various data mining techniques are available like Decision tree Induction, Rule Based
Classification or mining, Support vector machine, Stochastic classification, Logistic regression, Naïve
bayes, Artificial Neural Network & Fuzzy Logic, Genetic Algorithms. This paper provides the basic of
data mining with their effective techniques availability in medical sciences & reveals the efforts done on
medical databases using data mining techniques for human disease diagnosis.
A Secure Decision Making Process in Health Care System Using Naive Bayes Clas...IJTET Journal
Abstract— A Secure decision support estimation in health care system preserves the seclusion of the patient data, the decision estimation and the server side clinical support system. In order to preserve privacy paillier homomorphic encryption technique is used with naive bayes classification method. Hence the server involved in the diagnosis process is not able to know the diagnosis results of the patients because the patient’s data always continue to exist in an encrypted form even during the recognition process in health care system. To validate the performance, evaluate the method of classification using naive bayes classifier on medical datasets and the accuracy of the results is much better. Secure decision support estimation is a computerized medical diagnosis process for enhancing the health related decisions to improve patient’s health and also provide the more accurate decision in the diagnosis process.
AN ALGORITHM FOR PREDICTIVE DATA MINING APPROACH IN MEDICAL DIAGNOSISijcsit
The Healthcare industry contains big and complex data that may be required in order to discover fascinating pattern of diseases & makes effective decisions with the help of different machine learning techniques. Advanced data mining techniques are used to discover knowledge in database and for medical
research. This paper has analyzed prediction systems for Diabetes, Kidney and Liver disease using more
number of input attributes. The data mining classification techniques, namely Support Vector Machine(SVM) and Random Forest (RF) are analyzed on Diabetes, Kidney and Liver disease database. The performance of these techniques is compared, based on precision, recall, accuracy, f_measure as well
as time. As a result of study the proposed algorithm is designed using SVM and RF algorithm and the experimental result shows the accuracy of 99.35%, 99.37 and 99.14 on diabetes, kidney and liver disease respectively.
Improving the performance of Intrusion detection systemsyasmen essam
Intrusion detection systems (IDS) are widely studied by
researchers nowadays due to the dramatic growth in
network-based technologies. Policy violations and
unauthorized access is in turn increasing which makes
intrusion detection systems of great importance. Existing
approaches to improve intrusion detection systems focus on feature selection or reduction since some features are
irrelevant or redundant which when removed improve the
accuracy as well as the learning time.
Agent Oriented Patient Scheduling System: A Concurrent Metatem Based ApproachIJERA Editor
The Problem of Patient Scheduling[6] is a major issue in a Medical Healthcare System[4]. In India, Healthcare
is an 80 billion dollar Industry and is growing at an average rate of 17% annually. However, quality healthcare
is still out of reach for many. Each year thousands of fatalities arise simply due to the fact that patient could not
be provided with proper medical facilities at the right time. A software agent may be a member of a Multi-Agent
System[2][5] (MAS) which is collectively performing a range of complex and intelligent tasks. Using
Concurrent Metatem[3], a Multi-Agent Language, we have attempted to model a patient scheduling
system[4][6] that can help hospitals collaborate among them through a Liaison-Agent, in order to provide
patients with the best care possible. Patients should no longer have to be turned down when hospitals are packed
to capacity; instead, they could simply be shifted to another hospital. Hospitals and even doctors are assigned to
patients after an automated process of matching patient needs with doctor expertise and hospital infrastructureleading
to reduced waiting-time while maximizing efficiency and potentially saving lives.
Medical informatics growth can be observed now days. Advancement in different medical fields
discovers the various critical diseases and provides the guidelines for their cure. This has been possible
only because of well heeled medical databases as well as automation of data analysis process. Towards
this analysis process lots of learning and intelligence is required, the data mining techniques provides the
basis for that and various data mining techniques are available like Decision tree Induction, Rule Based
Classification or mining, Support vector machine, Stochastic classification, Logistic regression, Naïve
bayes, Artificial Neural Network & Fuzzy Logic, Genetic Algorithms. This paper provides the basic of
data mining with their effective techniques availability in medical sciences & reveals the efforts done on
medical databases using data mining techniques for human disease diagnosis.
A Secure Decision Making Process in Health Care System Using Naive Bayes Clas...IJTET Journal
Abstract— A Secure decision support estimation in health care system preserves the seclusion of the patient data, the decision estimation and the server side clinical support system. In order to preserve privacy paillier homomorphic encryption technique is used with naive bayes classification method. Hence the server involved in the diagnosis process is not able to know the diagnosis results of the patients because the patient’s data always continue to exist in an encrypted form even during the recognition process in health care system. To validate the performance, evaluate the method of classification using naive bayes classifier on medical datasets and the accuracy of the results is much better. Secure decision support estimation is a computerized medical diagnosis process for enhancing the health related decisions to improve patient’s health and also provide the more accurate decision in the diagnosis process.
AN ALGORITHM FOR PREDICTIVE DATA MINING APPROACH IN MEDICAL DIAGNOSISijcsit
The Healthcare industry contains big and complex data that may be required in order to discover fascinating pattern of diseases & makes effective decisions with the help of different machine learning techniques. Advanced data mining techniques are used to discover knowledge in database and for medical
research. This paper has analyzed prediction systems for Diabetes, Kidney and Liver disease using more
number of input attributes. The data mining classification techniques, namely Support Vector Machine(SVM) and Random Forest (RF) are analyzed on Diabetes, Kidney and Liver disease database. The performance of these techniques is compared, based on precision, recall, accuracy, f_measure as well
as time. As a result of study the proposed algorithm is designed using SVM and RF algorithm and the experimental result shows the accuracy of 99.35%, 99.37 and 99.14 on diabetes, kidney and liver disease respectively.
Improving the performance of Intrusion detection systemsyasmen essam
Intrusion detection systems (IDS) are widely studied by
researchers nowadays due to the dramatic growth in
network-based technologies. Policy violations and
unauthorized access is in turn increasing which makes
intrusion detection systems of great importance. Existing
approaches to improve intrusion detection systems focus on feature selection or reduction since some features are
irrelevant or redundant which when removed improve the
accuracy as well as the learning time.
Agent Oriented Patient Scheduling System: A Concurrent Metatem Based ApproachIJERA Editor
The Problem of Patient Scheduling[6] is a major issue in a Medical Healthcare System[4]. In India, Healthcare
is an 80 billion dollar Industry and is growing at an average rate of 17% annually. However, quality healthcare
is still out of reach for many. Each year thousands of fatalities arise simply due to the fact that patient could not
be provided with proper medical facilities at the right time. A software agent may be a member of a Multi-Agent
System[2][5] (MAS) which is collectively performing a range of complex and intelligent tasks. Using
Concurrent Metatem[3], a Multi-Agent Language, we have attempted to model a patient scheduling
system[4][6] that can help hospitals collaborate among them through a Liaison-Agent, in order to provide
patients with the best care possible. Patients should no longer have to be turned down when hospitals are packed
to capacity; instead, they could simply be shifted to another hospital. Hospitals and even doctors are assigned to
patients after an automated process of matching patient needs with doctor expertise and hospital infrastructureleading
to reduced waiting-time while maximizing efficiency and potentially saving lives.
Trust Enhanced Role Based Access Control Using Genetic Algorithm IJECEIAES
Improvements in technological innovations have become a boon for business organizations, firms, institutions, etc. System applications are being developed for organizations whether small-scale or large-scale. Taking into consideration the hierarchical nature of large organizations, security is an important factor which needs to be taken into account. For any healthcare organization, maintaining the confidentiality and integrity of the patients’ records is of utmost importance while ensuring that they are only available to the authorized personnel. The paper discusses the technique of Role-Based Access Control (RBAC) and its different aspects. The paper also suggests a trust enhanced model of RBAC implemented with selection and mutation only ‘Genetic Algorithm’. A practical scenario involving healthcare organization has also been considered. A model has been developed to consider the policies of different health departments and how it affects the permissions of a particular role. The purpose of the algorithm is to allocate tasks for every employee in an automated manner and ensures that they are not over-burdened with the work assigned. In addition, the trust records of the employees ensure that malicious users do not gain access to confidential patient data.
Optimization of network traffic anomaly detection using machine learning IJECEIAES
In this paper, to optimize the process of detecting cyber-attacks, we choose to propose 2 main optimization solutions: Optimizing the detection method and optimizing features. Both of these two optimization solutions are to ensure the aim is to increase accuracy and reduce the time for analysis and detection. Accordingly, for the detection method, we recommend using the Random Forest supervised classification algorithm. The experimental results in section 4.1 have proven that our proposal that use the Random Forest algorithm for abnormal behavior detection is completely correct because the results of this algorithm are much better than some other detection algorithms on all measures. For the feature optimization solution, we propose to use some data dimensional reduction techniques such as information gain, principal component analysis, and correlation coefficient method. The results of the research proposed in our paper have proven that to optimize the cyberattack detection process, it is not necessary to use advanced algorithms with complex and cumbersome computational requirements, it must depend on the monitoring data for selecting the reasonable feature extraction and optimization algorithm as well as the appropriate attack classification and detection algorithms.
DEVELOPING A CASE-BASED RETRIEVAL SYSTEM FOR SUPPORTING IT CUSTOMERSIJCSEA Journal
The case-based reasoning (CBR) approach is a modern approach. It is adopted for designing knowledgebased expertise systems. The aforementioned approach depends much on the stored experiences. These experiences serve as cases that can be employed for solving new problems. That is done through retrieving similar cases from the system and utilizing their solutions. The latter approach aims at solving problems through reviewing, processing and applying their experiences. The present study aimed to shed a light on a CBR application. That is done to develop a new system for assisting Internet users and solving the problems faced by those users. In addition, the present study focuses on the cases retrieval stage. It aimed at designing and building an experienced inquiry system for solving any problem that internet users might face when using a case-based reasoning (CBR) dialogue. That shall enable internet users to solve the problems faced when using the Internet. The system that was developed by the researcher operates through displaying the similar cases through a dialogue. That is done through using a well-developed algorithm and reviewing the relevant previous studies. It was found that the success rate of the proposed system is high.
DEVELOPING A CASE-BASED RETRIEVAL SYSTEM FOR SUPPORTING IT CUSTOMERSIJCSEA Journal
The case-based reasoning (CBR) approach is a modern approach. It is adopted for designing knowledgebased expertise systems. The aforementioned approach depends much on the stored experiences. These experiences serve as cases that can be employed for solving new problems. That is done through retrieving similar cases from the system and utilizing their solutions. The latter approach aims at solving problems through reviewing, processing and applying their experiences. The present study aimed to shed a light on a CBR application. That is done to develop a new system for assisting Internet users and solving the problems faced by those users. In addition, the present study focuses on the cases retrieval stage. It aimed at designing and building an experienced inquiry system for solving any problem that internet users might face when using a case-based reasoning (CBR) dialogue. That shall enable internet users to solve the problems faced when using the Internet. The system that was developed by the researcher operates through displaying the similar cases through a dialogue. That is done through using a well-developed algorithm and reviewing the relevant previous studies. It was found that the success rate of the proposed system is high.
Expert System to Determine the Priority Scale of Case in Laboratory of Forens...AM Publications
To know finished priority scale of case in Laboratory of Forensic, in this thesis we developed an expert system with forward and backward chaining methods rule based, used some criteria of case. Priority scale of case is done to sort data and determine the stage of case, which will be used, for examination. This process needs an evaluation in a number on aspect in accordance with the criteria and characteristics of the case. The capabilities of this system accommodates input data cases which user needs, priority scale and stages according to the standard operating procedure in Indonesian National Police especially in Laboratory of Forensic branch in Semarang and gives the visual picture like groove determining of the priority scale also the best solution from the several alternatives used forward and backward chaining methods rule based. The result showed a good and ideal decision for solved case by quickly recommendation and main priority of case with the result of compatibility level was 100%.
Improving collaborative filtering using lexicon-based sentiment analysisIJECEIAES
Since data is available increasingly on the Internet, efforts are needed to develop and improve recommender systems to produce a list of possible favorite items. In this paper, we expand our work to enhance the accuracy of Arabic collaborative filtering by applying sentiment analysis to user reviews, we also addressed major problems of the current work by applying effective techniques to handle the scalability and sparsity problems. The proposed approach consists of two phases: the sentiment analysis and the recommendation phase. The sentiment analysis phase estimates sentiment scores using a special lexicon for the Arabic dataset. The item-based and singular value decomposition-based collaborative filtering are used in the second phase. Overall, our proposed approach improves the experiments’ results by reducing average of mean absolute and root mean squared errors using a large Arabic dataset consisting of 63,000 book reviews.
Empirical evaluation of continuous auditing system use: a systematic reviewIJECEIAES
For more than two decades, the concept of continuous auditing (CA) has been introduced and many large firms had taken the initiatives to apply the CA system in supporting their audit functions. Despite the benefits that the CA system is offers, it is still not widely use and the number of people using the CA system is still considered low. This research focuses on the published papers on the use of CA system within the context of auditing system addressing the quality of system implementation. This paper analyzed primary studies collected using the pre-determined search strings on nine online databases. As a result, a total of 60 articles were carefully selected to undergo further analysis based on empirical evidence of CA system use. The articles were analyzed qualitatively using ATLAS.ti 7 and the elements for the CA system use are extracted from the selected papers. A total of four elements were identified contributing to the use of CA in practice. Those elements are the participant quality, system quality, information quality and products and services quality. This study answers five research objectives to understand the current studies on CA and to determine future research works on CA.
Subscription fraud analytics using classificationSomdeep Sen
A fictitious telecom company called Bad Idea came up with a strange rate plan called Praxis Plan where the callers are allowed to make only one call in the Morning (9AM-Noon), Afternoon (Noon-4PM), Evening(4PM-9PM) and Night (9PM-Midnight); i.e. four calls per day. Despite the popularity of the plan, Bad Idea was a target of Subscription Fraud by a gang of fraudsters consisting of three people: Sally, Virginia and Vince. They finally terminated their services. Bad Idea has their call logs spanning over one and half months.
The analytics team of the company has been provided two data sets: Black-List Subscriber Call-Logs & Audit Log. The Black-List Subscriber Call-Logs data set includes the calling patterns of the three fraudsters i.e. Sally, Virginia and Vince. After every 5 days the company undertakes an audit to see whether these Fraudsters have joined their network. The company reviews the list of subscribers who have made calls to the same people as these three fraudsters and in the same time frame. This has been provided in the Audit Log.
Test Data: http://bit.ly/1du9cRs
Training Data: http://bit.ly/1du9AQ1
A novel population-based local search for nurse rostering problem IJECEIAES
Population-based approaches regularly are better than single based (local search) approaches in exploring the search space. However, the drawback of population-based approaches is in exploiting the search space. Several hybrid approaches have proven their efficiency through different domains of optimization problems by incorporating and integrating the strength of population and local search approaches. Meanwhile, hybrid methods have a drawback of increasing the parameter tuning. Recently, population-based local search was proposed for a university course-timetabling problem with fewer parameters than existing approaches, the proposed approach proves its effectiveness. The proposed approach employs two operators to intensify and diversify the search space. The first operator is applied to a single solution, while the second is applied for all solutions. This paper aims to investigate the performance of population-based local search for the nurse rostering problem. The INRC2010 database with a dataset composed of 69 instances is used to test the performance of PB-LS. A comparison was made between the performance of PB-LS and other existing approaches in the literature. Results show good performances of proposed approach compared to other approaches, where population-based local search provided best results in 55 cases over 69 instances used in experiments.
Enhancing feature selection with a novel hybrid approach incorporating geneti...IJECEIAES
Computing advances in data storage are leading to rapid growth in large-scale datasets. Using all features increases temporal/spatial complexity and negatively influences performance. Feature selection is a fundamental stage in data preprocessing, removing redundant and irrelevant features to minimize the number of features and enhance the performance of classification accuracy. Numerous optimization algorithms were employed to handle feature selection (FS) problems, and they outperform conventional FS techniques. However, there is no metaheuristic FS method that outperforms other optimization algorithms in many datasets. This motivated our study to incorporate the advantages of various optimization techniques to obtain a powerful technique that outperforms other methods in many datasets from different domains. In this article, a novel combined method GASI is developed using swarm intelligence (SI) based feature selection techniques and genetic algorithms (GA) that uses a multi-objective fitness function to seek the optimal subset of features. To assess the performance of the proposed approach, seven datasets have been collected from the UCI repository and exploited to test the newly established feature selection technique. The experimental results demonstrate that the suggested method GASI outperforms many powerful SI-based feature selection techniques studied. GASI obtains a better average fitness value and improves classification performance.
Implementation of Banker’s Algorithm Using Dynamic Modified Approachrahulmonikasharma
Banker’s algorithm referred to as resource allocation and deadlock avoidance algorithm that checks for the safety by simulating the allocation of predetermined maximum possible of resources and makes the system into s-state by checking the possible deadlock conditions for all other pending processes. It needs to know how much of each resource a process could possibly request. Number of processes are static in algorithm, but in most of system processes varies dynamically and no additional process will be started while it is in execution. The number of resources are not allow to go down while it is in execution. In this research an approach for Dynamic Banker's algorithm is proposed which allows the number of resources to be changed at runtime that prevents the system to fall in unsafe state. It also give details about all the resources and processes that which one require resources and in what quantity. This also allocates the resource automatically to the stopped process for the execution and will always give the appropriate safe sequence for the given processes.
Trust Enhanced Role Based Access Control Using Genetic Algorithm IJECEIAES
Improvements in technological innovations have become a boon for business organizations, firms, institutions, etc. System applications are being developed for organizations whether small-scale or large-scale. Taking into consideration the hierarchical nature of large organizations, security is an important factor which needs to be taken into account. For any healthcare organization, maintaining the confidentiality and integrity of the patients’ records is of utmost importance while ensuring that they are only available to the authorized personnel. The paper discusses the technique of Role-Based Access Control (RBAC) and its different aspects. The paper also suggests a trust enhanced model of RBAC implemented with selection and mutation only ‘Genetic Algorithm’. A practical scenario involving healthcare organization has also been considered. A model has been developed to consider the policies of different health departments and how it affects the permissions of a particular role. The purpose of the algorithm is to allocate tasks for every employee in an automated manner and ensures that they are not over-burdened with the work assigned. In addition, the trust records of the employees ensure that malicious users do not gain access to confidential patient data.
Optimization of network traffic anomaly detection using machine learning IJECEIAES
In this paper, to optimize the process of detecting cyber-attacks, we choose to propose 2 main optimization solutions: Optimizing the detection method and optimizing features. Both of these two optimization solutions are to ensure the aim is to increase accuracy and reduce the time for analysis and detection. Accordingly, for the detection method, we recommend using the Random Forest supervised classification algorithm. The experimental results in section 4.1 have proven that our proposal that use the Random Forest algorithm for abnormal behavior detection is completely correct because the results of this algorithm are much better than some other detection algorithms on all measures. For the feature optimization solution, we propose to use some data dimensional reduction techniques such as information gain, principal component analysis, and correlation coefficient method. The results of the research proposed in our paper have proven that to optimize the cyberattack detection process, it is not necessary to use advanced algorithms with complex and cumbersome computational requirements, it must depend on the monitoring data for selecting the reasonable feature extraction and optimization algorithm as well as the appropriate attack classification and detection algorithms.
DEVELOPING A CASE-BASED RETRIEVAL SYSTEM FOR SUPPORTING IT CUSTOMERSIJCSEA Journal
The case-based reasoning (CBR) approach is a modern approach. It is adopted for designing knowledgebased expertise systems. The aforementioned approach depends much on the stored experiences. These experiences serve as cases that can be employed for solving new problems. That is done through retrieving similar cases from the system and utilizing their solutions. The latter approach aims at solving problems through reviewing, processing and applying their experiences. The present study aimed to shed a light on a CBR application. That is done to develop a new system for assisting Internet users and solving the problems faced by those users. In addition, the present study focuses on the cases retrieval stage. It aimed at designing and building an experienced inquiry system for solving any problem that internet users might face when using a case-based reasoning (CBR) dialogue. That shall enable internet users to solve the problems faced when using the Internet. The system that was developed by the researcher operates through displaying the similar cases through a dialogue. That is done through using a well-developed algorithm and reviewing the relevant previous studies. It was found that the success rate of the proposed system is high.
DEVELOPING A CASE-BASED RETRIEVAL SYSTEM FOR SUPPORTING IT CUSTOMERSIJCSEA Journal
The case-based reasoning (CBR) approach is a modern approach. It is adopted for designing knowledgebased expertise systems. The aforementioned approach depends much on the stored experiences. These experiences serve as cases that can be employed for solving new problems. That is done through retrieving similar cases from the system and utilizing their solutions. The latter approach aims at solving problems through reviewing, processing and applying their experiences. The present study aimed to shed a light on a CBR application. That is done to develop a new system for assisting Internet users and solving the problems faced by those users. In addition, the present study focuses on the cases retrieval stage. It aimed at designing and building an experienced inquiry system for solving any problem that internet users might face when using a case-based reasoning (CBR) dialogue. That shall enable internet users to solve the problems faced when using the Internet. The system that was developed by the researcher operates through displaying the similar cases through a dialogue. That is done through using a well-developed algorithm and reviewing the relevant previous studies. It was found that the success rate of the proposed system is high.
Expert System to Determine the Priority Scale of Case in Laboratory of Forens...AM Publications
To know finished priority scale of case in Laboratory of Forensic, in this thesis we developed an expert system with forward and backward chaining methods rule based, used some criteria of case. Priority scale of case is done to sort data and determine the stage of case, which will be used, for examination. This process needs an evaluation in a number on aspect in accordance with the criteria and characteristics of the case. The capabilities of this system accommodates input data cases which user needs, priority scale and stages according to the standard operating procedure in Indonesian National Police especially in Laboratory of Forensic branch in Semarang and gives the visual picture like groove determining of the priority scale also the best solution from the several alternatives used forward and backward chaining methods rule based. The result showed a good and ideal decision for solved case by quickly recommendation and main priority of case with the result of compatibility level was 100%.
Improving collaborative filtering using lexicon-based sentiment analysisIJECEIAES
Since data is available increasingly on the Internet, efforts are needed to develop and improve recommender systems to produce a list of possible favorite items. In this paper, we expand our work to enhance the accuracy of Arabic collaborative filtering by applying sentiment analysis to user reviews, we also addressed major problems of the current work by applying effective techniques to handle the scalability and sparsity problems. The proposed approach consists of two phases: the sentiment analysis and the recommendation phase. The sentiment analysis phase estimates sentiment scores using a special lexicon for the Arabic dataset. The item-based and singular value decomposition-based collaborative filtering are used in the second phase. Overall, our proposed approach improves the experiments’ results by reducing average of mean absolute and root mean squared errors using a large Arabic dataset consisting of 63,000 book reviews.
Empirical evaluation of continuous auditing system use: a systematic reviewIJECEIAES
For more than two decades, the concept of continuous auditing (CA) has been introduced and many large firms had taken the initiatives to apply the CA system in supporting their audit functions. Despite the benefits that the CA system is offers, it is still not widely use and the number of people using the CA system is still considered low. This research focuses on the published papers on the use of CA system within the context of auditing system addressing the quality of system implementation. This paper analyzed primary studies collected using the pre-determined search strings on nine online databases. As a result, a total of 60 articles were carefully selected to undergo further analysis based on empirical evidence of CA system use. The articles were analyzed qualitatively using ATLAS.ti 7 and the elements for the CA system use are extracted from the selected papers. A total of four elements were identified contributing to the use of CA in practice. Those elements are the participant quality, system quality, information quality and products and services quality. This study answers five research objectives to understand the current studies on CA and to determine future research works on CA.
Subscription fraud analytics using classificationSomdeep Sen
A fictitious telecom company called Bad Idea came up with a strange rate plan called Praxis Plan where the callers are allowed to make only one call in the Morning (9AM-Noon), Afternoon (Noon-4PM), Evening(4PM-9PM) and Night (9PM-Midnight); i.e. four calls per day. Despite the popularity of the plan, Bad Idea was a target of Subscription Fraud by a gang of fraudsters consisting of three people: Sally, Virginia and Vince. They finally terminated their services. Bad Idea has their call logs spanning over one and half months.
The analytics team of the company has been provided two data sets: Black-List Subscriber Call-Logs & Audit Log. The Black-List Subscriber Call-Logs data set includes the calling patterns of the three fraudsters i.e. Sally, Virginia and Vince. After every 5 days the company undertakes an audit to see whether these Fraudsters have joined their network. The company reviews the list of subscribers who have made calls to the same people as these three fraudsters and in the same time frame. This has been provided in the Audit Log.
Test Data: http://bit.ly/1du9cRs
Training Data: http://bit.ly/1du9AQ1
A novel population-based local search for nurse rostering problem IJECEIAES
Population-based approaches regularly are better than single based (local search) approaches in exploring the search space. However, the drawback of population-based approaches is in exploiting the search space. Several hybrid approaches have proven their efficiency through different domains of optimization problems by incorporating and integrating the strength of population and local search approaches. Meanwhile, hybrid methods have a drawback of increasing the parameter tuning. Recently, population-based local search was proposed for a university course-timetabling problem with fewer parameters than existing approaches, the proposed approach proves its effectiveness. The proposed approach employs two operators to intensify and diversify the search space. The first operator is applied to a single solution, while the second is applied for all solutions. This paper aims to investigate the performance of population-based local search for the nurse rostering problem. The INRC2010 database with a dataset composed of 69 instances is used to test the performance of PB-LS. A comparison was made between the performance of PB-LS and other existing approaches in the literature. Results show good performances of proposed approach compared to other approaches, where population-based local search provided best results in 55 cases over 69 instances used in experiments.
Enhancing feature selection with a novel hybrid approach incorporating geneti...IJECEIAES
Computing advances in data storage are leading to rapid growth in large-scale datasets. Using all features increases temporal/spatial complexity and negatively influences performance. Feature selection is a fundamental stage in data preprocessing, removing redundant and irrelevant features to minimize the number of features and enhance the performance of classification accuracy. Numerous optimization algorithms were employed to handle feature selection (FS) problems, and they outperform conventional FS techniques. However, there is no metaheuristic FS method that outperforms other optimization algorithms in many datasets. This motivated our study to incorporate the advantages of various optimization techniques to obtain a powerful technique that outperforms other methods in many datasets from different domains. In this article, a novel combined method GASI is developed using swarm intelligence (SI) based feature selection techniques and genetic algorithms (GA) that uses a multi-objective fitness function to seek the optimal subset of features. To assess the performance of the proposed approach, seven datasets have been collected from the UCI repository and exploited to test the newly established feature selection technique. The experimental results demonstrate that the suggested method GASI outperforms many powerful SI-based feature selection techniques studied. GASI obtains a better average fitness value and improves classification performance.
Implementation of Banker’s Algorithm Using Dynamic Modified Approachrahulmonikasharma
Banker’s algorithm referred to as resource allocation and deadlock avoidance algorithm that checks for the safety by simulating the allocation of predetermined maximum possible of resources and makes the system into s-state by checking the possible deadlock conditions for all other pending processes. It needs to know how much of each resource a process could possibly request. Number of processes are static in algorithm, but in most of system processes varies dynamically and no additional process will be started while it is in execution. The number of resources are not allow to go down while it is in execution. In this research an approach for Dynamic Banker's algorithm is proposed which allows the number of resources to be changed at runtime that prevents the system to fall in unsafe state. It also give details about all the resources and processes that which one require resources and in what quantity. This also allocates the resource automatically to the stopped process for the execution and will always give the appropriate safe sequence for the given processes.
Implementation of Banker’s Algorithm Using Dynamic Modified Approachrahulmonikasharma
Banker’s algorithm referred to as resource allocation and deadlock avoidance algorithm that checks for the safety by simulating the allocation of predetermined maximum possible of resources and makes the system into s-state by checking the possible deadlock conditions for all other pending processes. It needs to know how much of each resource a process could possibly request. Number of processes are static in algorithm, but in most of system processes varies dynamically and no additional process will be started while it is in execution. The number of resources are not allow to go down while it is in execution. In this research an approach for Dynamic Banker's algorithm is proposed which allows the number of resources to be changed at runtime that prevents the system to fall in unsafe state. It also give details about all the resources and processes that which one require resources and in what quantity. This also allocates the resource automatically to the stopped process for the execution and will always give the appropriate safe sequence for the given processes.
The process of determining eligibility for someone is usually given with the same value. The determination of eligibility is determined based on several criteria. These criteria are the ability to Academic Potential Test (APT) and Grade Point Average (GPA). The Tsukamoto fuzzy system is the model used in this paper. Each input variable is divided into two membership functions. There are nine rules of Tsukamoto's model applied. The system also provides following changes of parameters if the parameter values are to be changed. The result of the difference of the employability is given to them. The greater the APT score, the more the value of the feasibility of the work obtained. The more GPA values gained, the greater the value of the workplace eligibility.
Fuzzy Association Rule Mining based Model to Predict Students’ Performance IJECEIAES
The major intention of higher education institutions is to supply quality education to its students. One approach to get maximum level of quality in higher education system is by discovering knowledge for prediction regarding the internal assessment and end semester examination. The projected work intends to approach this objective by taking the advantage of fuzzy inference technique to classify student scores data according to the level of their performance. In this paper, student’s performance is evaluated using fuzzy association rule mining that describes Prediction of performance of the students at the end of the semester, on the basis of previous database like Attendance, Midsem Marks, Previous semester marks and Previous Academic Records were collected from the student’s previous database, to identify those students which needed individual attention to decrease fail ration and taking suitable action for the next semester examination.
Similar to Zakah Management System using Approach Classification (20)
Amazon products reviews classification based on machine learning, deep learni...TELKOMNIKA JOURNAL
In recent times, the trend of online shopping through e-commerce stores and websites has grown to a huge extent. Whenever a product is purchased on an e-commerce platform, people leave their reviews about the product. These reviews are very helpful for the store owners and the product’s manufacturers for the betterment of their work process as well as product quality. An automated system is proposed in this work that operates on two datasets D1 and D2 obtained from Amazon. After certain preprocessing steps, N-gram and word embedding-based features are extracted using term frequency-inverse document frequency (TF-IDF), bag of words (BoW) and global vectors (GloVe), and Word2vec, respectively. Four machine learning (ML) models support vector machines (SVM), logistic regression (RF), logistic regression (LR), multinomial Naïve Bayes (MNB), two deep learning (DL) models convolutional neural network (CNN), long-short term memory (LSTM), and standalone bidirectional encoder representations (BERT) are used to classify reviews as either positive or negative. The results obtained by the standard ML, DL models and BERT are evaluated using certain performance evaluation measures. BERT turns out to be the best-performing model in the case of D1 with an accuracy of 90% on features derived by word embedding models while the CNN provides the best accuracy of 97% upon word embedding features in the case of D2. The proposed model shows better overall performance on D2 as compared to D1.
Design, simulation, and analysis of microstrip patch antenna for wireless app...TELKOMNIKA JOURNAL
In this study, a microstrip patch antenna that works at 3.6 GHz was built and tested to see how well it works. In this work, Rogers RT/Duroid 5880 has been used as the substrate material, with a dielectric permittivity of 2.2 and a thickness of 0.3451 mm; it serves as the base for the examined antenna. The computer simulation technology (CST) studio suite is utilized to show the recommended antenna design. The goal of this study was to get a more extensive transmission capacity, a lower voltage standing wave ratio (VSWR), and a lower return loss, but the main goal was to get a higher gain, directivity, and efficiency. After simulation, the return loss, gain, directivity, bandwidth, and efficiency of the supplied antenna are found to be -17.626 dB, 9.671 dBi, 9.924 dBi, 0.2 GHz, and 97.45%, respectively. Besides, the recreation uncovered that the transfer speed side-lobe level at phi was much better than those of the earlier works, at -28.8 dB, respectively. Thus, it makes a solid contender for remote innovation and more robust communication.
Design and simulation an optimal enhanced PI controller for congestion avoida...TELKOMNIKA JOURNAL
In this paper, snake optimization algorithm (SOA) is used to find the optimal gains of an enhanced controller for controlling congestion problem in computer networks. M-file and Simulink platform is adopted to evaluate the response of the active queue management (AQM) system, a comparison with two classical controllers is done, all tuned gains of controllers are obtained using SOA method and the fitness function chose to monitor the system performance is the integral time absolute error (ITAE). Transient analysis and robust analysis is used to show the proposed controller performance, two robustness tests are applied to the AQM system, one is done by varying the size of queue value in different period and the other test is done by changing the number of transmission control protocol (TCP) sessions with a value of ± 20% from its original value. The simulation results reflect a stable and robust behavior and best performance is appeared clearly to achieve the desired queue size without any noise or any transmission problems.
Improving the detection of intrusion in vehicular ad-hoc networks with modifi...TELKOMNIKA JOURNAL
Vehicular ad-hoc networks (VANETs) are wireless-equipped vehicles that form networks along the road. The security of this network has been a major challenge. The identity-based cryptosystem (IBC) previously used to secure the networks suffers from membership authentication security features. This paper focuses on improving the detection of intruders in VANETs with a modified identity-based cryptosystem (MIBC). The MIBC is developed using a non-singular elliptic curve with Lagrange interpolation. The public key of vehicles and roadside units on the network are derived from number plates and location identification numbers, respectively. Pseudo-identities are used to mask the real identity of users to preserve their privacy. The membership authentication mechanism ensures that only valid and authenticated members of the network are allowed to join the network. The performance of the MIBC is evaluated using intrusion detection ratio (IDR) and computation time (CT) and then validated with the existing IBC. The result obtained shows that the MIBC recorded an IDR of 99.3% against 94.3% obtained for the existing identity-based cryptosystem (EIBC) for 140 unregistered vehicles attempting to intrude on the network. The MIBC shows lower CT values of 1.17 ms against 1.70 ms for EIBC. The MIBC can be used to improve the security of VANETs.
Conceptual model of internet banking adoption with perceived risk and trust f...TELKOMNIKA JOURNAL
Understanding the primary factors of internet banking (IB) acceptance is critical for both banks and users; nevertheless, our knowledge of the role of users’ perceived risk and trust in IB adoption is limited. As a result, we develop a conceptual model by incorporating perceived risk and trust into the technology acceptance model (TAM) theory toward the IB. The proper research emphasized that the most essential component in explaining IB adoption behavior is behavioral intention to use IB adoption. TAM is helpful for figuring out how elements that affect IB adoption are connected to one another. According to previous literature on IB and the use of such technology in Iraq, one has to choose a theoretical foundation that may justify the acceptance of IB from the customer’s perspective. The conceptual model was therefore constructed using the TAM as a foundation. Furthermore, perceived risk and trust were added to the TAM dimensions as external factors. The key objective of this work was to extend the TAM to construct a conceptual model for IB adoption and to get sufficient theoretical support from the existing literature for the essential elements and their relationships in order to unearth new insights about factors responsible for IB adoption.
Efficient combined fuzzy logic and LMS algorithm for smart antennaTELKOMNIKA JOURNAL
The smart antennas are broadly used in wireless communication. The least mean square (LMS) algorithm is a procedure that is concerned in controlling the smart antenna pattern to accommodate specified requirements such as steering the beam toward the desired signal, in addition to placing the deep nulls in the direction of unwanted signals. The conventional LMS (C-LMS) has some drawbacks like slow convergence speed besides high steady state fluctuation error. To overcome these shortcomings, the present paper adopts an adaptive fuzzy control step size least mean square (FC-LMS) algorithm to adjust its step size. Computer simulation outcomes illustrate that the given model has fast convergence rate as well as low mean square error steady state.
Design and implementation of a LoRa-based system for warning of forest fireTELKOMNIKA JOURNAL
This paper presents the design and implementation of a forest fire monitoring and warning system based on long range (LoRa) technology, a novel ultra-low power consumption and long-range wireless communication technology for remote sensing applications. The proposed system includes a wireless sensor network that records environmental parameters such as temperature, humidity, wind speed, and carbon dioxide (CO2) concentration in the air, as well as taking infrared photos.The data collected at each sensor node will be transmitted to the gateway via LoRa wireless transmission. Data will be collected, processed, and uploaded to a cloud database at the gateway. An Android smartphone application that allows anyone to easily view the recorded data has been developed. When a fire is detected, the system will sound a siren and send a warning message to the responsible personnel, instructing them to take appropriate action. Experiments in Tram Chim Park, Vietnam, have been conducted to verify and evaluate the operation of the system.
Wavelet-based sensing technique in cognitive radio networkTELKOMNIKA JOURNAL
Cognitive radio is a smart radio that can change its transmitter parameter based on interaction with the environment in which it operates. The demand for frequency spectrum is growing due to a big data issue as many Internet of Things (IoT) devices are in the network. Based on previous research, most frequency spectrum was used, but some spectrums were not used, called spectrum hole. Energy detection is one of the spectrum sensing methods that has been frequently used since it is easy to use and does not require license users to have any prior signal understanding. But this technique is incapable of detecting at low signal-to-noise ratio (SNR) levels. Therefore, the wavelet-based sensing is proposed to overcome this issue and detect spectrum holes. The main objective of this work is to evaluate the performance of wavelet-based sensing and compare it with the energy detection technique. The findings show that the percentage of detection in wavelet-based sensing is 83% higher than energy detection performance. This result indicates that the wavelet-based sensing has higher precision in detection and the interference towards primary user can be decreased.
A novel compact dual-band bandstop filter with enhanced rejection bandsTELKOMNIKA JOURNAL
In this paper, we present the design of a new wide dual-band bandstop filter (DBBSF) using nonuniform transmission lines. The method used to design this filter is to replace conventional uniform transmission lines with nonuniform lines governed by a truncated Fourier series. Based on how impedances are profiled in the proposed DBBSF structure, the fractional bandwidths of the two 10 dB-down rejection bands are widened to 39.72% and 52.63%, respectively, and the physical size has been reduced compared to that of the filter with the uniform transmission lines. The results of the electromagnetic (EM) simulation support the obtained analytical response and show an improved frequency behavior.
Deep learning approach to DDoS attack with imbalanced data at the application...TELKOMNIKA JOURNAL
A distributed denial of service (DDoS) attack is where one or more computers attack or target a server computer, by flooding internet traffic to the server. As a result, the server cannot be accessed by legitimate users. A result of this attack causes enormous losses for a company because it can reduce the level of user trust, and reduce the company’s reputation to lose customers due to downtime. One of the services at the application layer that can be accessed by users is a web-based lightweight directory access protocol (LDAP) service that can provide safe and easy services to access directory applications. We used a deep learning approach to detect DDoS attacks on the CICDDoS 2019 dataset on a complex computer network at the application layer to get fast and accurate results for dealing with unbalanced data. Based on the results obtained, it is observed that DDoS attack detection using a deep learning approach on imbalanced data performs better when implemented using synthetic minority oversampling technique (SMOTE) method for binary classes. On the other hand, the proposed deep learning approach performs better for detecting DDoS attacks in multiclass when implemented using the adaptive synthetic (ADASYN) method.
The appearance of uncertainties and disturbances often effects the characteristics of either linear or nonlinear systems. Plus, the stabilization process may be deteriorated thus incurring a catastrophic effect to the system performance. As such, this manuscript addresses the concept of matching condition for the systems that are suffering from miss-match uncertainties and exogeneous disturbances. The perturbation towards the system at hand is assumed to be known and unbounded. To reach this outcome, uncertainties and their classifications are reviewed thoroughly. The structural matching condition is proposed and tabulated in the proposition 1. Two types of mathematical expressions are presented to distinguish the system with matched uncertainty and the system with miss-matched uncertainty. Lastly, two-dimensional numerical expressions are provided to practice the proposed proposition. The outcome shows that matching condition has the ability to change the system to a design-friendly model for asymptotic stabilization.
Implementation of FinFET technology based low power 4×4 Wallace tree multipli...TELKOMNIKA JOURNAL
Many systems, including digital signal processors, finite impulse response (FIR) filters, application-specific integrated circuits, and microprocessors, use multipliers. The demand for low power multipliers is gradually rising day by day in the current technological trend. In this study, we describe a 4×4 Wallace multiplier based on a carry select adder (CSA) that uses less power and has a better power delay product than existing multipliers. HSPICE tool at 16 nm technology is used to simulate the results. In comparison to the traditional CSA-based multiplier, which has a power consumption of 1.7 µW and power delay product (PDP) of 57.3 fJ, the results demonstrate that the Wallace multiplier design employing CSA with first zero finding logic (FZF) logic has the lowest power consumption of 1.4 µW and PDP of 27.5 fJ.
Evaluation of the weighted-overlap add model with massive MIMO in a 5G systemTELKOMNIKA JOURNAL
The flaw in 5G orthogonal frequency division multiplexing (OFDM) becomes apparent in high-speed situations. Because the doppler effect causes frequency shifts, the orthogonality of OFDM subcarriers is broken, lowering both their bit error rate (BER) and throughput output. As part of this research, we use a novel design that combines massive multiple input multiple output (MIMO) and weighted overlap and add (WOLA) to improve the performance of 5G systems. To determine which design is superior, throughput and BER are calculated for both the proposed design and OFDM. The results of the improved system show a massive improvement in performance ver the conventional system and significant improvements with massive MIMO, including the best throughput and BER. When compared to conventional systems, the improved system has a throughput that is around 22% higher and the best performance in terms of BER, but it still has around 25% less error than OFDM.
Reflector antenna design in different frequencies using frequency selective s...TELKOMNIKA JOURNAL
In this study, it is aimed to obtain two different asymmetric radiation patterns obtained from antennas in the shape of the cross-section of a parabolic reflector (fan blade type antennas) and antennas with cosecant-square radiation characteristics at two different frequencies from a single antenna. For this purpose, firstly, a fan blade type antenna design will be made, and then the reflective surface of this antenna will be completed to the shape of the reflective surface of the antenna with the cosecant-square radiation characteristic with the frequency selective surface designed to provide the characteristics suitable for the purpose. The frequency selective surface designed and it provides the perfect transmission as possible at 4 GHz operating frequency, while it will act as a band-quenching filter for electromagnetic waves at 5 GHz operating frequency and will be a reflective surface. Thanks to this frequency selective surface to be used as a reflective surface in the antenna, a fan blade type radiation characteristic at 4 GHz operating frequency will be obtained, while a cosecant-square radiation characteristic at 5 GHz operating frequency will be obtained.
Reagentless iron detection in water based on unclad fiber optical sensorTELKOMNIKA JOURNAL
A simple and low-cost fiber based optical sensor for iron detection is demonstrated in this paper. The sensor head consist of an unclad optical fiber with the unclad length of 1 cm and it has a straight structure. Results obtained shows a linear relationship between the output light intensity and iron concentration, illustrating the functionality of this iron optical sensor. Based on the experimental results, the sensitivity and linearity are achieved at 0.0328/ppm and 0.9824 respectively at the wavelength of 690 nm. With the same wavelength, other performance parameters are also studied. Resolution and limit of detection (LOD) are found to be 0.3049 ppm and 0.0755 ppm correspondingly. This iron sensor is advantageous in that it does not require any reagent for detection, enabling it to be simpler and cost-effective in the implementation of the iron sensing.
Impact of CuS counter electrode calcination temperature on quantum dot sensit...TELKOMNIKA JOURNAL
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
A progressive learning for structural tolerance online sequential extreme lea...TELKOMNIKA JOURNAL
This article discusses the progressive learning for structural tolerance online sequential extreme learning machine (PSTOS-ELM). PSTOS-ELM can save robust accuracy while updating the new data and the new class data on the online training situation. The robustness accuracy arises from using the householder block exact QR decomposition recursive least squares (HBQRD-RLS) of the PSTOS-ELM. This method is suitable for applications that have data streaming and often have new class data. Our experiment compares the PSTOS-ELM accuracy and accuracy robustness while data is updating with the batch-extreme learning machine (ELM) and structural tolerance online sequential extreme learning machine (STOS-ELM) that both must retrain the data in a new class data case. The experimental results show that PSTOS-ELM has accuracy and robustness comparable to ELM and STOS-ELM while also can update new class data immediately.
Electroencephalography-based brain-computer interface using neural networksTELKOMNIKA JOURNAL
This study aimed to develop a brain-computer interface that can control an electric wheelchair using electroencephalography (EEG) signals. First, we used the Mind Wave Mobile 2 device to capture raw EEG signals from the surface of the scalp. The signals were transformed into the frequency domain using fast Fourier transform (FFT) and filtered to monitor changes in attention and relaxation. Next, we performed time and frequency domain analyses to identify features for five eye gestures: opened, closed, blink per second, double blink, and lookup. The base state was the opened-eyes gesture, and we compared the features of the remaining four action gestures to the base state to identify potential gestures. We then built a multilayer neural network to classify these features into five signals that control the wheelchair’s movement. Finally, we designed an experimental wheelchair system to test the effectiveness of the proposed approach. The results demonstrate that the EEG classification was highly accurate and computationally efficient. Moreover, the average performance of the brain-controlled wheelchair system was over 75% across different individuals, which suggests the feasibility of this approach.
Adaptive segmentation algorithm based on level set model in medical imagingTELKOMNIKA JOURNAL
For image segmentation, level set models are frequently employed. It offer best solution to overcome the main limitations of deformable parametric models. However, the challenge when applying those models in medical images stills deal with removing blurs in image edges which directly affects the edge indicator function, leads to not adaptively segmenting images and causes a wrong analysis of pathologies wich prevents to conclude a correct diagnosis. To overcome such issues, an effective process is suggested by simultaneously modelling and solving systems’ two-dimensional partial differential equations (PDE). The first PDE equation allows restoration using Euler’s equation similar to an anisotropic smoothing based on a regularized Perona and Malik filter that eliminates noise while preserving edge information in accordance with detected contours in the second equation that segments the image based on the first equation solutions. This approach allows developing a new algorithm which overcome the studied model drawbacks. Results of the proposed method give clear segments that can be applied to any application. Experiments on many medical images in particular blurry images with high information losses, demonstrate that the developed approach produces superior segmentation results in terms of quantity and quality compared to other models already presented in previeous works.
Automatic channel selection using shuffled frog leaping algorithm for EEG bas...TELKOMNIKA JOURNAL
Drug addiction is a complex neurobiological disorder that necessitates comprehensive treatment of both the body and mind. It is categorized as a brain disorder due to its impact on the brain. Various methods such as electroencephalography (EEG), functional magnetic resonance imaging (FMRI), and magnetoencephalography (MEG) can capture brain activities and structures. EEG signals provide valuable insights into neurological disorders, including drug addiction. Accurate classification of drug addiction from EEG signals relies on appropriate features and channel selection. Choosing the right EEG channels is essential to reduce computational costs and mitigate the risk of overfitting associated with using all available channels. To address the challenge of optimal channel selection in addiction detection from EEG signals, this work employs the shuffled frog leaping algorithm (SFLA). SFLA facilitates the selection of appropriate channels, leading to improved accuracy. Wavelet features extracted from the selected input channel signals are then analyzed using various machine learning classifiers to detect addiction. Experimental results indicate that after selecting features from the appropriate channels, classification accuracy significantly increased across all classifiers. Particularly, the multi-layer perceptron (MLP) classifier combined with SFLA demonstrated a remarkable accuracy improvement of 15.78% while reducing time complexity.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Explore the innovative world of trenchless pipe repair with our comprehensive guide, "The Benefits and Techniques of Trenchless Pipe Repair." This document delves into the modern methods of repairing underground pipes without the need for extensive excavation, highlighting the numerous advantages and the latest techniques used in the industry.
Learn about the cost savings, reduced environmental impact, and minimal disruption associated with trenchless technology. Discover detailed explanations of popular techniques such as pipe bursting, cured-in-place pipe (CIPP) lining, and directional drilling. Understand how these methods can be applied to various types of infrastructure, from residential plumbing to large-scale municipal systems.
Ideal for homeowners, contractors, engineers, and anyone interested in modern plumbing solutions, this guide provides valuable insights into why trenchless pipe repair is becoming the preferred choice for pipe rehabilitation. Stay informed about the latest advancements and best practices in the field.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
2. TELKOMNIKA ISSN: 1693-6930
Zakah Management System using Approach Classification (Zulfajri Basri Hasanuddin)
1853
Zakah consists of two types: Maal Zakah is obligatory Zakah issued by Muslims who
have reached enough nishab and haul. Tithe is the obligatory soul Zakah for all Muslims ranging
from birth and during the life of the world, which is paid each month of Ramadan.
2.2. Al Qaradhawi Method
The Al Qaradhawi calculating method is a method of calculating Zakah based on Yusuf
al Qaradawi’s premise perspective where it used theistinbat method. Etymologically, istinbat
means the discovery, excavation, expenditure (of origin), meaningful laws, regulations and
powers.
2.3. Naive Bayes Classifier
Naive Bayes is a machine learning method that uses probability calculations. This
algorithm utilizes the methods of probability and statisticwhich are stated by the British scientist
Thomas Bayes, that predict the probability in the future based on past experience. Naive Bayes
method is shown to have an accuracy and a very high speed when it is applied to a database
with a large data. The basis of the Naive Bayes theorem thatis used in the programming is the
Bayes formula:
The flow of Naive Bayes method can be seen in Figure 1,
Figure 1. The Naive Bayes Method Flow
Start
Read Training Data
Is it numeric data
Quantity and
Probability
Mean of each
Parameter
Probability Table Standard Deviation for
each Parameter
Mean Standard
Deviation Table
Solution
Stop
No Yes
3. ISSN: 1693-6930
TELKOMNIKA Vol. 15, No. 4, December 2017 : 1852 – 1857
1854
Description:
a. Read the training data.
b. Calculate the amount and probability, but if the numerical data:
Find the mean and standard deviation of each parameter whichis numeric data.
The equation used to calculate the average arithmetic value (mean) can be seen as follows:
Or
Where :
μ: arithmetic average (mean)
xi: samples value to -i
n: number of samples
And equations to calculate the standard deviation (standard deviation) can be seen as follows:
Where :
σ: standard deviation
xi: value of x to -i
μ: average count
n: number of samples
Find probabilistic value by counting the number of the corresponding data from the
same category divided by the amount of data in the category.
Getting the values in the mean table, standard deviation and probability.
The solution is then generated.
3. Results and Analysis
3.1. Program Testing
This test was conducted to determine how to use the program, the speed of the
program and the ability of the program to perform the classification. The program test was
conducted using the confusion matrix method in which the author attempted four times and then
calculate the accuracy and error.
3.2. Program Usage
On the use of the program will be shown how to run Zakah Application. Musakki party
can open the website link of Zakah applications then perform the calculation process, as for the
types of Zakah contained in the system, namely Gold and silver Zakah, money Zakah and
securities, commercial Zakah (enterprise), agriculture Zakah, charity farm, income Zakah, and
rikaz Zakah (artifacts). User calculate the Zakah on owned property by input the amount of
assets being owned by type and then the system will calculate, if the nishab referring to the gold
price, the system will automatically calculate based on the prevailing price of Antam’s gold in
Indonesia, moreover, the system displays the calculation results of each types of Zakah and
sum the total shown in the results that can be seen in the following figure:
4. TELKOMNIKA ISSN: 1693-6930
Zakah Management System using Approach Classification (Zulfajri Basri Hasanuddin)
1855
Figure 2. The calculation of maal Zakah
3.3 Classification Prospective Zakah Recipients
Distribution of Zakah funds by Board of Zakah Al Markaz, Makassar, South Sulawesi is
done after determining mustahik using the classification process. Classification is made using
discreteand continuous data which is conducted by experiments using feasible and unfeasible
data. Examples of data that we use in the experiment are described in the Table 1:
Table 1. Total Practice and Test Data
Practice Data Test Data
Feasible Unfeasible Feasible Unfeasible
200 150 25 25
In the fourth experiment, the authors tested the 350 practice data consisting of 200
feasible data and 150 unfeasible data. Tested with 50 pieces of test data. With details of 25
feasible and 25 unfeasible data. To determine a mustahik candidate is classified in the feasible
or unfeasible class. Then the 7 features that will be used are as follows:
a. Income
b. Amenability
c. Ownership of Assets.
d. Surface area.
e. Wall Types.
f. Floor Types.
g. Last Education.
Board of Zakah admin performs the data entry of mustahik candidate on the form of test
data that is displayed as shown in Figure 3.
5. ISSN: 1693-6930
TELKOMNIKA Vol. 15, No. 4, December 2017 : 1852 – 1857
1856
Figure 3. Mustahik Candidate Data Input Form
After the data of candidate mustahiq was input as in Figure 3, the program will make the
process of classification using Naive Bayes method, then the system will show the calculation
results of classification which resulted in a decision support system that states a candidate
mustahik eligible to receive Zakah and entered in the mustahik category as shown in the
Figure 4.
Figure 4. The results of the mustahik prospective classification
6. TELKOMNIKA ISSN: 1693-6930
Zakah Management System using Approach Classification (Zulfajri Basri Hasanuddin)
1857
The use of the Naïve Bayes classifier method to classified the feasible and unfeasible of
theZakahreceiver is an effective and productive way. Due to the ease in determining of mustahik
candidates as is done by computerization with the terms and conditions in making decisions.
The author has conducted four experiments as shown in Table 2:
Table 2. Number of Practice and Test Data 4 Times Trial
Trial Practice Data Test Data
Percentage
Feasible Unfeasible Feasible Unfeasible
1 45 70 25 25 84%
2 85 70 25 25 84.5%
3 100 100 25 25 86%
4 150 110 25 25 88%
Where the results of the final percentage of the accuracy of the program's success is
the result of percentage of accuracy:
𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 =
84 + 84 + 86 + 88
4
∗ 100
=
342
4
∗ 100
= 85%
Accuracy or precision of the program that we have created with the use of Naive Bayes
methods experience the difference at every change of practice data that author used. Thus, it is
known that mustahik classification program that we made is very dependent and influenced by
the used data set.
4. Conclusion
Based on the results of the study as described above, it can be concluded that this
application can help musakki for calculating maal Zakah and tithes, using Naive Bayes as a
method of classification and can facilitate the Board of Zakah in determining whether a resident
is feasible or not to receive Zakah based on Islam Shari'a. The built program has shown the
final percentage of the accuracy is at average 85%. It is hoped that the development of program
can be combine with GIS in order to easily identify the mustahik address and region.
References
[1] Ahmada Adzrin Raja Raja Ahmad Marzuki Amiruddin Othmanb, Sufiyudin Muhammad Salleh.
Assessing the Satisfaction Level towards Recipients of Zakat Zakat Management. International
Accounting and Business Conference, Science Direct. 2015.
[2] Fenty EM A, Khodijah Hulliyah, Mayang Ekafitri. Applying Mobile Application Development Life Cycle
in the Development of Zakat Maal Mobile Web Application Using JQuery Mobile Framework. IEEE
Conference Publications. 2014: 89-92.
[3] Heit Kotter, Henning, Tim A. Majchrzak, Benjamin Ruland and Till Weber, "Evaluating Frameworks to
create Mobile Web Apps". the International 9
th
Konferensi dan Teknologi Web Information Systems,
2013
[4] Liping Fan, Xing Huang, Liu Yi. Fault Diagnosis for Fuel Cell based on Naive Bayesian
Classification. TELKOMNIKA. 2013; 11(12): pp. 7664 ~ 7670 e-ISSN: 2087-278X.
[5] Muhammad Syukri Salleh1. Organizational And definitional reconfiguration Of Zakat
Management. International Journal of Education and Research. 2014; 2(5).
[6] Rizuan Abdul Kadir Mohd Zulkifli Zainal Abidin, Juliana Anis Ramli, Khairul Nizam Surbaini, Factors
Influencing A Business Towards Zakat Payment In Malaysia. International Journal of Science
Commerce and Humanities. 2014; 2(3).
[7] Ram Saad Al Jaffri, Norazita Marina Abdul Aziz, Norfaiezah Sawandi. Islamic accountability
framework in the Zakat funds management. International Conference on Accounting Studies, ICAS.
2014.