Bayesian theory and association rule mining methods are artificial intelligence techniques that have been used in various computing fields, especially in machine learning. Internet has been considered as an easy ground for vices like radicalization because of its diverse nature and ease of information access. These vices could be managed using recommender systems methods which are used to deliver users’ preference data based on their previous interests and in relation with the community around the user. The recommender systems are divided into two broad categories, i.e. collaborative systems which considers users which share the same preferences as the user in question and content-based recommender systems tends to recommend websites similar to those already liked by the user. Recent research and information from security organs indicate that, online radicalization has been growing at an alarming rate. The paper reviews in depth what has been carried out in recommender systems and looks at how these methods could be combined to from a strong system to monitor and manage online menace as a result of radicalization. The relationship between different websites and the trend from continuous access of these websites forms the basis for probabilistic reasoning in understanding the users’ behavior. Association rule mining method has been widely used in recommender systems in profiling and generating users’ preferences. To add probabilistic reasoning considering internet magnitude and more so in social media, Bayesian theory is incorporated. Combination of this two techniques provides better analysis of the results thereby adding reliability and knowledge to the results.
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...ijsptm
The privacy of personal information is an important issue affecting the confidence of internet users. The
widespread adoption of online social networks and access to these platforms using mobile devices has
encouraged developers to make the systems and interfaces acceptable to users who seek privacy. The aim
of this study is to test a wizard that allows users to control the sharing of personal information with others.
We also assess the concerns of users in terms of such sharing such as whether to hide personal data in
current online social network accounts. Survey results showed the wizard worked very well and that
females concealed more personal information than did males. In addition, most users who were concerned
about misuse of personal information hid those items. The results can be used to upgrade current privacy
systems or to design new systems that work on mobile internet devices. The system can also be used to save
time when setting personal privacy settings and makes users more aware of items that will be shared with
others.
Ethical and social issues in management information systems for BBA hons pro...Tonmoy zahid Rishad
Understanding Ethical and Social Issues Related to Systems
In the past 10 years, we have witnessed, arguably, one of the most ethically challenging periods for U.S. and global business. In today’s new legal environment, managers who violate the law and are convicted will most likely spend time in prison. Ethics refers to the principles of right and wrong that individuals, acting as free moral agents, use to make choices to guide their behaviors. When using information systems, it is essential to ask, “What is the ethical and socially responsible course of actin?”
A Model for Thinking about Ethical, Social and Political Issues
Ethical, social, and political issues are closely linked. The ethical dilemma you may face as a manager of information systems typically is reflected in social and political debate.
Ethical And Social Issues in MIS - Management Information SystemFaHaD .H. NooR
Information ethics has been defined as "the branch of ethics that focuses on the relationship between the creation, organization, dissemination, and use of information, and the ethical standards and moral codes governing human conduct in society".[1] The term information ethics was first coined by Robert Hauptman and used in the book Ethical challenges in librarianship. It examines the morality that comes from information as a resource, a product, or as a target.[2] It provides a critical framework for considering moral issues concerning informational privacy, moral agency (e.g. whether artificial agents may be moral), new environmental issues (especially how agents should behave in the infosphere), problems arising from the life-cycle (creation, collection, recording, distribution, processing, etc.) of information (especially ownership and copyright, digital divide, and digital rights). It is very vital to understand that librarians, archivists, information professionals among others, really understand the importance of knowing how to disseminate proper information as well as being responsible with their actions when addressing information.[3]
Information ethics has evolved to relate to a range of fields such as computer ethics,[4] medical ethics, journalism[5] and the philosophy of information.
Dilemmas regarding the life of information are becoming increasingly important in a society that is defined as "the information society". The explosion of so much technology has brought information ethics to a forefront in ethical considerations. Information transmission and literacy are essential concerns in establishing an ethical foundation that promotes fair, equitable, and responsible practices. Information ethics broadly examines issues related to ownership, access, privacy, security, and community. It is also concerned with relational issues such as "the relationship between information and the good of society, the relationship between information providers and the consumers of information".[6]
Information technology affects common issues such as copyright protection, intellectual freedom, accountability, privacy, and security. Many of these issues are difficult or impossible to resolve due to fundamental tensions between Western moral philosophies (based on rules, democracy, individual rights, and personal freedoms) and the traditional Eastern cultures (based on relationships, hierarchy, collective responsibilities, and social harmony).[7] The multi-faceted dispute between Google and the government of the People's Republic of China reflects some of these fundamental tensions.
Authorization mechanism for multiparty data sharing in social networkeSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Policy resolution of shared data in online social networks IJECEIAES
Online social networks have practically a go-to source for information divulging, social exchanges and finding new friends. The popularity of such sites is so profound that they are widely used by people belonging to different age groups and various regions. Widespread use of such sites has given rise to privacy and security issues. This paper proposes a set of rules to be incorporated to safeguard the privacy policies of related users while sharing information and other forms of media online. The proposed access control network takes into account the content sensitivity and confidence level of the accessor to resolve the conflicting privacy policies of the co-owners.
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...ijsptm
The privacy of personal information is an important issue affecting the confidence of internet users. The
widespread adoption of online social networks and access to these platforms using mobile devices has
encouraged developers to make the systems and interfaces acceptable to users who seek privacy. The aim
of this study is to test a wizard that allows users to control the sharing of personal information with others.
We also assess the concerns of users in terms of such sharing such as whether to hide personal data in
current online social network accounts. Survey results showed the wizard worked very well and that
females concealed more personal information than did males. In addition, most users who were concerned
about misuse of personal information hid those items. The results can be used to upgrade current privacy
systems or to design new systems that work on mobile internet devices. The system can also be used to save
time when setting personal privacy settings and makes users more aware of items that will be shared with
others.
Ethical and social issues in management information systems for BBA hons pro...Tonmoy zahid Rishad
Understanding Ethical and Social Issues Related to Systems
In the past 10 years, we have witnessed, arguably, one of the most ethically challenging periods for U.S. and global business. In today’s new legal environment, managers who violate the law and are convicted will most likely spend time in prison. Ethics refers to the principles of right and wrong that individuals, acting as free moral agents, use to make choices to guide their behaviors. When using information systems, it is essential to ask, “What is the ethical and socially responsible course of actin?”
A Model for Thinking about Ethical, Social and Political Issues
Ethical, social, and political issues are closely linked. The ethical dilemma you may face as a manager of information systems typically is reflected in social and political debate.
Ethical And Social Issues in MIS - Management Information SystemFaHaD .H. NooR
Information ethics has been defined as "the branch of ethics that focuses on the relationship between the creation, organization, dissemination, and use of information, and the ethical standards and moral codes governing human conduct in society".[1] The term information ethics was first coined by Robert Hauptman and used in the book Ethical challenges in librarianship. It examines the morality that comes from information as a resource, a product, or as a target.[2] It provides a critical framework for considering moral issues concerning informational privacy, moral agency (e.g. whether artificial agents may be moral), new environmental issues (especially how agents should behave in the infosphere), problems arising from the life-cycle (creation, collection, recording, distribution, processing, etc.) of information (especially ownership and copyright, digital divide, and digital rights). It is very vital to understand that librarians, archivists, information professionals among others, really understand the importance of knowing how to disseminate proper information as well as being responsible with their actions when addressing information.[3]
Information ethics has evolved to relate to a range of fields such as computer ethics,[4] medical ethics, journalism[5] and the philosophy of information.
Dilemmas regarding the life of information are becoming increasingly important in a society that is defined as "the information society". The explosion of so much technology has brought information ethics to a forefront in ethical considerations. Information transmission and literacy are essential concerns in establishing an ethical foundation that promotes fair, equitable, and responsible practices. Information ethics broadly examines issues related to ownership, access, privacy, security, and community. It is also concerned with relational issues such as "the relationship between information and the good of society, the relationship between information providers and the consumers of information".[6]
Information technology affects common issues such as copyright protection, intellectual freedom, accountability, privacy, and security. Many of these issues are difficult or impossible to resolve due to fundamental tensions between Western moral philosophies (based on rules, democracy, individual rights, and personal freedoms) and the traditional Eastern cultures (based on relationships, hierarchy, collective responsibilities, and social harmony).[7] The multi-faceted dispute between Google and the government of the People's Republic of China reflects some of these fundamental tensions.
Authorization mechanism for multiparty data sharing in social networkeSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Policy resolution of shared data in online social networks IJECEIAES
Online social networks have practically a go-to source for information divulging, social exchanges and finding new friends. The popularity of such sites is so profound that they are widely used by people belonging to different age groups and various regions. Widespread use of such sites has given rise to privacy and security issues. This paper proposes a set of rules to be incorporated to safeguard the privacy policies of related users while sharing information and other forms of media online. The proposed access control network takes into account the content sensitivity and confidence level of the accessor to resolve the conflicting privacy policies of the co-owners.
Comprehensive Social Media Security Analysis & XKeyscore Espionage TechnologyCSCJournals
Social networks can offer many services to the users for sharing activities events and their ideas. Many attacks can happened to the social networking websites due to trust that have been given by the users. Cyber threats are discussed in this paper. We study the types of cyber threats, classify them and give some suggestions to protect social networking websites of variety of attacks. Moreover, we gave some antithreats strategies with future trends.
The advancement of Information Technology has hastened the ability to disseminate information across the globe. In particular, the recent trends in ‘Social Networking’ have led to a spark in personally sensitive information being published on the World Wide Web. While such socially active websites are creative tools for expressing one’s personality it also entails serious privacy concerns. Thus, Social Networking websites could be termed a double edged sword. It is important for the law to keep abreast of these developments in technology. The purpose of this paper is to demonstrate the limits of extending existing laws to battle privacy intrusions in the Internet especially in the context of social networking. It is suggested that privacy specific legislation is the most appropriate means of protecting online privacy. In doing so it is important to maintain a balance between the competing right of expression, the failure of which may hinder the reaping of benefits offered by Internet technology
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
How to social scientists use link data (11 june2010)Han Woo PARK
The author would like to thank Bernie Horgan, Rob Ackland, Jeong-Soo Seo, and Yeon-ok Lee for their helpful comments on an earlier draft. Part of this research was carried out during the author’s stay at the Oxford Internet Institute. During the preparation of final manuscript, this research is supported from the WCU project granted from South Korean Government. This paper has been presented at the 2010 International Communication Association conference held in Singapore. http://www.icahdq.org/conferences/2010/
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.co¬m-Visit Our Website: www.finalyearprojects.org
Structural Health Monitoring and Strengthening Of BridgesEditor IJCATR
This paper presents one bridge which were either rehabilitated or strengthened by using FRP composites. The resulting structure was then tested for the effect after using FRP composites for Rehabilitation and strengthening. In this paper, Structural Health Monitoring basics are covered and need for SHM in future in or India scenario. Use of FRP composites in Rehabilitation and Strengthening of structures is becoming increasingly popular and is opening new possibilities in construction and rehabilitation of structures.
Comprehensive Social Media Security Analysis & XKeyscore Espionage TechnologyCSCJournals
Social networks can offer many services to the users for sharing activities events and their ideas. Many attacks can happened to the social networking websites due to trust that have been given by the users. Cyber threats are discussed in this paper. We study the types of cyber threats, classify them and give some suggestions to protect social networking websites of variety of attacks. Moreover, we gave some antithreats strategies with future trends.
The advancement of Information Technology has hastened the ability to disseminate information across the globe. In particular, the recent trends in ‘Social Networking’ have led to a spark in personally sensitive information being published on the World Wide Web. While such socially active websites are creative tools for expressing one’s personality it also entails serious privacy concerns. Thus, Social Networking websites could be termed a double edged sword. It is important for the law to keep abreast of these developments in technology. The purpose of this paper is to demonstrate the limits of extending existing laws to battle privacy intrusions in the Internet especially in the context of social networking. It is suggested that privacy specific legislation is the most appropriate means of protecting online privacy. In doing so it is important to maintain a balance between the competing right of expression, the failure of which may hinder the reaping of benefits offered by Internet technology
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
How to social scientists use link data (11 june2010)Han Woo PARK
The author would like to thank Bernie Horgan, Rob Ackland, Jeong-Soo Seo, and Yeon-ok Lee for their helpful comments on an earlier draft. Part of this research was carried out during the author’s stay at the Oxford Internet Institute. During the preparation of final manuscript, this research is supported from the WCU project granted from South Korean Government. This paper has been presented at the 2010 International Communication Association conference held in Singapore. http://www.icahdq.org/conferences/2010/
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09849539085, 09966235788 or mail us - ieeefinalsemprojects@gmail.co¬m-Visit Our Website: www.finalyearprojects.org
Structural Health Monitoring and Strengthening Of BridgesEditor IJCATR
This paper presents one bridge which were either rehabilitated or strengthened by using FRP composites. The resulting structure was then tested for the effect after using FRP composites for Rehabilitation and strengthening. In this paper, Structural Health Monitoring basics are covered and need for SHM in future in or India scenario. Use of FRP composites in Rehabilitation and Strengthening of structures is becoming increasingly popular and is opening new possibilities in construction and rehabilitation of structures.
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Editor IJCATR
Bayesian theory and association rule mining methods are artificial intelligence techniques that have been used in various
computing fields, especially in machine learning. Internet has been considered as an easy ground for vices like radicalization because
of its diverse nature and ease of information access. These vices could be managed using recommender systems methods which are
used to deliver users’ preference data based on their previous interests and in relation with the community around the user. The
recommender systems are divided into two broad categories, i.e. collaborative systems which considers users which share the same
preferences as the user in question and content-based recommender systems tends to recommend websites similar to those already
liked by the user. Recent research and information from security organs indicate that, online radicalization has been growing at an
alarming rate. The paper reviews in depth what has been carried out in recommender systems and looks at how these methods could be
combined to from a strong system to monitor and manage online menace as a result of radicalization. The relationship between
different websites and the trend from continuous access of these websites forms the basis for probabilistic reasoning in understanding
the users’ behavior. Association rule mining method has been widely used in recommender systems in profiling and generating users’
preferences. To add probabilistic reasoning considering internet magnitude and more so in social media, Bayesian theory is
incorporated. Combination of this two techniques provides better analysis of the results thereby adding reliability and knowledge to the
results.
WAP, HTTP and HTML5 Web Socket Architecture Analysis in Contemporary Mobile A...Editor IJCATR
Accessing current and accurate information anywhere and at anytime is becoming a growing interest nowadays. Wireless
Application Protocol (WAP) is an application protocol that creates an opportunity to access information of any interest from WAP
servers using mobile phones. WAP is an enabling technology based on the Internet client server architecture model, for developing
client application for handheld devices or other wireless terminal which usually have less powerful CPU’s, less memory, very
restricted power consumption, smaller and variant displays, phone keypads etc. This paper analyses the features of WAP in relation to
the well established HyperText Transfer Protocol (HTTP) technology, the web socket API innovations introduced in HTML5, the
recent improvements in mobile devices processing capacity by connecting to cloud services and how application can be developed on
them using modern tools. The features that are more adapted to client development of micro-devices are used for the technology
application test.
Impacts of Object Oriented Programming on Web Application DevelopmentEditor IJCATR
Development of web application nowadays can hardly survive without object oriented approach except for the purpose of just
information display. The complexity of application development and the need for content organization has raised the need for web
application developers to embrace object oriented programming approach. This paper exposes the impact of object oriented programming
on web application development. The exposition was done through a detailed study and analysis of information from secondary sources.
The internet was usefully employed to access journal articles for both national and international sources. Our study enables web
developers and designers to understand web application features, tools and methodologies for developing web application. It also keeps
researchers and scholars abreast of the boost which OOP has brought into Web Applications development
A Survey of Existing Mechanisms in Energy-Aware Routing In MANETsEditor IJCATR
A mobile ad hoc network (MANET) is a distributed and Self-organized network. In MANET, network topology
frequently changes because of high mobility nodes. Mobility of nodes and battery energy depletion are two major factors that cause loss
of the discovered routes. battery power depletion causes the nodes to die and loss of the obtained paths and thus affects the network
connectivity. Therefore, a routing protocol for energy efficiency should consider all the aspects to manage the energy consumption in
the network. so introducing an energy aware routing protocol, is one of the most important issues in MANET. This paper reviews some
energy aware routing protocols. The main purpose energy aware protocols are efficiently use of energy, reducing energy consumption
and increasing the network lifetime
Spam Detection in Social Networks Using Correlation Based Feature Subset Sele...Editor IJCATR
Bayesian classifier works efficiently on some fields, and badly on some. The performance of Bayesian Classifier suffers in
fields that involve correlated features. Feature selection is beneficial in reducing dimensionality, removing irrelevant data,
incrementing learning accuracy, and improving result comprehensibility. But, the recent increase of dimensionality of data place a hard
challenge to many existing feature selection methods with respect to efficiency and effectiveness. In this paper, Bayesian Classifier
with Correlation Based Feature Selection is introduced which can key out relevant features as well as redundancy among relevant
features without pair wise correlation analysis. The efficiency and effectiveness of our method is presented through broad.
Data Mining: Investment risk in the bankEditor IJCATR
This paper will discuss the technology and methods behind data mining, how data mining works, how it helps to improve
national security, and how sustainable the technology is. Sustainability, with regard to data mining, refers to the impact on the quality of
life. Quality of life refers to the preservation of human rights and the ability to feel secure. The ethics and the fallbacks regarding privacy
will also be discussed in depth, including the benefits that accompany these fallbacks, and whether they outweigh the cons. Both technical
and ethical articles will be used to highlight and discuss the potential, good and bad, and the controversy of data mining. Applications
of data mining to security will also be proposed.
Data mining methods are expanding rapidly allowing for the mass collection of information. This mass amount of information is then
used by many government agencies to identify threats, gain intelligence, and obtain a better understanding of enemy networks. However,
the ability to collect this information from any computer draws into question whether or not data mining leads to a violation of the
average citizen’s privacy and has created a debate as to if data mining is ethically plausible
Combining Neural Network and Firefly Algorithm to Predict Stock Price in Tehr...Editor IJCATR
In the present research, prediction of stock price index in Tehran stock exchange by using neural
networks and firefly algorithm in chaotic behavior of price index stock exchange are studied. Two data sets
are selected for neural network input. Various breaks of index and macro economic factors are considered
as independent variables. Also, firefly algorithm is used to [redict price index in next week. The results of
research show that combining neural networks and firefly optimization algorithm has better performance
than neural network to predict the price index. In addition, acceptable value of error-sequre means for
network error in test data show that there are chaotic mevements in behaviour of price index.
Development of Web-based Job Fair Information SystemEditor IJCATR
The development of information technology should be ordered to improve the services including job fair information system
services. This work aimed to develop of web-based job fair information system. The methods used in this work consists of collecting
data method, and software development method. Collecting data method using observations, interview, and literature study. Software
development method using waterfall model comprising the steps of requirements, specfification and design, implementation, testing,
deployment, and maintenance. The results of this work is software web-based information system provided a job information,
registration, and test schedule information.
LEACH is a hierarchical protocol in which most nodes transmit to cluster heads, and the cluster heads aggregate and compress the data and forward it to the base station (sink).In LEACH, a TDMA-based MAC protocol is integrated with clustering and a simple “routing” protocol. The goal of LEACH is to lower the energy consumption required to create and maintain clusters or to use the energy of the nodes in such a manner so as to improve the life time of a wireless sensor network. In this paper we are presenting an overview of the different protocol changes made in LEACH to improve network lifetime, throughput, coverage area of network etc.
An Improved Energy Efficient Wireless Sensor Networks Through Clustering In C...Editor IJCATR
One of the major reason for performance degradation in Wireless sensor network is the overhead due to control packet and
packet delivery degradation. Clustering in cross layer network operation is an efficient way manage control packet overhead and which
ultimately improve the lifetime of a network. All these overheads are crucial in a scalable networks. But the clustering always suffer
from the cluster head failure which need to be solved effectively in a large network. As the focus is to improve the average lifetime of
sensor network the cluster head is selected based on the battery life of nodes. The cross-layer operation model optimize the overheads
in multiple layer and ultimately the use of clustering will reduce the major overheads identified and their by the energy consumption
and throughput of wireless sensor network is improved. The proposed model operates on two layers of network ie., Network Layer
and Transport Layer and Clustering is applied in the network layer . The simulation result shows that the integration of two layers
reduces the energy consumption and increases the throughput of the wireless sensor networks.
Understanding Working Memory for Improving LearningEditor IJCATR
A web-based working memory (WM) test system is a management system website that allows students to test their ability
and skills in remembering visual patterns. It also enables you to record and store the data for individuals. The system is developed
using HTML, PHP and MySQL as a database system to manage and store the data. The system targets several users: children and
adults who suffer from attention deficit or learning problems. The main objectives for developing the website are to educate the
community on the benefits of performing the working memory test of the activity of the brain and improvements in social skills and
improving poor academic and professional performance, especially in maths and reading comprehension. This study implements a set
of tasks, testing 59 adults aged 18-24 years of age at King Abdul-Aziz University for testing and measuring WM and cognitive
abilities. Results showed tests depended on the age entry by the user. The implications of the test results will help people know their
WM level before ascertaining the appropriate suggestions, and to make the test suit our society.
Applications of Nano Electrical Machines used in Ball mills for Nano and Pyro...Editor IJCATR
Nano fillers play a vital role in increasing the performance of different types of motors. In recent days, nano technology
shows a tremendous improvement in the manufacture of high performance electronic devices and circuits, electrical apparatuses and
equipment. In this paper, a wide literature survey was done on the filled of nano dielectrics and nano coated motors. Comparison of
different nano fillers coated motors was done to show which motor was having superior performance characteristics compared to other
motors. Based on the literature survey on the previous research works carried out in the field of applications of nano technology in the
coating of nano fillers to the enamel used in the motors. Ball mills are using three phase induction motors for the mechanical
operations. Ball mills are used to manufacture the nano powders used for both the nano technology and pyro technology. Industries
should be well equipped with safety devices to avoid the fire accidents. Industries should follow the safety norms to avoid the fire
accidents. Pyro technology based research centre was located in Sivakasi to understand and motivate the engineers, people to make an
interest towards Pyro industries and to train the persons about the safety measures while working with the nano pyro powders used in
the nano pyro industries. The powders used here are always in the nano range. But, the people were unaware of this technique. So, this
paper will create some knowledge to the people who are working in nano pyro based industries present in Sivakasi. Sivakasi was an
industrial city located in South India having more than 15000 nano pyro based industries. So, this paper will educate the engineers,
managers and the persons who are all associated with these industries.
Applications of Nano Technology in Pyro Industries located in Sivakasi Editor IJCATR
This paper deals with the application of nano technology in the pyro industries located in Sivakasi. This paper also gives information about the geography, history, location of industries, pyro powder manufacturing companies, application of SEM, ball mill present in Sivakasi. Sivakasi is one of the most mega industrial centres present in India. Sivakasi is also called as Kutti Japan due to the enormous amount of industries present in the heart of the city. It is one of the important holy places present in the country. It is having hot climate throughout the day. Nowadays, due to El - Niño the climate conditions were changed. There is an abundant rainfall in the summer season also. This paper is useful for the students to study about the pyro industries present in sivakasi and the applications of nano technology in the pyro industries present in the city. Hereafter, the pyro industries using the nano technology for the improvement of the quality and performance of the fireworks present in Sivakasi shall also be called “Nano pyro industries” and the city can also be called as “Nano Pyro Industrial City”. Pyro industries are present in the city due to the low annual rainfall and the dry climate present in the city throughout the year. The land in Sivakasi is also called as “Sulphur land” due to the heavy temperature in the city. Nowadays, the trend is changed due to the climatic changes caused by El – Niño. In the year 2015, the summer season in sivakasi was converted into rainy season by the climatic changes caused by the unstable changes in the winds from the Bay of Bengal.
Comparisons of QoS in VoIP over WIMAX by Varying the Voice codes and Buffer sizeEditor IJCATR
Voice over Internet Protocol (VoIP) is developed for voice communications system based on voice packets transmitted over
IP network with real-time communications of voice across networks using the Internet protocols. Quality of Service (QoS) mechanism
is applied to guarantee successful voice packets transmitted over IP network with reduced delay or drop according to assigned priority
of voice packets. In this paper, the goal of simulation models is present to investigate the performance of VoIP codecs and buffer size
for improving quality of service (QoS) with the simulation results by using OPNET modeler version 14.5. The performance of the
proposed algorithm is analyzed and compared the quality of service for VoIP. The final simulated result shows that the VoIP service
performance best under G.729 voice encoder scheme and buffer size 256 Kb over WiMAX network.
Duplicate Code Detection using Control StatementsEditor IJCATR
Code clone detection is an important area of research as reusability is a key factor in software evolution. Duplicate code degrades the design and structure of software and software qualities like readability, changeability, maintainability. Code clone increases the maintenance cost as incorrect changes in copied code may lead to more errors. In this paper we address structural code similarity detection and propose new methods to detect structural clones using structure of control statements. By structure we mean order of control statements used in the source code. We have considered two orders of control structures: (i) Sequence of control statements as it appears (ii) Execution flow of control statements.
An Access Control Model for Collaborative Management of Shared Data in OSNSIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...AJHSSR Journal
ABSTRACT : The Internet of Things (IoT) is a significant research topic with many challenges and affects
many areas of our lives, including healthcare. The purpose of this paper is to examine the current state of
knowledge on Internet addiction and online privacy and security issues, with a focus on identifying gaps in the
literature, quantifying the research, and areas in need of further research. This paper aims to provide guidance
for creating insightful and helpful systematic literature review articles in the field of International Business. In
this paper, we present a thorough review of the different security and privacy risks, which threaten the wellbeing of OSN users in general, and children in particular. We also present an overview of existing solutions that
can provide better protection, security, and privacy for OSN user’s identities, identities, and lives. In addition,
we provide a comprehensive survey on how recent and ongoing advances in technology have motivated the
development of affordable healthcare gadgets and connected health services using IoT. The COVID-19
pandemic has led to new cyber security threats and privacy issues.
KEYWORDS : Internet of Things, Literature Review, Internet Addiction, Cybersecurity
Collusion-resistant multiparty data sharing in social networksIJECEIAES
The number of users on online social networks (OSNs) has grown tremendously over the past few years, with sites like Facebook amassing over a billion users. With the popularity of OSNs, the increase in privacy risk from the large volume of sensitive and private data is inevitable. While there are many features for access control for an individual user, most OSNs still need concrete mechanisms to preserve the privacy of data shared between multiple users. The proposed method uses metrics such as identity leakage (IL) and strength of interaction (SoI) to fine-tune the scenarios that use privacy risk and sharing loss to identify and resolve conflicts. In addition to conflict resolution, bot detection is also done to mitigate collusion attacks. The final decision to share the data item is then ascertained based on whether it passes the threshold condition for the above metrics.
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING ijcsit
This study examines the individuals’ participation intentions and behaviour on Social Networking Sites (SNSs). For this purpose, the Technology Acceptance Model (TAM) is utilized and extended in this study through the addition of “perceived social capital” construct aiming to increase its explanatory power and predictive ability in this context. Data collected from a survey of 1100 participants and distilled to 657 usable sets has been analysed to assess the predictive power of proposed model via structural equation modelling. The model proposed in this study explains 56% of the variance in “Participation Intentions” and 55% of the variance in “Participation Behaviour”. Participation of behavioural intention in the model’
explanatory power was the highest amongst the constructs (able to explain 28% of usage behaviour).While, “Attitude” explain around 11% of SNSs usage behaviour. The study findings also show that “Perceived Social Capital” construct has a notable impact on usage behaviour, this impact came indirectly through its direct effect on “Attitude” and “Perceived Usefulness”. Participation of “Perceived Social Capital” in the models' explanatory power was the third highest amongst the constructs. “Perceived Social Capital”, alone explain around 9% of SNSs usage behaviour.
Cyber Ethics An Introduction by Paul A. Adekunte | Matthew N. O. Sadiku | Jan...ijtsrd
Cyber ethics is the study of the ethics relating to computers, as well as to user behavior and what computers are programmed to do, and how it affects individuals and society. It is the branch of philosophy that deals with what is considered to be right or wrong. Since the advent of computers, various governments have enacted regulations and while organizations have defined policies about cyberethics. Cyberethics also known as “internet ethics,” is a branch of applied ethics that examines the moral, legal, and social issues i.e. ethical questions brought about by the emergence of digital technologies and global virtual environments. Arising with the introduction of the internet are, filtering, accuracy, security, censorship, conflicts over privacy, property, accessibility, and others. This paper is to elucidate more on cyberethics and its impacts on users and the society Paul A. Adekunte | Matthew N. O. Sadiku | Janet O. Sadiku "Cyber Ethics: An Introduction" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-8 | Issue-1 , February 2024, URL: https://www.ijtsrd.com/papers/ijtsrd63513.pdf Paper Url: https://www.ijtsrd.com/computer-science/computer-security/63513/cyber-ethics-an-introduction/paul-a-adekunte
Running head POLICIES FOR MANAGING PRIVACY1POLICIES FOR M.docxjeanettehully
Running head: POLICIES FOR MANAGING PRIVACY
1
POLICIES FOR MANAGING PRIVACY
5
Online Policies for Enabling Financial Companies to Manage Privacy Issues
Name: Sunil Kumar Parisa
Date:03/29/2020
University of Cumberland’s
ABSTRACT
Financial companies are under constant threats in the face of cyber-attacks, which are growing by the day. The companies usually implement measures that primarily focus on the deployment of technologies for suppressing the attacks. They do not consider user policies as essential elements that help curb the vulnerabilities. The policies put in place have a low level of enforceability, which lowers the impact of the plans. The research project will determine the relationship between policy enforceability and the vulnerabilities posed to a system by the internal and external users.
INTRODUCTION
Business companies in the financial sector have the responsibility of ensuring the data that belong to the customers are fully protected. Cyber-crimes are on the rise, and the approaches employed today are not entirely practical. Technological tools and measures are not efficient. They should be complemented by the behavioral standards that suppress the vulnerabilities in all the IT domains (Vincent, Higgs & Pinsker, 2015). Enforceable policies will ensure there is an integration of behavioral and technological measures for promoting data security and privacy.
LITERATURE REVIEW
Financial companies usually emphasize policies that guide the collection of customer and storage as well as access to the data by the internal and external users. These policies are relevant as they promote best practices at both levels. The companies have a belief that these are the areas that need closer monitoring and evaluation. However, the policies put in place are not always enforceable. A lack of enforceability creates a situation where the desired outcomes are not realized (Yeganeh, 2019). It explains why data breaches are still experienced even after such policies are formulated and implemented.
RESEARCH METHOD
To investigate the relationship between enforceability of the policies and the vulnerabilities that business organizations are exposed to, a case study method will be used. It is an essential tool that helps determine a causal relationship (White & McBurney, 2012). Also, it will provide insights that will inform the recommendations that need to be considered by the multiple business organizations in the financial sector. Credible data that are free of confounding variables must be collected, analyzed, and inferences drawn. Two data collection procedures will be utilized as follows.
i. Semi-structured interviews will be conducted to collect diverse data on the design and implementation of user and online policies. The interviewees will offer data that expound on the security and privacy positions of the systems.
ii. Independent observations will be made to inform the behaviors of the users, both internally and externally. The observation ...
Running head POLICIES FOR MANAGING PRIVACY1POLICIES FOR M.docxglendar3
Running head: POLICIES FOR MANAGING PRIVACY
1
POLICIES FOR MANAGING PRIVACY
5
Online Policies for Enabling Financial Companies to Manage Privacy Issues
Name: Sunil Kumar Parisa
Date:03/29/2020
University of Cumberland’s
ABSTRACT
Financial companies are under constant threats in the face of cyber-attacks, which are growing by the day. The companies usually implement measures that primarily focus on the deployment of technologies for suppressing the attacks. They do not consider user policies as essential elements that help curb the vulnerabilities. The policies put in place have a low level of enforceability, which lowers the impact of the plans. The research project will determine the relationship between policy enforceability and the vulnerabilities posed to a system by the internal and external users.
INTRODUCTION
Business companies in the financial sector have the responsibility of ensuring the data that belong to the customers are fully protected. Cyber-crimes are on the rise, and the approaches employed today are not entirely practical. Technological tools and measures are not efficient. They should be complemented by the behavioral standards that suppress the vulnerabilities in all the IT domains (Vincent, Higgs & Pinsker, 2015). Enforceable policies will ensure there is an integration of behavioral and technological measures for promoting data security and privacy.
LITERATURE REVIEW
Financial companies usually emphasize policies that guide the collection of customer and storage as well as access to the data by the internal and external users. These policies are relevant as they promote best practices at both levels. The companies have a belief that these are the areas that need closer monitoring and evaluation. However, the policies put in place are not always enforceable. A lack of enforceability creates a situation where the desired outcomes are not realized (Yeganeh, 2019). It explains why data breaches are still experienced even after such policies are formulated and implemented.
RESEARCH METHOD
To investigate the relationship between enforceability of the policies and the vulnerabilities that business organizations are exposed to, a case study method will be used. It is an essential tool that helps determine a causal relationship (White & McBurney, 2012). Also, it will provide insights that will inform the recommendations that need to be considered by the multiple business organizations in the financial sector. Credible data that are free of confounding variables must be collected, analyzed, and inferences drawn. Two data collection procedures will be utilized as follows.
i. Semi-structured interviews will be conducted to collect diverse data on the design and implementation of user and online policies. The interviewees will offer data that expound on the security and privacy positions of the systems.
ii. Independent observations will be made to inform the behaviors of the users, both internally and externally. The observation.
Running head POLICIES FOR MANAGING PRIVACY1POLICIES FOR M.docxtodd581
Running head: POLICIES FOR MANAGING PRIVACY
1
POLICIES FOR MANAGING PRIVACY
5
Online Policies for Enabling Financial Companies to Manage Privacy Issues
Name: Sunil Kumar Parisa
Date:03/29/2020
University of Cumberland’s
ABSTRACT
Financial companies are under constant threats in the face of cyber-attacks, which are growing by the day. The companies usually implement measures that primarily focus on the deployment of technologies for suppressing the attacks. They do not consider user policies as essential elements that help curb the vulnerabilities. The policies put in place have a low level of enforceability, which lowers the impact of the plans. The research project will determine the relationship between policy enforceability and the vulnerabilities posed to a system by the internal and external users.
INTRODUCTION
Business companies in the financial sector have the responsibility of ensuring the data that belong to the customers are fully protected. Cyber-crimes are on the rise, and the approaches employed today are not entirely practical. Technological tools and measures are not efficient. They should be complemented by the behavioral standards that suppress the vulnerabilities in all the IT domains (Vincent, Higgs & Pinsker, 2015). Enforceable policies will ensure there is an integration of behavioral and technological measures for promoting data security and privacy.
LITERATURE REVIEW
Financial companies usually emphasize policies that guide the collection of customer and storage as well as access to the data by the internal and external users. These policies are relevant as they promote best practices at both levels. The companies have a belief that these are the areas that need closer monitoring and evaluation. However, the policies put in place are not always enforceable. A lack of enforceability creates a situation where the desired outcomes are not realized (Yeganeh, 2019). It explains why data breaches are still experienced even after such policies are formulated and implemented.
RESEARCH METHOD
To investigate the relationship between enforceability of the policies and the vulnerabilities that business organizations are exposed to, a case study method will be used. It is an essential tool that helps determine a causal relationship (White & McBurney, 2012). Also, it will provide insights that will inform the recommendations that need to be considered by the multiple business organizations in the financial sector. Credible data that are free of confounding variables must be collected, analyzed, and inferences drawn. Two data collection procedures will be utilized as follows.
i. Semi-structured interviews will be conducted to collect diverse data on the design and implementation of user and online policies. The interviewees will offer data that expound on the security and privacy positions of the systems.
ii. Independent observations will be made to inform the behaviors of the users, both internally and externally. The observation.
Text Mining in Digital Libraries using OKAPI BM25 ModelEditor IJCATR
The emergence of the internet has made vast amounts of information available and easily accessible online. As a result, most libraries have digitized their content in order to remain relevant to their users and to keep pace with the advancement of the internet. However, these digital libraries have been criticized for using inefficient information retrieval models that do not perform relevance ranking to the retrieved results. This paper proposed the use of OKAPI BM25 model in text mining so as means of improving relevance ranking of digital libraries. Okapi BM25 model was selected because it is a probability-based relevance ranking algorithm. A case study research was conducted and the model design was based on information retrieval processes. The performance of Boolean, vector space, and Okapi BM25 models was compared for data retrieval. Relevant ranked documents were retrieved and displayed at the OPAC framework search page. The results revealed that Okapi BM 25 outperformed Boolean model and Vector Space model. Therefore, this paper proposes the use of Okapi BM25 model to reward terms according to their relative frequencies in a document so as to improve the performance of text mining in digital libraries.
Green Computing, eco trends, climate change, e-waste and eco-friendlyEditor IJCATR
This study focused on the practice of using computing resources more efficiently while maintaining or increasing overall performance. Sustainable IT services require the integration of green computing practices such as power management, virtualization, improving cooling technology, recycling, electronic waste disposal, and optimization of the IT infrastructure to meet sustainability requirements. Studies have shown that costs of power utilized by IT departments can approach 50% of the overall energy costs for an organization. While there is an expectation that green IT should lower costs and the firm’s impact on the environment, there has been far less attention directed at understanding the strategic benefits of sustainable IT services in terms of the creation of customer value, business value and societal value. This paper provides a review of the literature on sustainable IT, key areas of focus, and identifies a core set of principles to guide sustainable IT service design.
Policies for Green Computing and E-Waste in NigeriaEditor IJCATR
Computers today are an integral part of individuals’ lives all around the world, but unfortunately these devices are toxic to the environment given the materials used, their limited battery life and technological obsolescence. Individuals are concerned about the hazardous materials ever present in computers, even if the importance of various attributes differs, and that a more environment -friendly attitude can be obtained through exposure to educational materials. In this paper, we aim to delineate the problem of e-waste in Nigeria and highlight a series of measures and the advantage they herald for our country and propose a series of action steps to develop in these areas further. It is possible for Nigeria to have an immediate economic stimulus and job creation while moving quickly to abide by the requirements of climate change legislation and energy efficiency directives. The costs of implementing energy efficiency and renewable energy measures are minimal as they are not cash expenditures but rather investments paid back by future, continuous energy savings.
Performance Evaluation of VANETs for Evaluating Node Stability in Dynamic Sce...Editor IJCATR
Vehicular ad hoc networks (VANETs) are a favorable area of exploration which empowers the interconnection amid the movable vehicles and between transportable units (vehicles) and road side units (RSU). In Vehicular Ad Hoc Networks (VANETs), mobile vehicles can be organized into assemblage to promote interconnection links. The assemblage arrangement according to dimensions and geographical extend has serious influence on attribute of interaction .Vehicular ad hoc networks (VANETs) are subclass of mobile Ad-hoc network involving more complex mobility patterns. Because of mobility the topology changes very frequently. This raises a number of technical challenges including the stability of the network .There is a need for assemblage configuration leading to more stable realistic network. The paper provides investigation of various simulation scenarios in which cluster using k-means algorithm are generated and their numbers are varied to find the more stable configuration in real scenario of road.
Optimum Location of DG Units Considering Operation ConditionsEditor IJCATR
The optimal sizing and placement of Distributed Generation units (DG) are becoming very attractive to researchers these days. In this paper a two stage approach has been used for allocation and sizing of DGs in distribution system with time varying load model. The strategic placement of DGs can help in reducing energy losses and improving voltage profile. The proposed work discusses time varying loads that can be useful for selecting the location and optimizing DG operation. The method has the potential to be used for integrating the available DGs by identifying the best locations in a power system. The proposed method has been demonstrated on 9-bus test system.
Analysis of Comparison of Fuzzy Knn, C4.5 Algorithm, and Naïve Bayes Classifi...Editor IJCATR
Early detection of diabetes mellitus (DM) can prevent or inhibit complication. There are several laboratory test that must be done to detect DM. The result of this laboratory test then converted into data training. Data training used in this study generated from UCI Pima Database with 6 attributes that were used to classify positive or negative diabetes. There are various classification methods that are commonly used, and in this study three of them were compared, which were fuzzy KNN, C4.5 algorithm and Naïve Bayes Classifier (NBC) with one identical case. The objective of this study was to create software to classify DM using tested methods and compared the three methods based on accuracy, precision, and recall. The results showed that the best method was Fuzzy KNN with average and maximum accuracy reached 96% and 98%, respectively. In second place, NBC method had respective average and maximum accuracy of 87.5% and 90%. Lastly, C4.5 algorithm had average and maximum accuracy of 79.5% and 86%, respectively.
Web Scraping for Estimating new Record from Source SiteEditor IJCATR
Study in the Competitive field of Intelligent, and studies in the field of Web Scraping, have a symbiotic relationship mutualism. In the information age today, the website serves as a main source. The research focus is on how to get data from websites and how to slow down the intensity of the download. The problem that arises is the website sources are autonomous so that vulnerable changes the structure of the content at any time. The next problem is the system intrusion detection snort installed on the server to detect bot crawler. So the researchers propose the use of the methods of Mining Data Records and the method of Exponential Smoothing so that adaptive to changes in the structure of the content and do a browse or fetch automatically follow the pattern of the occurrences of the news. The results of the tests, with the threshold 0.3 for MDR and similarity threshold score 0.65 for STM, using recall and precision values produce f-measure average 92.6%. While the results of the tests of the exponential estimation smoothing using ? = 0.5 produces MAE 18.2 datarecord duplicate. It slowed down to 3.6 datarecord from 21.8 datarecord results schedule download/fetch fix in an average time of occurrence news.
Evaluating Semantic Similarity between Biomedical Concepts/Classes through S...Editor IJCATR
Most of the existing semantic similarity measures that use ontology structure as their primary source can measure semantic similarity between concepts/classes using single ontology. The ontology-based semantic similarity techniques such as structure-based semantic similarity techniques (Path Length Measure, Wu and Palmer’s Measure, and Leacock and Chodorow’s measure), information content-based similarity techniques (Resnik’s measure, Lin’s measure), and biomedical domain ontology techniques (Al-Mubaid and Nguyen’s measure (SimDist)) were evaluated relative to human experts’ ratings, and compared on sets of concepts using the ICD-10 “V1.0” terminology within the UMLS. The experimental results validate the efficiency of the SemDist technique in single ontology, and demonstrate that SemDist semantic similarity techniques, compared with the existing techniques, gives the best overall results of correlation with experts’ ratings.
Semantic Similarity Measures between Terms in the Biomedical Domain within f...Editor IJCATR
The techniques and tests are tools used to define how measure the goodness of ontology or its resources. The similarity between biomedical classes/concepts is an important task for the biomedical information extraction and knowledge discovery. However, most of the semantic similarity techniques can be adopted to be used in the biomedical domain (UMLS). Many experiments have been conducted to check the applicability of these measures. In this paper, we investigate to measure semantic similarity between two terms within single ontology or multiple ontologies in ICD-10 “V1.0” as primary source, and compare my results to human experts score by correlation coefficient.
A Strategy for Improving the Performance of Small Files in Openstack Swift Editor IJCATR
This is an effective way to improve the storage access performance of small files in Openstack Swift by adding an aggregate storage module. Because Swift will lead to too much disk operation when querying metadata, the transfer performance of plenty of small files is low. In this paper, we propose an aggregated storage strategy (ASS), and implement it in Swift. ASS comprises two parts which include merge storage and index storage. At the first stage, ASS arranges the write request queue in chronological order, and then stores objects in volumes. These volumes are large files that are stored in Swift actually. During the short encounter time, the object-to-volume mapping information is stored in Key-Value store at the second stage. The experimental results show that the ASS can effectively improve Swift's small file transfer performance.
Integrated System for Vehicle Clearance and RegistrationEditor IJCATR
Efficient management and control of government's cash resources rely on government banking arrangements. Nigeria, like many low income countries, employed fragmented systems in handling government receipts and payments. Later in 2016, Nigeria implemented a unified structure as recommended by the IMF, where all government funds are collected in one account would reduce borrowing costs, extend credit and improve government's fiscal policy among other benefits to government. This situation motivated us to embark on this research to design and implement an integrated system for vehicle clearance and registration. This system complies with the new Treasury Single Account policy to enable proper interaction and collaboration among five different level agencies (NCS, FRSC, SBIR, VIO and NPF) saddled with vehicular administration and activities in Nigeria. Since the system is web based, Object Oriented Hypermedia Design Methodology (OOHDM) is used. Tools such as Php, JavaScript, css, html, AJAX and other web development technologies were used. The result is a web based system that gives proper information about a vehicle starting from the exact date of importation to registration and renewal of licensing. Vehicle owner information, custom duty information, plate number registration details, etc. will also be efficiently retrieved from the system by any of the agencies without contacting the other agency at any point in time. Also number plate will no longer be the only means of vehicle identification as it is presently the case in Nigeria, because the unified system will automatically generate and assigned a Unique Vehicle Identification Pin Number (UVIPN) on payment of duty in the system to the vehicle and the UVIPN will be linked to the various agencies in the management information system.
Assessment of the Efficiency of Customer Order Management System: A Case Stu...Editor IJCATR
The Supermarket Management System deals with the automation of buying and selling of good and services. It includes both sales and purchase of items. The project Supermarket Management System is to be developed with the objective of making the system reliable, easier, fast, and more informative.
Energy-Aware Routing in Wireless Sensor Network Using Modified Bi-Directional A*Editor IJCATR
Energy is a key component in the Wireless Sensor Network (WSN)[1]. The system will not be able to run according to its function without the availability of adequate power units. One of the characteristics of wireless sensor network is Limitation energy[2]. A lot of research has been done to develop strategies to overcome this problem. One of them is clustering technique. The popular clustering technique is Low Energy Adaptive Clustering Hierarchy (LEACH)[3]. In LEACH, clustering techniques are used to determine Cluster Head (CH), which will then be assigned to forward packets to Base Station (BS). In this research, we propose other clustering techniques, which utilize the Social Network Analysis approach theory of Betweeness Centrality (BC) which will then be implemented in the Setup phase. While in the Steady-State phase, one of the heuristic searching algorithms, Modified Bi-Directional A* (MBDA *) is implemented. The experiment was performed deploy 100 nodes statically in the 100x100 area, with one Base Station at coordinates (50,50). To find out the reliability of the system, the experiment to do in 5000 rounds. The performance of the designed routing protocol strategy will be tested based on network lifetime, throughput, and residual energy. The results show that BC-MBDA * is better than LEACH. This is influenced by the ways of working LEACH in determining the CH that is dynamic, which is always changing in every data transmission process. This will result in the use of energy, because they always doing any computation to determine CH in every transmission process. In contrast to BC-MBDA *, CH is statically determined, so it can decrease energy usage.
Security in Software Defined Networks (SDN): Challenges and Research Opportun...Editor IJCATR
In networks, the rapidly changing traffic patterns of search engines, Internet of Things (IoT) devices, Big Data and data centers has thrown up new challenges for legacy; existing networks; and prompted the need for a more intelligent and innovative way to dynamically manage traffic and allocate limited network resources. Software Defined Network (SDN) which decouples the control plane from the data plane through network vitalizations aims to address these challenges. This paper has explored the SDN architecture and its implementation with the OpenFlow protocol. It has also assessed some of its benefits over traditional network architectures, security concerns and how it can be addressed in future research and related works in emerging economies such as Nigeria.
Measure the Similarity of Complaint Document Using Cosine Similarity Based on...Editor IJCATR
Report handling on "LAPOR!" (Laporan, Aspirasi dan Pengaduan Online Rakyat) system depending on the system administrator who manually reads every incoming report [3]. Read manually can lead to errors in handling complaints [4] if the data flow is huge and grows rapidly, it needs at least three days to prepare a confirmation and it sensitive to inconsistencies [3]. In this study, the authors propose a model that can measure the identities of the Query (Incoming) with Document (Archive). The authors employed Class-Based Indexing term weighting scheme, and Cosine Similarities to analyse document similarities. CoSimTFIDF, CoSimTFICF and CoSimTFIDFICF values used in classification as feature for K-Nearest Neighbour (K-NN) classifier. The optimum result evaluation is pre-processing employ 75% of training data ratio and 25% of test data with CoSimTFIDF feature. It deliver a high accuracy 84%. The k = 5 value obtain high accuracy 84.12%
Hangul Recognition Using Support Vector MachineEditor IJCATR
The recognition of Hangul Image is more difficult compared with that of Latin. It could be recognized from the structural arrangement. Hangul is arranged from two dimensions while Latin is only from the left to the right. The current research creates a system to convert Hangul image into Latin text in order to use it as a learning material on reading Hangul. In general, image recognition system is divided into three steps. The first step is preprocessing, which includes binarization, segmentation through connected component-labeling method, and thinning with Zhang Suen to decrease some pattern information. The second is receiving the feature from every single image, whose identification process is done through chain code method. The third is recognizing the process using Support Vector Machine (SVM) with some kernels. It works through letter image and Hangul word recognition. It consists of 34 letters, each of which has 15 different patterns. The whole patterns are 510, divided into 3 data scenarios. The highest result achieved is 94,7% using SVM kernel polynomial and radial basis function. The level of recognition result is influenced by many trained data. Whilst the recognition process of Hangul word applies to the type 2 Hangul word with 6 different patterns. The difference of these patterns appears from the change of the font type. The chosen fonts for data training are such as Batang, Dotum, Gaeul, Gulim, Malgun Gothic. Arial Unicode MS is used to test the data. The lowest accuracy is achieved through the use of SVM kernel radial basis function, which is 69%. The same result, 72 %, is given by the SVM kernel linear and polynomial.
Application of 3D Printing in EducationEditor IJCATR
This paper provides a review of literature concerning the application of 3D printing in the education system. The review identifies that 3D Printing is being applied across the Educational levels [1] as well as in Libraries, Laboratories, and Distance education systems. The review also finds that 3D Printing is being used to teach both students and trainers about 3D Printing and to develop 3D Printing skills.
Survey on Energy-Efficient Routing Algorithms for Underwater Wireless Sensor ...Editor IJCATR
In underwater environment, for retrieval of information the routing mechanism is used. In routing mechanism there are three to four types of nodes are used, one is sink node which is deployed on the water surface and can collect the information, courier/super/AUV or dolphin powerful nodes are deployed in the middle of the water for forwarding the packets, ordinary nodes are also forwarder nodes which can be deployed from bottom to surface of the water and source nodes are deployed at the seabed which can extract the valuable information from the bottom of the sea. In underwater environment the battery power of the nodes is limited and that power can be enhanced through better selection of the routing algorithm. This paper focuses the energy-efficient routing algorithms for their routing mechanisms to prolong the battery power of the nodes. This paper also focuses the performance analysis of the energy-efficient algorithms under which we can examine the better performance of the route selection mechanism which can prolong the battery power of the node
Comparative analysis on Void Node Removal Routing algorithms for Underwater W...Editor IJCATR
The designing of routing algorithms faces many challenges in underwater environment like: propagation delay, acoustic channel behaviour, limited bandwidth, high bit error rate, limited battery power, underwater pressure, node mobility, localization 3D deployment, and underwater obstacles (voids). This paper focuses the underwater voids which affects the overall performance of the entire network. The majority of the researchers have used the better approaches for removal of voids through alternate path selection mechanism but still research needs improvement. This paper also focuses the architecture and its operation through merits and demerits of the existing algorithms. This research article further focuses the analytical method of the performance analysis of existing algorithms through which we found the better approach for removal of voids
Decay Property for Solutions to Plate Type Equations with Variable CoefficientsEditor IJCATR
In this paper we consider the initial value problem for a plate type equation with variable coefficients and memory in
1 n R n ), which is of regularity-loss property. By using spectrally resolution, we study the pointwise estimates in the spectral
space of the fundamental solution to the corresponding linear problem. Appealing to this pointwise estimates, we obtain the global
existence and the decay estimates of solutions to the semilinear problem by employing the fixed point theorem
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxR&R Consult
CFD analysis is incredibly effective at solving mysteries and improving the performance of complex systems!
Here's a great example: At a large natural gas-fired power plant, where they use waste heat to generate steam and energy, they were puzzled that their boiler wasn't producing as much steam as expected.
R&R and Tetra Engineering Group Inc. were asked to solve the issue with reduced steam production.
An inspection had shown that a significant amount of hot flue gas was bypassing the boiler tubes, where the heat was supposed to be transferred.
R&R Consult conducted a CFD analysis, which revealed that 6.3% of the flue gas was bypassing the boiler tubes without transferring heat. The analysis also showed that the flue gas was instead being directed along the sides of the boiler and between the modules that were supposed to capture the heat. This was the cause of the reduced performance.
Based on our results, Tetra Engineering installed covering plates to reduce the bypass flow. This improved the boiler's performance and increased electricity production.
It is always satisfying when we can help solve complex challenges like this. Do your systems also need a check-up or optimization? Give us a call!
Work done in cooperation with James Malloy and David Moelling from Tetra Engineering.
More examples of our work https://www.r-r-consult.dk/en/cases-en/
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)MdTanvirMahtab2
This presentation is about the working procedure of Shahjalal Fertilizer Company Limited (SFCL). A Govt. owned Company of Bangladesh Chemical Industries Corporation under Ministry of Industries.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
Integration of Bayesian Theory and Association Rule Mining in Predicting User’s Browsing Activities – Survey Paper
1. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 743
Integration of Bayesian Theory and Association Rule
Mining in Predicting User’s Browsing Activities – Survey
Paper
Geoffrey Gitonga
Department of Computing
School of Computing and
Information Technology
Jomo Kenyatta University of
Agriculture & Technology
Nairobi, Kenya
Wilson Cheruiyot
Department of Computing
School of Computing and
Information Technology
Jomo Kenyatta University of
Agriculture & Technology,
Nairobi, Kenya
Waweru Mwangi
Department of Computing
School of Computing and
Information Technology
Jomo Kenyatta University of
Agriculture & Technology,
Nairobi, Kenya
Abstract: Bayesian theory and association rule mining methods are artificial intelligence techniques that have been used in various
computing fields, especially in machine learning. Internet has been considered as an easy ground for vices like radicalization because
of its diverse nature and ease of information access. These vices could be managed using recommender systems methods which are
used to deliver users’ preference data based on their previous interests and in relation with the community around the user. The
recommender systems are divided into two broad categories, i.e. collaborative systems which considers users which share the same
preferences as the user in question and content-based recommender systems tends to recommend websites similar to those already
liked by the user. Recent research and information from security organs indicate that, online radicalization has been growing at an
alarming rate. The paper reviews in depth what has been carried out in recommender systems and looks at how these methods could be
combined to from a strong system to monitor and manage online menace as a result of radicalization. The relationship between
different websites and the trend from continuous access of these websites forms the basis for probabilistic reasoning in understanding
the users’ behavior. Association rule mining method has been widely used in recommender systems in profiling and generating users’
preferences. To add probabilistic reasoning considering internet magnitude and more so in social media, Bayesian theory is
incorporated. Combination of this two techniques provides better analysis of the results thereby adding reliability and knowledge to the
results.
Keywords: Bayesian; mining; theory; association; intelligence; browsing
1. INTRODUCTION
The use of internet has been growing tremendously over the
years accelerated by the ease in internet access. This has led to
growing dependent on the internet for chores like business,
work and other social responsibilities like sharing of
information, files and opinions using applications like
Facebook, Twitter and Friendster among others. As a result of
this enormous development of the internet and web based
applications, new vices, like radicalization, in relation to
internet usage have also been introduced that were not
considered in the internet usage and accessibility campaign.
Online radicalization which is a process by which internet
users are brainwashed to accept extreme religious, social or
political ideologies. Radicalization therefore led to other
internet vices like cybercrime, child trafficking and even
hacking among others.
Due to internet development and the increase in the number of
users, recommender systems for web applications were
introduced to anticipate users’ preferences in terms of content
and information based on their personalized content. This is
based on the users’ interaction with the system such that the
information is analyzed over time to determine the
preferences. Preferences could also be determined based on
similarities discovered between preferences from different
user groups (community opinion) where the trend could also
be used as the basis for recommendation. Recommender
systems are mainly based on two categories; collaborative and
content-based filtering methods.
Using this idea, internet menace could also be curbed by
being able to understand users’ online activities.
Radicalization does not happen overnight, and therefore with
accumulated information about user’s browsing activities, the
system would be able to identify and relate the trend that leads
to radicalization. The relationships between websites and
users behavior is analyzed in order to identify the trend in
regard to internet usage. The importance of websites to users
will be considered based on time spent, frequency of
accessibility to determine the probability of radicalization
using Bayesian theory and association rule mining methods.
The following sections will review what has been done by
other scholars in this field, concentrating on the recommender
systems techniques and how they could be applied in curbing
the menace associated with the use of web applications.
2. THEORETICAL REVIEW
Users’ activities on the website are guided by the guiding
principles of ethics. Users tend to misuse online sites
knowingly thereby ending up radicalized as a result of a
continuous use of some specific online sites. Ethics has been
defined in various ways by different scholars. According to
Mohd & Sembok [7], these are moral values that help guide
human behavior, actions, and options. They can be defined in
two ways normative and prescriptive. For the normative
description, ethics are considered to be well-based standards
of right and wrong that suggest what humans should to do,
2. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 744
usually in terms of rights, obligations, benefits to society,
fairness, and specific virtues. Ethics also involves values that
related to virtues of honesty, compassion, and loyalty. For the
prescriptive description, ethics refers to the study and
nurturing of personal ethical standards, as well as community
ethics, in terms of behaviour, feelings, laws, and social habits
and norms which can deviate from more universal ethical
standards [7].
Hilty [5] also defines the term ethics in two ways;
philosophical reflection and the other to more practical
governing principles, which is the principle of moral conduct
governing an individual or a group.
Conclusion from the above definitions can therefore be
derived that, ethics are standards of right or wrong as
perceived by human being in the use of technology. What is
perceived as wrong in the human eye could remain wrong as
long as the person in question perceive it so. It does not have
to be controlled by government institutions laws or standards
for them to be seen so.
In relation to technology, and more so internet, there are many
issues that come as a result of development in technology.
These could be good or bad depending on the situation and
information use. Social sites like Myspace, Facebook,
Friendster, Twitter always enable users to learn something
new. In Friendster, for example, students can share comments
and pictures as well as ideas which could be beneficial or
negative depending on the perception of the user. Negative
influence such as addiction to pornographic, radicalization
and even exposure to fraud [12]. Ethical issues in Information
Technology (IT) involves three “rights” which are;
Right to know: This is the extent to which
information can be accessed from IT infrastructure.
Property right: This is the right to that IT facilities
are protected. For example, users can protect their
computers from viruses
Right to privacy: This refers to right to user’s
privacy. Every technology user should be
responsible to protect their information like
passwords from being used by other people.
As much as these rights guarantee freedom of information
access and protection, some people, using skills acquired in a
bid to enforce these rights, violate them. They therefore ends
up in fraud, radicalization or even hacking.
Other challenges that face the industry, especially in academic
institutions is the homogeneous society that makes it
impossible to come up with umbrella technology ethical
standards that can be used across board. In higher learning
institutions, there are many people with different age groups,
class, gender and affiliations. People of different occupations,
age groups and educational backgrounds who live in different
countries representing a variety of cultures use resources
available through the global computer network. This,
however, complicates the problem of developing universal
standards of behaviour and a system of ethical norms, which
could be widely recognized in the World Wide Web.
Meanwhile many believe that such a system is really needed
today. This is as mentioned by Gordon & Johnson [4].
However, this heterogeneous society brings along vices that
are shared among the users thereby ending up affecting them
in the long run.
According to Hilty [5], much has been done in terms of
enforcing IT ethics. The adoption of the “ACM Code of
Ethics and Professional Conduct“ by the ACM Council in
1992. Also, IFIP was introduced SIG 9.2.2, its Special Interest
Group on a Framework for Ethics of Computing. He also
mentioned that, there have been many challenges for IT ethics
for over a decade. For example in privacy protection, data
security, trust in complex systems, online communication,
freedom of speech, intellectual property and sustainable
development
As much as technology was introduced in learning institutions
over a decade ago, concentration has been on the application
of technology and less on ethics and influence effects. As
mentioned by Mohd & Sembok [7], technology ethics have
been negatively been affected mainly because of lack of
awareness of technology information security issues, the
rapidly evolving complexity of systems that are difficult to
implement and operate, the ease at which one can access and
reach information and communication technology, the
anonymity provided by these technologies, and the
transnational nature of communication networks. Cybercrime
and online radicalization have been issues of the global ethical
aspects that have not been resolved fully despite government
regulations and policies. Even where awareness is growing
and where legislation may be adequate, capacity to use
information security technologies and related procedures, as
well as to protect against, detect and respond effectively, to
cybercrime and online radicalization, is low.
Globally, internet and communications cut across territorial
boundaries, thereby creating a new realm of human activities,
and undermining the feasibility and legitimacy of applying
laws based on geographic boundaries. The new boundaries,
which are manifested in the monitor screens, firewalls,
passwords, intruder detection, and virus busters, have created
new personalities, groups, organizations, and other new forms
of social, economic, and political groupings in the cyber
world of bits. Traditional border-based law making and law
enforcing authorities find this new environment of cyber
boundaries very challenging.
The following are some of the main malpractice issues that
have not been adequately addressed by the current measures
in place, as mentioned by Mohd & Sembok [7];
Cybercrime: Research conducted by Computer Security
Institute (CSI, 2003) in Asia-Pacific region indicate that the
threat from computer crime and information security breaches
continues persistently thereby resulting to financial losses 50
– 60 % between 2002 to 2004.
Pornography: Internet brings great benefits, but also
enormous risks to children. Among the worst forms of
cybercrime is the global trade in child pornography.
International criminal investigation have been trying to handle
this situation but with a very small margin.
Piracy: In early 2002, International Planning and Research
Corporation (IPR) completed an analysis of software piracy
for the year 2001 as part of an ongoing study for the Business
Software Alliance (BSA) and its member companies. The
world piracy rate increased in two consecutive years, 2000
and 2001. The 2001 piracy rate of 40% is a marked increase
from 37% in 2000. Both years were up from the low set in
3. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 745
1999 at 36%. Since then, the rate of piracy has been
increasing tremendously over the years.
Neumann & Stevens [10] also mentioned online radicalization
as the one of the biggest challenge that security institutions
have not been able to handle. They define radicalization as
“the process (or processes) whereby individuals or groups
come to approve of and (ultimately) participate in the use of
violence for political aims”. Most policymakers and scholars
have only a general idea that internet is the main breeding
ground for radicalization but only the most cursory idea of
how it works.
Therefore, online platform forms an ideal cheap ground for
extremist organizations to sell their ideals. According to
Neumann & Stevens [10], internet can be used in three main
ways by the extremists;
To illustrate, communicate their ideological
messages and/or narratives
Internet provides a risk-free platform for potential
recruits to find like-minded people. It is also used to
for a network of these like-minded
It creates a new social platform in which
unacceptable views and behavior are normalized
which would have otherwise been rejected.
“Surrounded by other radicals, the internet becomes
a virtual ‘echo chamber’ in which the most extreme
ideas and suggestions receive the most
encouragement and support.”
According to Olumoye & Y [11], he suggested the following
measures that could curb negative effects of technology:
The society must ensure each person is accountable
for everything he or she does, no matter how
inexplicable his or her action may appear.
Since there are growing complexities of ethical and
social issues that revolve around multiple breaches,
it becomes imperative for the educators and
computer
professional bodies to develop curriculum on ethical
and professionalcodes of conduct in the
information society.
There is need to lay emphases on information
systems security controls
The government should develop a comprehensive
laws and legislations to create a sense or awareness
of compliance requirements that affects information
systems professionals.
Our law enforcement agents should be more
sophisticated in their computer crime investigation.
This can been enhanced with the use of computer
forensics, which is a formal investigative technique
used in evaluating digital information for judicial
review
The use of intelligence techniques have of late taken a centre
stage especially in the use of web based applications in
identifying and categorizing ideas based on users preferences.
According to Mukhopadhyay, Vij, & Wanaskar [8]
recommender systems are tools used to filter, sort and order
items and data based on opinions of users or community to
assist users to determine content of interest from a doubtless
overwhelming set of decisions. Two algorithms became very
popular: collaborative filtering and content-based filtering
Content-based recommender systems work with individual
profiling of users from the very beginning. A profile has
biographic data about a user and preferences based on items
rated by the user. In the entire process of recommendation, the
system makes a comparison of items that were already
positively rated by user with items that were not rated and
looks for similarities. This is carried out mainly using tags or
keywords. In this case the profiles of other users are not
essential.
Collaborative filtering method involves searching and
locating users in a community that share item preferences
based on their common online browsing habits. If two or more
users have similar or closely similar rated items, they are
considered to have similar preferences. A user gets
recommendations to choose items that he/she has not rated
before, but that were already positively rated by users in
his/her neighbourhood [8].
Mukhopadhyay, Vij, & Wanaskar [8] identified the following
challenges affecting recommender systems;
1. Cold-start: This occurs when providing recommendations
to new users. The system is not able to recommend
preferences to new users as their profiles are almost empty.
Cold-start also affects new items that have not been captured
by the system and haven’t been rated before.
2. Trust: Consideration of people with short history cannot be
compared with those with rich history in their profiles. The
issue of trust arises towards evaluations of a certain customer.
3. Scalability: The number of users continues to grow
tremendously and therefore, the system needs more resources
for processing information and forming recommendations.
Majority of resources is utilized during determination of
users’ preferences.
4. Sparsity: This refers to the problem of lack of information.
In online shops that have a huge amount of users and items
there are some users who rate just a few items.
5. Privacy: In order to receive the most accurate and correct
recommendation, the system must acquire the most amount of
information possible about the user, including demographic
data, and data about the location of a particular user.
Naturally, the question of reliability, security and
confidentiality of the given information arises. Many online
shops offer effective protection of privacy of the users by
utilizing specialized algorithms and programs.
This idea could be extended in profiling users interest and
thereby be in apposition to determine and predict the kind of a
person the user is based on his/her online activities. These
systems are developed using intelligence techniques such as
association rule mining, Bayesian theory, cluster analysis,
reinforcement learning among other techniques.
In association rule for example as stated by Mukhopadhyay,
Vij, & Wanaskar [8] an upgraded association rule mining
method cab be used in recommender systems because of its
scalability and gives high precision in results determination.
Using this method, the weight of the web page is given in
binary form to pages that are visited to find whether the page
is present or not. This method assumes that if the web page is
visited by the user, that page is considered important
specifically to that user. However, not all the pages visited by
the user are of interest. Some users may visit a page looking
for useful information but it may not have what the user is
looking for. Therefore, factors like time spent by the user and
visiting frequency of the page are considered during web page
4. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 746
calculation. This idea could also be used in identifying web
pages of interest to the user in order to understand users
browsing behaviour thereby predicting their personalities.
This therefore could be used to curb vices like cybercrime,
radicalization among others.
Another technique highlighted by different scholars in
recommender systems and data mining, especially in
probability theory and statistics is Bayes' theorem also
referred to as Bayes’ rule. It is a result that is of importance in
the mathematical manipulation of conditional probabilities.
Bayes rule can be derived from more basic axioms of
probability, specifically conditional probability. This has also
been extended to machine learning.
According to Tipping [13] when applied, the probabilities
involved in Bayes' theorem may have any of a number
of probability interpretations. In one of these interpretations,
the theorem is used directly as part of a particular approach
to statistical inference. ln particular, with the Bayesian
interpretation of probability, the theorem expresses how a
subjective degree of belief should rationally change to account
for evidence: this is Bayesian inference, which is fundamental
to Bayesian statistics. However, Bayes' theorem has
applications in a wide range of calculations involving
probabilities, not just in Bayesian inference.
Mathematically, Bayes' theorem gives the relationship
between the probabilities of A and B,P(A) and P(B), and
the conditional
probabilities of A given B and B given A, P(A|B) and P(B|A).
In its most common form, it is:
The meaning of this statement depends on the interpretation
of probability ascribed to the Bayesian interpretation [13].
Neal [9] provides the following steps in implementation of
Bayesian method;
1. Formulation of knowledge about the situation
probabilistically
2. Gathering of data
3. Computing the posterior probability distribution for
the parameters
4. Given the observed data
5. Use of the identified posterior distribution to:
i. Reach scientific conclusions,
properly accounting for uncertainty.
ii. Make predictions by averaging over
the posterior distribution.
iii. Make decisions so as to minimize
posterior expected loss.
The posterior distribution for the model parameters given the
observed data is found by combining the prior distribution
with the likelihood for the parameters given the data. This is
done using Bayes' Rule:
Another form of Bayes's Theorem that is generally
encountered when looking at two competing statements or
hypotheses is:
For proposition A and evidence or background B,
P(A),the prior probability, is the initial degree of
belief in A.
P(-A), is the corresponding probability of the initial
degree of belief against A: 1-P(A)=P(-A)
P(B|A), the conditional probability or
likelihood, is the degree of belief in B, given
that the proposition A is true.
P(B|-A), the conditional probability or
likelihood, is the degree of belief in B, given
that the proposition A is false.
P(A|B), the posterior probability, is the
probability for A after taking into
account B for and against A.
Tipping [13], has pointed out the main advantage and
disadvantage of the Bayesian method. The greatest advantage
of a Bayesian approach is that, there is an automatic
preference for simple models that sufficiently explain the data
without unnecessary complexity. This property holds even if
the prior P(W) is completely uninformative. The practical
disadvantage of the Bayesian approach is that it requires the
modeller to perform integrations over variables, and many of
these computations are analytically intractable. As a result,
much contemporary research in Bayesian approaches to
machine learning relies on, or is directly concerned with,
approximation techniques.
The following are other artificial intelligence techniques as
identified by Wu et al. [14]
a) Decision tree learning
This method uses a decision tree as a predictive
technique mapping observations about an item to conclusions
about the item's target value. This techniques is among the
most predictive modelling technique used in statistics, data
mining and machine learning. This approach depicts a tree
like structure with leaves representing class labels and
branches conjunctions of features that lead to class labels.
In decision analysis, a decision tree can be used to visually
and explicitly represent decisions and decision making.
Decision tree learning is a method commonly used in data
mining. The goal is to create a model that predicts the value of
a target variable based on several input variables.
Data comes in records of the form:
The dependent variable, Y, is the target variable that we are
trying to understand, classify or generalize. The vector x is
composed of the input variables, x1, x2, x3 etc., that are used
for that task.
This method however has got the following limitations;
Practical decision-tree learning algorithms are based
on heuristics such as the greedy algorithm where
locally-optimal decisions are made at each node. Such
5. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 747
algorithms cannot guarantee to return the globally-
optimal decision tree.
Decision-tree learners can create over-complex trees
that do not generalize well from the training data (over
fitting).
There are concepts that are hard to learn because
decision trees do not express them easily, such
as parity (the evenness or oddness of the number of
bits with value one within a given set of bits, and is
thus determined by the value of all the bits. It can be
calculated via an XOR sum of the bits, yielding 0 for
even parity and 1 for odd parity. This property of
being dependent upon all the bits and changing value
if any one bit changes allow for its use in error
detection schemes.) or multiplexer ( a device that
selects one of several analog or digital input signals
and forwards the selected input into a single
line) problems. In such cases, the decision tree
becomes prohibitively large.
For data including categorical variables with different
numbers of levels, information gain in decision trees is
biased in favour of those attributes with more levels.
b) Association rule mining
This is a popular and well researched method for discovering
interesting relations between variables in large databases. It is
intended to identify strong rules discovered in databases using
different measures of interestingness. Based on the concept of
strong rules, Agrawal & Srikant [2], introduced association
rules for discovering regularities between products in large-
scale transaction data recorded by point-of-sale (POS)
systems in supermarkets. For example, the
rule fou
nd in the sales data of a supermarket would indicate that if a
customer buys onions and potatoes together, he or she is
likely to also buy hamburger meat. Such information can be
used as the basis for decisions about marketing activities such
as, e.g., promotional pricing or product placements.
As stated by Mukhopadhyay, Vij, & Wanaskar [8],
association rule mining technique can be easily used in
recommendation systems and it is scalable, gives high
precision , and only gives binary weight to the pages that are
visited i.e. to find whether the page is present or not. User
may visit a page but it may not have useful information for
him. So factors like time spent by the user and visiting
frequency of the page should be considered for the page
consideration. So in association rule mining method the
weight of the page is also included.
One limitation of the standard approach to discovering
associations is that by searching massive numbers of possible
associations to look for collections of items that appear to be
associated, there is a large risk of finding many false
associations. These are collections of items that co-occur with
unexpected frequency in the data, but only do so by chance.
c) Cluster analysis
Also known as clustering is the task of grouping a set of
objects so that the objects in the same cluster are more similar
to each other than to those in other clusters. It is a main task
of exploratory data mining, and a common technique
for statistical data analysis, used in many fields,
including machine learning.
Popular notions of clusters include groups with
small distances among the cluster members, dense areas of the
data space, intervals or particular statistical distributions.
Cluster analysis as such is not an automatic task, but an
iterative process of knowledge discovery or interactive multi-
objective optimization that involves trial and failure. It will
often be necessary to modify data pre-processing and model
parameters until the result achieves the desired properties.
d) Reinforcement learning
As emphasized by Peters et al. [6] Reinforcement learning is
an area of machine learning inspired by behaviorist,
concerned with how software agents ought to take actions in
an environment so as to maximize some notion of
cumulative reward. Reinforcement learning is mostly applied
in dynamic programming and machine learning.
Reinforcement learning is suitable to problems which include
a long-term versus short-term reward trade-off.
Reinforcement learning has been applied successfully to
problems like, robot control, elevator,
scheduling, telecommunications, backgammon and checkers.
Reinforcement learning is considered powerful because of
two components: The use of samples to optimize performance
and the use of function approximation to deal with large
environments. Peters et al. [6] also identify situations where
reinforcement learning can also be used successfully. These
situations are:
A model of the environment is known, but an
analytic solution is not available;
Only a simulation model of the environment is
given (the subject of simulation-based
optimization);
The only way to collect information about the
environment is by interacting with it.
The main limitation in this technique is that, although finite-
time performance bounds appeared for many algorithms in the
recent years, these bounds are expected to be rather loose and
thus more work is needed to better understand the relative
advantages, as well as the limitations of these algorithms.
The researcher intends to use the Bayesian theory combined
with association rule learning to generate the proposed
algorithm. The aspects of recommender systems, using
content-based approach will also be considered in predicting
what sites are most likely to be visited by the user. This will
ensure that the limitations experienced from both methods are
covered thereby increasing dependability percentage of the
algorithm. In Bayesian technique, much contemporary
research to machine learning relies on, or is directly
concerned with approximation techniques and is based on
6. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 748
approximation. Since both methods are based on probabilities,
combining both increases the chances of acquiring the correct
results.
3. DISCUSSION
Machine learning/artificial intelligence is a relatively new area
in information technology. A variety of techniques have been
introduced each with its benefits and shortcomings based on
different applications. More research has been carried out on
integration and enhancement of these techniques to form
hybrids in a bid to make them much better and effective.
Inspired by Forsati & Rahbar [3] using weighted association
mining method; the following criteria are considered;
1. Page duration
2. Frequency of access
3. Indegree – in this case it refers to the
average number of objects in a page
These three criteria are used to generate the average weight of
the web page. The more the weight, the more important the
page is considered by the user.
The associated method is used in capturing the time and
webpage access frequency which forms the basis of
importance estimation. Generally, a user is assumed to spend
more time on more useful web pages than on less useful ones.
However, this alone cannot be sufficient as other factors like
size of the page and number of objects in a page might equally
be important. The formula of duration is given in Equation (1)
below. Frequency is the number of times that a web page is
accessed by a user. The general assumption is that web pages
with a higher frequency are of stronger interest to users.
However, it is important to consider in the calculating the
frequency of a web page the in-degree of that page, in this
research the number of objects in a page e.g. images, videos,
flashy objects etc. The frequency is calculated using the
formula in Equation (2) below [8].
Bayesian aspect can then be used to capture and predict user’s
behaviour based on the usage of the social media like Twitter
and Facebook. This therefore concentrates on the probabilistic
assumptions specifically on social media. These could be
measured through comments posted, likes or tags identifying
the trends and already known facts in regard to social media
usage and radicalization statistics. In particular, this is used to
generate the probability Maximum a Posteriori (MAP) as to
whether the user is radicalized or not based on his usage
statistics of the social media sites. This is also forms part of
the equation generated for association rule mining as Equation
(3). Therefore, time spent by a user on a web page, frequency
of visiting and probability estimation based on social media
usage and information, are then used as three crucial aspects
in measuring the user’s interest on the web page and
relationships among different web pages using Equation (3)
as indicated below.
Duration =
Equation (1)
Frequency = Number of visits * Indegree Equation (2)
Probability = MAP Equation (3)
Weight = Duration * Frequency * Probability Equation (4)
4. CONCLUSION
Internet can no longer be ignored and therefore, monitoring
measures efficient in controlling users’ activities online need
to be put in place that are. As much as more research has been
carried out in enhancing artificial intelligence techniques that
are used in profiling and recommender systems, more is still
required. In this survey, association rule mining has been
enhanced to weighted association rule mining considering the
average weight of the web page. However, not all web pages
should be treated the same way. Social media web pages have
different kind of information and should be treated as such,
differently from other web pages. Therefore, as much as
weighted association rule mining method would be
appropriate for other informational web pages, Bayesian
theory would be appropriate for social media sites. The
combination of these two algorithms would ultimately form
an enhanced algorithm with accurate and reliable results.
5. REFERENCES
[1]. Agrawal, R., & Srikant, R. (1994). Fast algorithms
for mining association rules. San Jose, CA 95120:
IBM Almaden Research Center 650 Harry Road.
[2]. Forsati, R. M., & Rahbar, A. (2009). An efficient
algorithm for web recommendation systems.
AICCSA 2009. IEEE/ACS International Conference
on Computer Systems and Applications, 579-586.
[3]. Gordon, L., & Johnson, D. G. (2004). Ethical
psychological and societal problems of the
application of technologys in education. Unesco
Institute For Information Technologies in
Education.
[4]. Hilty, L. M. (2002, December). The role of ethics in
the field of information and communication
technologies. Discussion paper prepared for the
technology working group of the Swiss Academy of
technical (SATW).
[5]. Peters, J., S, V., & S, S. (2003). Reinforcement
learning for humanoid robotics. Third IEEE-RAS
interantional conference on Humanoid Robots, 29-
30.
[6]. Mohd, T., & Sembok, T. (2003). Ethics of
information communication technology
(technology). University Kebangsaan Malasia for
the regional meeting on ethics of science and
technology.
[7]. Mukhopadhyay, D., Vij, S. R., & Wanaskar, U. H.
(2013). A hybrid web recommendation system
Time spent
Size (bits)
7. International Journal of Computer Applications Technology and Research
Volume 4– Issue 10, 743 - 749, 2015, ISSN: 2319–8656
www.ijcat.com 749
based on the improved association rule mining
algorithm.
[8]. Neal, R. M. (2004). Bayesian methods for machine
learning. University of Toronto.
[9]. Neumann, R. P., & Stevens, T. (2009). Countering
online radicalization. A strategy for action; the
international centre for the study of radicalization
and political violence (ICSR).
[10].Olumoye, & Y, M. (2013, November 11). Ethics
and social impact of information systems in our
society: analysis and recommendations.
International journal of science and research (IJSR),
2, 2319-7064.
[11].Rikowski, R. (2009). Using information technology,
teaching ethical issues in IT (7th ed.). London south
Bank University: Mc. Graw Hill.
[12].Tipping, M. E. (2006). Bayesian Inference: an
introduction to principles and practice in machine
learning. Cambridge, UK: Microsoft Research.
[13].Wu, X., Kumar, V., Quinlan, J. R., Ghosh, J., Yang,
Q., Motoda, H., . . . Zhou, Z.-H. (2007). Top 10
algorithms in data mining. London: Springer-Verlag
Limited.