In this technical age there are many ways where an attacker can get access to people’s sensitive information illegitimately. One of the ways is Phishing, Phishing is an activity of misleading people into giving their sensitive information on fraud websites that lookalike to the real website. The phishers aim is to steal personal information, bank details etc. Day by day it’s getting more and more risky to enter your personal information on websites fearing that it might be a phishing attack and can steal your sensitive information. That’s why phishing website detection is necessary to alert the user and block the website. An automated detection of phishing attack is necessary one of which is machine learning. Machine Learning is one of the efficient techniques to detect phishing attack as it removes drawback of existing approaches. Efficient machine learning model with content based approach proves very effective to detect phishing websites.
Our proposed system uses Hybrid approach which combines machine learning based method and content based method. The URL based features will be extracted and passed to machine learning model and in content based approach, TF-IDF algorithm will detect a phishing website by using the top keywords of a web page. This hybrid approach is used to achieve highly efficient result. Finally, our system will notify and alert user if the website is Phishing or Legitimate.
Phishing Website Detection by Machine Learning Techniques Presentation.pdfVaralakshmiKC
Phishing is a form of social engineering where attackers deceive people into revealing sensitive information or installing malware such as ransomware.Phishing (pronounced: fishing) is an attack that attempts to steal your money, or your identity, by getting you to reveal personal information -- such as credit card numbers, bank information, or passwords -- on websites that pretend to be legitimate.
Build your first blockchain application with Amazon Managed Blockchain - SVC2...Amazon Web Services
Learn how to set up a blockchain network and deploy your first application using Amazon Managed Blockchain. In this hands-on workshop, attendees build a blockchain network for a nonprofit organization to enable it to distribute funds without an intermediary, ensuring immutable transactions and full transparency to a donor about how the donation is being used. In addition donors can view all of the donations received by the organization and how these donations have been spent. You must have an active AWS account to participate in this workshop.
Phishing Website Detection by Machine Learning Techniques Presentation.pdfVaralakshmiKC
Phishing is a form of social engineering where attackers deceive people into revealing sensitive information or installing malware such as ransomware.Phishing (pronounced: fishing) is an attack that attempts to steal your money, or your identity, by getting you to reveal personal information -- such as credit card numbers, bank information, or passwords -- on websites that pretend to be legitimate.
Build your first blockchain application with Amazon Managed Blockchain - SVC2...Amazon Web Services
Learn how to set up a blockchain network and deploy your first application using Amazon Managed Blockchain. In this hands-on workshop, attendees build a blockchain network for a nonprofit organization to enable it to distribute funds without an intermediary, ensuring immutable transactions and full transparency to a donor about how the donation is being used. In addition donors can view all of the donations received by the organization and how these donations have been spent. You must have an active AWS account to participate in this workshop.
Phishing is a social engineering Technique which they main aim is to target the user Information like user id, password, credit card information and so on. Which result a financial loss to the user. Detecting Phishing is the one of the challenge problem that relay to human vulnerabilities. This paper proposed the Detecting Phishing Web Sites using different Machine Learning Approaches. In this to evaluate different classification models to predict malicious and benign websites by using Machine Learning Algorithms. Experiments are performed on data set consisting malicious and benign, In This paper the results shows the proposed Algorithms has high detection accuracy. Nakkala Srinivas Mudiraj ""Detecting Phishing using Machine Learning"" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-4 , June 2019, URL: https://www.ijtsrd.com/papers/ijtsrd23755.pdf
Paper URL: https://www.ijtsrd.com/computer-science/computer-security/23755/detecting-phishing-using-machine-learning/nakkala-srinivas-mudiraj
Cloud Security is critical to Data Security and Application Resilience against CyberAttacks. This talk looks at Security Best Practices that need to be practised.
This talk was presented at AWS Community Day Bengaluru 2019 by Amar Prusty, Cloud-Data Center Consultant Architect, DXC Technology
As organizations struggle to mature their security and IT service profiles across expanding numbers of endpoints, they are increasingly turning to the proactive management capabilities of endpoint detection and responses platforms.
To provide organizations with a clear example of how to identify the most effective EDRP solutions, leading IT analyst firm Enterprise Management Associates (EMA) has conducted independent and objective research on the features and capabilities of two of the leading solution suites in this market: Tanium Core and 1E Tachyon.
Driver drowsiness detection is a car safety technology which helps prevent accidents caused by the driver getting drowsy. Various studies have suggested that around 20% of all road accidents are fatigue-related, up to 50% on certain roads
Automatic Attendance system using Facial RecognitionNikyaa7
It is a boimetric based App,which is gradually evolving in the universal boimetric solution with a virtually zero effort from the user end when compared with other boimetric options.
LEAF DISEASE DETECTION USING IMAGE PROCESSING AND SUPPORT VECTOR MACHINE (SVM)Journal For Research
in the study on leaf disease detection can be a helpful aspect in keeping an eye on huge area of fields of crops, but it’s important to detect the disease as early as possible. This paper gives a method to detect the disease caused to the leaf calculating the RGB and HSV values. Primarily the image is blurred in order reduce noise. Then the image is converted from RGB to HSV form, after this color thresholding is done. After thresholding foreground or background detection is performed. Background detection leads to feature extractions of the leaf. Then k-means algorithm is applied which can help to clot the clusters. The following system is a software based solution for detecting the disease with which the leaf is infected. In order to detect the disease some steps are to be followed using image processing and support vector machine. Improving the quality and production of agricultural products detection of the leaf disease can be useful.
Mushroom is one of the fungi types’ food that has the most potent nutrients on the plant.
Mushrooms have major medical advantages such as killing cancer cells.
Phishing is a social engineering Technique which they main aim is to target the user Information like user id, password, credit card information and so on. Which result a financial loss to the user. Detecting Phishing is the one of the challenge problem that relay to human vulnerabilities. This paper proposed the Detecting Phishing Web Sites using different Machine Learning Approaches. In this to evaluate different classification models to predict malicious and benign websites by using Machine Learning Algorithms. Experiments are performed on data set consisting malicious and benign, In This paper the results shows the proposed Algorithms has high detection accuracy. Nakkala Srinivas Mudiraj ""Detecting Phishing using Machine Learning"" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-4 , June 2019, URL: https://www.ijtsrd.com/papers/ijtsrd23755.pdf
Paper URL: https://www.ijtsrd.com/computer-science/computer-security/23755/detecting-phishing-using-machine-learning/nakkala-srinivas-mudiraj
Cloud Security is critical to Data Security and Application Resilience against CyberAttacks. This talk looks at Security Best Practices that need to be practised.
This talk was presented at AWS Community Day Bengaluru 2019 by Amar Prusty, Cloud-Data Center Consultant Architect, DXC Technology
As organizations struggle to mature their security and IT service profiles across expanding numbers of endpoints, they are increasingly turning to the proactive management capabilities of endpoint detection and responses platforms.
To provide organizations with a clear example of how to identify the most effective EDRP solutions, leading IT analyst firm Enterprise Management Associates (EMA) has conducted independent and objective research on the features and capabilities of two of the leading solution suites in this market: Tanium Core and 1E Tachyon.
Driver drowsiness detection is a car safety technology which helps prevent accidents caused by the driver getting drowsy. Various studies have suggested that around 20% of all road accidents are fatigue-related, up to 50% on certain roads
Automatic Attendance system using Facial RecognitionNikyaa7
It is a boimetric based App,which is gradually evolving in the universal boimetric solution with a virtually zero effort from the user end when compared with other boimetric options.
LEAF DISEASE DETECTION USING IMAGE PROCESSING AND SUPPORT VECTOR MACHINE (SVM)Journal For Research
in the study on leaf disease detection can be a helpful aspect in keeping an eye on huge area of fields of crops, but it’s important to detect the disease as early as possible. This paper gives a method to detect the disease caused to the leaf calculating the RGB and HSV values. Primarily the image is blurred in order reduce noise. Then the image is converted from RGB to HSV form, after this color thresholding is done. After thresholding foreground or background detection is performed. Background detection leads to feature extractions of the leaf. Then k-means algorithm is applied which can help to clot the clusters. The following system is a software based solution for detecting the disease with which the leaf is infected. In order to detect the disease some steps are to be followed using image processing and support vector machine. Improving the quality and production of agricultural products detection of the leaf disease can be useful.
Mushroom is one of the fungi types’ food that has the most potent nutrients on the plant.
Mushrooms have major medical advantages such as killing cancer cells.
PDMLP: PHISHING DETECTION USING MULTILAYER PERCEPTRONIJNSA Journal
A phishing website is a significant problem on the internet. It’s one of the Cyber-attack types where attackers try to obtain sensitive information such as username and password or credit card information. The recent growth in deploying a Detection phishing URL system on many websites has resulted in a massive amount of available data to predict phishing websites. In this paper, we purpose a new method to develop a phishing detection system called phishing detection based on a multilayer perceptron (PDMLP), which used on two types of datasets. The performance of these mechanisms evaluated in terms of Accuracy, Precision, Recall, and F-measure. Results showed that PDMLP provides better performance in comparison to KNN, SVM, C4.5 Decision Tree, RF, and RoF to classifiers.
Malicious-URL Detection using Logistic Regression TechniqueDr. Amarjeet Singh
Over the last few years, the Web has seen a
massive growth in the number and kinds of web services.
Web facilities such as online banking, gaming, and social
networking have promptly evolved as has the faith upon them
by people to perform daily tasks. As a result, a large amount
of information is uploaded on a daily to the Web. As these
web services drive new opportunities for people to interact,
they also create new opportunities for criminals. URLs are
launch pads for any web attacks such that any malicious
intention user can steal the identity of the legal person by
sending the malicious URL. Malicious URLs are a keystone
of Internet illegitimate activities. The dangers of these sites
have created a mandates for defences that protect end-users
from visiting them. The proposed approach is that classifies
URLs automatically by using Machine-Learning algorithm
called logistic regression that is used to binary classification.
The classifiers achieves 97% accuracy by learning phishing
URLs
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...ijaia
Reducing the risk pose by phishers and other cybercriminals in the cyber space requires a robust and automatic means of detecting phishing websites, since the culprits are constantly coming up with new techniques of achieving their goals almost on daily basis. Phishers are constantly evolving the methods they used for luring user to revealing their sensitive information. Many methods have been proposed in past for phishing detection. But the quest for better solution is still on. This research covers the development of phishing website model based on different algorithms with different set of features in order to investigate the most significant features in the dataset
A Deep Learning Technique for Web Phishing Detection Combined URL Features an...IJCNCJournal
The most popular way to deceive online users nowadays is phishing. Consequently, to increase cybersecurity, more efficient web page phishing detection mechanisms are needed. In this paper, we propose an approach that rely on websites image and URL to deals with the issue of phishing website recognition as a classification challenge. Our model uses webpage URLs and images to detect a phishing attack using convolution neural networks (CNNs) to extract the most important features of website images and URLs and then classifies them into benign and phishing pages. The accuracy rate of the results of the experiment was 99.67%, proving the effectiveness of the proposed model in detecting a web phishing attack.
PUMMP: PHISHING URL DETECTION USING MACHINE LEARNING WITH MONOMORPHIC AND POL...IJCNCJournal
Phishing scams are increasing drastically, which affects Internet users in compromising personal
credentials. This paper proposes a novel feature utilization method for phishing URL detection called the
Polymorphic property of features. In the initial stage, the URL-related features (46 features) were
extracted. Later, a subset of features (19 out of 46) with the polymorphic property of features was
identified, and they were extracted from different parts of the URL (the domain and path). After extracting
the features, various machine learning classification algorithms were applied to build the machine
learning model using monomorphic treatment of features, polymorphic treatment of features, and both
monomorphic and polymorphic treatment of features. By the polymorphic property of features, we mean
that the same feature provides different interpretations when considered in different parts of the URL. The
machine learning models were built on two different datasets. A comparison of the machine learning
models derived from the two datasets reveals the fact that the model built with both monomorphic and
polymorphic treatment of features yielded higher accuracy in Phishing URL detection than the existing
works.
PUMMP: Phishing URL Detection using Machine Learning with Monomorphic and Pol...IJCNCJournal
Phishing scams are increasing drastically, which affects Internet users in compromising personal credentials. This paper proposes a novel feature utilization method for phishing URL detection called the Polymorphic property of features. In the initial stage, the URL-related features (46 features) were extracted. Later, a subset of features (19 out of 46) with the polymorphic property of features was identified, and they were extracted from different parts of the URL (the domain and path). After extracting the features, various machine learning classification algorithms were applied to build the machine learning model using monomorphic treatment of features, polymorphic treatment of features, and both monomorphic and polymorphic treatment of features. By the polymorphic property of features, we mean that the same feature provides different interpretations when considered in different parts of the URL. The machine learning models were built on two different datasets. A comparison of the machine learning models derived from the two datasets reveals the fact that the model built with both monomorphic and polymorphic treatment of features yielded higher accuracy in Phishing URL detection than the existing works
Understanding the Impact and Challenges of Corona Crisis on Education Sector...vivatechijri
n the second week of March 2020, governments of all states in a country suddenly declared
shutting down of all colleges and schools for a temporary period of time as an immediate measure to stop the
spread of pandemic that is of novel corona virus. As the days pass by almost close to a month with no certainty
when they will again reopen. Due to pandemic like this an alarm bells have started sounding in the field of
education where a huge impact can be seen on teaching and learning process as well as on the entire education
sector in turn. The pandemic disruption like this is actually gave time to educators of today to really think about
the sector. Through the present research article, the author is highlighting on the possible impact of
coronavirus on education sector with the future challenges for education sector with possible suggestions.
LEADERSHIP ONLY CAN LEAD THE ORGANIZATION TOWARDS IMPROVEMENT AND DEVELOPMENT vivatechijri
This paper is explaining that how only leadership is responsible for sustainable improvement and
growth and only it can lead the organization towards improvement and overall development. Leadership and its
effectiveness are discussed in this research work and also how leadership is a different way of the success of the
organization and different from the traditional management to create true work-culture and good-will of the
organization in the social scene. Leadership is only responsible in bringing positive and negative change in the
organization; if the leadership doesn’t have the concern in the organization, the organization will not be able to
lead in the right direction towards improvement and development.
The topic of assignment is a critical problem in mathematics and is further explored in the real
physical world. We try to implement a replacement method during this paper to solve assignment problems with
algorithm and solution steps. By using new method and computing by existing two methods, we analyse a
numerical example, also we compare the optimal solutions between this new method and two current methods. A
standardized technique, simple to use to solve assignment problems, may be the proposed method
Structural and Morphological Studies of Nano Composite Polymer Gel Electroly...vivatechijri
n today’s society, we stand before a change in energy scarcity. As our civilization grows, many
countries in thedeveloping world seek to have the standard of living that has been exclusive to a few nations, so
their arises a need in thedevelopment of technology that is compatible enough with the resources provided by
nature in order to have sustainabledevelopment to all class of the society. In order to overcomethe prevailing
challenges of huge energy crises in near future, there is an urgent need for the development of electrical
vehiclesor hybrid electrical vehicles with low CO2 emissions using renewable energy sources. In view of the
above, electrochemicalcapacitors can fulfil the requirements to some extent.Preparation of nano composite
polymer gel electrolyte is the best optional product to overcome these problems. When fillers are added or
dispersed to the polymer gel electrolyte, amorphous or porous nature of electrolyte increases which enhances
the liquid absorbing quality of polymer and helps in removing the drawbacks of polymer gel electrolytes such as
leakage, poor mechanical and thermal stability etc. In this work dispersion of SiO2 nano filler is done in the
[PVdF (HFP)-PC-Mg (ClO4)2] for the synthesis of nano composite PGE [PVdF (HFP)-PC-MgClO4- SiO2].
Optimization and characterization was carried out by using various techniques.
Theoretical study of two dimensional Nano sheet for gas sensing applicationvivatechijri
This study is focus on various two dimensional material for sensing various gases with theoretical
view for new research in gas sensing application. In this paper we review various two dimensional sheet such as
Graphene, Boron Nitride nanosheet, Mxene and their application in sensing various gases present in the
atmosphere.
METHODS FOR DETECTION OF COMMON ADULTERANTS IN FOODvivatechijri
Food is essential forliving. Food adulteration deceives consumers and can endanger their health. The
purpose of this document is to list common food adulterant methods commonly found in India. An adulterant is
a substance found in other substances such as food, cosmetics, pharmaceuticals, fuels, or other chemicals that
compromise the safety or effectiveness of that substance. The addition of adulterants is called adulteration. The
most common reason for adulteration is the use of undeclared materials by manufacturers that are cheaper than
the correct and declared ones. The adulterants can be harmful or reduce the effectiveness of the product, or
they can be harmless.
The novel ideas of being a entrepreneur is a key for everyone to get in the hustle, but developing a
idea from core requires a systematic plan, time management, time investment and most importantly client
attention. The Time required for developing may vary from idea to idea and strength of the team. Leadership to
build a team and manage the same throughout the peak of development is the main quality. Innovations and
Techniques to qualify the huddles is another aspect of Business Development and client Retention.
Innovation for supporting prosperity has for quite some time been a focus on numerous orders, including PC science, brain research, and human-PC connection. In any case, the meaning of prosperity isn't continuously clear and this has suggestions for how we plan for and evaluate advances that intend to cultivate it. Here, we talk about current meanings of prosperity and how it relates with and now and then is a result of self-amazing quality. We at that point center around how innovations can uphold prosperity through encounters of self-amazing quality, finishing with conceivable future bearings.
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGEvivatechijri
Demand for data storage is growing exponentially, but the capacity of existing storage media is not keeping up, there emerges a requirement for a storage medium with high capacity, high storage density, and possibility to face up to extreme environmental conditions. According to a research in 2018, every minute Google conducted 3.88 million searches, other people posted 49,000 photos on Instagram, sent 159,362,760 e-mails, tweeted 473,000 times and watched 4.33 million videos on YouTube. In 2020 it estimated a creation of 1.7 megabytes of knowledge per second per person globally, which translates to about 418 zettabytes during a single year. The magnetic or optical data-storage systems that currently hold this volume of 0s and 1s typically cannot last for quite a century. Running data centres takes vast amounts of energy. In short, we are close to have a substantial data-storage problem which will only become more severe over time. Deoxyribonucleic acid (DNA) are often potentially used for these purposes because it isn't much different from the traditional method utilized in a computer. DNA’s information density is notable, 215 petabytes or 215 million gigabytes of data can be stored in just one gram of DNA. First we can encode all data at a molecular level and then store it in a medium that will last for a while and not become out-dated just like floppy disks. Due to the improved techniques for reading and writing DNA, a rapid increase is observed in the amount of possible data storage in DNA.
The usage of chatbots has increased tremendously since past few years. A conversational interface is an interface that the user can interact with by means of a conversation. The conversation can occur by speech but also by text input. When a chatty interface uses text, it is also described as a chatbot or a conversational medium. During this study, the user experience factors of these so called chatbots were investigated. The prime objective is “to spot the state of the art in chatbot usability and applied human-computer interaction methodologies, to research the way to assess chatbots usability". Two sorts of chatbots are formulated, one with and one without personalisation factors. the planning of this research may be a two-by-two factorial design. The independent variables are the two chatbots (unpersonalised versus personalised) and thus the speci?c task or goal the user are ready to do with the chatbot within the ?nancial ?eld (a simple versus a posh task). The results are that there was no noteworthy interaction effect between personalisation and task on the user experience of chatbots. A signi?cant di?erence was found between the two tasks with regard to the user experience of chatbots, however this variation wasn't because of personalisation.
The Smart glasses Technology of wearable computing aims to identify the computing devices into today’s world.(SGT) are wearable Computer glasses that is used to add the information alongside or what the wearer sees. They are also able to change their optical properties at runtime.(SGT) is used to be one of the modern computing devices that amalgamate the humans and machines with the help of information and communication technology. Smart glasses is mainly made up of an optical head-mounted display or embedded wireless glasses with transparent heads- up display or augmented reality (AR) overlay in it. In recent years, it is been used in the medical and gaming applications, and also in the education sector. This report basically focuses on smart glasses, one of the categories of wearable computing which is very popular presently in the media and expected to be a big market in the next coming years. It Evaluate the differences from smart glasses to other smart devices. It introduces many possible different applications from the different companies for the different types of audience and gives an overview of the different smart glasses which are available presently and will be available after the next few years.
Future Applications of Smart Iot Devicesvivatechijri
With the Internet of Things (IoT) bit by bit creating as the resulting time of the headway of the Internet, it gets critical to see the diverse expected zones for the utilization of IoT and the research challenges that are connected with these applications going from splendid savvy urban areas, to medical care administrations, shrewd farming, collaborations and retail. IoT is needed to attack into for all expectations and purposes for all pieces of our day-to-day life. Despite the fact that the current IoT enabling advancements have immensely improved in the continuous years, there are so far different issues that require attention. Since the IoT ideas results from heterogeneous advancements, many examination difficulties will arise. In like manner, IoT is planning for new components of exploration to be finished. This paper presents the progressing headway of IoT advancements and inspects future applications.
Cross Platform Development Using Fluttervivatechijri
Today the development of cross-platform mobile application has under the state of compromise. The developers are not willing to choose an alternative of either building the similar app many times for many operating systems or to accept a lowest common denominator and optimal solution that will going to trade the native speed, accuracy for portability. The Flutter is an open-source SDK for creating high-performance, high fidelity mobile apps for the development of iOS and Android. Few significant features of flutter are - Just-in-time compilation (JIT), Ahead- of-time compilation (AOT compilation) into a native (system-dependent) machine code so that the resulting binary file can execute natively. The Flutter’s hot reload functionality helps us to understand quickly and easily experiment, build UIs, add features, and fix bugs. Hot reload works by injecting updated source code files into the running Dart Virtual Machine (VM). With the help of Flutter, we believe that we would be having a solution that gives us the best of both worlds: hardware accelerated graphics and UI, powered by native ARM code, targeting both popular mobile operating systems.
The Internet, today, has become an important part of our lives. The World Wide Web that was once a small and inaccessible data storage service is now large and valuable. Current activities partially or completely integrated into the physical world can be made to a higher standard. All activities related to our daily life are mapped and linked to another business in the digital world. The world has seen great strides in the Internet and in 3D stereoscopic displays. The time has come to unite the two to bring a new level of experience to the users. 3D Internet is a concept that is yet to be used and requires browsers to be equipped with in-depth visualization and artificial intelligence. When this material is included, the Internet concept of material may become a reality discussed in this paper. In this paper we have discussed the features, possible setting methods, applications, and advantages and disadvantages of using the Internet. With this paper we aim to provide a clear view of 3D Internet and the potential benefits associated with this obviously cost the amount of investment needed to be used.
Recommender System (RS) has emerged as a significant research interest that aims to assist users to seek out items online by providing suggestions that closely match their interests. Recommender system, an information filtering technology employed in many items is presented in internet sites as per the interest of users, and is implemented in applications like movies, music, venue, books, research articles, tourism and social media normally. Recommender systems research is usually supported comparisons of predictive accuracy: the higher the evaluation scores, the higher the recommender. One amongst the leading approaches was the utilization of advice systems to proactively recommend scholarly papers to individual researchers. In today's world, time has more value and therefore the researchers haven't any much time to spend on trying to find the proper articles in line with their research domain. Recommender Systems are designed to suggest users the things that best fit the user needs and preferences. Recommender systems typically produce an inventory of recommendations in one among two ways -through collaborative or content-based filtering. Additionally, both the general public and also the non-public used descriptive metadata are used. The scope of the advice is therefore limited to variety of documents which are either publicly available or which are granted copyright permits. Recommendation systems (RS) support users and developers of varied computer and software systems to beat information overload, perform information discovery tasks and approximate computation, among others.
The study LiFi (Light Fidelity) demonstrates about how can we use this technology as a medium of communication similar to Wifi . This is the latest technology proposed by Harold Haas in 2011. It explains about the process of transmitting data with the help of illumination of an Led bulb and about its speed intensity to transmit data. Basically in this paper, author will discuss about the technology and also explain that how we can replace from WiFi to LiFi . WiFi generally used for wireless coverage within the buildings while LiFi is capable for high intensity wireless data coverage in limited areas with no obstacles .This research paper represents introduction of the Lifi technology,performance,modulation and challenges. This research paper can be used as a reference and knowledge to develop some of LiFitechnology.
Social media platform and Our right to privacyvivatechijri
The advancement of Information Technology has hastened the ability to disseminate information across the globe. In particular, the recent trends in ‘Social Networking’ have led to a spark in personally sensitive information being published on the World Wide Web. While such socially active websites are creative tools for expressing one’s personality it also entails serious privacy concerns. Thus, Social Networking websites could be termed a double edged sword. It is important for the law to keep abreast of these developments in technology. The purpose of this paper is to demonstrate the limits of extending existing laws to battle privacy intrusions in the Internet especially in the context of social networking. It is suggested that privacy specific legislation is the most appropriate means of protecting online privacy. In doing so it is important to maintain a balance between the competing right of expression, the failure of which may hinder the reaping of benefits offered by Internet technology
THE USABILITY METRICS FOR USER EXPERIENCEvivatechijri
THE USABILITY METRICS FOR USER EXPERIENCE was innovatively created by Google engineers and it is ready for production in record time. The success of Google is to attributed the efficient search algorithm, and also to the underlying commodity hardware. As Google run number of application then Google’s goal became to build a vast storage network out of inexpensive commodity hardware. So Google create its own file system, named as THE USABILITY METRICS FOR USER EXPERIENCE that is GFS. THE USABILITY METRICS FOR USER EXPERIENCE is one of the largest file system in operation. Generally THE USABILITY METRICS FOR USER EXPERIENCE is a scalable distributed file system of large distributed data intensive apps. In the design phase of THE USABILITY METRICS FOR USER EXPERIENCE, in which the given stress includes component failures , files are huge and files are mutated by appending data. The entire file system is organized hierarchically in directories and identified by pathnames. The architecture comprises of multiple chunk servers, multiple clients and a single master. Files are divided into chunks, and that is the key design parameter. THE USABILITY METRICS FOR USER EXPERIENCE also uses leases and mutation order in their design to achieve atomicity and consistency. As of there fault tolerance, THE USABILITY METRICS FOR USER EXPERIENCE is highly available, replicas of chunk servers and master exists.
Google File System was innovatively created by Google engineers and it is ready for production in record time. The success of Google is to attributed the efficient search algorithm, and also to the underlying commodity hardware. As Google run number of application then Google’s goal became to build a vast storage network out of inexpensive commodity hardware. So Google create its own file system, named as Google File System that is GFS. Google File system is one of the largest file system in operation. Generally Google File System is a scalable distributed file system of large distributed data intensive apps. In the design phase of Google file system, in which the given stress includes component failures , files are huge and files are mutated by appending data. The entire file system is organized hierarchically in directories and identified by pathnames. The architecture comprises of multiple chunk servers, multiple clients and a single master. Files are divided into chunks, and that is the key design parameter. Google File System also uses leases and mutation order in their design to achieve atomicity and consistency. As of there fault tolerance, Google file system is highly available, replicas of chunk servers and master exists.
A Study of Tokenization of Real Estate Using Blockchain Technologyvivatechijri
Real estate is by far one of the most trusted investments that people have preferred, being a lucrative investment it provides a steady source of income in the form of lease and rents. Although there are numerous advantages, one of the key downsides of real estate investments is lack of liquidity. Thus, even though global real estate investments amount to about twice the size of investments in stock markets, the number of investors in the real estate market is significantly lower. Block chain technology has real potential in addressing the issues of liquidity and transparency, opening the market to even retail investors. Owing to the functionality and flexibility of creating Security Tokens, which are backed by real-world assets, real estate can be made liquid with the help of Special Purpose Vehicles. Tokens of ERC 777 standard, which represent fractional ownership of the real estate can be purchased by an investor and these tokens can also be listed on secondary exchanges. The robustness of Smart Contracts can enable the efficient transfer of tokens and seamless distribution of earnings amongst the investors. This work describes Ethereum blockchainbased solutions to make the existing Real Estate investment system much more efficient.
Water scarcity is the lack of fresh water resources to meet the standard water demand. There are two type of water scarcity. One is physical. The other is economic water scarcity.
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...Amil Baba Dawood bangali
Contact with Dawood Bhai Just call on +92322-6382012 and we'll help you. We'll solve all your problems within 12 to 24 hours and with 101% guarantee and with astrology systematic. If you want to take any personal or professional advice then also you can call us on +92322-6382012 , ONLINE LOVE PROBLEM & Other all types of Daily Life Problem's.Then CALL or WHATSAPP us on +92322-6382012 and Get all these problems solutions here by Amil Baba DAWOOD BANGALI
#vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore#blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #blackmagicforlove #blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #Amilbabainuk #amilbabainspain #amilbabaindubai #Amilbabainnorway #amilbabainkrachi #amilbabainlahore #amilbabaingujranwalan #amilbabainislamabad
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)MdTanvirMahtab2
This presentation is about the working procedure of Shahjalal Fertilizer Company Limited (SFCL). A Govt. owned Company of Bangladesh Chemical Industries Corporation under Ministry of Industries.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Student information management system project report ii.pdfKamal Acharya
Our project explains about the student management. This project mainly explains the various actions related to student details. This project shows some ease in adding, editing and deleting the student details. It also provides a less time consuming process for viewing, adding, editing and deleting the marks of the students.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
A Hybrid Approach For Phishing Website Detection Using Machine Learning.
1. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280 Article No. X
PP XX-XX
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
1
www.viva-technology.org/New/IJRI
A Hybrid Approach For Phishing Website Detection Using
Machine Learning.
Harsh Kansagara1
, Vandan Raval2
,Faiz Shaikh3
, Prof. Saniket Kudoo4
1
(Department of Computer Engineering, Mumbai University, India)
Email: 17301081harsh@viva-technology.org
2
(Department of Computer Engineering, Mumbai University, India)
Email: 17301044vandan@viva-technology.org
3
(Department of Computer Engineering, Mumbai University, India)
Email:17305018faiz@viva-technology.org
4
(Department of Computer Engineering, Mumbai University, India)
Email: saniketkudoo@viva-technology.org
Abstract: In this technical age there are many ways where an attacker can get access to people’s sensitive
information illegitimately. One of the ways is Phishing, Phishing is an activity of misleading people into giving
their sensitive information on fraud websites that lookalike to the real website. The phishers aim is to steal
personal information, bank details etc. Day by day it’s getting more and more risky to enter your personal
information on websites fearing that it might be a phishing attack and can steal your sensitive information.
That’s why phishing website detection is necessary to alert the user and block the website. An automated
detection of phishing attack is necessary one of which is machine learning. Machine Learning is one of the
efficient techniques to detect phishing attack as it removes drawback of existing approaches. Efficient machine
learning model with content based approach proves very effective to detect phishing websites.
Our proposed system uses Hybrid approach which combines machine learning based method and content
based method. The URL based features will be extracted and passed to machine learning model and in content
based approach, TF-IDF algorithm will detect a phishing website by using the top keywords of a web page. This
hybrid approach is used to achieve highly efficient result. Finally, our system will notify and alert user if the
website is Phishing or Legitimate.
Keywords- Content-based approach, Machine learning, Phishing detection, Random Forest, TF-IDF.
I. INTRODUCTION
In this technical era, the people are interconnected with each other by means of internet with the help of the
electronic devices like computers, laptops. One of the major cyber threats now is Phishing, where the attackers
use illegitimate websites to obtain victims credentials and use it .Most phishing attacks work by creating a
phony version of the real site’s web interface to achieve the user’s trust. Even for cautious users it’s sometimes
difficult to detect phishing attack. Because the attackers attempt to gain your trust by using the same user
interface, almost same URL and cloned websites.
Machine learning is a system that provides the ability to automatically learn and improve from its experience
without being overtly programmed. In our algorithm the process of learning begins from with training dataset in
order to look for patterns in data and make better decisions in the future. The main benefit of the algorithm is to
allow the system to learn and decide automatically whether the website is phishy or legitimate. In terms of
machine learning larger number of data will increase the accuracy of the model significantly. So the prediction
of phishing websites will deal with larger dataset to train the model for better accuracy.
Random forest is a machine learning algorithm which is used for both classification as well as regression.
But still, it is mostly used for classification problems. We know that a forest is made up of many trees and more
trees means more robust forest. It is an ensemble method which has many trees so it is better than a single
2. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280 Article No. X
PP XX-XX
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
2
www.viva-technology.org/New/IJRI
decision tree. Ensembles are a method to improve performance using divide-and-conquer approach. The main
principle in ensemble methods is that we can form strong learner when a group of weak learners come together.
Hence, we will use Random Forest algorithm for the classification of URL features.
There are various types of Phishing attacks which are been used by the attackers for various domains for
different purpose. Since phishing is such a widespread problem in the cyber-security domain, there is a necessity
of phishing website detection.
II. RELATED WORKS
Peng yang,Guangzhen Zhao, Peng zen [1] have proposed a multidimensional feature phishing detection
approach based using deep learning.They use character sequence features of the given URL for quick
classification by deep learning. Then they combine URL statistical features, webpage code features, webpage
text features, and the quick classification result of deep learning into multidimensional features. The deep
learning algorithms they used to train and test were CNN-LSTM algorithm. The classifier used in the
multidimensional feature algorithm is the XGBoost (eXtreme Gradient Boosting) ensemble learning algorithm,
which has high classification accuracy. The approach can limit the detection time for setting a threshold. They
test on a dataset containing millions of phishing URLs and legitimate URLs. By reasonably altering the
threshold, the experimental outcomes show that the detection efficiency can be improved.
Mohammed HazimAlkawaz, Stephanie Joanne Steven, Asif Iqbal Hajamydeen [2] has designed a software to
show awareness of the extensive level of its functionality, features that can be displayed in the monitoring era.
The system fosters many features in comparison of other software. Its unique features such as capturing
blacklisted URL’s from the browser directly to verify the validity of the website, notifying user on blacklisted
websites while they are trying to access through pop-up, and also notifying through email. This system will
assist user to be alert when they are trying to access a blacklisted website.
Huaping Yuan, Xu Chen, Yukun Li, Zhenguo Yang, Wenyin Liu [3] they propose to extract features from
URLs and webpage links to detect phishing websites and their targets. Moreover to the basic features of a
particular URL, such as suspicious characters, length, number of dots, a feature matrix is
also built from these basic features of the links in the given URL’s webpage. Furthermore, they extract certain
statistical features from each column of the feature matrix, such as mean, median, and variance. Lexical features
are also removed from the given URL, the links and content in the webpage, such as title and textual content. A
number of ML models have been explored for phishing detection, among which Deep Forest model which
shows competitive performance, achieving a true positive rate of 98.3% and a false alarm rate of 2.6%. In
particular, they aimed an effective strategy based on search operator via search engines to find the phishing
targets, which attains an accuracy of 93.98%.
F.C. Dalgic, A.S. Bozkir and M. Aydos [4] have proposed vision based brand prediction approach that aimed
to build a phishing web page detection and recognition system that is scalable and robust. Hence, use of
numerous MPEG-7 and MPEG-7 like solid colour descriptors have been proposed. Furthermore, along with
being invariant to input image size, they also recommended the extraction and utilization of color based
information via coarse- to-fine multi-level spatial patch pyramid. The recommended approach presents a trivial
schema serving competitive accuracy and higher feature extraction and inferring speed that can be used as a
browser plugin or mobile device phishing protector.
SrushtiPatil, SudhirDhage [5] have reviewed various anti-phishing approaches. All methods are debated to
give a clear conscience of already existing techniques, their limitations and possible improvements. Next they
analyzed the in-use anti phishing tools available for free. Next they described the most important steps to build
an efficient anti-phishing model with the help of architecture diagram. Finally they compared the models using
all the 5 types of approaches based on the number of features used, accuracy and size of dataset.
S. Parekh, D. Parikh, S. Kotak and S. Sankhe [6] they have created a solution to detect phishing websites by
using the URL detection method using Random Forest algorithm. They have used 3 major phases such as
Parsing, Heuristic Classification of data, Performance Analysis in their model and each phase makes use of a
different algorithm or technique for processing of data to give better results. They also used Rstudio for better
analysis. The dataset used in this was from Phish Tank and it was spilt into 70% and 30%. The 70% data was
considered for training and 30% for testing. They managed to get 95% accuracy and thus Random Forests was
chosen for classification.
Jian Mao , Wenqian Tian, Pei Li , Tao Wei, and Zhenkai Liang [7] have propose a robust phishing detection
approach, Phishing-Alarm, based on CSS features of web pages. They identify CSS features using different
methods, as well as algorithms to efficiently evaluate page similarity. They prototyped Phishing-Alarm as an
extension to the Google Chrome browser and demonstrated its effectiveness in evaluation using real-world
phishing samples.
3. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280 Article No. X
PP XX-XX
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
3
www.viva-technology.org/New/IJRI
T. Nathezhtha, D. Sangeetha and V. Vaidehi [8] have proposed an attack detection method named as Web
Crawler based Phishing AttackDetector (WC-PAD) which is a three step method. It takes the web traffics, URL
as input features, based on the features classification it classifies website as phishing and non-phishing. Firstly,
the DNS Blacklist based detection is done. Secondly a Web Crawler based detection is accomplished followed
by Heuristics based detection. The trial analysis of the proposed WC-PAD is done with datasets contain actual
phishing cases. The implementation results give 98.9% accuracy for the proposed approach in both phishing and
zero-day phishing attack detection.
Altyeb Altaher [9] have proposed a hybrid approach for classifying the websites as Phishing, Legitimate or
Suspicious, the proposed approach combines the K-nearest neighbors (KNN) algorithm with the support vector
machine algorithm (SVM). The proposed approach combines the effectiveness and simplicity of KNN with
SVM. Thus, the proposed approach gains the advantages of combining KNN with SVM to avoid the drawbacks
when they are used separately. The experimental results give an accuracy of 90.04% for the proposed KNN with
SVM approach.
S. Haruta, H. Asahina and I. Sasase [10] have proposed an approach of visual similarity-based phishing
detection scheme using CSS and image. They focus on the fact that attackers often steal legitimate websites’
CSS to mimic the legitimate websites. They regarded the websites which have hyperlinks from other websites as
legitimate. By detecting the website which has similar appearance to legitimate websites or plagiarizes CSS of
legitimate website, they detected phishing website and its target simultaneously. Moreover, by using CSS, they
alleviated the shortcoming of locally different images which causes false negative in the conventional scheme.
By the computer simulation, they showed their scheme achieves about 80% of detection accuracy and finds 72.1
% famous phishing target.
Christou, O. Pitropakis, N. Papadopoulos, P. McKeown, S. and Buchanan [11] have proposed a system
where malicious domain name datasets were used to make predictions on their nature using the Random Forests
and SVM algorithms. It uses an an automated filtration system that provide's a list of possibly malicious URLs,
narrowing the list down and reducing the human input required to spot them,Machine Learning techniques use
features extracted from the URLs and their DNS data to analyse and detect whether they are malicious or
legitimate.
S. Roopak, A. P. Vijayaraghavan and T. Thomas [12] have extracted the relevant rules based on webpage
source code and Secure Socket Layering (SSL) based features from a training dataset using Repeated
Incremental Pruning to Produce Error Reduction (RIPPER) algorithm. Further, they check for the presence of
these rules in a test dataset. Their experimental results show that this approach can identify phishing websites
with an accuracy of 0.92%.
H. Yuan, X. Chen, Y. Li, Z. Yang and W. Liu [13] they propose to extract features from URLs and webpage
links to detect phishing websites and their targets. Furthermore, they extract certain statistical features from each
column of the feature matrix, such as mean, median, and variance. Lexical features are too extracted from the
given URL. Furthermore, the content and links in its webpage such as textual title and content were also
extracted. A number of machine learning models were explored for phishing detection, among which Deep
Forest ML model which showed competitive performance, achieving a true positive rate of 98.3% and a false
alarm rate of 2.6%. They designed an effective strategy based on search operators to find phishing URLs, which
achieved an accuracy of 93.98%.
M. Sameen, K. Han and S. O. Hwang [14] they have designed an ensemble machine learning-based detection
system called PhishHaven. It helps to identify AI-generated as well as human-generated phishing URLs. Their
technique uses lexical analysis for feature extraction. To further enhance lexical analysis, they introduced 3
classification procedures. Firstly, they have URL HTML Encoding which helps to classify URL on-the-fly and
proactively compare with some of the existing methods. Secondly, a URL Hit approach to deal with tiny URLs,
which is an open problem yet to be solved. Lastly, the final classification of URLs is made on an unbiased
voting mechanism in PhishHaven, which aims to avoid misclassification when the number of votes is equal.
They have used a benchmark dataset of 100,000 phishing and normal URLs achieving 98.00% accuracy
outperforming existing systems.
H. Chapla, R. Kotak and M. Joiser [15] in this paper they designed a framework of phishing detection using
URL and Fuzzy Logic as Classifier. They used MatLab tool to code the program which can extract the features
from the entered URL. They have different feature sets, out of those they extracted some features based on URL
of website.They used 11 features for classification of phishing websites. The fuzzy classifier implemented using
MatLab in this paper achieved an accuracy of 91.46%.
4. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280 Article No. X
PP XX-XX
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
4
www.viva-technology.org/New/IJRI
III. METHODOLOGY
Phishing websites can easily trick us into submitting private/financial information into their system dealing
us a huge personal/financial loss. The aim of our project is to successfully classify websites as phishing websites
or legitimate websites.
The system which we are going to use will be Random Forest Algorithm and TF-IDF approach for detecting
phishing websites. Few websites can easily be classified as phishing just by looking at it, while few need a
deeper analysis. We will carefully extract certain features from URLs which will be used for training and testing
inputs. These inputs will be divided into four arrays namely training input, training output, testing input, and
testing output. Training input array will be used for training our Random Forest Classifier. We'll then use testing
input to test our classifier.
TF-IDF (Term Frequency-Inverse Document Frequency) is a numerical statistical method which evaluates
the significance of a particular word in a document. It is an information retrieval technique that weighs a term's
frequency (TF) and its inverse document frequency (IDF). Each word or term will have its own unique TF and
IDF scores. The product of TF and IDF scores of a term is called the TF-IDF weight of that term. We will
calculate the TF-IDF scores of the words appearing in the webpage. The frequently occurring words or the
words with the highest TDF-DF scores will be passed as a query in the search engine for results.
Figure 1: System Flow Diagram
IV. RESULT
This chapter provides the partial implementation and the screenshots of the results. For now, we have created
the content based side of our project. The content based side will work in such a way that when we feed a URL
into the program, it will scrape that webpage and find out the domain name of that URL, the title of that
webpage and top 3 most frequently used words from the webpage. All of these things combined will make a
query. This query will be fed into the search engine and will retrieve the top 10 URL’s that pop up. Queries will
be calculated for each of these 10 URL’s and then will be compared to the original query of the URL which we
gave the program. If the query matches, then the webpage can be considered as legitimate otherwise phishing.
5. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280 Article No. X
PP XX-XX
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
5
www.viva-technology.org/New/IJRI
V. CONCLUSION
This system tries to make a safe environment for browsing websites by detecting phishing websites keep the
user safe. Or else, the user might end up giving his credentials to the phisher’s which can lead to huge losses.
This project also aims to implement the detection of the phishing websites using Machine Learning. This task
will be done by using hybrid approach extracting the features of the website and notify the users that the website
is legitimate of phishing. This system aims to achieve high detection performance.
Acknowledgements
We would like to express a deep sense of gratitude towards our guide Prof.Saniket Kudoo, Computer
Engineering Department for his constant encouragement and valuable suggestions. The work that we were able
to present is possible because of his timely guidance We would like to pay gratitude to the panel of examiners
Prof. Sunita Naik, Dr. Tatwadarshi P. N., Prof. Dnyaneshwar Bhabad, & Prof. Vinit Raut for their time, effort
they put to evaluate our work and their valuable suggestions time to time. We would like to thank Project Head
of the Computer Engineering Department, Prof. Janhavi Sangoi for her support and co-ordination. We would
like to thank Head of the Computer Engineering Department, Prof. Ashwini Save for her support and
coordination. We are also grateful to teaching and non-teaching staff of Computer Engineering Department who
lend their helping hands in providing continuous support.
REFERENCES
[1] P. Yang, G. Zhao and P. Zeng, "Phishing Website Detection Based on Multidimensional Features Driven
by Deep Learning" IEEE Access, vol. 7, 2019, pp. 15196-15209.
[2] M. H. Alkawaz, S. J. Steven and A. I. Hajamydeen, "Detecting Phishing Website Using Machine
Learning," IEEE International Colloquium on Signal Processing & Its Applications (CSPA),, 2020, pp.
111-114.
[3] H. Yuan, X. Chen, Y. Li, Z. Yang and W. Liu, "Detecting Phishing Websites and Targets Based on URLs
and Webpage Links," International Conference on Pattern Recognition (ICPR), 2018, pp. 3669-3674.
[4] F. C. Dalgic, A. S. Bozkir and M. Aydos, "Phish-IRIS: A New Approach for Vision Based Brand
Prediction of Phishing Web Pages via Compact Visual Descriptors," International Symposium on
Multidisciplinary Studies and Innovative Technologies (ISMSIT), 2018, pp. 1-8.
[5] S. Patil and S. Dhage, "A Methodical Overview on Phishing Detection along with an Organized Way to
Construct an Anti-Phishing Framework," International Conference on Advanced Computing &
Communication Systems (ICACCS), 2019,pp. 588-593.
[6] S. Parekh, D. Parikh, S. Kotak and S. Sankhe, "A New Method for Detection of Phishing Websites: URL
Detection," International Conference on Inventive Communication and Computational Technologies
(ICICCT), 2018, pp. 949-952.
[7] J. Mao, W. Tian, P. Li, T. Wei and Z. Liang, "Phishing-Alarm: Robust and Efficient Phishing Detection via
Page Component Similarity," IEEE Access, vol.5, 2017, pp. 17020-17030.
[8] T. Nathezhtha, D. Sangeetha and V. Vaidehi, "WC-PAD: Web Crawling based Phishing Attack Detection,"
International Carnahan Conference on Security Technology (ICCST), 2019, pp. 1-6.
[9] Taha, Altyeb. “Phishing Websites Classification using Hybrid SVM and KNN Approach” International
Journal of Advanced Computer Science and Applications , Volume 8 Issue 6, 2017.
[10] S. Haruta, H. Asahina and I. Sasase, "Visual Similarity-Based Phishing Detection Scheme Using Image
and CSS with Target Website Finder," IEEE Global Communications Conference, 2017, pp. 1-6.
[11] Christou, O.; Pitropakis, N.; Papadopoulos, P.; McKeown, S. and Buchanan, "Phishing URL Detection
Through Top-level Domain Analysis: A Descriptive Approach", International Conference on Information
Systems Security and Privacy, Volume 1: ICISSP, 2020, pp. 289-298.
[12] S. Roopak, A. P. Vijayaraghavan and T. Thomas, "On Effectiveness of Source Code and SSL Based
Features for Phishing Website Detection," International Conference on Advanced Technologies in
Intelligent Control, Environment, Computing & Communication Engineering (ICATIECE), 2019, pp. 172-
175.
[13] H. Yuan, X. Chen, Y. Li, Z. Yang and W. Liu, "Detecting Phishing Websites and Targets Based on URLs
and Webpage Links," International Conference on Pattern Recognition (ICPR), 2018, pp. 3669-3674.
6. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280 Article No. X
PP XX-XX
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
6
www.viva-technology.org/New/IJRI
[14] M. Sameen, K. Han and S. O. Hwang, "PhishHaven—An Efficient Real-Time AI Phishing URLs Detection
System," IEEE Access, vol. 8, 2020, pp. 83425-83443.
[15] H. Chapla, R. Kotak and M. Joiser, "A Machine Learning Approach for URL Based Web Phishing Using
Fuzzy Logic as Classifier," International Conference on Communication and Electronics Systems , 2019,
pp. 383-388.