SlideShare a Scribd company logo
Web Defacement and Intrusion Monitoring Tool:
WDIMT
Mfundo Masango∗, Francois Mouton†, Palesa Antony ‡ and Bokang Mangoale§
Command, Control and Information Warfare
Defence, Peace, Safety and Security
Council for Scientific and Industrial Research
Pretoria, South Africa
∗Email: gmasango@csir.co.za
†Email: moutonf@gmail.com
‡Email: pantony@csir.co.za
§Email: bmangoale@csir.co.za
Abstract—Websites have become a form of information dis-
tributors; usage of websites has seen a significant rise in the
amount of information circulated on the Internet. Some busi-
nesses have created websites that display services the business
renders or information about that particular product; businesses
make use of the Internet to expand business opportunities or
advertise the services they render on a global scale. This does
not only apply to businesses, other entities such as celebrities,
socialites, bloggers and vloggers are using the Internet to expand
personal or business opportunities too. These entities make use
of websites that are hosted by a web host. The contents of the
website is stored on a web server. However, not all websites
undergo penetration testing which leads to them being vulnerable.
Penetration testing is a costly exercise that most companies or
website owners find they cannot afford. With web defacement still
one of the most common attacks on websites, these attacks aim
at altering the content of the web pages or to make the website
inactive. This paper proposes a Web Defacement and Intrusion
Monitoring Tool, that could be a possible solution to the rapid
identification of altered or deleted web pages. The proposed tool
will have web defacement detection capabilities that may be used
for intrusion detection too. The proposed solution will also be
used to regenerate the original content of a website after the
website has been defaced.
Keywords—Commands, Intrusion Detection, Self-Healing, Web
Defacement, Web Monitoring.
I. INTRODUCTION
The amount of information available on the Internet is
vast. Web pages on the Internet support the dissemination
of vast amounts of information, and render this information
accessible on a global scale through a web browser. A web
page is a document that is displayed in a web browser. This
web page is usually part of a website, which is a collection
of interconnected web pages. A website is hosted on a web
server, which is a computer that hosts a website on the Internet
[1].
The information and content presented on the web pages
attracts a wide audiences who are free to use the information
for both benign and malicious purposes. Attackers target
websites for a wide range of malicious purposes, including
but not limited to defacement, personal financial gain, content
manipulation and extraction of protected information [2], [3].
Website defacement is still one of the most common attacks
in cyber space [4]. The defacement of a website happens when
the content of the website is altered or a web page has been
visually altered. The altering of the content could result in the
website being inactive. The common targeted websites that are
being defaced include religious and government sites, where
hackers display a resentment of the political or religious views,
therefor defacing the website to get their own opinion out
there [5]. Figure 1 gives a visual representation on the reported
number of websites that have been defaced from the year 2009
to the year 2017 [6].
This paper proposes a Web Defacement and Intrusion
Monitoring Tool (WDIMT) that is able to detect defacement,
authorize changes directly and conduct regeneration of a
website’s original content after internal penetration testing has
been conducted. This will allow for the rapid identification
of web pages that have been altered or deleted. Furthermore
the tool is able to rapidly remove any unidentified or intruder
files. The WDIMT makes use of the Linux terminal for
execution of commands that will access the tools capabilities.
A web page has a graphical user interface, which gives visual
representation of the files that are currently being monitored.
The tool is not only used for monitoring the defacement of
a website, the tool will be providing a possible solution to
the amount of time it takes to recover the original contents
of a web page. Furthermore the tool has intrusion detection
capabilities which will allow the tool to identify any intruder
files that have been inserted into a website.
The paper is structured as follows. Section II gives back-
ground on tools and techniques that were proposed prior to
other research and the development of the WDIMT. Section
III discusses the structure of the WDIMT which includes
the system architecture, system requirements, system features,
system flow diagram and the system usage. Section V discusses
the advantages and limitations of the WDIMT and proposes
some scenarios which illustrates the usage of the WDIMT.
Section IV discusess the analysis process of the WDIMT.
Section VI concludes the paper and discusses potential future
work.
2017 International Conference on Cyberworlds
978-0-7695-6215-5/17 $31.00 © 2017 IEEE
DOI 10.1109/CW.2017.55
72
2017 International Conference on Cyberworlds
978-0-7695-6215-5/17 $31.00 © 2017 IEEE
DOI 10.1109/CW.2017.55
72
2017 International Conference on Cyberworlds
978-1-5386-2089-2/17 $31.00 © 2017 IEEE
DOI 10.1109/CW.2017.55
72
Figure 1. Defaced websites from the year 2009–2017 [6]
II. BACKGROUND
This section discusses related work which includes “Web
injection attacks”, “Implementing a web browser with web de-
facement detection technique”, “Detection of web defacement
by means of Genetic Programming”, “A Web Vulnerability
Scanner”, and “SQL-injection vulnerability scanning tool for
automatic creation of SQL-injection attacks”.
Medvet et al [7] proposes that “Most Web sites lack a
systematic surveillance of their integrity and the detection of
web defacement’s is often dependent on occasional checks
by administrators or feedback from users”. Different website
defacement tools have been proposed using techniques such as
reaction time sensors and cardinality sensors [7], [8].
A. Web Injection Attacks
Morgan [9] discusses code injection and specifically cross
site scripting. Code injection refers to the malicious insertion
of processing logic to change the behaviour of a website to the
benefit of an attacker. Furthermore, Morgan identified cross site
scripting as a code injection attack which is a very common
attack that executes malicious scripts into a legitimate website
or web application. Mitigating against code insertion is also
discussed in this article, which proposes stages on how to
possibly prevent code injection/insertion attacks.
B. Implementing A Web Browser With Web Defacement De-
tection Techniques
Kanti et al [10] discusses a prototype web browser which
can be used to check the defacement of a website, also
proposing a recovery mechanism for the defaced pages using
a checksum based approach. The proposed algorithm in the
study was used for defacement detection; the algorithm was
implemented in the prototype web browser that has inbuilt
defacement detection techniques. The web browser would
periodically calculate the checksum for the defacement de-
tection technique’s frequency which is based on the saved
hash code or checksum for each web page. Upon comparing
their algorithm against existing integrity based web defacement
detection methods, they found their algorithm to be detecting
approximately 45 times more anomalies than other web de-
facement detection methods.
C. Detection of Web Defacement By Means of Genetic Pro-
gramming
Medvet et al [7] discusses how Genetic Programming (GP)
was used to establish a evolutionary paradigm for automatic
generation of algorithms for detecting web defacement. GP
would build an algorithm based on a sequence of readings
of the remote page to be monitored and on a sample set of
attacks. A framework that was proposed prior to this paper
was used to prove the concept of a GP oriented approach
to defacement detection. A specialised tool for monitoring
a collection of web pages that are typically remote, hosted
by different organisations and whose content, appearance and
degree of dynamism was not known prior to this research.
Medvet et al conducted tests by executing a learning phase
for each web page for constructing a profile that will be used
in the monitoring phase. During implementation when the
reading failed to match the profile created during the learning
phase, the tool would raise alerts and send notifications to the
administrator.
D. SQL-Injection Vulnerability Scanning Tool For Automatic
Creation of SQL-Injection Attacks
Alostad et al [11] discusses the development of a web
scanning tool that has enhanced features that was able to
conduct efficient penetration testing on Hypertext Preprocesses
(PHP) based websites to detect SQL injection vulnerabilities.
The MySQL1injector tool, as proposed by Alostad et al,
would automate and simplify the penetration testing process
73
73
73
(as much as possible). The proposed tool utilises different
attacking patterns, vectors and modes for shaping each attack.
The tool is also able to conduct efficient penetration testing
on PHP based websites in order to detect SQL injection
vulnerabilities and subsequently notify the web developers of
each vulnerability that needs to be fixed. The tool houses a
total number of 10 attacking patterns, if one pattern failed to
expose a vulnerability the others would succeed in tricking the
web server if it is vulnerable. The tool is operated remotely
through a client side interface. MYSQL1Injector tool allows
web developers, that are not trained in penetration testing,
an opportunity to conduct penetration testing on their web
database servers.
E. A Web Vulnerability Scanner
Kals et al [12] discusses vulnerabilities that web applica-
tions are exposed to as a result of generic input validation
problems identifying SQL injection and cross site scripting
(XSS). Furthermore, it demonstrates how easy it is for at-
tackers to automatically discover and exploit application-level
vulnerabilities in a large number of web applications. Kals
et al proposes a web vulnerability scanner that automatically
analyses web sites with the aim of finding exploitable SQL
and XSS vulnerabilities. The proposed tool, SecuBat, has
three main components namely, crawling component, attack
component and an analysis module. The crawling component
needs to be seeded with a root web address, which will step
down the link tree, collecting all pages and included web
form. The attack component scans each page for the presence
of web forms, extracting the action address and the method
(i.e GET OR POST) used to submit the form content. The
analysis modules parses and interprets the server response after
an attack has been launched. The analysis modules use attack-
specific criteria and keywords to calculate the confidence value
for deciding if the attack was successful. The tool was tested
with both injection attacks by means of a case study.
The proposed WDIMT makes use of the discussed concepts
which include creating tools that may prevent and detect
possible code injections, developing algorithms which are
compromised with some form of checksum or hashing methods
that will be used to detect web defacement, identification of
any web pages that may have been removed by unauthorised
personnel and regeneration of the original content belonging to
a website that has been defaced. The proposed WDIMT com-
prises of functionalities used by the tools discussed prior such
as rapid identification and notification of a defaced web page
or website, regeneration of original content, rapid identification
of intruder files and unauthorised changes made on webpages.
However, the WDIMT is more client-side dependant, as a
user has full control and view of each web page belonging
to them. This tool also provides a visual representation of
any new files that were added without the users consent,
deleting them automatically which may minimise the injection
of unknown/unauthorised scripts. Manipulated files or web
pages are also visually represented allowing the user to easily
identify affected files more efficiently. The response time of
detection and reuploading of original content is rapid which
may also serve as a preventative method for a denial-of-service
attack on a website. Restoration of any deleted file or web page
in the result of an attack, will be automatically regenerated.
III. WEB DEFACEMENT AND INTRUSION MONITORING
TOOL
The Web Defacement and Intrusion Monitoring Tool
(WDIMT) is a tool that will be used for website defacement
detection. Making use of a website for displaying the cur-
rent websites defacement status. Monitoring each website’s
web pages individually, automatic regeneration of defaced or
deleted pages will be done automatically once the script is
executed in the Linux terminal. The WDIMT can detect any
defacement anomalies that have been made onto a web page
or an entire site.
A. WDIMT System Architecture
The system architecture of the WDIMT is structured into 3
layers, seen in Figure 2, presentation layer which represents the
different graphical user interfaces that will be used for display-
ing the users information and the execute of commands in the
Linux terminal; The execution of the WDIMT commands are
not limited to being specifically executed on a Linux terminal;
Business Layer where most of the communication between the
database and the presentation layer occur, this layer controls
the communication channels between the presentation layer
and the data access layer; The data access layer displays the
database which will store user information and hash of each
web page. How the different layers communicate with each
other is displayed in Figure 2.
B. WDIMT Requirements
Using the WDIMT currently requires a user to have a
Windows machine and the Linux machine. Both machines and
the database were connected to the same network, this was
done for immediate communication among the two machines
and the database. The Linux machine is used to execute the
commands and the Windows machine was used to access the
WDIMT web page for registering and viewing a users data.
The communication among the machine and the database are
illustrated on Figure 3. A user has a option of using either a
desktop computer or a laptop, or mixing the the devices used.
The requirements presented below are a bare minimum
of the system specifications used when the WDIMT was
being developed. The WDIMT requirements are subject to
change depending on a users machine. A user may have
system specifications that exceed or are below the minimum
requirements and this will not affect the performance and
functionality of the WDIMT.
The following section illustrates the minimum requirements
for each machine:
Linux Machine:
• Linux Ubuntu v14.04 or later.
• Hypertext Preprocessor (PHP).
• 80GB Hard drive.
• 2GB RAM.
• Intel Core i7 Extreme QUAD CORE 965 3.2GHz.
Windows Machine:
74
74
74
Figure 2. System architecture of the WDIMT.
• Windows 7 or later.
• 80GB Hard drive.
• 2GB RAM.
• Intel Core i7 Extreme QUAD CORE 965 3.2GHz.
Figure 3. Connection between the Windows and Linux Machine.
C. WDIMT Features
The WDIMT makes use of the Linux terminal for executing
the script that will hash an entire site’s web pages, the hashes
are stored into a database. The database is hosted off-site, away
from a website’s web server.The hashes are used to calculate
the hash of each web page individually every time the script
is executed in the Linux terminal. The script will be appended
with various commands that will initialise, verify, force and
delete. A user will be able to use the WDIMT’s website to
view the status of each web page which is represented by a
colour schema on the website, green indicating that a web
page has not been altered in any form, red indicating that a
web page has been removed/deleted and yellow indicating that
a web page has been altered as seen in Figure 4.
Figure 4. The home page of WDIMT.
1) Initialise: The initialise command is executed with a ‘-
i’. This command is used to read the full folder of a website,
hashing each web page found, the stored hash will be used at a
later stage by the verify command. A copy of the web page is
also made and stored onto the database that is hosted off-site.
The stored copies are retrieved when the original content of a
web page is reloaded.
This command sets the status of a file to green and sets
the reloading of the original content to a default value where
the original content will not be reloaded. The reloading value
may be altered by a user on the WDIMT web page.
2) Verify: The verify command is executed with a ‘-v’.
This command is used to verify the hashes of the web pages
and check if any form of defacement has been detected. If
the hash of any web page has been changed, the user will
be able to identify the defaced web page of a particular page
both on the Linux terminal and WDIMT web page as seen in
Figure 11. If a user would like to have the original content of
the identified web page reloaded. The user will have to click
on the “More Info” link. This link will redirect the user to
a web page which has the functionality of changing a flag
on the identified web page. The flag option gives a user the
option of having the original content of the identified web page
75
75
75
reloaded with its original content as seen in Figure 6. After a
user has selected that the original content of the web page be
reloaded, the verify command will need to be executed once
more in the Linux terminal in order for the status of the file
to change from yellow to green indicating that the content of
the web page has been reloaded. This command also has the
functionality of identifying any intruder files and these intruder
files are removed immediately. Any file that has been deleted
is recreated and the file’s status is identified by the colour red
on the WDIMT’s web page.
Figure 5. WDIMT web page displaying a file that has been altered.
Figure 6. WDIMT’s web page with flag options.
3) Force: The force command is executed with a ‘-f’. This
command overrides the verify command, as it reloads the
original content of every web page. The comparing of hashes is
used in order to identify any altered web page, the comparison
process is executed within this command. Any intruder files
that have been identified will be removed immediately. Files
that have been deleted will be regenerated and all file statuses
will be set to green. The command does not require a user
to have the option of reloading the original content set as the
command bypasses any option a user may have selected.
4) Delete: The delete command is executed with a ‘-d’.
This command is used to delete all the web pages that belong
to a particular website that a user has specified. This identifies
the website’s full folder path, a user has provided and deletes
all the content that has been stored. This includes the copies
and web page hashes that were generated when the initialise
command was executed for the first time. This will clear all
instances on the database. When a user accesses the WDIMT’s
web page they will have no files displayed to them as they have
deleted all instances.
D. WDIMT Flow Diagram
The commands are in a sequential flow, with initialise
being the first step required as the user will be registered and
providing a path to the web pages which need do be monitored.
Once the files have been successfully hashed, copied and
stored. The user will be able to view the files that they have
uploaded on the WDIMT web page. The verify command may
be executed once a user has files uploaded. This command
will check each individual web page for defacement, once
defacement has been detected the web pages original content
may be re-uploaded. A user may prefer to delete their content
from the WDIMT. The delete command will delete all the web
pages that belong to a user. Furthermore the force command
will execute the delete command followed by the initialise
command. The flow diagram of the commands is seen in Figure
7.
Figure 7. Flow diagram of command execution WDIMT.
E. WDIMT Usage
A user will be required to use the Linux terminal where the
user will be providing a full path to the location of a websites
web pages which will be monitored. The Linux terminal will
also be used for executing of the WDIMT script that will be run
with additional commands for initialising, verifying, forcing
and deleting of web pages belonging to the user.
After successful registration on the WDIMT website, a user
will need to use a Linux terminal to execute the script with a
initialise command, where a user will provide the full path to
the location of the website’s web pages as seen in Figure 8.
IV. WDIMT ANALYSIS
This section analysises the data generated when the
WDIMT’s php script is executed in the Linux terminal. The
script is executed with different commands that have been
discussed in the previous section. The following subsections
identify the different outputs generated when the script is
executed with the verify command.
76
76
76
Figure 8. Linux terminal commands.
A. WDIMT Script
The WDIMT’s php script that is executed in the Linux
terminal has a number of functions. Most of these functions
are executed once the verify command has been instantiated.
The script is executed with the different options that were
discussed in the previous section. The options invoke different
methods and functions within the script, one function the
hashes all the file and web pages is shown in Figure 9.
Figure 9. Code snip of a function on the WDIMT script.
B. Intruder Files
Files that have been identified as intruder files are removed
immediately when the verify command is executed, possibly
minimising any chance of intruder files going undetected.
A user will be able to identify these intruder files on the
Linux terminal once the verify command has been executed
as seen in Figure 10.
C. Changed Files
Web pages and file that have been altered by an unautho-
rised personnel are identified on the Linux terminal, once the
verify command has been executed as seen in Figure 11.
The altered web pages and files will only be reuploaded
with their original content once a user has changed the flag
option of the affected web page or file and executed the verify
command again, after the flag option has been altered on the
WDIMT’s web page.
Figure 10. Identified intruder files.
Figure 11. Identified changed files.
D. Removed Files
Web pages and files that have been removed by unautho-
rised personnel are identified and regenerated in the respective
directory where the file is located.
On the Linux terminal the user will be able to identify
these files and also confirm that the files have been regenerated
into the original path where the file was located, possibly
77
77
77
minimising the chances of the website’s being unavailable as
seen in Figure 12.
Figure 12. Identified removed files.
V. DISCUSSION
Numerous web defacement monitoring tools have been
developed, offering different web defacement detection tech-
niques. Some of these techniques are being used in real
life scenarios. The WDIMT has automated swift detection
of defacement, swift notifications of defacement and swift
reuploading of a web pages original content.
The strength of the WDIMT lays in its swift detection of
defacement of an entire website’s web pages, identifying each
defaced web page individually, swiftly notifying a user which
web pages have been defaced. The WDIMT rapidly re-uploads
the original content of a defaced web page, visiually identifies
which web pages were defaced and provides a user with flag
options which will be used to indicate whether the web pages
original content should be reuploaded or left in the current
defaced state. The WDIMT availability is also a strength as a
user is able to identify the affected web pages on the Linux
terminal without accessing the WDIMT website.
The WDIMT has some known limitations. For example,
the usage of the Linux terminal for executing the commands
associated with the WDIMT, this will require a user to have a
computer that has the Linux operating system installed on it.
This limits user’s that only have Windows operating system
installed on their computers. Authentication on the Linux
terminal commands does not require a user to authenticate
themselves before executing a command, this may raise se-
curity concerns as we are assuming that the person executing
the commands is the website’s owner. Executing the commands
requires a manual approval rather than being automated.
Further development on the WDIMT would increase the
tools swift defacement detection, notifications and reupload-
ing of original content. The following subsections provide
scenarios which will illustrate use cases of the WDIMT.
These scenarios show that the WDIMT being proposed has
capabilities beyond just monitoring of a user’s web pages,
regeneration of web pages after penetration testing has been
conducted, allowing user’s to visually identify the affected web
pages giving full control over reuploading of the web page’s
original content.
A. Detection of Intruder Files
A company has a website that advertises the services
they offer. The websites administrator has access to some of
the company’s confidential information. The information is
accessed through SQL commands. An intruder could put a
file that will access or duplicate the SQL commands which
could possibly give the intruder administrative rights.
The intruder’s file could be inserted deep into the websites
folder structure, which may not be visible when manually
searching through the folders. The tool can be used to identify
the intruder file and immediately remove it.
B. Detection of Altered Web Pages
The local government elections are about to commence is
the country. Parties involved in the elections could possibly
sabotage another party’s website which contains the party’s
campaign information. A party could possibly render the
services of a hacker that will alter the opposition party’s
website.
The hacker could possibly alter all web pages to give false
information about the party. The tool can be used to identify
the web pages that have been altered by the hacker and re-
upload the original content of the pages that were altered.
C. Detection of Deleted Web Pages
A web developer whom was previously employed by a
company for web development services is bitter after dismissal.
The web developer gained access to the company’s website and
has deleted some web pages thus making the website inactive.
The inactive website could possibly cost the company a
undisclosed amount of money. The tool could be used to
identify deleted web pages, swiftly regenerate any of them
and reupload each web page’s original content.
VI. CONCLUSION
Accessing the Internet has been made easier with the
advancement of technology. The volume of information that is
being distributed on the Internet may increase daily. With this
information being accessed daily it is not likely that everyone
making use of the Internet may be using it for legal purposes.
The WDIMT is a web defacement detection tool that
monitors web pages for defacement, which makes use of
the Linux terminal command which is used for executing
commands and a web page that will visually represent all
web pages belonging to a user. Upon executing commands
in the Linux terminal, the WDIMT will be able to detect any
78
78
78
defacement that may have occurred or is currently in progress.
If any defacement was detected it will be visually represented
on the WDIMT’s web page which gives a user the option
to identify a defaced web page and have the original content
of that web page re-uploaded. Swift reuploading of a web
pages original content is one strong point of the WDIMT. The
proposed tool aligns well with the scenarios identified in the
previous section, as the tools full capacity and all its features
are used in fulfilment of the identified scenarios.
Future work will be done on the WDIMT to allow the
commands to be executable on a Windows machine. An impact
study will be done using data gathered from the usage of the
WDIMT. This gathered data may be used for analytic purposes
that may assist in identification of which web pages get
commonly defaced. For future avenues of research, it would
be useful for a user to be able to have a mobile application
that will allow the same functionalities as the WDIMT’s web
page.
REFERENCES
[1] Lady Ninja86. (2016, Dec.) What is the differ-
ence between webpage, website, web server, and
search engine? Mozilla Developer Network. [Online].
Available: https://developer.mozilla.org/en-US/docs/Learn/Common-
questions/Pages-sites-servers-and-search-engines
[2] T. Perez. (2015) Why websites get hacked. Sucuri Inc.
[Online]. Available: https://blog.sucuri.net/2015/02/why-websites-get-
hacked.html
[3] G. Davanzo, E. Medvet, and A. Bartoli, “Anomaly detection techniques
for a web defacement monitoring service,” Expert Systems with Appli-
cations, vol. 38, no. 10, pp. 12 521–12 530, 2011.
[4] J. Lyon. (2014) What are the 5 most common attacks on websites?
Quora. [Online]. Available: https://www.quora.com/What-are-the-5-
most-common-attacks-on-websites
[5] Cybercrime.org.za. (2016) Website defacement definition. ISC
AFRICA. [Online]. Available: https://cybercrime.org.za/website-
defacement
[6] Zone-h. (2017, 05) zone-h.org. Zone-H. [Online]. Available:
http://www.zone-h.org/stats/ymd
[7] E. Medvet, C. Fillon, and A. Bartoli, “Detection of web defacements by
means of genetic programming,” in Information Assurance and Security,
2007. IAS 2007. Third International Symposium on. IEEE, 2007, pp.
227–234.
[8] A. Bartoli, G. Davanzo, and E. Medvet, “The reaction time to web site
defacements,” IEEE Internet Computing, vol. 13, no. 4, 2009.
[9] D. Morgan, “Web injection attacks,” Network Security, vol. 2006, no. 3,
pp. 8–10, 2006.
[10] T. Kanti, V. Richariya, and V. Richariya, “Implementing a web browser
with web defacement detection techniques,” World of Computer Science
and Information Technology Journal (WCSIT), vol. 1, no. 7, pp. 307–
310, 2011.
[11] A. B. M. Ali, M. S. Abdullah, J. Alostad et al., “Sql-injection vul-
nerability scanning tool for automatic creation of sql-injection attacks,”
Procedia Computer Science, vol. 3, pp. 453–458, 2011.
[12] S. Kals, E. Kirda, C. Kruegel, and N. Jovanovic, “Secubat:
A web vulnerability scanner,” in Proceedings of the 15th
International Conference on World Wide Web, ser. WWW ’06.
New York, NY, USA: ACM, 2006, pp. 247–256. [Online]. Available:
http://doi.acm.org/10.1145/1135777.1135817
79
79
79

More Related Content

What's hot

IRJET- Phishing Web Site
IRJET-  	  Phishing Web SiteIRJET-  	  Phishing Web Site
IRJET- Phishing Web Site
IRJET Journal
 
CSRF Attacks and its Defence using Middleware
CSRF Attacks and its Defence using MiddlewareCSRF Attacks and its Defence using Middleware
CSRF Attacks and its Defence using Middleware
ijtsrd
 
Watch Guard Reputation Enabled Defense (White Paper)Dna
Watch Guard   Reputation Enabled Defense (White Paper)DnaWatch Guard   Reputation Enabled Defense (White Paper)Dna
Watch Guard Reputation Enabled Defense (White Paper)Dna
SylCotter
 
2013 Threat Report
2013 Threat Report2013 Threat Report
2013 Threat Report
Envision Technology Advisors
 
State of Internet 1H 2008
State of Internet 1H 2008State of Internet 1H 2008
State of Internet 1H 2008
Kim Jensen
 
Websense 2013 Threat Report
Websense 2013 Threat ReportWebsense 2013 Threat Report
Websense 2013 Threat Report
Kim Jensen
 
Don’t let Your Website Spread Malware – a New Approach to Web App Security
Don’t let Your Website Spread Malware – a New Approach to Web App SecurityDon’t let Your Website Spread Malware – a New Approach to Web App Security
Don’t let Your Website Spread Malware – a New Approach to Web App Security
Sasha Nunke
 
Jean pier talbot - web is the battlefield - atlseccon2011
Jean pier talbot - web is the battlefield - atlseccon2011Jean pier talbot - web is the battlefield - atlseccon2011
Jean pier talbot - web is the battlefield - atlseccon2011
Atlantic Security Conference
 
Security risks awareness
Security risks awarenessSecurity risks awareness
Security risks awareness
Janagi Kannan
 
Content Management System Security
Content Management System SecurityContent Management System Security
Content Management System Security
Samvel Gevorgyan
 
Web Application Security Guide by Qualys 2011
Web Application Security Guide by Qualys 2011 Web Application Security Guide by Qualys 2011
Web Application Security Guide by Qualys 2011
nat page
 
How To Catch a Phish: User Awareness and Training
How To Catch a Phish: User Awareness and TrainingHow To Catch a Phish: User Awareness and Training
How To Catch a Phish: User Awareness and Training
London School of Cyber Security
 
Methods Hackers Use
Methods Hackers UseMethods Hackers Use
Methods Hackers Use
brittanyjespersen
 
Attack of the killer virus!
Attack of the killer virus!Attack of the killer virus!
Attack of the killer virus!
UltraUploader
 
Wannacry Virus
Wannacry VirusWannacry Virus
Wannacry Virus
East West University
 
security_secure_pipes_frost_whitepaper
security_secure_pipes_frost_whitepapersecurity_secure_pipes_frost_whitepaper
security_secure_pipes_frost_whitepaper
Alan Rudd
 
IRJET- Phishing Website Detection System
IRJET- Phishing Website Detection SystemIRJET- Phishing Website Detection System
IRJET- Phishing Website Detection System
IRJET Journal
 
WHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack Group
WHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack GroupWHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack Group
WHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack Group
Symantec
 
Midyear security-report-2016
Midyear security-report-2016Midyear security-report-2016
Midyear security-report-2016
Andrey Apuhtin
 

What's hot (19)

IRJET- Phishing Web Site
IRJET-  	  Phishing Web SiteIRJET-  	  Phishing Web Site
IRJET- Phishing Web Site
 
CSRF Attacks and its Defence using Middleware
CSRF Attacks and its Defence using MiddlewareCSRF Attacks and its Defence using Middleware
CSRF Attacks and its Defence using Middleware
 
Watch Guard Reputation Enabled Defense (White Paper)Dna
Watch Guard   Reputation Enabled Defense (White Paper)DnaWatch Guard   Reputation Enabled Defense (White Paper)Dna
Watch Guard Reputation Enabled Defense (White Paper)Dna
 
2013 Threat Report
2013 Threat Report2013 Threat Report
2013 Threat Report
 
State of Internet 1H 2008
State of Internet 1H 2008State of Internet 1H 2008
State of Internet 1H 2008
 
Websense 2013 Threat Report
Websense 2013 Threat ReportWebsense 2013 Threat Report
Websense 2013 Threat Report
 
Don’t let Your Website Spread Malware – a New Approach to Web App Security
Don’t let Your Website Spread Malware – a New Approach to Web App SecurityDon’t let Your Website Spread Malware – a New Approach to Web App Security
Don’t let Your Website Spread Malware – a New Approach to Web App Security
 
Jean pier talbot - web is the battlefield - atlseccon2011
Jean pier talbot - web is the battlefield - atlseccon2011Jean pier talbot - web is the battlefield - atlseccon2011
Jean pier talbot - web is the battlefield - atlseccon2011
 
Security risks awareness
Security risks awarenessSecurity risks awareness
Security risks awareness
 
Content Management System Security
Content Management System SecurityContent Management System Security
Content Management System Security
 
Web Application Security Guide by Qualys 2011
Web Application Security Guide by Qualys 2011 Web Application Security Guide by Qualys 2011
Web Application Security Guide by Qualys 2011
 
How To Catch a Phish: User Awareness and Training
How To Catch a Phish: User Awareness and TrainingHow To Catch a Phish: User Awareness and Training
How To Catch a Phish: User Awareness and Training
 
Methods Hackers Use
Methods Hackers UseMethods Hackers Use
Methods Hackers Use
 
Attack of the killer virus!
Attack of the killer virus!Attack of the killer virus!
Attack of the killer virus!
 
Wannacry Virus
Wannacry VirusWannacry Virus
Wannacry Virus
 
security_secure_pipes_frost_whitepaper
security_secure_pipes_frost_whitepapersecurity_secure_pipes_frost_whitepaper
security_secure_pipes_frost_whitepaper
 
IRJET- Phishing Website Detection System
IRJET- Phishing Website Detection SystemIRJET- Phishing Website Detection System
IRJET- Phishing Website Detection System
 
WHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack Group
WHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack GroupWHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack Group
WHITE PAPER▶ Symantec Security Response Presents:The Waterbug Attack Group
 
Midyear security-report-2016
Midyear security-report-2016Midyear security-report-2016
Midyear security-report-2016
 

Masango2017

  • 1. Web Defacement and Intrusion Monitoring Tool: WDIMT Mfundo Masango∗, Francois Mouton†, Palesa Antony ‡ and Bokang Mangoale§ Command, Control and Information Warfare Defence, Peace, Safety and Security Council for Scientific and Industrial Research Pretoria, South Africa ∗Email: gmasango@csir.co.za †Email: moutonf@gmail.com ‡Email: pantony@csir.co.za §Email: bmangoale@csir.co.za Abstract—Websites have become a form of information dis- tributors; usage of websites has seen a significant rise in the amount of information circulated on the Internet. Some busi- nesses have created websites that display services the business renders or information about that particular product; businesses make use of the Internet to expand business opportunities or advertise the services they render on a global scale. This does not only apply to businesses, other entities such as celebrities, socialites, bloggers and vloggers are using the Internet to expand personal or business opportunities too. These entities make use of websites that are hosted by a web host. The contents of the website is stored on a web server. However, not all websites undergo penetration testing which leads to them being vulnerable. Penetration testing is a costly exercise that most companies or website owners find they cannot afford. With web defacement still one of the most common attacks on websites, these attacks aim at altering the content of the web pages or to make the website inactive. This paper proposes a Web Defacement and Intrusion Monitoring Tool, that could be a possible solution to the rapid identification of altered or deleted web pages. The proposed tool will have web defacement detection capabilities that may be used for intrusion detection too. The proposed solution will also be used to regenerate the original content of a website after the website has been defaced. Keywords—Commands, Intrusion Detection, Self-Healing, Web Defacement, Web Monitoring. I. INTRODUCTION The amount of information available on the Internet is vast. Web pages on the Internet support the dissemination of vast amounts of information, and render this information accessible on a global scale through a web browser. A web page is a document that is displayed in a web browser. This web page is usually part of a website, which is a collection of interconnected web pages. A website is hosted on a web server, which is a computer that hosts a website on the Internet [1]. The information and content presented on the web pages attracts a wide audiences who are free to use the information for both benign and malicious purposes. Attackers target websites for a wide range of malicious purposes, including but not limited to defacement, personal financial gain, content manipulation and extraction of protected information [2], [3]. Website defacement is still one of the most common attacks in cyber space [4]. The defacement of a website happens when the content of the website is altered or a web page has been visually altered. The altering of the content could result in the website being inactive. The common targeted websites that are being defaced include religious and government sites, where hackers display a resentment of the political or religious views, therefor defacing the website to get their own opinion out there [5]. Figure 1 gives a visual representation on the reported number of websites that have been defaced from the year 2009 to the year 2017 [6]. This paper proposes a Web Defacement and Intrusion Monitoring Tool (WDIMT) that is able to detect defacement, authorize changes directly and conduct regeneration of a website’s original content after internal penetration testing has been conducted. This will allow for the rapid identification of web pages that have been altered or deleted. Furthermore the tool is able to rapidly remove any unidentified or intruder files. The WDIMT makes use of the Linux terminal for execution of commands that will access the tools capabilities. A web page has a graphical user interface, which gives visual representation of the files that are currently being monitored. The tool is not only used for monitoring the defacement of a website, the tool will be providing a possible solution to the amount of time it takes to recover the original contents of a web page. Furthermore the tool has intrusion detection capabilities which will allow the tool to identify any intruder files that have been inserted into a website. The paper is structured as follows. Section II gives back- ground on tools and techniques that were proposed prior to other research and the development of the WDIMT. Section III discusses the structure of the WDIMT which includes the system architecture, system requirements, system features, system flow diagram and the system usage. Section V discusses the advantages and limitations of the WDIMT and proposes some scenarios which illustrates the usage of the WDIMT. Section IV discusess the analysis process of the WDIMT. Section VI concludes the paper and discusses potential future work. 2017 International Conference on Cyberworlds 978-0-7695-6215-5/17 $31.00 © 2017 IEEE DOI 10.1109/CW.2017.55 72 2017 International Conference on Cyberworlds 978-0-7695-6215-5/17 $31.00 © 2017 IEEE DOI 10.1109/CW.2017.55 72 2017 International Conference on Cyberworlds 978-1-5386-2089-2/17 $31.00 © 2017 IEEE DOI 10.1109/CW.2017.55 72
  • 2. Figure 1. Defaced websites from the year 2009–2017 [6] II. BACKGROUND This section discusses related work which includes “Web injection attacks”, “Implementing a web browser with web de- facement detection technique”, “Detection of web defacement by means of Genetic Programming”, “A Web Vulnerability Scanner”, and “SQL-injection vulnerability scanning tool for automatic creation of SQL-injection attacks”. Medvet et al [7] proposes that “Most Web sites lack a systematic surveillance of their integrity and the detection of web defacement’s is often dependent on occasional checks by administrators or feedback from users”. Different website defacement tools have been proposed using techniques such as reaction time sensors and cardinality sensors [7], [8]. A. Web Injection Attacks Morgan [9] discusses code injection and specifically cross site scripting. Code injection refers to the malicious insertion of processing logic to change the behaviour of a website to the benefit of an attacker. Furthermore, Morgan identified cross site scripting as a code injection attack which is a very common attack that executes malicious scripts into a legitimate website or web application. Mitigating against code insertion is also discussed in this article, which proposes stages on how to possibly prevent code injection/insertion attacks. B. Implementing A Web Browser With Web Defacement De- tection Techniques Kanti et al [10] discusses a prototype web browser which can be used to check the defacement of a website, also proposing a recovery mechanism for the defaced pages using a checksum based approach. The proposed algorithm in the study was used for defacement detection; the algorithm was implemented in the prototype web browser that has inbuilt defacement detection techniques. The web browser would periodically calculate the checksum for the defacement de- tection technique’s frequency which is based on the saved hash code or checksum for each web page. Upon comparing their algorithm against existing integrity based web defacement detection methods, they found their algorithm to be detecting approximately 45 times more anomalies than other web de- facement detection methods. C. Detection of Web Defacement By Means of Genetic Pro- gramming Medvet et al [7] discusses how Genetic Programming (GP) was used to establish a evolutionary paradigm for automatic generation of algorithms for detecting web defacement. GP would build an algorithm based on a sequence of readings of the remote page to be monitored and on a sample set of attacks. A framework that was proposed prior to this paper was used to prove the concept of a GP oriented approach to defacement detection. A specialised tool for monitoring a collection of web pages that are typically remote, hosted by different organisations and whose content, appearance and degree of dynamism was not known prior to this research. Medvet et al conducted tests by executing a learning phase for each web page for constructing a profile that will be used in the monitoring phase. During implementation when the reading failed to match the profile created during the learning phase, the tool would raise alerts and send notifications to the administrator. D. SQL-Injection Vulnerability Scanning Tool For Automatic Creation of SQL-Injection Attacks Alostad et al [11] discusses the development of a web scanning tool that has enhanced features that was able to conduct efficient penetration testing on Hypertext Preprocesses (PHP) based websites to detect SQL injection vulnerabilities. The MySQL1injector tool, as proposed by Alostad et al, would automate and simplify the penetration testing process 73 73 73
  • 3. (as much as possible). The proposed tool utilises different attacking patterns, vectors and modes for shaping each attack. The tool is also able to conduct efficient penetration testing on PHP based websites in order to detect SQL injection vulnerabilities and subsequently notify the web developers of each vulnerability that needs to be fixed. The tool houses a total number of 10 attacking patterns, if one pattern failed to expose a vulnerability the others would succeed in tricking the web server if it is vulnerable. The tool is operated remotely through a client side interface. MYSQL1Injector tool allows web developers, that are not trained in penetration testing, an opportunity to conduct penetration testing on their web database servers. E. A Web Vulnerability Scanner Kals et al [12] discusses vulnerabilities that web applica- tions are exposed to as a result of generic input validation problems identifying SQL injection and cross site scripting (XSS). Furthermore, it demonstrates how easy it is for at- tackers to automatically discover and exploit application-level vulnerabilities in a large number of web applications. Kals et al proposes a web vulnerability scanner that automatically analyses web sites with the aim of finding exploitable SQL and XSS vulnerabilities. The proposed tool, SecuBat, has three main components namely, crawling component, attack component and an analysis module. The crawling component needs to be seeded with a root web address, which will step down the link tree, collecting all pages and included web form. The attack component scans each page for the presence of web forms, extracting the action address and the method (i.e GET OR POST) used to submit the form content. The analysis modules parses and interprets the server response after an attack has been launched. The analysis modules use attack- specific criteria and keywords to calculate the confidence value for deciding if the attack was successful. The tool was tested with both injection attacks by means of a case study. The proposed WDIMT makes use of the discussed concepts which include creating tools that may prevent and detect possible code injections, developing algorithms which are compromised with some form of checksum or hashing methods that will be used to detect web defacement, identification of any web pages that may have been removed by unauthorised personnel and regeneration of the original content belonging to a website that has been defaced. The proposed WDIMT com- prises of functionalities used by the tools discussed prior such as rapid identification and notification of a defaced web page or website, regeneration of original content, rapid identification of intruder files and unauthorised changes made on webpages. However, the WDIMT is more client-side dependant, as a user has full control and view of each web page belonging to them. This tool also provides a visual representation of any new files that were added without the users consent, deleting them automatically which may minimise the injection of unknown/unauthorised scripts. Manipulated files or web pages are also visually represented allowing the user to easily identify affected files more efficiently. The response time of detection and reuploading of original content is rapid which may also serve as a preventative method for a denial-of-service attack on a website. Restoration of any deleted file or web page in the result of an attack, will be automatically regenerated. III. WEB DEFACEMENT AND INTRUSION MONITORING TOOL The Web Defacement and Intrusion Monitoring Tool (WDIMT) is a tool that will be used for website defacement detection. Making use of a website for displaying the cur- rent websites defacement status. Monitoring each website’s web pages individually, automatic regeneration of defaced or deleted pages will be done automatically once the script is executed in the Linux terminal. The WDIMT can detect any defacement anomalies that have been made onto a web page or an entire site. A. WDIMT System Architecture The system architecture of the WDIMT is structured into 3 layers, seen in Figure 2, presentation layer which represents the different graphical user interfaces that will be used for display- ing the users information and the execute of commands in the Linux terminal; The execution of the WDIMT commands are not limited to being specifically executed on a Linux terminal; Business Layer where most of the communication between the database and the presentation layer occur, this layer controls the communication channels between the presentation layer and the data access layer; The data access layer displays the database which will store user information and hash of each web page. How the different layers communicate with each other is displayed in Figure 2. B. WDIMT Requirements Using the WDIMT currently requires a user to have a Windows machine and the Linux machine. Both machines and the database were connected to the same network, this was done for immediate communication among the two machines and the database. The Linux machine is used to execute the commands and the Windows machine was used to access the WDIMT web page for registering and viewing a users data. The communication among the machine and the database are illustrated on Figure 3. A user has a option of using either a desktop computer or a laptop, or mixing the the devices used. The requirements presented below are a bare minimum of the system specifications used when the WDIMT was being developed. The WDIMT requirements are subject to change depending on a users machine. A user may have system specifications that exceed or are below the minimum requirements and this will not affect the performance and functionality of the WDIMT. The following section illustrates the minimum requirements for each machine: Linux Machine: • Linux Ubuntu v14.04 or later. • Hypertext Preprocessor (PHP). • 80GB Hard drive. • 2GB RAM. • Intel Core i7 Extreme QUAD CORE 965 3.2GHz. Windows Machine: 74 74 74
  • 4. Figure 2. System architecture of the WDIMT. • Windows 7 or later. • 80GB Hard drive. • 2GB RAM. • Intel Core i7 Extreme QUAD CORE 965 3.2GHz. Figure 3. Connection between the Windows and Linux Machine. C. WDIMT Features The WDIMT makes use of the Linux terminal for executing the script that will hash an entire site’s web pages, the hashes are stored into a database. The database is hosted off-site, away from a website’s web server.The hashes are used to calculate the hash of each web page individually every time the script is executed in the Linux terminal. The script will be appended with various commands that will initialise, verify, force and delete. A user will be able to use the WDIMT’s website to view the status of each web page which is represented by a colour schema on the website, green indicating that a web page has not been altered in any form, red indicating that a web page has been removed/deleted and yellow indicating that a web page has been altered as seen in Figure 4. Figure 4. The home page of WDIMT. 1) Initialise: The initialise command is executed with a ‘- i’. This command is used to read the full folder of a website, hashing each web page found, the stored hash will be used at a later stage by the verify command. A copy of the web page is also made and stored onto the database that is hosted off-site. The stored copies are retrieved when the original content of a web page is reloaded. This command sets the status of a file to green and sets the reloading of the original content to a default value where the original content will not be reloaded. The reloading value may be altered by a user on the WDIMT web page. 2) Verify: The verify command is executed with a ‘-v’. This command is used to verify the hashes of the web pages and check if any form of defacement has been detected. If the hash of any web page has been changed, the user will be able to identify the defaced web page of a particular page both on the Linux terminal and WDIMT web page as seen in Figure 11. If a user would like to have the original content of the identified web page reloaded. The user will have to click on the “More Info” link. This link will redirect the user to a web page which has the functionality of changing a flag on the identified web page. The flag option gives a user the option of having the original content of the identified web page 75 75 75
  • 5. reloaded with its original content as seen in Figure 6. After a user has selected that the original content of the web page be reloaded, the verify command will need to be executed once more in the Linux terminal in order for the status of the file to change from yellow to green indicating that the content of the web page has been reloaded. This command also has the functionality of identifying any intruder files and these intruder files are removed immediately. Any file that has been deleted is recreated and the file’s status is identified by the colour red on the WDIMT’s web page. Figure 5. WDIMT web page displaying a file that has been altered. Figure 6. WDIMT’s web page with flag options. 3) Force: The force command is executed with a ‘-f’. This command overrides the verify command, as it reloads the original content of every web page. The comparing of hashes is used in order to identify any altered web page, the comparison process is executed within this command. Any intruder files that have been identified will be removed immediately. Files that have been deleted will be regenerated and all file statuses will be set to green. The command does not require a user to have the option of reloading the original content set as the command bypasses any option a user may have selected. 4) Delete: The delete command is executed with a ‘-d’. This command is used to delete all the web pages that belong to a particular website that a user has specified. This identifies the website’s full folder path, a user has provided and deletes all the content that has been stored. This includes the copies and web page hashes that were generated when the initialise command was executed for the first time. This will clear all instances on the database. When a user accesses the WDIMT’s web page they will have no files displayed to them as they have deleted all instances. D. WDIMT Flow Diagram The commands are in a sequential flow, with initialise being the first step required as the user will be registered and providing a path to the web pages which need do be monitored. Once the files have been successfully hashed, copied and stored. The user will be able to view the files that they have uploaded on the WDIMT web page. The verify command may be executed once a user has files uploaded. This command will check each individual web page for defacement, once defacement has been detected the web pages original content may be re-uploaded. A user may prefer to delete their content from the WDIMT. The delete command will delete all the web pages that belong to a user. Furthermore the force command will execute the delete command followed by the initialise command. The flow diagram of the commands is seen in Figure 7. Figure 7. Flow diagram of command execution WDIMT. E. WDIMT Usage A user will be required to use the Linux terminal where the user will be providing a full path to the location of a websites web pages which will be monitored. The Linux terminal will also be used for executing of the WDIMT script that will be run with additional commands for initialising, verifying, forcing and deleting of web pages belonging to the user. After successful registration on the WDIMT website, a user will need to use a Linux terminal to execute the script with a initialise command, where a user will provide the full path to the location of the website’s web pages as seen in Figure 8. IV. WDIMT ANALYSIS This section analysises the data generated when the WDIMT’s php script is executed in the Linux terminal. The script is executed with different commands that have been discussed in the previous section. The following subsections identify the different outputs generated when the script is executed with the verify command. 76 76 76
  • 6. Figure 8. Linux terminal commands. A. WDIMT Script The WDIMT’s php script that is executed in the Linux terminal has a number of functions. Most of these functions are executed once the verify command has been instantiated. The script is executed with the different options that were discussed in the previous section. The options invoke different methods and functions within the script, one function the hashes all the file and web pages is shown in Figure 9. Figure 9. Code snip of a function on the WDIMT script. B. Intruder Files Files that have been identified as intruder files are removed immediately when the verify command is executed, possibly minimising any chance of intruder files going undetected. A user will be able to identify these intruder files on the Linux terminal once the verify command has been executed as seen in Figure 10. C. Changed Files Web pages and file that have been altered by an unautho- rised personnel are identified on the Linux terminal, once the verify command has been executed as seen in Figure 11. The altered web pages and files will only be reuploaded with their original content once a user has changed the flag option of the affected web page or file and executed the verify command again, after the flag option has been altered on the WDIMT’s web page. Figure 10. Identified intruder files. Figure 11. Identified changed files. D. Removed Files Web pages and files that have been removed by unautho- rised personnel are identified and regenerated in the respective directory where the file is located. On the Linux terminal the user will be able to identify these files and also confirm that the files have been regenerated into the original path where the file was located, possibly 77 77 77
  • 7. minimising the chances of the website’s being unavailable as seen in Figure 12. Figure 12. Identified removed files. V. DISCUSSION Numerous web defacement monitoring tools have been developed, offering different web defacement detection tech- niques. Some of these techniques are being used in real life scenarios. The WDIMT has automated swift detection of defacement, swift notifications of defacement and swift reuploading of a web pages original content. The strength of the WDIMT lays in its swift detection of defacement of an entire website’s web pages, identifying each defaced web page individually, swiftly notifying a user which web pages have been defaced. The WDIMT rapidly re-uploads the original content of a defaced web page, visiually identifies which web pages were defaced and provides a user with flag options which will be used to indicate whether the web pages original content should be reuploaded or left in the current defaced state. The WDIMT availability is also a strength as a user is able to identify the affected web pages on the Linux terminal without accessing the WDIMT website. The WDIMT has some known limitations. For example, the usage of the Linux terminal for executing the commands associated with the WDIMT, this will require a user to have a computer that has the Linux operating system installed on it. This limits user’s that only have Windows operating system installed on their computers. Authentication on the Linux terminal commands does not require a user to authenticate themselves before executing a command, this may raise se- curity concerns as we are assuming that the person executing the commands is the website’s owner. Executing the commands requires a manual approval rather than being automated. Further development on the WDIMT would increase the tools swift defacement detection, notifications and reupload- ing of original content. The following subsections provide scenarios which will illustrate use cases of the WDIMT. These scenarios show that the WDIMT being proposed has capabilities beyond just monitoring of a user’s web pages, regeneration of web pages after penetration testing has been conducted, allowing user’s to visually identify the affected web pages giving full control over reuploading of the web page’s original content. A. Detection of Intruder Files A company has a website that advertises the services they offer. The websites administrator has access to some of the company’s confidential information. The information is accessed through SQL commands. An intruder could put a file that will access or duplicate the SQL commands which could possibly give the intruder administrative rights. The intruder’s file could be inserted deep into the websites folder structure, which may not be visible when manually searching through the folders. The tool can be used to identify the intruder file and immediately remove it. B. Detection of Altered Web Pages The local government elections are about to commence is the country. Parties involved in the elections could possibly sabotage another party’s website which contains the party’s campaign information. A party could possibly render the services of a hacker that will alter the opposition party’s website. The hacker could possibly alter all web pages to give false information about the party. The tool can be used to identify the web pages that have been altered by the hacker and re- upload the original content of the pages that were altered. C. Detection of Deleted Web Pages A web developer whom was previously employed by a company for web development services is bitter after dismissal. The web developer gained access to the company’s website and has deleted some web pages thus making the website inactive. The inactive website could possibly cost the company a undisclosed amount of money. The tool could be used to identify deleted web pages, swiftly regenerate any of them and reupload each web page’s original content. VI. CONCLUSION Accessing the Internet has been made easier with the advancement of technology. The volume of information that is being distributed on the Internet may increase daily. With this information being accessed daily it is not likely that everyone making use of the Internet may be using it for legal purposes. The WDIMT is a web defacement detection tool that monitors web pages for defacement, which makes use of the Linux terminal command which is used for executing commands and a web page that will visually represent all web pages belonging to a user. Upon executing commands in the Linux terminal, the WDIMT will be able to detect any 78 78 78
  • 8. defacement that may have occurred or is currently in progress. If any defacement was detected it will be visually represented on the WDIMT’s web page which gives a user the option to identify a defaced web page and have the original content of that web page re-uploaded. Swift reuploading of a web pages original content is one strong point of the WDIMT. The proposed tool aligns well with the scenarios identified in the previous section, as the tools full capacity and all its features are used in fulfilment of the identified scenarios. Future work will be done on the WDIMT to allow the commands to be executable on a Windows machine. An impact study will be done using data gathered from the usage of the WDIMT. This gathered data may be used for analytic purposes that may assist in identification of which web pages get commonly defaced. For future avenues of research, it would be useful for a user to be able to have a mobile application that will allow the same functionalities as the WDIMT’s web page. REFERENCES [1] Lady Ninja86. (2016, Dec.) What is the differ- ence between webpage, website, web server, and search engine? Mozilla Developer Network. [Online]. Available: https://developer.mozilla.org/en-US/docs/Learn/Common- questions/Pages-sites-servers-and-search-engines [2] T. Perez. (2015) Why websites get hacked. Sucuri Inc. [Online]. Available: https://blog.sucuri.net/2015/02/why-websites-get- hacked.html [3] G. Davanzo, E. Medvet, and A. Bartoli, “Anomaly detection techniques for a web defacement monitoring service,” Expert Systems with Appli- cations, vol. 38, no. 10, pp. 12 521–12 530, 2011. [4] J. Lyon. (2014) What are the 5 most common attacks on websites? Quora. [Online]. Available: https://www.quora.com/What-are-the-5- most-common-attacks-on-websites [5] Cybercrime.org.za. (2016) Website defacement definition. ISC AFRICA. [Online]. Available: https://cybercrime.org.za/website- defacement [6] Zone-h. (2017, 05) zone-h.org. Zone-H. [Online]. Available: http://www.zone-h.org/stats/ymd [7] E. Medvet, C. Fillon, and A. Bartoli, “Detection of web defacements by means of genetic programming,” in Information Assurance and Security, 2007. IAS 2007. Third International Symposium on. IEEE, 2007, pp. 227–234. [8] A. Bartoli, G. Davanzo, and E. Medvet, “The reaction time to web site defacements,” IEEE Internet Computing, vol. 13, no. 4, 2009. [9] D. Morgan, “Web injection attacks,” Network Security, vol. 2006, no. 3, pp. 8–10, 2006. [10] T. Kanti, V. Richariya, and V. Richariya, “Implementing a web browser with web defacement detection techniques,” World of Computer Science and Information Technology Journal (WCSIT), vol. 1, no. 7, pp. 307– 310, 2011. [11] A. B. M. Ali, M. S. Abdullah, J. Alostad et al., “Sql-injection vul- nerability scanning tool for automatic creation of sql-injection attacks,” Procedia Computer Science, vol. 3, pp. 453–458, 2011. [12] S. Kals, E. Kirda, C. Kruegel, and N. Jovanovic, “Secubat: A web vulnerability scanner,” in Proceedings of the 15th International Conference on World Wide Web, ser. WWW ’06. New York, NY, USA: ACM, 2006, pp. 247–256. [Online]. Available: http://doi.acm.org/10.1145/1135777.1135817 79 79 79