SlideShare a Scribd company logo
1 of 36
Software Security Group Project 2016
Students: Matteo Lucchetti, Michele Reale, Florin Tanasache
SECURITY ASSESSMENT
OF MEDIAWIKI WEB
APPLICATION
1
OWASP ASVS 3.0
Table of contents
1. Introduction……………………………………………………...
1.1 Abstract……………………………………………..
1.2 Students…………………………………………….
3
2. The MediaWiki web application……………………..........
2.1 Brief introduction…………………………………...
2.2 Level of the web-app……………………………....
4
3. Log working group meetings………………………............
3.1 21/04 09:00-12:00………………………………….
3.2 05/05 09:00-14:00………………………………….
3.3 07/05 09:00-16:00………………………………….
3.4 12/05 09:00-10:00………………………………….
3.5 19/05 09:00-13:00………………………………….
3.6 26/05 10:00-14:00………………………………….
3.7 28/05 11:00-15:30………………………………….
3.8 29/05 16:00-20:00………………………………….
3.9 30/05 09:30-13:30………………………………….
3.10 31/05 16:00-21:15………………………………..
5
2
4. Executive summary…………………………………………….
4.1 Overview of tools………………………………….
4.2 Summary of findings………………………………
4.3 A specific review of RIPS…………………………
4.4 A specific review of Vega…………………………
7
5. Control of requirements……………………………………...
5.1 V3: Session management system verification
requirements……………………………………………
5.2 V11: HTTP security configuration verification
requirements………………………………….…………
5.3 V16: Files and resources verification
requirements………………………………….…………
12
6. Conclusion………………………………………………….……
6.1 About the tools……..……………………….……...
6.2 About the requirements of OWASP………………
6.3 About the security level of MediaWiki…………....
33
3
1. Introduction
1.1 Abstract
This report documents the development of the group project assigned by Professor Francesco
Parisi Presicce for the class of Security in Software Applications of Master Degree in Computer
Science at Sapienza in the academic year 2015/2016. The project expects to analyze a web-
application, in our case MediaWiki, through the OWASP standards. The goal is to quantify the
usefulness of these standards, of the tools used and judge whether the web-app respects the
security requirements or not. To analyze the application in deep, the class was divided into
groups and each of these has been assigned three or more categories. In May 30, 2016, each
group will give a presentation of the points assigned in order to have an overview of the results
obtained.
1.2 Students and categories assigned
Our group is composed by:
❖ Matteo Lucchetti
➢ Matricola: 1496408
➢ Mail: lucchetti.1496408@studenti.uniroma1.it
❖ Michele Reale
➢ Matricola: 1315785
➢ Mail: reale.1315785@studenti.uniroma1.it
❖ Florin Tanasache
➢ Matricola: 1524243
➢ Mail: tanasache.1524243@studenti.uniroma1.it
We must verify the categories V3. Session management verification, V11. HTTP security
configuration and V16. File and resources requirements of the OWASP standards on the web-
app MediaWiki.
4
2. The MediaWiki web application
2.1 Brief introduction
MediaWiki is a free and open-source wiki application. It was originally developed by the
Wikimedia Foundation and runs on many websites, including Wikipedia,Wiktionary and
Wikimedia Commons. It is written in the PHP programming language and uses a backend
database. The peculiarity of MediaWiki, but more generally of all wiki sites, is to allow its users
to add, edit or delete content through a web browser, allowing the development of a virtual
encyclopedia.
2.2 Level of the web-app
According to the OWASP ASVS guidelines, we decided to adopt the ASVS Level 1
(Opportunistic) specifications. In fact, MediaWiki does not correspond either to Level 2 or Level
3 kind of applications (business-critical, mission-critical), since it would not give a significant
gain to any attacker - no valuable resources but the user accounts could be stolen ("too much
effort, not enough gain"). Hence, the attacker would not feature expensive and sophisticated
techniques and tools to attack Mediawiki, but just simple and cheap automatic tools to spot the
most common vulnerabilities - most of them are well documented in the OWASP Top 10 and
similar checklists. We'll feature a set of automatic analysis tools to spot and fix such
vulnerabilities; following, you can find the complete list.
5
3. Log working group meetings
This section presents the calendar of meetings with respective timetables and the works done .
In particular, are not reported the hours spent on the project individually.
3.1 21/04 09:00-12:00
❖ Installation and configuration of MediaWiki in our laptops in three different environments:
Windows 7 VirtualBox, Windows 10, Linux.
❖ Discussion about the ASVS level of MediaWiki.
❖ Distribution of tasks:
Lucchetti 3.1 3.5 3.16 11.1 11.6 16.1 16.4 16.9
Reale 3.2 3.6 3.11 3.17 11.2 11.7 16.2 16.5
Tanasache 3.3 3.7 3.12 3.18 11.5 11.8 16.3 16.8
❖ Installation and first scans with RISP.
❖ Creation of template for final report.
3.2 05/05 09:00-14:00
❖ Research of tools for specific tasks.
❖ Discussion about requirements of first deadline.
❖ Creation of template for first deadline.
3.3 07/05 09:00-16:00
❖ Discussion about our results.
❖ Scan with main tools.
❖ Compilation of the report.
❖ Conclusion of first report.
3.4 12/05 09:00-10:00
❖ Comparison the individual works.
❖ Compilation of the report.
3.5 19/05 09:00-13:00
❖ Comparison the individual works.
6
❖ Compilation of the report.
3.6 26/05 10:00-14:00
❖ Comparison the individual works.
❖ Compilation of the report.
3.7 28/05 11:00-15:30
❖ Preparation of the slides for presentation.
3.8 29/05 16:00-20:00
❖ Completation of the slides.
3.9 30/05 09:30-13:30
❖ Presentation of the project.
3.10 31/05 16:00-21:15
❖ Completation of the report.
7
4. Executive summary
4.1 Overview of tools
To carry out the requested analyzes, we have been used various tools. These were classified
into two categories according to their generality of the results. The Main tools, dealing in a
general way to scan the entire web-app trying to identify more vulnerabilities that can. While the
Side tools are concerned with identifying specific vulnerabilities.
Main tools (used to address all the three ASVSrequirements)
Tool name Main purpose Comment
RIPS General-purpose, grep-based
static analysis of PHP-based web
apps.
See Section 4.3
Yasca General-purpose, grep-based,
extensible static analysis of source
code written in several languages
(PHP is supported as well).
Too verbose w.r.t. W3C browser
compatibility and informational
messages. Very few vulnerabilities found,
compared to the other tools.
ZAP A penetration-testing tool for Web
applications, featuring both static
and dynamic tools.
Accurate static and dynamic analysis.
HTTP request-response reporting.
Very long execution time.
Report exporting allowed.
Vega Web-app security testing tool,
performing both static analysis and
proxy-based dynamic analysis.
See Section 4.4
Side tools
Tool name ASVS Requirements Hit Comment
Burp Repeater V3: Session management Assesses the degree of randomness of
security tokens and nonces.
Cookie Digger V3: Session management Very similar to Burp Sequencer.
JHijack V3: Session management A simple Java Fuzzer mainly used for
numeric session hijacking and parameter
8
enumeration.
DotDotPwn V16: Path traversal Featured in Kali Linux pentest platform.
Redirect
Checker
V16: Path traversal Online tool checking URL redirection.
Please note: All tools listed have been used, but not all were helpful in solving our requirements.
In the following, we will take in consideration only those that we believe have been most useful.
4.2 Summary of findings
This section shows the results generated by the main tools scans on MediaWiki.
Scan’s results of RIPS with verbositylevel 4
Vulnerabilities found Comments Useful
Code execution This vulnerability is being considered from other points. No
Command execution This vulnerability is being considered from other points. No
Protocol injection This vulnerability is being considered from other points. No
File disclosure This vulnerability is being considered from other points. No
File inclusion We must analyze this vulnerability for V16. Yes
File manipulation This vulnerability is being considered from other points. No
SQL injection This vulnerability is being considered from other points. No
Cross-site scripting We must analyze this vulnerability for V3. Yes
HTTP response splitting We must analyze this vulnerability for V11. Yes
Session fixation We must analyze this vulnerability for V3. Yes
Possible flow control This vulnerability is being considered from other points. No
9
Reflection injection This vulnerability is being considered from other points. No
PHP object injection This vulnerability is being considered from other points. No
Scan’s results of Yasca
Vulnerabilities found Comments Useful
SQL Injection This vulnerability is being considered from other points. No
Cross-site scripting We must analyze this vulnerability for V3. Yes
Weak credentials This vulnerability is being considered from other points. No
Scan’s results of ZAP
Vulnerabilities found Comments Useful
Cross-site scripting We must analyze this vulnerability for V3. Yes
Remote OS command
injection
We must analyze this vulnerability for V16. Yes
SQL injection This vulnerability is addressed by other requirements. No
Application error
disclosure
We must analyze this vulnerability for V3. Yes
Buffer overflow This vulnerability is addressed by other requirements. No
Directory browsing We must analyze this vulnerability for V16. Yes
Format string error This vulnerability is addressed by other requirements. No
X-Frame-Options
header not set
We must analyze this vulnerability for V11. Yes
10
Cross-domain
JavaScript source file
inclusion
We must analyze this vulnerability for V16. Yes
Password autocomplete
in browser
This vulnerability is addressed by other requirements. No
Private IP disclosure This vulnerability is addressed by other requirements. No
Web browser XSS
protection not enabled
We must analyze this vulnerability for V11. Yes
X-Content-Type-
Options header missing
We must analyze this vulnerability for V11. Yes
Scan’s results of Vega
Vulnerabilities found Comments Useful
Bash “ShellShock”
Injection
This vulnerability is addressed by other requirements. No
Cleartext Password
over HTTP
This vulnerability is addressed by other requirements. No
Cross-site scripting We must analyze this vulnerability for V3. Yes
Integer Overflow This vulnerability is addressed by other requirements. No
Page Fingerprint
Differential Detected
This vulnerability is addressed by other requirements. No
Shell Injection This vulnerability is addressed by other requirements. No
SQL Injection This vulnerability is addressed by other requirements. No
Local filesystems paths
found
This vulnerability is addressed by other requirements. No
PHP error detected We must analyze this vulnerability for V3. Yes
Possible HTTP PUT file
upload
We must analyze this vulnerability both for V11 and V16. Yes
11
Possible Source Code
Disclosure
This vulnerability is addressed by other requirements. No
Possible XML Injection This vulnerability is addressed by other requirements. No
Directory Listing
Detected
We must analyze this vulnerability for V16. Yes
Email Addresses Found This vulnerability is addressed by other requirements. No
Form Password Field
Autocomplete Enabled
This vulnerability is addressed by other requirements. No
12
4.3 A specific review of RIPS
RIPS is a tool written in PHP to find vulnerabilities in PHP applications using static code
analysis. By tokenizing and parsing all source code files RIPS is able to transform PHP source
code into a program model and to detect sensitive sinks that can be tainted by userinput during
the program flow. Besides the structured output of found vulnerabilities RIPS also offers an
integrated code audit framework for further manual analysis.
The RIPS user interface is simple, but there are some nuances. One of these are the verbosity
levels. RIPS has 5 verbosity levels, each of them can be chosen to improve the results. After a
complete analysis with all these, we have chosen the level 4, because it report all the results of
the precedent levels plus additional informations. We are aware that this level make many false
positive but, for a first scan of the web-app, we felt it appropriate knows all the possible
vulnerabilites that this tool can find.
4.4 A specific review of Vega
Vega is quite a powerful tool to perform security analysis of Web servers: it's based on charging
the Web server with an intense, but carefully produced, load of HTTP requests matched against
their responses in order to detect possible security flaws. It's possible to choose the modules
(analysis components) to tune the analysis to the analyst's particular needs. The tool was tuned
to run all the available modules, and this required a 22-hour-long execution.
The reports are very well detailed, since they show, in a compact and clear way:
❖ the HTTP request that led to the flaw;
❖ the content of the corresponding HTTP response;
❖ the kind of vulnerability;
❖ a discussion about it;
❖ its possible impact;
❖ suggested and practical solutions to fix it;
❖ additional references about this vulnerability;
❖ a quick recap of the spotted vulnerability, together with its severity level and
classification.
Reports are a key feature of Vega, but this software is missing a fundamental feature: it's
impossible to export security reports, even if they're directly displayed as HTML pages inside
the tool GUI. Vega stores the reports in a database file whose content is not accessible without
Vega.
This is a huge drawback of such a good tool, even if Vega can be closed and restarted again,
without losing any scanning report.
13
5. Control of requirements
In this section, the detailed analysis is given on each individual requirement. For each of these
is given the author, the generic description of the intent of the requirement, the tools adopted,
the testing flow, the final verdict, and, if negative, a possible solution.
Please note: all this requirements are verified on the default version of the web app
downloadable from this site https://www.mediawiki.org/wiki/Download.
5.1 V3: Session management system verification requirements
3.1 Verify that there is no custom session manager, or that the
custom session manager is resistant against all common session
management attacks.
AUTHOR Matteo Lucchetti
DESCRIPTION The session management mechanism is a fundamental security
component in the majority of web applications. HTTP itself is a
stateless protocol, and session management enables the application
to uniquely identify a given user across a number of different requests
and to handle the data that it accumulates about the state of that
user's interaction with the application. The best practice is to use a
robust, well-know session manager built in to a web application
framework. However, if the developers want implement a custom
session manager, they must check if it is resistant against all common
session management attacks: some of these are session hijacking
and session fixation. The first is the process by which a malicious user
acquires a valid session identifier after it has been assigned and
inherits that individual’s permissions, the second instead is an attack
that permits an attacker to hijack a valid user session.
TOOLS
ADOPTED
RIPS: This tool reported that there are 41 files with the session fixation
vulnerability.
TESTING As php manual said, to implement a costum session manager, the
developers must use the session_set_save_handler() function that
provides a way to override the default session handling mechanism
with new functionality so that you can store the data where you’d like.
I found this function in the ObjectCacheSessionHandler.php and I
found out that the web app uses a custom session manager. So, I
tested his resistance against the common session management
attacks.
Session fixation
14
I done this attack with Google Chrome browser; I established a
legitimate connection with the web server which issued a session ID
then I open a new browser in incognito mode (to simulate a new
machine for the victim) and I set a cookie with the session ID through
the following client-side scripting:
document.cookie=”wikidb_session=<session ID>”;
now I done the part of the victim and, how to predict the attack, I tried
to access to the Web Server, but this have blocked me, then the attack
has failed.
Session hijacking
The session hijacking attack compromises the session token by
stealing or predicting a valid session token to gain unauthorized
access to the Web Server. The session token could be compromised
in different ways (Predictable session token, session sniffing, client-
side attacks, man-in-the-middle attack etc.). For this test I assume that
the attacker managed to get through one of these ways the session
ID. So, I set a cookie like before with the session ID just stolen. The
result is that I was able to access the web server without entering
credentials, then the attack was successful.
RECOMMENDED
SOLUTIONS
The best solution is to implement a well-know session manager built in
PHP.
VERDICT MediaWiki fails this requirement.
3.2 Verify that sessions are invalidated when the user logs out.
AUTHOR Michele Reale
DESCRIPTION In this case, we check that it's not possible to “reuse” a session token
after the logout operation. If a session can still be used after logging
out, an attacker may grab the session token and use it to impersonate
a user.
TOOLS
ADOPTED
BURP SUITE - REPEATER: I used Burp Suite, particularly the
Repeater tool, to test the validation of old session tokens after the user
logs out.
TESTING BLACK BOX TESTING: One quick way to test this is to log in, get the
session token from the cookie, log out, then manually add the session
cookie with the session token and see if you are still logged in.
Several tests like this confirmed that the session token was not re-
usable.
CODE ANALYSIS: we inspected the source code to check whether
the session cookies were invalidated or not. We found out several
modules of the application explicitly dealing with it, particularly
User.php.
15
RECOMMENDED
SOLUTIONS
No one because MediaWiki pass this requirement.
VERDICT MediaWiki supports this requirement
3.3 Verify that sessions timeout after a specified period of inactivity.
AUTHOR Florin Tanasache
DESCRIPTION All sessions should implement an idle or inactivity timeout. This
timeout defines the amount of time a session will remain active in case
there is no activity in the session, closing and invalidating the session
upon the defined idle period since the last HTTP request received by
the web application for a given session ID. The idle timeout limits the
chances an attacker has to guess and use a valid session ID from
another user. However, if the attacker is able to hijack a given session,
the idle timeout does not limit the attacker’s actions, as he can
generate activity on the session periodically to keep the session active
for longer periods of time. Session timeout management and
expiration must be enforced server-side. If the client is used to enforce
the session timeout, for example using the session token or other
client parameters to track time references (e.g. number of minutes
since login time), an attacker could manipulate these to extend the
session duration.
TOOLS
ADOPTED
RIPS: With this tool I checked if there are vulnerability in session
management. The results showed to me that Mediawiki is vulnerable
by XSS attack or Session Fixation. These attacks could compromise a
valid session timeout.
BURP SUITE - REPEATER: I used Burp Suite, in special case the
Repeater tool, for manipulating and resending individual requests.
After some individual HTTP requests to Mediawiki, I noticed that his
sessions are implemented with cookies for a quick navigation and
delete these when the user makes the log-out. But I didn’t found
nothing about session-timeout, so Mediawiki is vulnerable by attacks
like Cross Site Scripting and Cross Site Request Forgery.
TESTING BLACK BOX TESTING: Initially, I had to check whether a timeout
exists, for instance, by logging in and waiting for the timeout log out to
be triggered. Mediawiki is a simply wiki where users can to post article
etc. Therefore, a short timeout in Mediawiki could annoy users which
are typing lengthy articles with unnecessary log in requests. There
timeouts of an hour and more can be acceptable. As I say above, i
tried a login of an user and I waited for 2 hours. After that, I noticed
that Mediawiki still doesn’t log out the user after an inactivity timeout.
So, Mediawiki has not a session timeout or it doesn’t set a session
time. However, if I was connected (log in done) and I closed the
16
browser, and subsequently when I reopened Mediawiki (web browser),
it made the log-out automatically because I closed the browser.
Obviously i didn’t check the box “Mantienimi collegato” when made the
log in.
RECOMMENDED
SOLUTIONS
In this case, for a secure use of Mediawiki i must have a session-
timeout. In fact, it has the session.cookie_lifetime setted to 0 that
make the session’s cookie a real session cookie and it is only valid
until the browser is closed as we saw above. Use a simple time-stamp
that denotes the time of the last activity (i.e. request) and update it with
every request:
if ( isset($_SESSION['LAST_ACTIVITY']) && (time() -
$_SESSION['LAST_ACTIVITY'] > 3600)) {
// last request was more than 60 minutes ago
session_unset(); // unset $_SESSION variable for the run-time
session_destroy(); // destroy session data in storage
}
$_SESSION['LAST_ACTIVITY'] = time(); // update last activity time-
stamp
Updating the session data with every request also changes the
session file's modification date so that the session is not removed by
the garbage collector prematurely. In this case we choose an interval
of 60 minutes.
VERDICT MediaWiki fails this requirement.
3.5 Verify that all pages that require authentication have easyand
visible access to logout functionality.
AUTHOR Matteo Lucchetti
DESCRIPTION The goal of the logout functionality is to destroy and make unusable all
the session tokens. it's important for a user have available a easy and
visible access to this functionality to prevent a “reuse” of a session.
So, we must check that the application provides a logout button and
that this button is present and well visible on all pages that require
authentication. A logout button that is not clearly visible, or that is
present only on certain pages, poses a security risk, as the user might
forget to use it at the end of his/her session.
TOOLS
ADOPTED
No one.
TESTING To analyze this requirement, I have carried out a manual check. I
logged onto the site and I have checked the presence of the log out
link. Given the size of the web app, I could not control all the pages,
but I did a search for areas and for application flow. In this control I
17
have found the presence of the logout link on each page displayed.
RECOMMENDED
SOLUTIONS
No one because MediaWiki passes this requirement.
VERDICT MediaWiki pass this requirement.
3.6 Verify that the session id is never disclosed in URLs, error
messages, or logs. This includes verifying that the application
does not support URL rewriting of session cookies.
AUTHOR Michele Reale
DESCRIPTION Session IDs in URLs are very easy to intercept, and they are used to
carry out a lot of operations.
If a session ID appears in a URL, then users are severely vulnerable
even by accidental sharing a page link with other hosts.
Session tokens must be kept either inside the HTTP packet header or
payload.
TOOLS
ADOPTED
BURP SUITE: we performed lots of login/logout operations for
different user accounts, and we captured the HTTP requests through
Burp Suite. We found out that the session id is transmitted inside the
packet payload and not inside the URL.
TESTING Description of test
RECOMMENDED
SOLUTIONS
No one because MediaWiki passes this requirement.
VERDICT MediaWiki supports this requirement
3.7 Verify that all successful authentication and re-authentication
generates a new session and session id.
AUTHOR Florin Tanasache
DESCRIPTION When an application does not renew its session cookie(s) after a
successful user authentication, it could be possible to find a session
fixation vulnerability and force a user to utilize a cookie known by
the attacker. In that case, an attacker could steal the user session
(session hijacking).
Session fixation vulnerabilities occur when:
● A web application authenticates a user without first invalidating
the existing session ID, thereby continuing to use the session
ID already associated with the user.
● An attacker is able to force a known session ID on a user so
that, once the user authenticates, the attacker has access to
18
the authenticated session.
In the generic exploit of session fixation vulnerabilities, an attacker
creates a new session on a web application and records the
associated
session identifier. The attacker then causes the victim to
authenticate against the server using the same session identifier,
giving the attacker access to the user’s account through the active
session.
TOOLS
ADOPTED
BURP SUITE - REPEATER: With the help of this tool (used as proxy)
I verified through a simple black box testing if all successful
authentication and re-authentication generates a new session and
session id.
TESTING BLACK BOX TESTING: With the tool Repeater I analyzed the
answers of some GET requests of Mediawiki. First, I tried an login with
an userid and password. Burp showed to me the relative
“wikidb_session” for this authentication. After that, I did the logout of
Mediawiki. Afterwards, with an successfully re-authentication to the
application I noticed that new cookie has been generated.
Hence, Mediawiki changed the sessionID for each new re-
authentication.
RECOMMENDED
SOLUTIONS
No one because MediaWIki pass the requirement.
VERDICT MediaWiki pass this requirement
3.11 Verify that session ids are sufficiently long, random and unique
across the correct active session base.
AUTHOR Michele Reale
DESCRIPTION
TOOLS
ADOPTED
BURP SUITE: inspecting cookies in HTTP requests.
TESTING First, we inspected some HTTP packets to get the kind of session
tokens: the first results - that is, 32 hexadecimal digits, corresponding
to 128 bits - were confirmed by inspecting the source code file devoted
to generating such tokens.
The algorithm used to generate random tokens harvests randomness
from lots of sources.
By the way, another inspection in the source code revealed that the
tokens are generated without considering the possible presence of an
identical token in the database, so this requirement is not completely
fulfilled.
19
RECOMMENDED
SOLUTIONS
Look for the presence of the freshly-generated token in the database,
before proceeding with assigning it to the users.
VERDICT MediaWiki violates this requirement
3.12 Verify that session ids stored in cookies have their path set to an
appropriately restrictive value for the application, and
authentication session tokens additionally set the “HttpOnly” and
“secure” attributes.
AUTHOR Florin Tanasache
DESCRIPTION Cookies are often a key attack vector for malicious users (typically
targeting other users) and the application should always take due
diligence to protect cookies. The importance of secure use of Cookies
cannot be understated, especially within dynamic web applications,
which need to maintain state across a stateless protocol such as
HTTP. To understand the importance of cookies it is imperative to
understand what they are primarily used for. Since HTTP is stateless,
the server cannot determine if a request it receives is part of a current
session or the start of a new session without some type of identifier.
Once the tester has an understanding of how cookies are set, when
they are set, what they are used for, why they are used, and their
importance, they should take a look at what attributes can be set for a
cookie and how to test if they are secure. The following is a list of the
attributes that can be set for each cookie and what they mean. The
next section will focus on how to test for each attribute:
● secure - This attribute tells the browser to only send the cookie
if the request is being sent over a secure channel such as
HTTPS. This will help protect the cookie from being passed
over unencrypted requests. If the application can be accessed
over both HTTP and HTTPS, then there is the potential that the
cookie can be sent in clear text.
● HttpOnly - This attribute is used to help prevent attacks such
as cross-site scripting, since it does not allow the cookie to be
accessed via a client side script such as JavaScript. Note that
not all browsers support this functionality.
● path - The path attribute signifies the URL or path for which the
cookie is valid. If the path attribute is set too loosely, then it
could leave the application vulnerable to attacks by other
applications on the same server. For example, if the path
attribute was set to the web server root "/", then the application
cookies will be sent to every application within the same
domain.
TOOLS
ADOPTED
BURP SUITE - REPEATER: By using this tool as an intercepting
proxy, I analyzed all responses where a cookie is set by the
application (using the Set-cookie directive) and I inspected the cookie
for attributes above.
20
TESTING BLACK BOX TESTING: In this case of Mediawiki I made a GET
request and I looked if the Set-Cookie header in the HTTP response
includes the following attribute and their values.
● Secure Attribute: The property of this attribute is non
guaranteed. Then the browser would agree to pass it via an
unencrypted channel such as using HTTP, and this could lead
to an attacker leading users into submitting their cookie over an
insecure channel.
● HttpOnly Attribute: This attribute should always be set even
though not every browser supports it. This attribute aids in
securing the cookie from being accessed by a client side script,
it does not eliminate cross site scripting risks but does
eliminate some exploitation vectors. We checked to see if the
";HttpOnly" tag has been set and in this case it did, so this
attribute is non guaranteed.
● Path Attribute: This attribute is specified and it is set as the
default path (/), then it can be vulnerable to less secure
applications on the same server. For example, if the application
resides at /myapp/, then it have to verify that the cookies path
is set to "; path=/myapp/" and NOT "; path=/".
RECOMMENDED
SOLUTIONS
A solution for PHP applications is the use of the following function:
setcookie( $name, $value, $expire, $path, $domain, $secure,
$httponly)
This function defines a cookie to be sent along with the rest of the
HTTP headers.
VERDICT MediaWiki fails this requirement.
3.16 Verify that the application limits the number of active concurrent
sessions.
AUTHOR Matteo Lucchetti
DESCRIPTION When two or more sessions are held at the same time they are known
as concurrent sessions. It's a common request that a web application
not allow a user to have more than one session active at a time. In
other words, after a user logs into an application, he should not be
permitted to open a different type of browser (or use another
computer) to log in again until his first session has ended. We must
check that are not allowed multiple concurrent sessions for privilage
the privacy of the users.
TOOLS
ADOPTED
No one.
TESTING To analyze this requirement, I have carried out a manual check. I just
21
tried to log-in with the same user from differents browsers and the
results is that I have obtained different concurrent sessions for the
same user.
RECOMMENDED
SOLUTIONS
We could maintain a DB-table with active user sessions, where a
session is considered active, if last user activity took place less then X
(configurable value) minutes ago. Each time a user tries to
authenticate throgh the login form, we should check how many
sessions for that user are active at the moment, and based upon that
check make a decision whether to authenticate him or decline with
some form of response message.
VERDICT MediaWiki fails this requirement
3.17 Verify that an active session list is displayed in the account
profile or similar of each user. The user should be able to
terminate any active session.
AUTHOR Michele Reale
DESCRIPTION A session management mechanism allows users to control their
currently active sessions on the website.
Users may forget to logout their account when using a different
workstation, or their password may be discovered; thus, a session
management dashboard allows users to be aware of such problems
and fix them.
TOOLS
ADOPTED
No tool adopted
TESTING Two separate logins from different hosts were performed with the
same username, and no session list control is available.
Moreover, the logout operation performed on a host won't log out the
other sessions.
RECOMMENDED
SOLUTIONS
A session management interface should be provided to the logged
users.
VERDICT MediaWiki violates this requirement
3.18 Verify the user is prompted with the option to terminate all other
active sessions after a successful change password process.
AUTHOR Florin Tanasache
DESCRIPTION This requirements indicate an one good security practice: all sessions
are invalidated on change password. For instance, the user is
22
changing password because the old one has been compromised. In
that case, invalidating all sessions helps protect the user, because
after the user clicked that password reset link, of course his password
is changed, which means he cannot login with that old credentials The
credentials which had been logged him into that website somehow are
expired. This practice is used for the majority of web applications.
TOOLS
ADOPTED
No one
TESTING To test this requirement I did a manual testing. I used two browsers,
so there are established multiple login sessions. Then in one browser,
I changed account password. After that I tried to navigate on second
browser and I saw that the session has terminated and the browser
asked to me the credentials. Therefore MediaWiki verify this
requirement.
RECOMMENDED
SOLUTIONS
No one because MediaWiki pass this requirement.
VERDICT MediaWiki pass this requirement
5.2 V11: HTTP security configuration verification requirements
11.1 Verify that the application accepts only a defined set of required
HTTP request methods, such as GET and POST are accepted,
and unused methods (e.g. TRACE, PUT, and DELETE) are
explicitly blocked.
AUTHOR Matteo Lucchetti
DESCRIPTION HTTP offers a number of methods that can be used to perform actions
on the web server. While GET and POST are by far the most common
methods that are used to access information provided by a web
server, the Hypertext Transfer Protocol allows several other methods.
Some of these methods can potentially pose a security risk for a web
application, as they allow an attacker to modify the files stored on the
web server and, in some scenarios, steal the credentials of legitimate
users. More specifically, the methods that should be disabled are the
following: PUT, DELETE, CONNECT and TRACE.
TOOLS
ADOPTED
ADVANCED REST CLIENT: I used the tool to send and test differents
HTTP requests to MediaWiki.
23
TESTING I sent differents HTTP requests for every method. All the responses
that I received are with the status “200 ok”, so the request are allowed;
in particular, if they should not be allowed, the status code had to be
“405 method not allowed”.
RECOMMENDED
SOLUTIONS
We could insert the following Apache configuration directives in
httpd.conf:
RewriteEngine On
RewriteCond %{REQUEST_METHOD}!^(GET|POST|HEAD)
RewriteRule .* - [R=405,L]
This is a sort of whitelist to consent only the “secure” methods.
VERDICT MediaWiki fails this requirement.
11.2 Verify that everyHTTP response contains a content type header
specifying a safe character set (e.g., UTF-8, ISO 8859-1).
AUTHOR Michele Reale
DESCRIPTION HTTP defines a Content-Type header which defines the character
set to be used when reading the HTTP packet payload bytes.
There are several character sets, which define how characters are
encoded in bytes and decoded viceversa, like Unicode Transformation
Format 8 (UTF-8) and ISO 8859-1.
The content of a file is just a sequence of bytes that need to be
decoded to meaningful characters, unless their raw binary content
needs to be accessed. Since different character sets may represent
the same bytes in different characters, it's necessary to specify which
character set was used to generate that file in order to have no
ambiguity.
Using the wrong charset (or not specifying it explicitly) allows attackers
to perform XSS attacks based on exploiting subtle differences
between charsets.
TOOLS
ADOPTED
ADVANCED REST CLIENT: this tool was used to check the presence
of a charset specification in HTTP response Content-Type header.
VEGA: this tool was used to perform intensive HTTP request-
response analysis to detect potential flaws.
TESTING Vega reported 9 Informational Level HTTP GET responses with no
charset specified, regarding minor-importance resources.
These resources were retrieved by using Advanced Rest Client and
displayed with different charsets to check possible discrepancies that
may have led to an XSS exploitation: these files don't appear to
change with the classic charsets. Most important, their Content-
Type header was missing an explicit charset.
Despite the low amount and importance of such resources, the
requirement imposes that not an HTTP response can have an implicit
charset, so this requirement is violated.
24
RECOMMENDED
SOLUTIONS
Such resources can be returned with a specific charset by using the
Apache Server configuration directive AddDefaultCharset utf-8
inside httpd.conf file.
VERDICT MediaWiki violates this requirement.
11.5 Verify that the HTTP headers or any part of the HTTP response do
not expose detailed version information of system components.
AUTHOR Florin Tanasache
DESCRIPTION Knowing the version and type of system components, in special caso
web servers, allows testers to determine known vulnerabilities and the
appropriate exploits to use during testing.
There are several different vendors and versions of web servers on
the market today. Knowing the type of web server that you are testing
significantly helps in the testing process, and will also change the
course of the test. This information can be derived by sending the web
server specific commands and analyzing the output, as each version
of web server software may respond differently to these commands.
By knowing how each type of web server responds to specific
commands and keeping this information in a web server fingerprint
database, a penetration tester can send these commands to the web
server, analyze the response, and compare it to the database of
known signatures.
TOOLS
ADOPTED
ADVANCED REST CLIENT: With this tool I controlled if the
information of web server is displayable in HTTP header.
TESTING The simplest and most basic form of identifying a Web server is to look
at the Server field in the HTTP response header. For our experiments I
used the tool above. Atter a HTTP Request-Response, I have the I
noticed the following line:
Server: Apache/2.4.17 (Win32) OpenSSL/1.0.2g PHP/5.5.33
From the Server field, I understood that the server is likely Apache,
version 2.4.17, running on Windows operating system.
RECOMMENDED
SOLUTIONS
The possible solutions are:
- protect the presentation layer web server behind a hardened reverse
proxy.
- obfuscate the presentation layer web server headers.
Apache allows to hide its own versioning information by inserting the
following lines in the httpd.conf file:
ServerTokens Prod
ServerSignature Off
25
VERDICT MediaWiki fails this requirement
11.6 Verify that all API responses contain X-Content-Type-Options:
nosniff and Content-Disposition:attachment; filename="api.json"
(or other appropriate filename for the content type).
AUTHOR Matteo Lucchetti
DESCRIPTION There are some HTTP headers used for prevent some vulnerabilities.
In particular, we want check that every API response contain “X-
Content-Type-Options: nosniff” to prevent MIME-sniffing vulnerability,
with whom the browser can be manipulated into interpreting data in a
way that allows an attacker to carry out operations that are not
expected by either the site operator or user, such as cross-site
scripting. Furthermore, we want to make sure that the content-
disposition identifies the content-type with an appropriate filename.
TOOLS
ADOPTED
ADVANCED REST CLIENT: I used the tool to control if the X-
Content-Type-Options and Content-Disposition are setted correctly.
TESTING I sent some HTTP requests to MediaWiki. Then, I checked if among
the response headers the X-Content-Type-Options is setted to
“nosniff” and I noticed that it is setted correctly. Moreover, the script
WebStart.php adds such header in each HTTP response. Also, I tried
to check the association of the Content-Disposition with the Content-
Type, but I have not found some dialog download windows, only an
instruction for convert in pdf a web page and download it. In this case,
the content-disposition identifies the content-type with an appropriate
filename, but this is not enough for consider exceeded the
requirement.
RECOMMENDED
SOLUTIONS
We could define a single interface point where to set the Content-
Disposition header according to the specific file.
VERDICT We can’t express a verdict.
11.7 Verify that the Content SecurityPolicy V2 (CSP) is in use in a way
that either disables inline JavaScript or provides an integrity
check on inline JavaScript with CSP noncing or hashing.
AUTHOR Michele Reale
DESCRIPTION A Content Security Policy (CSP) is a standard security mechanism for
Web servers which allows to control (and possibly forbid completely)
the execution of in-line JavaScript code - that is the code inside a
<script>...</script> tag, while the script file referred by src
attribute is named external JavaScript source code.
CSP provides a lot of commands to define a suitable policy for the
26
specific system purposes: a very good example is the current GitHub
CSP, visible in the Content-Security-Policy header of its homepage
HTTP response.
See https://glebbahmutov.com/blog/disable-inline-javascript-for-
security/ and
https://www.mediawiki.org/wiki/Requests_for_comment/Content-
Security-Policy for useful examples and recommendations.
TOOLS
ADOPTED
ADVANCED REST CLIENT: used to check the presence of the
Content-Security-Policy header in MediaWiki pages.
GOOGLE CHROME DEVELOPERCONSOLE: used to try to execute
in-line JavaScript code.
TESTING First, the MediaWiki home page was retrieved through Advanced Rest
Client: the HTTP response was missing the CSP header.
Then, a simple script with in-line JavaScript code was launched on the
main page through the Google Chrome Developer Console to check
whether a JavaScript in-line injection is allowed or not:
var el = document.createElement('script');
el.innerText = 'alert("This site has a robust CSP.");'
document.body.appendChild(el);
The result was that a popup message with the specified text appeared,
hence MediaWiki allows in-line JavaScript code by default; thus, the
system violates the requirement.
RECOMMENDED
SOLUTIONS
MediaWiki Project community is still implementing and discussing the
CSP support: see
https://www.mediawiki.org/wiki/Requests_for_comment/Content-
Security-Policy for further (and very recent) references.
VERDICT MediaWiki violates this requirement.
11.8 Verify that the X-XSS-Protection:"1; mode=block" header is in
place.
AUTHOR Florin Tanasache
DESCRIPTION This header is used to configure the built in reflective XSS protection
found in Internet Explorer, Chrome and Safari (Webkit). Valid settings
for the header are 0, which disables the protection, 1 which enables
the protection and 1; mode=block which tells the browser to block the
response if it detects an attack rather than sanitizing the script. A
possible attack is the so called “Clickjacking”, a.k.a. “UI redress
attack”: when an attacker uses multiple transparent or opaque layers
to trick a user into clicking on a button or link on another page when
they were intending to click on the the top level page. Thus, the
attacker is “hijacking” clicks meant for their page and routing them to
27
another page, most likely owned by another application, domain, or
both. Using a similar technique, keystrokes can also be hijacked. With
a carefully crafted combination of stylesheets, iframes, and text boxes,
a user can be led to believe they are typing in the password to their
email or bank account, but are instead typing into an invisible frame
controlled by the attacker.
TOOLS
ADOPTED
ADVANCED REST CLIENT: With this tool I tested if the X-XSS-
Protection: 1; mode=block header is in place.
RIPS: I used only this tool only to verified if MediaWiki is vulnerable
against Cross-Site Scripting attacks. The result of the tools showed to
me that there are 1098 files that can be vulnerables.
TESTING For testing I used the tool to send some GET request to Mediawiki.
After that, I checked if among the response headers the X-XSS-
Protection: 1; mode=block header is in place. I noticed that this
requirement is not verified, so MediaWiki doesn’t set this header.
RECOMMENDED
SOLUTIONS
Since MediaWiki is a PHP web application we have to send the
response header with PHP. We can enable it by modifying our Apache
settings or our .htaccess file, and adding the following line to it:
header("X-XSS-Protection: 1; mode=block");
Such header can be enabled also by inserting the following line in
Apache httpd.conf file:
Header set X-XSS-Protection “1; mode=block”
VERDICT MediaWiki fails this requirement
5.3 V16: Files and resources verification requirements
16.1 Verify that URL redirects and forwards only allow whitelisted
destinations, or show a warning when redirecting to potentially
untrusted content.
AUTHOR Matteo Lucchetti
DESCRIPTION Unvalidated redirects and forwards are possible when a web
application accepts untrusted input that could cause the web
application to redirect the request to a URL contained within untrusted
input. By modifying untrusted URL input to a malicious site, an
attacker may successfully launch a phishing scam and steal user
credentials. Because the server name in the modified link is identical
to the original site, phishing attempts may have a more trustworthy
28
appearance. Unvalidated redirect and forward attacks can also be
used to maliciously craft a URL that would pass the application’s
access control check and then forward the attacker to privileged
functions that they would normally not be able to access.
TOOLS
ADOPTED
REDIRECT CHECKER: This tool reported that everything is fine.
TESTING Since no main tools reported a open redirect vulnerability, I tried to
verify the presence with the above side tool. Also this tool did not
report any vulnerabilities. Furthermore, I found
checkBadRedirects.php file that makes a check on redirected page.
RECOMMENDED
SOLUTIONS
No one because MediaWiki pass this requirement.
VERDICT MediaWiki pass this requirement
16.2 Verify that untrusted file data submitted to the application is not
used directly with file I/O commands, particularly to protect
against path traversal, local file include, file mime type, and OS
command injection vulnerabilities.
AUTHOR Michele Reale
DESCRIPTION A Path Traversal attack aims to access files and directories that are
stored outside the web root folder. By browsing the application, the
attacker looks for absolute links to files stored on the web server. By
manipulating variables that reference files with “dot-dot-slash (../)”
sequences and its variations, it may be possible to access arbitrary
files and directories stored on file system, including application source
code, configuration and critical system files, limited by system
operational access control. The attacker uses ”../” sequences to move
up to root directory, thus permitting navigation through the file system.
This attack can be executed with an external malicious code injected
on the path, like the Resource Injection attack. To perform this attack
it’s not necessary to use a specific tool; attackers typically use a
spider/crawler to detect all URLs available. This attack is also known
as “dot-dot-slash”, “directory traversal”, “directory climbing” and
“backtracking”.
TOOLS
ADOPTED
ZAP. The tool raised a lot of warnings about:
❖ disclosure of file paths in error messages (11);
❖ remote OS command injection (1);
❖ JavaScript src file inclusion (22).
Vega. The tool raised a lot of warnings about:
❖ directory listing detection (77);
❖ possible HTTP PUT file upload (2);
❖ PHP error messages possibly containing file paths (326);
29
❖ bash "ShellShock" injection (58);
❖ shell injection (213);
❖ local file inclusion (49);
❖ XPath injection (81);
❖ local filesystem paths in Web pages (198).
RIPS. The tool raised the following warnings:
❖ command execution (2);
❖ file disclosure (3);
❖ file inclusion (32);
❖ file manipulation (9).
TESTING Both ZAP and Vega proved the presence of a huge amount of
vulnerabilities related to this requirement.
RECOMMENDED
SOLUTIONS
Address all the warnings the tools reported, then repeat thorough
scans again.
VERDICT MediaWiki violates this requirement
16.3 Verify that files obtained from untrusted sources are validated to
be of expected type and scanned by antivirus scanners to
prevent upload of known malicious content.
AUTHOR Florin Tanasache
DESCRIPTION Many application’s business processes allow for the upload and
manipulation of data that is submitted via files. But the business
process must check the files and only allow certain “approved” file
types. Deciding what files are “approved” is determined by the
business logic and is application/system specific. The risk in that by
allowing users to upload files, attackers may submit an unexpected file
type that that could be executed and adversely impact the application
or system through attacks that may deface the web site, perform
remote commands, browse the system files, browse the local
resources, attack other servers, or exploit the local vulnerabilities, just
to name a few.
Vulnerabilities related to the upload of unexpected file types is unique
in that the upload should quickly reject a file if it does not have a
specific extension.
TOOLS
ADOPTED
No one
TESTING Starting from MediaWiki version 1.1, uploads are initially disabled by
default, due to security considerations. Uploads can be enabled via a
configuration setting. So, there are not problems with untrusted data
validation. In MediaWiki version 1.5 and later, the attribute to be set
resides in LocalSettings.php and $wgEnableUploads is set as follows:
30
$wgEnableUploads = true; # Enable uploads
This enables uploads. However, our version of Mediawiki setted the
line above to false.
RECOMMENDED
SOLUTIONS
Applications should be developed with mechanisms to only accept and
manipulate “acceptable“ files that the rest of the application
functionality is ready to handle and expecting. Some specific examples
include: Black or White listing of file extensions, using “Content-Type”
from the header, or using a file type recognizer, all to only allow
specified file types into the system.
VERDICT MediaWiki pass this requirement
16.4 Verify that untrusted data is not used within inclusion, class
loader, or reflection capabilities to prevent remote/local file
inclusion vulnerabilities.
AUTHOR Matteo Lucchetti
DESCRIPTION File inclusion vulnerability allows an attacker to include a file, usually
through a script on the web server. The vulnerability occurs due to the
use of user-supplied input without proper validation. There are two
type of file inculsion:
● Local File Inclusion (LFI) - is the process of including files,
that are already locally present on the server, through the
exploiting of vulnerable inclusion procedures implemented in
the application.
● Remote File Inclusion (RFI) - is the process of including
remote files through the exploiting of vulnerable inclusion
procedures implemented in the application.
TOOLS
ADOPTED
RIPS: This tool reported that there are 571 files with the file inclusion
vulnerability.
TESTING Since file inclusion occurs when paths passed to "include" statements
are not properly sanitized, in a blackbox testing approach, we should
look for scripts which take filenames as parameters. RIPS reported
different files with this scripts.
RECOMMENDED
SOLUTIONS
Three key ways to prevent file inclusion attacks are:
● Never use arbitrary input data in a literal file include request
● Use a filter to thoroughly scrub input parameters against
possible file inclusions
● Build a dynamic whitelist
VERDICT MediaWiki fails this requirement
31
16.5 Verify that untrusted data is not used within cross-domain
resource sharing (CORS) to protect against arbitrary remote
content.
AUTHOR Michele Reale
DESCRIPTION Cross-Domain Resource Sharing (CORS) is a protection mechanism
to perform intended cross-site requests in a secure way and to block
unintended ones.
CORS is based on the definition of a set of permitted HTTP verbs for
each listed URL.
TOOLS
ADOPTED
VEGA: the tool reported a successful maliciously-crafted cross-site
request to Mediawiki.
TESTING Vega crafted an HTTP GET request on load.php file that led to a
successful cross-site malicious operation.
RECOMMENDED
SOLUTIONS
MediaWiki supports the definition of CORS policies through a proper
configuration of the flag $wgCrossSiteAJAXdomains in the
configuration files.
Once activated, CORS will use dedicated HTTP header (such as
Access-Control-Allow-Origin) to check the cross-site requests.
VERDICT MediaWiki violates this requirement
16.8 Verify the application code does not execute uploaded data
obtained from untrusted sources.
AUTHOR Florin Tanasache
DESCRIPTION This requirement concerns about code execution vulnerability. Code
injection vulnerabilities occur where the output or content served from
a Web application can be manipulated in such a way that it triggers
server-side code execution. In some poorly written Web applications
that allow users to modify server-side files it is sometimes possible to
inject code in the scripting language of the application itself.
TOOLS
ADOPTED
RIPS: With this tool I have analyzed MediaWiki and I have noticed that
there are 1866 possibly sink of Code Execution vulnerability.
TESTING Simply static analysis test with RIPS. However, as we seen in 16.3
requirement in the our MediaWiki version uploads are initially disabled
by default, due to security considerations.
RECOMMENDED
SOLUTIONS
To protect against this type of attack, you should analyze everything
your application does with files.
VERDICT MediaWiki fails this requirement
32
16.9 Do not use Flash, Active-X, Silverlight, NACL, client-side Java or
other client side technologies not supported natively via W3C
browser standards.
AUTHOR Matteo Lucchetti
DESCRIPTION The World Wide Web Consortium (W3C) is the main international
standards organization for the World Wide Web. The consortium has
established that there are some client-side technologies that should
not be used, such as Flash, Active-X, Silverlight, NACL etc. The
reason is to prevenet the upload of data from untrusted sources.
TOOLS
ADOPTED
No one.
TESTING To verify this requirement I have to check that the web app does not
require the use of these technologies. To do that, I have done a
manually check of the code because I have not find a suitable tool.
The research carried out did not identify any of these technologies, but
of course I may have committed some human error. In any case,
mediawiki should not use these technologies as it is composed of
simple web pages, so I decided to trust of my control.
RECOMMENDED
SOLUTIONS
No one because MediaWiki pass the requirement.
VERDICT MediaWiki pass this requirement
6. Conclusion
We conclude this report giving our verdict on the utility of the tools used, on the goodness of the
OWASP requirements and giving a review about the security level of MediaWiki based on the
final verdicts of the requirements.
33
6.1 About the tools
We used several tools to test the requirements above, and each tool is useful according the test
done. Some tools like Burp-Repeater and Advanced Rest Client were most important than other
because of their ease of use. Other tools like Vega, Zap and RIPS have been used mostly as
confirmation of certain vulnerabilities found. However, we consider that to test the safety of an
application is very important to a direct discussion about the design choices with the developers.
6.2 About the requirements of OWASP
In this Project we have analyzed three OWASP requirements on MediaWiki. In particular we
have studied if the web app has a good implementation of the session management, a good
HTTP security configuration and if it has a good management of files and resources. Each
requirement is composed of several sub-requirements regarding the key points of each
analysis, so as to analyze the best-known problems. We think that these standards are
complete enough, but we believe requires a continuous update so as to introduce the new
vulnerabilities.
6.3 About the security level of MediaWiki
To assess the level of security of mediawiki, here is given the summary table of requirements
with the final verdicts.
# Description Verdict
3.1 Verify that there is no custom session manager, or that the custom
session manager is resistant against all common session
management attacks.
Fail
3.2 Verify that sessions are invalidated when the user logs out. Pass
3.3 Verify that sessions timeout after a specified period of inactivity. Fail
3.5 Verify that all pages that require authentication have easy and
visible access to logout functionality.
Pass
3.6 Verify that the session id is never disclosed in URLs, error
messages, or logs. This includes verifying that the application does
not support URL rewriting of session cookies.
Pass
3.7 Verify that all successful authentication and re-authentication
generates a new session and session id.
Pass
34
3.11 Verify that session ids are sufficiently long, random and unique
across the correct active session base.
Fail
3.12 Verify that session ids stored in cookies have their path set to an
appropriately restrictive value for the application, and authentication
session tokens additionally set the “HttpOnly” and “secure”
attributes.
Fail
3.16 Verify that the application limits the number of active concurrent
sessions.
Fail
3.17 Verify that an active session list is displayed in the account profile or
similar of each user. The user should be able to terminate any active
session.
Fail
3.18 Verify the user is prompted with the option to terminate all other
active sessions after a successful change password process.
Pass
11.1 Verify that the application accepts only a defined set of required
HTTP request methods, such as GET and POST are accepted, and
unused methods (e.g. TRACE, PUT, and DELETE) are explicitly
blocked.
Fail
11.2 Verify that every HTTP response contains a content type header
specifying a safe character set (e.g., UTF-8, ISO 8859-1).
Fail
11.5 Verify that the HTTP headers or any part of the HTTP response do
not expose detailed version information of system components.
Fail
11.6 Verify that all API responses contain X-Content-Type-Options:
nosniff and Content-Disposition: attachment; filename="api.json" (or
other appropriate filename for the content type).
Don’t know
11.7 Verify that the Content Security Policy V2 (CSP) is in use in a way
that either disables inline JavaScript or provides an integrity check
on inline JavaScript with CSP noncing or hashing.
Fail
11.8 Verify that the X-XSS-Protection: 1; mode=block header is in place. Fail
16.1 Verify that URL redirects and forwards only allow whitelisted
destinations, or show a warning when redirecting to potentially
untrusted content.
Pass
16.2 Verify that untrusted file data submitted to the application is not used
directly with file I/O commands, particularly to protect against path
traversal, local file include, file mime type, and OS command
injection vulnerabilities.
Fail
16.3 Verify that files obtained from untrusted sources are validated to be
of expected type and scanned by antivirus scanners to prevent
upload of known malicious content.
Pass
35
16.4 Verify that untrusted data is not used within inclusion, class loader,
or reflection capabilities to prevent remote/local file inclusion
vulnerabilities.
Fail
16.5 Verify that untrusted data is not used within cross-domain resource
sharing (CORS) to protect against arbitrary remote content.
Fail
16.8 Verify the application code does not execute uploaded data obtained
from untrusted sources.
Fail
16.9 Do not use Flash, Active-X, Silverlight, NACL, client-side Java or
other client side technologies not supported natively via W3C
browser standards.
Pass
As we can see, MediaWiki has so many requirements that are not passed, in particular the
section V11 is completely not supported. With these results, we can not say that MediaWiki has
a good level of security. But we must remember that this web application it was not developed
for to be secure; in fact we have classified it as ASVS level 1.
Moreover, we tested MediaWiki as a local Web service with its default configuration, thus we
assessed the security of its default settings. We proposed several simple solutions for these
security issues based on configuring MediaWiki properly, and so these verdicts are not
expressing the real security level of MediaWiki by themselves.
At the end, we claim that MediaWiki has a low level of security.

More Related Content

Similar to Security assessment of mediawiki web-application

[Wroclaw #5] OWASP Projects: beyond Top 10
[Wroclaw #5] OWASP Projects: beyond Top 10[Wroclaw #5] OWASP Projects: beyond Top 10
[Wroclaw #5] OWASP Projects: beyond Top 10OWASP
 
Web Application Vulnerabilities
Web Application VulnerabilitiesWeb Application Vulnerabilities
Web Application VulnerabilitiesPamela Wright
 
Web application penetration testing lab setup guide
Web application penetration testing lab setup guideWeb application penetration testing lab setup guide
Web application penetration testing lab setup guideSudhanshu Chauhan
 
Web application framework
Web application frameworkWeb application framework
Web application frameworkPankaj Chand
 
VAPT PRESENTATION full.pptx
VAPT PRESENTATION full.pptxVAPT PRESENTATION full.pptx
VAPT PRESENTATION full.pptxDARSHANBHAVSAR14
 
Crime Reporting System.pptx
Crime Reporting System.pptxCrime Reporting System.pptx
Crime Reporting System.pptxPenilVora
 
System design for Web Application
System design for Web ApplicationSystem design for Web Application
System design for Web ApplicationMichael Choi
 
Top 10 Web Vulnerability Scanners
Top 10 Web Vulnerability ScannersTop 10 Web Vulnerability Scanners
Top 10 Web Vulnerability Scannerswensheng wei
 
VAPT_FINAL SLIDES.pptx
VAPT_FINAL SLIDES.pptxVAPT_FINAL SLIDES.pptx
VAPT_FINAL SLIDES.pptxkarthikvcyber
 
HP WebInspect
HP WebInspectHP WebInspect
HP WebInspectrohit_ta
 
Full Stack Web Development: Vision, Challenges and Future Scope
Full Stack Web Development: Vision, Challenges and Future ScopeFull Stack Web Development: Vision, Challenges and Future Scope
Full Stack Web Development: Vision, Challenges and Future ScopeIRJET Journal
 
Net training in bhubaneswar
Net training in bhubaneswar Net training in bhubaneswar
Net training in bhubaneswar litbbsr
 
The Development History of PVS-Studio for Linux
The Development History of PVS-Studio for LinuxThe Development History of PVS-Studio for Linux
The Development History of PVS-Studio for LinuxPVS-Studio
 
Why Johnny Still Can’t Pentest: A Comparative Analysis of Open-source Black-...
Why Johnny Still Can’t Pentest:  A Comparative Analysis of Open-source Black-...Why Johnny Still Can’t Pentest:  A Comparative Analysis of Open-source Black-...
Why Johnny Still Can’t Pentest: A Comparative Analysis of Open-source Black-...Rana Khalil
 
Ease of full Stack Development
Ease of full Stack DevelopmentEase of full Stack Development
Ease of full Stack DevelopmentIRJET Journal
 
OWASP Europe Summit Portugal 2008. Web Application Assessments
OWASP Europe Summit Portugal 2008. Web Application AssessmentsOWASP Europe Summit Portugal 2008. Web Application Assessments
OWASP Europe Summit Portugal 2008. Web Application AssessmentsInternet Security Auditors
 
JCON_15FactorWorkshop.pptx
JCON_15FactorWorkshop.pptxJCON_15FactorWorkshop.pptx
JCON_15FactorWorkshop.pptxGrace Jansen
 

Similar to Security assessment of mediawiki web-application (20)

[Wroclaw #5] OWASP Projects: beyond Top 10
[Wroclaw #5] OWASP Projects: beyond Top 10[Wroclaw #5] OWASP Projects: beyond Top 10
[Wroclaw #5] OWASP Projects: beyond Top 10
 
Web Application Vulnerabilities
Web Application VulnerabilitiesWeb Application Vulnerabilities
Web Application Vulnerabilities
 
Web application penetration testing lab setup guide
Web application penetration testing lab setup guideWeb application penetration testing lab setup guide
Web application penetration testing lab setup guide
 
Web application framework
Web application frameworkWeb application framework
Web application framework
 
VAPT PRESENTATION full.pptx
VAPT PRESENTATION full.pptxVAPT PRESENTATION full.pptx
VAPT PRESENTATION full.pptx
 
Project Presentation
Project Presentation Project Presentation
Project Presentation
 
Crime Reporting System.pptx
Crime Reporting System.pptxCrime Reporting System.pptx
Crime Reporting System.pptx
 
System design for Web Application
System design for Web ApplicationSystem design for Web Application
System design for Web Application
 
Top 10 Web Vulnerability Scanners
Top 10 Web Vulnerability ScannersTop 10 Web Vulnerability Scanners
Top 10 Web Vulnerability Scanners
 
VAPT_FINAL SLIDES.pptx
VAPT_FINAL SLIDES.pptxVAPT_FINAL SLIDES.pptx
VAPT_FINAL SLIDES.pptx
 
HP WebInspect
HP WebInspectHP WebInspect
HP WebInspect
 
Full Stack Web Development: Vision, Challenges and Future Scope
Full Stack Web Development: Vision, Challenges and Future ScopeFull Stack Web Development: Vision, Challenges and Future Scope
Full Stack Web Development: Vision, Challenges and Future Scope
 
Zap Scanning
Zap ScanningZap Scanning
Zap Scanning
 
Net training in bhubaneswar
Net training in bhubaneswar Net training in bhubaneswar
Net training in bhubaneswar
 
The Development History of PVS-Studio for Linux
The Development History of PVS-Studio for LinuxThe Development History of PVS-Studio for Linux
The Development History of PVS-Studio for Linux
 
Why Johnny Still Can’t Pentest: A Comparative Analysis of Open-source Black-...
Why Johnny Still Can’t Pentest:  A Comparative Analysis of Open-source Black-...Why Johnny Still Can’t Pentest:  A Comparative Analysis of Open-source Black-...
Why Johnny Still Can’t Pentest: A Comparative Analysis of Open-source Black-...
 
AppSec & OWASP Top 10 Primer
AppSec & OWASP Top 10 PrimerAppSec & OWASP Top 10 Primer
AppSec & OWASP Top 10 Primer
 
Ease of full Stack Development
Ease of full Stack DevelopmentEase of full Stack Development
Ease of full Stack Development
 
OWASP Europe Summit Portugal 2008. Web Application Assessments
OWASP Europe Summit Portugal 2008. Web Application AssessmentsOWASP Europe Summit Portugal 2008. Web Application Assessments
OWASP Europe Summit Portugal 2008. Web Application Assessments
 
JCON_15FactorWorkshop.pptx
JCON_15FactorWorkshop.pptxJCON_15FactorWorkshop.pptx
JCON_15FactorWorkshop.pptx
 

Recently uploaded

Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio, Inc.
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsAlberto González Trastoy
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityNeo4j
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfPower Karaoke
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comFatema Valibhai
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyFrank van der Linden
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsMehedi Hasan Shohan
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - InfographicHr365.us smith
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...aditisharan08
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)OPEN KNOWLEDGE GmbH
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWave PLM
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number SystemsJheuzeDellosa
 
cybersecurity notes for mca students for learning
cybersecurity notes for mca students for learningcybersecurity notes for mca students for learning
cybersecurity notes for mca students for learningVitsRangannavar
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackVICTOR MAESTRE RAMIREZ
 

Recently uploaded (20)

Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered Sustainability
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdf
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The Ugly
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software Solutions
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - Infographic
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need It
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number Systems
 
cybersecurity notes for mca students for learning
cybersecurity notes for mca students for learningcybersecurity notes for mca students for learning
cybersecurity notes for mca students for learning
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStack
 

Security assessment of mediawiki web-application

  • 1. Software Security Group Project 2016 Students: Matteo Lucchetti, Michele Reale, Florin Tanasache SECURITY ASSESSMENT OF MEDIAWIKI WEB APPLICATION
  • 2. 1 OWASP ASVS 3.0 Table of contents 1. Introduction……………………………………………………... 1.1 Abstract…………………………………………….. 1.2 Students……………………………………………. 3 2. The MediaWiki web application…………………….......... 2.1 Brief introduction…………………………………... 2.2 Level of the web-app…………………………….... 4 3. Log working group meetings………………………............ 3.1 21/04 09:00-12:00…………………………………. 3.2 05/05 09:00-14:00…………………………………. 3.3 07/05 09:00-16:00…………………………………. 3.4 12/05 09:00-10:00…………………………………. 3.5 19/05 09:00-13:00…………………………………. 3.6 26/05 10:00-14:00…………………………………. 3.7 28/05 11:00-15:30…………………………………. 3.8 29/05 16:00-20:00…………………………………. 3.9 30/05 09:30-13:30…………………………………. 3.10 31/05 16:00-21:15……………………………….. 5
  • 3. 2 4. Executive summary……………………………………………. 4.1 Overview of tools…………………………………. 4.2 Summary of findings……………………………… 4.3 A specific review of RIPS………………………… 4.4 A specific review of Vega………………………… 7 5. Control of requirements……………………………………... 5.1 V3: Session management system verification requirements…………………………………………… 5.2 V11: HTTP security configuration verification requirements………………………………….………… 5.3 V16: Files and resources verification requirements………………………………….………… 12 6. Conclusion………………………………………………….…… 6.1 About the tools……..……………………….……... 6.2 About the requirements of OWASP……………… 6.3 About the security level of MediaWiki………….... 33
  • 4. 3 1. Introduction 1.1 Abstract This report documents the development of the group project assigned by Professor Francesco Parisi Presicce for the class of Security in Software Applications of Master Degree in Computer Science at Sapienza in the academic year 2015/2016. The project expects to analyze a web- application, in our case MediaWiki, through the OWASP standards. The goal is to quantify the usefulness of these standards, of the tools used and judge whether the web-app respects the security requirements or not. To analyze the application in deep, the class was divided into groups and each of these has been assigned three or more categories. In May 30, 2016, each group will give a presentation of the points assigned in order to have an overview of the results obtained. 1.2 Students and categories assigned Our group is composed by: ❖ Matteo Lucchetti ➢ Matricola: 1496408 ➢ Mail: lucchetti.1496408@studenti.uniroma1.it ❖ Michele Reale ➢ Matricola: 1315785 ➢ Mail: reale.1315785@studenti.uniroma1.it ❖ Florin Tanasache ➢ Matricola: 1524243 ➢ Mail: tanasache.1524243@studenti.uniroma1.it We must verify the categories V3. Session management verification, V11. HTTP security configuration and V16. File and resources requirements of the OWASP standards on the web- app MediaWiki.
  • 5. 4 2. The MediaWiki web application 2.1 Brief introduction MediaWiki is a free and open-source wiki application. It was originally developed by the Wikimedia Foundation and runs on many websites, including Wikipedia,Wiktionary and Wikimedia Commons. It is written in the PHP programming language and uses a backend database. The peculiarity of MediaWiki, but more generally of all wiki sites, is to allow its users to add, edit or delete content through a web browser, allowing the development of a virtual encyclopedia. 2.2 Level of the web-app According to the OWASP ASVS guidelines, we decided to adopt the ASVS Level 1 (Opportunistic) specifications. In fact, MediaWiki does not correspond either to Level 2 or Level 3 kind of applications (business-critical, mission-critical), since it would not give a significant gain to any attacker - no valuable resources but the user accounts could be stolen ("too much effort, not enough gain"). Hence, the attacker would not feature expensive and sophisticated techniques and tools to attack Mediawiki, but just simple and cheap automatic tools to spot the most common vulnerabilities - most of them are well documented in the OWASP Top 10 and similar checklists. We'll feature a set of automatic analysis tools to spot and fix such vulnerabilities; following, you can find the complete list.
  • 6. 5 3. Log working group meetings This section presents the calendar of meetings with respective timetables and the works done . In particular, are not reported the hours spent on the project individually. 3.1 21/04 09:00-12:00 ❖ Installation and configuration of MediaWiki in our laptops in three different environments: Windows 7 VirtualBox, Windows 10, Linux. ❖ Discussion about the ASVS level of MediaWiki. ❖ Distribution of tasks: Lucchetti 3.1 3.5 3.16 11.1 11.6 16.1 16.4 16.9 Reale 3.2 3.6 3.11 3.17 11.2 11.7 16.2 16.5 Tanasache 3.3 3.7 3.12 3.18 11.5 11.8 16.3 16.8 ❖ Installation and first scans with RISP. ❖ Creation of template for final report. 3.2 05/05 09:00-14:00 ❖ Research of tools for specific tasks. ❖ Discussion about requirements of first deadline. ❖ Creation of template for first deadline. 3.3 07/05 09:00-16:00 ❖ Discussion about our results. ❖ Scan with main tools. ❖ Compilation of the report. ❖ Conclusion of first report. 3.4 12/05 09:00-10:00 ❖ Comparison the individual works. ❖ Compilation of the report. 3.5 19/05 09:00-13:00 ❖ Comparison the individual works.
  • 7. 6 ❖ Compilation of the report. 3.6 26/05 10:00-14:00 ❖ Comparison the individual works. ❖ Compilation of the report. 3.7 28/05 11:00-15:30 ❖ Preparation of the slides for presentation. 3.8 29/05 16:00-20:00 ❖ Completation of the slides. 3.9 30/05 09:30-13:30 ❖ Presentation of the project. 3.10 31/05 16:00-21:15 ❖ Completation of the report.
  • 8. 7 4. Executive summary 4.1 Overview of tools To carry out the requested analyzes, we have been used various tools. These were classified into two categories according to their generality of the results. The Main tools, dealing in a general way to scan the entire web-app trying to identify more vulnerabilities that can. While the Side tools are concerned with identifying specific vulnerabilities. Main tools (used to address all the three ASVSrequirements) Tool name Main purpose Comment RIPS General-purpose, grep-based static analysis of PHP-based web apps. See Section 4.3 Yasca General-purpose, grep-based, extensible static analysis of source code written in several languages (PHP is supported as well). Too verbose w.r.t. W3C browser compatibility and informational messages. Very few vulnerabilities found, compared to the other tools. ZAP A penetration-testing tool for Web applications, featuring both static and dynamic tools. Accurate static and dynamic analysis. HTTP request-response reporting. Very long execution time. Report exporting allowed. Vega Web-app security testing tool, performing both static analysis and proxy-based dynamic analysis. See Section 4.4 Side tools Tool name ASVS Requirements Hit Comment Burp Repeater V3: Session management Assesses the degree of randomness of security tokens and nonces. Cookie Digger V3: Session management Very similar to Burp Sequencer. JHijack V3: Session management A simple Java Fuzzer mainly used for numeric session hijacking and parameter
  • 9. 8 enumeration. DotDotPwn V16: Path traversal Featured in Kali Linux pentest platform. Redirect Checker V16: Path traversal Online tool checking URL redirection. Please note: All tools listed have been used, but not all were helpful in solving our requirements. In the following, we will take in consideration only those that we believe have been most useful. 4.2 Summary of findings This section shows the results generated by the main tools scans on MediaWiki. Scan’s results of RIPS with verbositylevel 4 Vulnerabilities found Comments Useful Code execution This vulnerability is being considered from other points. No Command execution This vulnerability is being considered from other points. No Protocol injection This vulnerability is being considered from other points. No File disclosure This vulnerability is being considered from other points. No File inclusion We must analyze this vulnerability for V16. Yes File manipulation This vulnerability is being considered from other points. No SQL injection This vulnerability is being considered from other points. No Cross-site scripting We must analyze this vulnerability for V3. Yes HTTP response splitting We must analyze this vulnerability for V11. Yes Session fixation We must analyze this vulnerability for V3. Yes Possible flow control This vulnerability is being considered from other points. No
  • 10. 9 Reflection injection This vulnerability is being considered from other points. No PHP object injection This vulnerability is being considered from other points. No Scan’s results of Yasca Vulnerabilities found Comments Useful SQL Injection This vulnerability is being considered from other points. No Cross-site scripting We must analyze this vulnerability for V3. Yes Weak credentials This vulnerability is being considered from other points. No Scan’s results of ZAP Vulnerabilities found Comments Useful Cross-site scripting We must analyze this vulnerability for V3. Yes Remote OS command injection We must analyze this vulnerability for V16. Yes SQL injection This vulnerability is addressed by other requirements. No Application error disclosure We must analyze this vulnerability for V3. Yes Buffer overflow This vulnerability is addressed by other requirements. No Directory browsing We must analyze this vulnerability for V16. Yes Format string error This vulnerability is addressed by other requirements. No X-Frame-Options header not set We must analyze this vulnerability for V11. Yes
  • 11. 10 Cross-domain JavaScript source file inclusion We must analyze this vulnerability for V16. Yes Password autocomplete in browser This vulnerability is addressed by other requirements. No Private IP disclosure This vulnerability is addressed by other requirements. No Web browser XSS protection not enabled We must analyze this vulnerability for V11. Yes X-Content-Type- Options header missing We must analyze this vulnerability for V11. Yes Scan’s results of Vega Vulnerabilities found Comments Useful Bash “ShellShock” Injection This vulnerability is addressed by other requirements. No Cleartext Password over HTTP This vulnerability is addressed by other requirements. No Cross-site scripting We must analyze this vulnerability for V3. Yes Integer Overflow This vulnerability is addressed by other requirements. No Page Fingerprint Differential Detected This vulnerability is addressed by other requirements. No Shell Injection This vulnerability is addressed by other requirements. No SQL Injection This vulnerability is addressed by other requirements. No Local filesystems paths found This vulnerability is addressed by other requirements. No PHP error detected We must analyze this vulnerability for V3. Yes Possible HTTP PUT file upload We must analyze this vulnerability both for V11 and V16. Yes
  • 12. 11 Possible Source Code Disclosure This vulnerability is addressed by other requirements. No Possible XML Injection This vulnerability is addressed by other requirements. No Directory Listing Detected We must analyze this vulnerability for V16. Yes Email Addresses Found This vulnerability is addressed by other requirements. No Form Password Field Autocomplete Enabled This vulnerability is addressed by other requirements. No
  • 13. 12 4.3 A specific review of RIPS RIPS is a tool written in PHP to find vulnerabilities in PHP applications using static code analysis. By tokenizing and parsing all source code files RIPS is able to transform PHP source code into a program model and to detect sensitive sinks that can be tainted by userinput during the program flow. Besides the structured output of found vulnerabilities RIPS also offers an integrated code audit framework for further manual analysis. The RIPS user interface is simple, but there are some nuances. One of these are the verbosity levels. RIPS has 5 verbosity levels, each of them can be chosen to improve the results. After a complete analysis with all these, we have chosen the level 4, because it report all the results of the precedent levels plus additional informations. We are aware that this level make many false positive but, for a first scan of the web-app, we felt it appropriate knows all the possible vulnerabilites that this tool can find. 4.4 A specific review of Vega Vega is quite a powerful tool to perform security analysis of Web servers: it's based on charging the Web server with an intense, but carefully produced, load of HTTP requests matched against their responses in order to detect possible security flaws. It's possible to choose the modules (analysis components) to tune the analysis to the analyst's particular needs. The tool was tuned to run all the available modules, and this required a 22-hour-long execution. The reports are very well detailed, since they show, in a compact and clear way: ❖ the HTTP request that led to the flaw; ❖ the content of the corresponding HTTP response; ❖ the kind of vulnerability; ❖ a discussion about it; ❖ its possible impact; ❖ suggested and practical solutions to fix it; ❖ additional references about this vulnerability; ❖ a quick recap of the spotted vulnerability, together with its severity level and classification. Reports are a key feature of Vega, but this software is missing a fundamental feature: it's impossible to export security reports, even if they're directly displayed as HTML pages inside the tool GUI. Vega stores the reports in a database file whose content is not accessible without Vega. This is a huge drawback of such a good tool, even if Vega can be closed and restarted again, without losing any scanning report.
  • 14. 13 5. Control of requirements In this section, the detailed analysis is given on each individual requirement. For each of these is given the author, the generic description of the intent of the requirement, the tools adopted, the testing flow, the final verdict, and, if negative, a possible solution. Please note: all this requirements are verified on the default version of the web app downloadable from this site https://www.mediawiki.org/wiki/Download. 5.1 V3: Session management system verification requirements 3.1 Verify that there is no custom session manager, or that the custom session manager is resistant against all common session management attacks. AUTHOR Matteo Lucchetti DESCRIPTION The session management mechanism is a fundamental security component in the majority of web applications. HTTP itself is a stateless protocol, and session management enables the application to uniquely identify a given user across a number of different requests and to handle the data that it accumulates about the state of that user's interaction with the application. The best practice is to use a robust, well-know session manager built in to a web application framework. However, if the developers want implement a custom session manager, they must check if it is resistant against all common session management attacks: some of these are session hijacking and session fixation. The first is the process by which a malicious user acquires a valid session identifier after it has been assigned and inherits that individual’s permissions, the second instead is an attack that permits an attacker to hijack a valid user session. TOOLS ADOPTED RIPS: This tool reported that there are 41 files with the session fixation vulnerability. TESTING As php manual said, to implement a costum session manager, the developers must use the session_set_save_handler() function that provides a way to override the default session handling mechanism with new functionality so that you can store the data where you’d like. I found this function in the ObjectCacheSessionHandler.php and I found out that the web app uses a custom session manager. So, I tested his resistance against the common session management attacks. Session fixation
  • 15. 14 I done this attack with Google Chrome browser; I established a legitimate connection with the web server which issued a session ID then I open a new browser in incognito mode (to simulate a new machine for the victim) and I set a cookie with the session ID through the following client-side scripting: document.cookie=”wikidb_session=<session ID>”; now I done the part of the victim and, how to predict the attack, I tried to access to the Web Server, but this have blocked me, then the attack has failed. Session hijacking The session hijacking attack compromises the session token by stealing or predicting a valid session token to gain unauthorized access to the Web Server. The session token could be compromised in different ways (Predictable session token, session sniffing, client- side attacks, man-in-the-middle attack etc.). For this test I assume that the attacker managed to get through one of these ways the session ID. So, I set a cookie like before with the session ID just stolen. The result is that I was able to access the web server without entering credentials, then the attack was successful. RECOMMENDED SOLUTIONS The best solution is to implement a well-know session manager built in PHP. VERDICT MediaWiki fails this requirement. 3.2 Verify that sessions are invalidated when the user logs out. AUTHOR Michele Reale DESCRIPTION In this case, we check that it's not possible to “reuse” a session token after the logout operation. If a session can still be used after logging out, an attacker may grab the session token and use it to impersonate a user. TOOLS ADOPTED BURP SUITE - REPEATER: I used Burp Suite, particularly the Repeater tool, to test the validation of old session tokens after the user logs out. TESTING BLACK BOX TESTING: One quick way to test this is to log in, get the session token from the cookie, log out, then manually add the session cookie with the session token and see if you are still logged in. Several tests like this confirmed that the session token was not re- usable. CODE ANALYSIS: we inspected the source code to check whether the session cookies were invalidated or not. We found out several modules of the application explicitly dealing with it, particularly User.php.
  • 16. 15 RECOMMENDED SOLUTIONS No one because MediaWiki pass this requirement. VERDICT MediaWiki supports this requirement 3.3 Verify that sessions timeout after a specified period of inactivity. AUTHOR Florin Tanasache DESCRIPTION All sessions should implement an idle or inactivity timeout. This timeout defines the amount of time a session will remain active in case there is no activity in the session, closing and invalidating the session upon the defined idle period since the last HTTP request received by the web application for a given session ID. The idle timeout limits the chances an attacker has to guess and use a valid session ID from another user. However, if the attacker is able to hijack a given session, the idle timeout does not limit the attacker’s actions, as he can generate activity on the session periodically to keep the session active for longer periods of time. Session timeout management and expiration must be enforced server-side. If the client is used to enforce the session timeout, for example using the session token or other client parameters to track time references (e.g. number of minutes since login time), an attacker could manipulate these to extend the session duration. TOOLS ADOPTED RIPS: With this tool I checked if there are vulnerability in session management. The results showed to me that Mediawiki is vulnerable by XSS attack or Session Fixation. These attacks could compromise a valid session timeout. BURP SUITE - REPEATER: I used Burp Suite, in special case the Repeater tool, for manipulating and resending individual requests. After some individual HTTP requests to Mediawiki, I noticed that his sessions are implemented with cookies for a quick navigation and delete these when the user makes the log-out. But I didn’t found nothing about session-timeout, so Mediawiki is vulnerable by attacks like Cross Site Scripting and Cross Site Request Forgery. TESTING BLACK BOX TESTING: Initially, I had to check whether a timeout exists, for instance, by logging in and waiting for the timeout log out to be triggered. Mediawiki is a simply wiki where users can to post article etc. Therefore, a short timeout in Mediawiki could annoy users which are typing lengthy articles with unnecessary log in requests. There timeouts of an hour and more can be acceptable. As I say above, i tried a login of an user and I waited for 2 hours. After that, I noticed that Mediawiki still doesn’t log out the user after an inactivity timeout. So, Mediawiki has not a session timeout or it doesn’t set a session time. However, if I was connected (log in done) and I closed the
  • 17. 16 browser, and subsequently when I reopened Mediawiki (web browser), it made the log-out automatically because I closed the browser. Obviously i didn’t check the box “Mantienimi collegato” when made the log in. RECOMMENDED SOLUTIONS In this case, for a secure use of Mediawiki i must have a session- timeout. In fact, it has the session.cookie_lifetime setted to 0 that make the session’s cookie a real session cookie and it is only valid until the browser is closed as we saw above. Use a simple time-stamp that denotes the time of the last activity (i.e. request) and update it with every request: if ( isset($_SESSION['LAST_ACTIVITY']) && (time() - $_SESSION['LAST_ACTIVITY'] > 3600)) { // last request was more than 60 minutes ago session_unset(); // unset $_SESSION variable for the run-time session_destroy(); // destroy session data in storage } $_SESSION['LAST_ACTIVITY'] = time(); // update last activity time- stamp Updating the session data with every request also changes the session file's modification date so that the session is not removed by the garbage collector prematurely. In this case we choose an interval of 60 minutes. VERDICT MediaWiki fails this requirement. 3.5 Verify that all pages that require authentication have easyand visible access to logout functionality. AUTHOR Matteo Lucchetti DESCRIPTION The goal of the logout functionality is to destroy and make unusable all the session tokens. it's important for a user have available a easy and visible access to this functionality to prevent a “reuse” of a session. So, we must check that the application provides a logout button and that this button is present and well visible on all pages that require authentication. A logout button that is not clearly visible, or that is present only on certain pages, poses a security risk, as the user might forget to use it at the end of his/her session. TOOLS ADOPTED No one. TESTING To analyze this requirement, I have carried out a manual check. I logged onto the site and I have checked the presence of the log out link. Given the size of the web app, I could not control all the pages, but I did a search for areas and for application flow. In this control I
  • 18. 17 have found the presence of the logout link on each page displayed. RECOMMENDED SOLUTIONS No one because MediaWiki passes this requirement. VERDICT MediaWiki pass this requirement. 3.6 Verify that the session id is never disclosed in URLs, error messages, or logs. This includes verifying that the application does not support URL rewriting of session cookies. AUTHOR Michele Reale DESCRIPTION Session IDs in URLs are very easy to intercept, and they are used to carry out a lot of operations. If a session ID appears in a URL, then users are severely vulnerable even by accidental sharing a page link with other hosts. Session tokens must be kept either inside the HTTP packet header or payload. TOOLS ADOPTED BURP SUITE: we performed lots of login/logout operations for different user accounts, and we captured the HTTP requests through Burp Suite. We found out that the session id is transmitted inside the packet payload and not inside the URL. TESTING Description of test RECOMMENDED SOLUTIONS No one because MediaWiki passes this requirement. VERDICT MediaWiki supports this requirement 3.7 Verify that all successful authentication and re-authentication generates a new session and session id. AUTHOR Florin Tanasache DESCRIPTION When an application does not renew its session cookie(s) after a successful user authentication, it could be possible to find a session fixation vulnerability and force a user to utilize a cookie known by the attacker. In that case, an attacker could steal the user session (session hijacking). Session fixation vulnerabilities occur when: ● A web application authenticates a user without first invalidating the existing session ID, thereby continuing to use the session ID already associated with the user. ● An attacker is able to force a known session ID on a user so that, once the user authenticates, the attacker has access to
  • 19. 18 the authenticated session. In the generic exploit of session fixation vulnerabilities, an attacker creates a new session on a web application and records the associated session identifier. The attacker then causes the victim to authenticate against the server using the same session identifier, giving the attacker access to the user’s account through the active session. TOOLS ADOPTED BURP SUITE - REPEATER: With the help of this tool (used as proxy) I verified through a simple black box testing if all successful authentication and re-authentication generates a new session and session id. TESTING BLACK BOX TESTING: With the tool Repeater I analyzed the answers of some GET requests of Mediawiki. First, I tried an login with an userid and password. Burp showed to me the relative “wikidb_session” for this authentication. After that, I did the logout of Mediawiki. Afterwards, with an successfully re-authentication to the application I noticed that new cookie has been generated. Hence, Mediawiki changed the sessionID for each new re- authentication. RECOMMENDED SOLUTIONS No one because MediaWIki pass the requirement. VERDICT MediaWiki pass this requirement 3.11 Verify that session ids are sufficiently long, random and unique across the correct active session base. AUTHOR Michele Reale DESCRIPTION TOOLS ADOPTED BURP SUITE: inspecting cookies in HTTP requests. TESTING First, we inspected some HTTP packets to get the kind of session tokens: the first results - that is, 32 hexadecimal digits, corresponding to 128 bits - were confirmed by inspecting the source code file devoted to generating such tokens. The algorithm used to generate random tokens harvests randomness from lots of sources. By the way, another inspection in the source code revealed that the tokens are generated without considering the possible presence of an identical token in the database, so this requirement is not completely fulfilled.
  • 20. 19 RECOMMENDED SOLUTIONS Look for the presence of the freshly-generated token in the database, before proceeding with assigning it to the users. VERDICT MediaWiki violates this requirement 3.12 Verify that session ids stored in cookies have their path set to an appropriately restrictive value for the application, and authentication session tokens additionally set the “HttpOnly” and “secure” attributes. AUTHOR Florin Tanasache DESCRIPTION Cookies are often a key attack vector for malicious users (typically targeting other users) and the application should always take due diligence to protect cookies. The importance of secure use of Cookies cannot be understated, especially within dynamic web applications, which need to maintain state across a stateless protocol such as HTTP. To understand the importance of cookies it is imperative to understand what they are primarily used for. Since HTTP is stateless, the server cannot determine if a request it receives is part of a current session or the start of a new session without some type of identifier. Once the tester has an understanding of how cookies are set, when they are set, what they are used for, why they are used, and their importance, they should take a look at what attributes can be set for a cookie and how to test if they are secure. The following is a list of the attributes that can be set for each cookie and what they mean. The next section will focus on how to test for each attribute: ● secure - This attribute tells the browser to only send the cookie if the request is being sent over a secure channel such as HTTPS. This will help protect the cookie from being passed over unencrypted requests. If the application can be accessed over both HTTP and HTTPS, then there is the potential that the cookie can be sent in clear text. ● HttpOnly - This attribute is used to help prevent attacks such as cross-site scripting, since it does not allow the cookie to be accessed via a client side script such as JavaScript. Note that not all browsers support this functionality. ● path - The path attribute signifies the URL or path for which the cookie is valid. If the path attribute is set too loosely, then it could leave the application vulnerable to attacks by other applications on the same server. For example, if the path attribute was set to the web server root "/", then the application cookies will be sent to every application within the same domain. TOOLS ADOPTED BURP SUITE - REPEATER: By using this tool as an intercepting proxy, I analyzed all responses where a cookie is set by the application (using the Set-cookie directive) and I inspected the cookie for attributes above.
  • 21. 20 TESTING BLACK BOX TESTING: In this case of Mediawiki I made a GET request and I looked if the Set-Cookie header in the HTTP response includes the following attribute and their values. ● Secure Attribute: The property of this attribute is non guaranteed. Then the browser would agree to pass it via an unencrypted channel such as using HTTP, and this could lead to an attacker leading users into submitting their cookie over an insecure channel. ● HttpOnly Attribute: This attribute should always be set even though not every browser supports it. This attribute aids in securing the cookie from being accessed by a client side script, it does not eliminate cross site scripting risks but does eliminate some exploitation vectors. We checked to see if the ";HttpOnly" tag has been set and in this case it did, so this attribute is non guaranteed. ● Path Attribute: This attribute is specified and it is set as the default path (/), then it can be vulnerable to less secure applications on the same server. For example, if the application resides at /myapp/, then it have to verify that the cookies path is set to "; path=/myapp/" and NOT "; path=/". RECOMMENDED SOLUTIONS A solution for PHP applications is the use of the following function: setcookie( $name, $value, $expire, $path, $domain, $secure, $httponly) This function defines a cookie to be sent along with the rest of the HTTP headers. VERDICT MediaWiki fails this requirement. 3.16 Verify that the application limits the number of active concurrent sessions. AUTHOR Matteo Lucchetti DESCRIPTION When two or more sessions are held at the same time they are known as concurrent sessions. It's a common request that a web application not allow a user to have more than one session active at a time. In other words, after a user logs into an application, he should not be permitted to open a different type of browser (or use another computer) to log in again until his first session has ended. We must check that are not allowed multiple concurrent sessions for privilage the privacy of the users. TOOLS ADOPTED No one. TESTING To analyze this requirement, I have carried out a manual check. I just
  • 22. 21 tried to log-in with the same user from differents browsers and the results is that I have obtained different concurrent sessions for the same user. RECOMMENDED SOLUTIONS We could maintain a DB-table with active user sessions, where a session is considered active, if last user activity took place less then X (configurable value) minutes ago. Each time a user tries to authenticate throgh the login form, we should check how many sessions for that user are active at the moment, and based upon that check make a decision whether to authenticate him or decline with some form of response message. VERDICT MediaWiki fails this requirement 3.17 Verify that an active session list is displayed in the account profile or similar of each user. The user should be able to terminate any active session. AUTHOR Michele Reale DESCRIPTION A session management mechanism allows users to control their currently active sessions on the website. Users may forget to logout their account when using a different workstation, or their password may be discovered; thus, a session management dashboard allows users to be aware of such problems and fix them. TOOLS ADOPTED No tool adopted TESTING Two separate logins from different hosts were performed with the same username, and no session list control is available. Moreover, the logout operation performed on a host won't log out the other sessions. RECOMMENDED SOLUTIONS A session management interface should be provided to the logged users. VERDICT MediaWiki violates this requirement 3.18 Verify the user is prompted with the option to terminate all other active sessions after a successful change password process. AUTHOR Florin Tanasache DESCRIPTION This requirements indicate an one good security practice: all sessions are invalidated on change password. For instance, the user is
  • 23. 22 changing password because the old one has been compromised. In that case, invalidating all sessions helps protect the user, because after the user clicked that password reset link, of course his password is changed, which means he cannot login with that old credentials The credentials which had been logged him into that website somehow are expired. This practice is used for the majority of web applications. TOOLS ADOPTED No one TESTING To test this requirement I did a manual testing. I used two browsers, so there are established multiple login sessions. Then in one browser, I changed account password. After that I tried to navigate on second browser and I saw that the session has terminated and the browser asked to me the credentials. Therefore MediaWiki verify this requirement. RECOMMENDED SOLUTIONS No one because MediaWiki pass this requirement. VERDICT MediaWiki pass this requirement 5.2 V11: HTTP security configuration verification requirements 11.1 Verify that the application accepts only a defined set of required HTTP request methods, such as GET and POST are accepted, and unused methods (e.g. TRACE, PUT, and DELETE) are explicitly blocked. AUTHOR Matteo Lucchetti DESCRIPTION HTTP offers a number of methods that can be used to perform actions on the web server. While GET and POST are by far the most common methods that are used to access information provided by a web server, the Hypertext Transfer Protocol allows several other methods. Some of these methods can potentially pose a security risk for a web application, as they allow an attacker to modify the files stored on the web server and, in some scenarios, steal the credentials of legitimate users. More specifically, the methods that should be disabled are the following: PUT, DELETE, CONNECT and TRACE. TOOLS ADOPTED ADVANCED REST CLIENT: I used the tool to send and test differents HTTP requests to MediaWiki.
  • 24. 23 TESTING I sent differents HTTP requests for every method. All the responses that I received are with the status “200 ok”, so the request are allowed; in particular, if they should not be allowed, the status code had to be “405 method not allowed”. RECOMMENDED SOLUTIONS We could insert the following Apache configuration directives in httpd.conf: RewriteEngine On RewriteCond %{REQUEST_METHOD}!^(GET|POST|HEAD) RewriteRule .* - [R=405,L] This is a sort of whitelist to consent only the “secure” methods. VERDICT MediaWiki fails this requirement. 11.2 Verify that everyHTTP response contains a content type header specifying a safe character set (e.g., UTF-8, ISO 8859-1). AUTHOR Michele Reale DESCRIPTION HTTP defines a Content-Type header which defines the character set to be used when reading the HTTP packet payload bytes. There are several character sets, which define how characters are encoded in bytes and decoded viceversa, like Unicode Transformation Format 8 (UTF-8) and ISO 8859-1. The content of a file is just a sequence of bytes that need to be decoded to meaningful characters, unless their raw binary content needs to be accessed. Since different character sets may represent the same bytes in different characters, it's necessary to specify which character set was used to generate that file in order to have no ambiguity. Using the wrong charset (or not specifying it explicitly) allows attackers to perform XSS attacks based on exploiting subtle differences between charsets. TOOLS ADOPTED ADVANCED REST CLIENT: this tool was used to check the presence of a charset specification in HTTP response Content-Type header. VEGA: this tool was used to perform intensive HTTP request- response analysis to detect potential flaws. TESTING Vega reported 9 Informational Level HTTP GET responses with no charset specified, regarding minor-importance resources. These resources were retrieved by using Advanced Rest Client and displayed with different charsets to check possible discrepancies that may have led to an XSS exploitation: these files don't appear to change with the classic charsets. Most important, their Content- Type header was missing an explicit charset. Despite the low amount and importance of such resources, the requirement imposes that not an HTTP response can have an implicit charset, so this requirement is violated.
  • 25. 24 RECOMMENDED SOLUTIONS Such resources can be returned with a specific charset by using the Apache Server configuration directive AddDefaultCharset utf-8 inside httpd.conf file. VERDICT MediaWiki violates this requirement. 11.5 Verify that the HTTP headers or any part of the HTTP response do not expose detailed version information of system components. AUTHOR Florin Tanasache DESCRIPTION Knowing the version and type of system components, in special caso web servers, allows testers to determine known vulnerabilities and the appropriate exploits to use during testing. There are several different vendors and versions of web servers on the market today. Knowing the type of web server that you are testing significantly helps in the testing process, and will also change the course of the test. This information can be derived by sending the web server specific commands and analyzing the output, as each version of web server software may respond differently to these commands. By knowing how each type of web server responds to specific commands and keeping this information in a web server fingerprint database, a penetration tester can send these commands to the web server, analyze the response, and compare it to the database of known signatures. TOOLS ADOPTED ADVANCED REST CLIENT: With this tool I controlled if the information of web server is displayable in HTTP header. TESTING The simplest and most basic form of identifying a Web server is to look at the Server field in the HTTP response header. For our experiments I used the tool above. Atter a HTTP Request-Response, I have the I noticed the following line: Server: Apache/2.4.17 (Win32) OpenSSL/1.0.2g PHP/5.5.33 From the Server field, I understood that the server is likely Apache, version 2.4.17, running on Windows operating system. RECOMMENDED SOLUTIONS The possible solutions are: - protect the presentation layer web server behind a hardened reverse proxy. - obfuscate the presentation layer web server headers. Apache allows to hide its own versioning information by inserting the following lines in the httpd.conf file: ServerTokens Prod ServerSignature Off
  • 26. 25 VERDICT MediaWiki fails this requirement 11.6 Verify that all API responses contain X-Content-Type-Options: nosniff and Content-Disposition:attachment; filename="api.json" (or other appropriate filename for the content type). AUTHOR Matteo Lucchetti DESCRIPTION There are some HTTP headers used for prevent some vulnerabilities. In particular, we want check that every API response contain “X- Content-Type-Options: nosniff” to prevent MIME-sniffing vulnerability, with whom the browser can be manipulated into interpreting data in a way that allows an attacker to carry out operations that are not expected by either the site operator or user, such as cross-site scripting. Furthermore, we want to make sure that the content- disposition identifies the content-type with an appropriate filename. TOOLS ADOPTED ADVANCED REST CLIENT: I used the tool to control if the X- Content-Type-Options and Content-Disposition are setted correctly. TESTING I sent some HTTP requests to MediaWiki. Then, I checked if among the response headers the X-Content-Type-Options is setted to “nosniff” and I noticed that it is setted correctly. Moreover, the script WebStart.php adds such header in each HTTP response. Also, I tried to check the association of the Content-Disposition with the Content- Type, but I have not found some dialog download windows, only an instruction for convert in pdf a web page and download it. In this case, the content-disposition identifies the content-type with an appropriate filename, but this is not enough for consider exceeded the requirement. RECOMMENDED SOLUTIONS We could define a single interface point where to set the Content- Disposition header according to the specific file. VERDICT We can’t express a verdict. 11.7 Verify that the Content SecurityPolicy V2 (CSP) is in use in a way that either disables inline JavaScript or provides an integrity check on inline JavaScript with CSP noncing or hashing. AUTHOR Michele Reale DESCRIPTION A Content Security Policy (CSP) is a standard security mechanism for Web servers which allows to control (and possibly forbid completely) the execution of in-line JavaScript code - that is the code inside a <script>...</script> tag, while the script file referred by src attribute is named external JavaScript source code. CSP provides a lot of commands to define a suitable policy for the
  • 27. 26 specific system purposes: a very good example is the current GitHub CSP, visible in the Content-Security-Policy header of its homepage HTTP response. See https://glebbahmutov.com/blog/disable-inline-javascript-for- security/ and https://www.mediawiki.org/wiki/Requests_for_comment/Content- Security-Policy for useful examples and recommendations. TOOLS ADOPTED ADVANCED REST CLIENT: used to check the presence of the Content-Security-Policy header in MediaWiki pages. GOOGLE CHROME DEVELOPERCONSOLE: used to try to execute in-line JavaScript code. TESTING First, the MediaWiki home page was retrieved through Advanced Rest Client: the HTTP response was missing the CSP header. Then, a simple script with in-line JavaScript code was launched on the main page through the Google Chrome Developer Console to check whether a JavaScript in-line injection is allowed or not: var el = document.createElement('script'); el.innerText = 'alert("This site has a robust CSP.");' document.body.appendChild(el); The result was that a popup message with the specified text appeared, hence MediaWiki allows in-line JavaScript code by default; thus, the system violates the requirement. RECOMMENDED SOLUTIONS MediaWiki Project community is still implementing and discussing the CSP support: see https://www.mediawiki.org/wiki/Requests_for_comment/Content- Security-Policy for further (and very recent) references. VERDICT MediaWiki violates this requirement. 11.8 Verify that the X-XSS-Protection:"1; mode=block" header is in place. AUTHOR Florin Tanasache DESCRIPTION This header is used to configure the built in reflective XSS protection found in Internet Explorer, Chrome and Safari (Webkit). Valid settings for the header are 0, which disables the protection, 1 which enables the protection and 1; mode=block which tells the browser to block the response if it detects an attack rather than sanitizing the script. A possible attack is the so called “Clickjacking”, a.k.a. “UI redress attack”: when an attacker uses multiple transparent or opaque layers to trick a user into clicking on a button or link on another page when they were intending to click on the the top level page. Thus, the attacker is “hijacking” clicks meant for their page and routing them to
  • 28. 27 another page, most likely owned by another application, domain, or both. Using a similar technique, keystrokes can also be hijacked. With a carefully crafted combination of stylesheets, iframes, and text boxes, a user can be led to believe they are typing in the password to their email or bank account, but are instead typing into an invisible frame controlled by the attacker. TOOLS ADOPTED ADVANCED REST CLIENT: With this tool I tested if the X-XSS- Protection: 1; mode=block header is in place. RIPS: I used only this tool only to verified if MediaWiki is vulnerable against Cross-Site Scripting attacks. The result of the tools showed to me that there are 1098 files that can be vulnerables. TESTING For testing I used the tool to send some GET request to Mediawiki. After that, I checked if among the response headers the X-XSS- Protection: 1; mode=block header is in place. I noticed that this requirement is not verified, so MediaWiki doesn’t set this header. RECOMMENDED SOLUTIONS Since MediaWiki is a PHP web application we have to send the response header with PHP. We can enable it by modifying our Apache settings or our .htaccess file, and adding the following line to it: header("X-XSS-Protection: 1; mode=block"); Such header can be enabled also by inserting the following line in Apache httpd.conf file: Header set X-XSS-Protection “1; mode=block” VERDICT MediaWiki fails this requirement 5.3 V16: Files and resources verification requirements 16.1 Verify that URL redirects and forwards only allow whitelisted destinations, or show a warning when redirecting to potentially untrusted content. AUTHOR Matteo Lucchetti DESCRIPTION Unvalidated redirects and forwards are possible when a web application accepts untrusted input that could cause the web application to redirect the request to a URL contained within untrusted input. By modifying untrusted URL input to a malicious site, an attacker may successfully launch a phishing scam and steal user credentials. Because the server name in the modified link is identical to the original site, phishing attempts may have a more trustworthy
  • 29. 28 appearance. Unvalidated redirect and forward attacks can also be used to maliciously craft a URL that would pass the application’s access control check and then forward the attacker to privileged functions that they would normally not be able to access. TOOLS ADOPTED REDIRECT CHECKER: This tool reported that everything is fine. TESTING Since no main tools reported a open redirect vulnerability, I tried to verify the presence with the above side tool. Also this tool did not report any vulnerabilities. Furthermore, I found checkBadRedirects.php file that makes a check on redirected page. RECOMMENDED SOLUTIONS No one because MediaWiki pass this requirement. VERDICT MediaWiki pass this requirement 16.2 Verify that untrusted file data submitted to the application is not used directly with file I/O commands, particularly to protect against path traversal, local file include, file mime type, and OS command injection vulnerabilities. AUTHOR Michele Reale DESCRIPTION A Path Traversal attack aims to access files and directories that are stored outside the web root folder. By browsing the application, the attacker looks for absolute links to files stored on the web server. By manipulating variables that reference files with “dot-dot-slash (../)” sequences and its variations, it may be possible to access arbitrary files and directories stored on file system, including application source code, configuration and critical system files, limited by system operational access control. The attacker uses ”../” sequences to move up to root directory, thus permitting navigation through the file system. This attack can be executed with an external malicious code injected on the path, like the Resource Injection attack. To perform this attack it’s not necessary to use a specific tool; attackers typically use a spider/crawler to detect all URLs available. This attack is also known as “dot-dot-slash”, “directory traversal”, “directory climbing” and “backtracking”. TOOLS ADOPTED ZAP. The tool raised a lot of warnings about: ❖ disclosure of file paths in error messages (11); ❖ remote OS command injection (1); ❖ JavaScript src file inclusion (22). Vega. The tool raised a lot of warnings about: ❖ directory listing detection (77); ❖ possible HTTP PUT file upload (2); ❖ PHP error messages possibly containing file paths (326);
  • 30. 29 ❖ bash "ShellShock" injection (58); ❖ shell injection (213); ❖ local file inclusion (49); ❖ XPath injection (81); ❖ local filesystem paths in Web pages (198). RIPS. The tool raised the following warnings: ❖ command execution (2); ❖ file disclosure (3); ❖ file inclusion (32); ❖ file manipulation (9). TESTING Both ZAP and Vega proved the presence of a huge amount of vulnerabilities related to this requirement. RECOMMENDED SOLUTIONS Address all the warnings the tools reported, then repeat thorough scans again. VERDICT MediaWiki violates this requirement 16.3 Verify that files obtained from untrusted sources are validated to be of expected type and scanned by antivirus scanners to prevent upload of known malicious content. AUTHOR Florin Tanasache DESCRIPTION Many application’s business processes allow for the upload and manipulation of data that is submitted via files. But the business process must check the files and only allow certain “approved” file types. Deciding what files are “approved” is determined by the business logic and is application/system specific. The risk in that by allowing users to upload files, attackers may submit an unexpected file type that that could be executed and adversely impact the application or system through attacks that may deface the web site, perform remote commands, browse the system files, browse the local resources, attack other servers, or exploit the local vulnerabilities, just to name a few. Vulnerabilities related to the upload of unexpected file types is unique in that the upload should quickly reject a file if it does not have a specific extension. TOOLS ADOPTED No one TESTING Starting from MediaWiki version 1.1, uploads are initially disabled by default, due to security considerations. Uploads can be enabled via a configuration setting. So, there are not problems with untrusted data validation. In MediaWiki version 1.5 and later, the attribute to be set resides in LocalSettings.php and $wgEnableUploads is set as follows:
  • 31. 30 $wgEnableUploads = true; # Enable uploads This enables uploads. However, our version of Mediawiki setted the line above to false. RECOMMENDED SOLUTIONS Applications should be developed with mechanisms to only accept and manipulate “acceptable“ files that the rest of the application functionality is ready to handle and expecting. Some specific examples include: Black or White listing of file extensions, using “Content-Type” from the header, or using a file type recognizer, all to only allow specified file types into the system. VERDICT MediaWiki pass this requirement 16.4 Verify that untrusted data is not used within inclusion, class loader, or reflection capabilities to prevent remote/local file inclusion vulnerabilities. AUTHOR Matteo Lucchetti DESCRIPTION File inclusion vulnerability allows an attacker to include a file, usually through a script on the web server. The vulnerability occurs due to the use of user-supplied input without proper validation. There are two type of file inculsion: ● Local File Inclusion (LFI) - is the process of including files, that are already locally present on the server, through the exploiting of vulnerable inclusion procedures implemented in the application. ● Remote File Inclusion (RFI) - is the process of including remote files through the exploiting of vulnerable inclusion procedures implemented in the application. TOOLS ADOPTED RIPS: This tool reported that there are 571 files with the file inclusion vulnerability. TESTING Since file inclusion occurs when paths passed to "include" statements are not properly sanitized, in a blackbox testing approach, we should look for scripts which take filenames as parameters. RIPS reported different files with this scripts. RECOMMENDED SOLUTIONS Three key ways to prevent file inclusion attacks are: ● Never use arbitrary input data in a literal file include request ● Use a filter to thoroughly scrub input parameters against possible file inclusions ● Build a dynamic whitelist VERDICT MediaWiki fails this requirement
  • 32. 31 16.5 Verify that untrusted data is not used within cross-domain resource sharing (CORS) to protect against arbitrary remote content. AUTHOR Michele Reale DESCRIPTION Cross-Domain Resource Sharing (CORS) is a protection mechanism to perform intended cross-site requests in a secure way and to block unintended ones. CORS is based on the definition of a set of permitted HTTP verbs for each listed URL. TOOLS ADOPTED VEGA: the tool reported a successful maliciously-crafted cross-site request to Mediawiki. TESTING Vega crafted an HTTP GET request on load.php file that led to a successful cross-site malicious operation. RECOMMENDED SOLUTIONS MediaWiki supports the definition of CORS policies through a proper configuration of the flag $wgCrossSiteAJAXdomains in the configuration files. Once activated, CORS will use dedicated HTTP header (such as Access-Control-Allow-Origin) to check the cross-site requests. VERDICT MediaWiki violates this requirement 16.8 Verify the application code does not execute uploaded data obtained from untrusted sources. AUTHOR Florin Tanasache DESCRIPTION This requirement concerns about code execution vulnerability. Code injection vulnerabilities occur where the output or content served from a Web application can be manipulated in such a way that it triggers server-side code execution. In some poorly written Web applications that allow users to modify server-side files it is sometimes possible to inject code in the scripting language of the application itself. TOOLS ADOPTED RIPS: With this tool I have analyzed MediaWiki and I have noticed that there are 1866 possibly sink of Code Execution vulnerability. TESTING Simply static analysis test with RIPS. However, as we seen in 16.3 requirement in the our MediaWiki version uploads are initially disabled by default, due to security considerations. RECOMMENDED SOLUTIONS To protect against this type of attack, you should analyze everything your application does with files. VERDICT MediaWiki fails this requirement
  • 33. 32 16.9 Do not use Flash, Active-X, Silverlight, NACL, client-side Java or other client side technologies not supported natively via W3C browser standards. AUTHOR Matteo Lucchetti DESCRIPTION The World Wide Web Consortium (W3C) is the main international standards organization for the World Wide Web. The consortium has established that there are some client-side technologies that should not be used, such as Flash, Active-X, Silverlight, NACL etc. The reason is to prevenet the upload of data from untrusted sources. TOOLS ADOPTED No one. TESTING To verify this requirement I have to check that the web app does not require the use of these technologies. To do that, I have done a manually check of the code because I have not find a suitable tool. The research carried out did not identify any of these technologies, but of course I may have committed some human error. In any case, mediawiki should not use these technologies as it is composed of simple web pages, so I decided to trust of my control. RECOMMENDED SOLUTIONS No one because MediaWiki pass the requirement. VERDICT MediaWiki pass this requirement 6. Conclusion We conclude this report giving our verdict on the utility of the tools used, on the goodness of the OWASP requirements and giving a review about the security level of MediaWiki based on the final verdicts of the requirements.
  • 34. 33 6.1 About the tools We used several tools to test the requirements above, and each tool is useful according the test done. Some tools like Burp-Repeater and Advanced Rest Client were most important than other because of their ease of use. Other tools like Vega, Zap and RIPS have been used mostly as confirmation of certain vulnerabilities found. However, we consider that to test the safety of an application is very important to a direct discussion about the design choices with the developers. 6.2 About the requirements of OWASP In this Project we have analyzed three OWASP requirements on MediaWiki. In particular we have studied if the web app has a good implementation of the session management, a good HTTP security configuration and if it has a good management of files and resources. Each requirement is composed of several sub-requirements regarding the key points of each analysis, so as to analyze the best-known problems. We think that these standards are complete enough, but we believe requires a continuous update so as to introduce the new vulnerabilities. 6.3 About the security level of MediaWiki To assess the level of security of mediawiki, here is given the summary table of requirements with the final verdicts. # Description Verdict 3.1 Verify that there is no custom session manager, or that the custom session manager is resistant against all common session management attacks. Fail 3.2 Verify that sessions are invalidated when the user logs out. Pass 3.3 Verify that sessions timeout after a specified period of inactivity. Fail 3.5 Verify that all pages that require authentication have easy and visible access to logout functionality. Pass 3.6 Verify that the session id is never disclosed in URLs, error messages, or logs. This includes verifying that the application does not support URL rewriting of session cookies. Pass 3.7 Verify that all successful authentication and re-authentication generates a new session and session id. Pass
  • 35. 34 3.11 Verify that session ids are sufficiently long, random and unique across the correct active session base. Fail 3.12 Verify that session ids stored in cookies have their path set to an appropriately restrictive value for the application, and authentication session tokens additionally set the “HttpOnly” and “secure” attributes. Fail 3.16 Verify that the application limits the number of active concurrent sessions. Fail 3.17 Verify that an active session list is displayed in the account profile or similar of each user. The user should be able to terminate any active session. Fail 3.18 Verify the user is prompted with the option to terminate all other active sessions after a successful change password process. Pass 11.1 Verify that the application accepts only a defined set of required HTTP request methods, such as GET and POST are accepted, and unused methods (e.g. TRACE, PUT, and DELETE) are explicitly blocked. Fail 11.2 Verify that every HTTP response contains a content type header specifying a safe character set (e.g., UTF-8, ISO 8859-1). Fail 11.5 Verify that the HTTP headers or any part of the HTTP response do not expose detailed version information of system components. Fail 11.6 Verify that all API responses contain X-Content-Type-Options: nosniff and Content-Disposition: attachment; filename="api.json" (or other appropriate filename for the content type). Don’t know 11.7 Verify that the Content Security Policy V2 (CSP) is in use in a way that either disables inline JavaScript or provides an integrity check on inline JavaScript with CSP noncing or hashing. Fail 11.8 Verify that the X-XSS-Protection: 1; mode=block header is in place. Fail 16.1 Verify that URL redirects and forwards only allow whitelisted destinations, or show a warning when redirecting to potentially untrusted content. Pass 16.2 Verify that untrusted file data submitted to the application is not used directly with file I/O commands, particularly to protect against path traversal, local file include, file mime type, and OS command injection vulnerabilities. Fail 16.3 Verify that files obtained from untrusted sources are validated to be of expected type and scanned by antivirus scanners to prevent upload of known malicious content. Pass
  • 36. 35 16.4 Verify that untrusted data is not used within inclusion, class loader, or reflection capabilities to prevent remote/local file inclusion vulnerabilities. Fail 16.5 Verify that untrusted data is not used within cross-domain resource sharing (CORS) to protect against arbitrary remote content. Fail 16.8 Verify the application code does not execute uploaded data obtained from untrusted sources. Fail 16.9 Do not use Flash, Active-X, Silverlight, NACL, client-side Java or other client side technologies not supported natively via W3C browser standards. Pass As we can see, MediaWiki has so many requirements that are not passed, in particular the section V11 is completely not supported. With these results, we can not say that MediaWiki has a good level of security. But we must remember that this web application it was not developed for to be secure; in fact we have classified it as ASVS level 1. Moreover, we tested MediaWiki as a local Web service with its default configuration, thus we assessed the security of its default settings. We proposed several simple solutions for these security issues based on configuring MediaWiki properly, and so these verdicts are not expressing the real security level of MediaWiki by themselves. At the end, we claim that MediaWiki has a low level of security.