SlideShare a Scribd company logo
1 of 62
Download to read offline
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 97
Comprehensive Social Media Security Analysis & XKeyscore
Espionage Technology
Adam Ali.Zare Hudaib adamhudaib@gmail.com
Licensed Penetration Tester EC-Council
Certified Ethical Hacker EC-Council
Certified Security Analyst EC-Council
Wireshark Certified Network Analyst ( Wireshark University)
CEH , ECSA , LPT , WCNA
Sweden
Abstract
Social networks can offer many services to the users for sharing activities events and their ideas.
Many attacks can happened to the social networking websites due to trust that have been given
by the users. Cyber threats are discussed in this paper. We study the types of cyber threats,
classify them and give some suggestions to protect social networking websites of variety of
attacks. Moreover, we gave some antithreats strategies with future trends.
Keywords: XKeyscore, Social Media Security, Privacy, TFC, Cyber Threats.
1. INTRODUCTION
In recent years, global online social network (OSN) usage has increased sharply as these
networks become interwoven into people’s everyday lives as places to meet and communicate
with others. Online social networks, such as Facebook, Google+, LinkedIn, Sina Weibo, Twitter,
Tumblr, and VKontakte (VK), have hundreds of millions of daily active users. Facebook, for
example, has more than 1.15 billion monthly active users, 819 million of which are active mobile
Facebook users as of June 2013. Facebook users have been shown to accept friendship
requests from people whom they do not know but who simply have several friends in common. By
accepting these friend requests, users unknowingly disclose their private information to total
strangers. This information could be used maliciously, harming users both in the virtual and in the
real world. These risks escalate when the users are children or teenagers who are by nature
more exposed and vulnerable than adults. As the use of online social networks becomes
progressively more embedded into the everyday lives of users, personal information becomes
easily exposed and abused. Information harvesting, by both the online social network operator
itself and by third-party commercial companies, has recently been identified as a significant
security concern for OSN users. Companies can use the harvested personal information for a
variety of purposes, all of which can jeopardize a user’s privacy. For example, companies can
use collected private information to tailor online ads according to a user’s profile, to gain profitable
insights about their customers, or even to share the user’s private and personal data with the
government. This information may include general data, such as age, gender, and income;
however, in some cases more personal information can be exposed, such as the user’s sexual
orientation and even if the user consumed addictive substances. These privacy issues become
more alarming when considering the nature of online social networks: information can be
obtained on a network user without even directly accessing that individual’s online profile;
personal details can be inferred solely by collecting data on the user’s friends.
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 98
SOCIAL NETWORK SITES: METRICS, PRIVACY
2.1 Social Network Sites: Usage Metrics, Network Structure and Privacy
Social Network Sites (SNSs) exhibit wide popularity, high diffusion and an increasing number of
features. Specifically, Facebook, which currently holds a prime position among SNSs, has a
continuously evolving feature set and one billion monthly active users, approximately 81% of
whom are from outside the U.S. and Canada, and 604 million of whom access the site via mobile
devices [1]. Given this diversity, an effective way of understanding Facebook is by exploring
motives for using the service via theoretical frameworks such as Uses and Gratifications (U&G)
[2]. A good understanding A good understanding of these motives can shed light onto the intricate
mechanisms behind important aspects of SNSs, such as site adoption, participation [4],
information seeking [5], and the privacy of users [2]. Privacy, in particular, is a major concern
since it dictates the usage decisions of many SNS users and as Facebook, specifically, has found
itself under harsh criticism regarding the enactment of highly contentious privacy policies and
privacy-sensitive features [3]. The emergence of social sites also represents a valuable research
resource. Indeed, scholars have highlighted the enormous potential of taking advantage of data
that are generated electronically when people use online services [3]. Furthermore, compared to
the methods and data available to traditional social scientists, online information can be accessed
and analyzed computationally in ways that are both efficient and accurate [3]. In particular, in the
case of Facebook, a rich, robust Application Programming Interface (API) allows researchers to
collect large volumes of data relating to issues such as site feature use and personal network
structure with unprecedented accuracy, granularity and reliability.
Media is consumed for a wide range of purposes and individuals utilize different media channels
to achieve very different ends [6]. U&G is a theoretical framework for studying these motives and
outcomes – fundamentally, the “how” and “why” of media use [2]. A key strength of the approach
is its established and broadly applicable frame of analysis (covering media as diverse as tabloids,
reality TV as entertainment or social connection) with social and psychological antecedents (such
as demographics) and cognitive, attitudinal, or behavioral outcomes (such as usage patterns) [3].
U&G has recently proven valuable in exploring and explaining a wide variety of social media
phenomena including topics as diverse as the motivations for contributing content to an online
community [4], explaining why political candidates are befriended [1], and cataloguing the
psychosocial well-being of teenage girls. U&G studies have explored behavior on most common
forms of social media including content sharing sites (e.g., YouTube), SNSs (e.g., Myspace [1]),
media sharing communities, and blogs [3]. Taken together, this work highlights the importance of
eliciting and understanding users’ motives in social media, as well as the value of employing data
from a natural research instrument, like Facebook, for social studies. particular, has most
commonly been captured by self-report methods using surveys. Typical questions include time
spent on site and visit frequency [3]. Acknowledging the lack of rigor in such ad-hoc methods, the
Facebook Intensity Scale was introduced to capture the extent to which a user is emotionally
connected to Facebook and the extent to which Facebook is integrated into their daily activities.
The scale has been subsequently adopted in a number of other studies.
The advent of SNSs has greatly facilitated the capture of personal social network data and a wide
range of useful metrics can now be calculated automatically and in real time [2]. Commonly used
metrics include:
 Network Size: The number of nodes in a participant’s egocentric network, i.e., the number of
friends that an individual has. Correlations have been shown between network size and
personality and social capital.
 Network Density: The extent that nodes in an egocentric network are interconnected –
essentially, how many of an individuals’ friends know each other. This is calculated as the
ratio of the number of ties to the number of possible ties.
 Average Degree: Mean number of mutual friends in an egocentric network. Higher values on
this statistic have previously been associated with bonding social capital and higher
socioeconomic status.
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 99
 Average Path Length: The average geodesic distance between all pairs of nodes in a network.
 Diameter: The longest geodesic distance within the network, i.e., maximum distance between
two nodes.
 Network Modularity: A scalar value between −1 and 1 that measures the density of links inside
communities as compared to links between communities.
 Number of Connected Components: The number of distinct clusters within a network. This has
been interpreted as the number of an individual’s social contexts and associated with bridging
social capital and social contagion.
 Average Clustering Coefficient: The clustering coefficient is a measure of the embeddedness
of a node in its neighborhood. The average gives an overall indication of the clustering in the
network, and high values are associated with a “small-world” effect [3].
Users often make decisions about whether and how they use a SNS based on the perceived
privacy implications of their actions. However, privacy is a complex concept that has presented
challenges to the social media ecosystem. One key issue is the tradeoff between providing users
with advanced new features that mine their data to provide relevant content but lead to negative
effects in terms of how users perceive their privacy [6]. Attempting to understand this topic
further, boyd [5] argues that in the context of the social web, privacy violations are common
because mediated publics exhibit certain properties that are not present in unmediated publics,
namely persistence, searchability, replicability, and invisible audiences. Researchers studying the
social implications of privacy have concluded that the right to privacy can be considered a social
stratifier that divides users into classes of haves and have-nots, thus creating a privacy divide [3].
The privacy of SNS information is a particularly pertinent topic of study because of research
reporting that users find it challenging to understand the privacy implications of SNSs. For
instance, recent research has shown that the current Facebook privacy controls allow users to
effectively manage threats from outsiders, but are poor at mitigating concerns related to members
of a user’s existing friend network [3]. Similarly, a study on Facebook apps found abundant
misunderstandings and confusion about how apps function and how they manage, use, and
share profile data [4].
Investigating the uses and gratifications of a social network site can provide powerful descriptive
and explanatory insights into the mechanisms that drive users’ behaviors.
Nationality showed a significant effect on the regression model for the first privacy question, with
participants from the USA being less concerned about their privacy on Facebook, possibly due to
the fact that they are more tech savvy and comfortable with this online media. On the other hand,
nationality did not have a significant effect on the second privacy question, but two of the motives
for use did. Specifically, users that were motivated by communication opportunities with like-
minded people were found to be more likely to report tweaking their privacy settings. From the
factor’s description we know that these people tend to be more enthusiastic about organizing or
joining events and groups. This may be because they feel more comfortable in familiar settings
and therefore have increased suspicion of strangers or companies on Facebook. Furthermore,
since events predominantly take place offline and a popular use of groups is to organize offline
meetings, it may be that these people have greater experience of the implications of Facebook
privacy settings to offline social interaction. The fact that the Content motive was positively
associated with frequently changing privacy settings may be due to the fact that people who
frequently use applications and interactive content on Facebook have taken the time to
understand the privacy implications of installing such dynamic features. Interestingly, the
newsfeed feature, which caused a large backlash with regards to privacy when it was first
introduced [6], does not show a significant effect on users’ perceived privacy. Furthermore, a
substantial discrepancy was observed in the motives of people that report to be concerned about
their privacy on Facebook and those that engage in changing their privacy settings.
2.2 A New Social Media Security Model (SMSM)
Previous security methods have been proposed to reduce the vulnerabilities encountered by most
social networks without altering the usability of the system. These models emphasised user
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 100
education and policy implementation. However, these have very limited success as users are
notoriously non-compliant with such methodology. Some work has been done to explore the
effects of social networks in disseminating information about successful social engineering
attacks. Coronges et al, 2012, developed an experiment where phishing emails were created and
sent to a controlled user population. Their results showed that local leadership within an
organisation appeared to influence security vulnerability but had no influence on security
resilience or phishing prevention. Joshi and Kuo (2011) investigated mathematical formulation
and computational models for security and privacy of social networks. In their empirical study,
they presented several ways an attacker can take advantage of the security vulnerabilities in an
online social network. Some of the vulnerabilities discussed were Social Phishing Attacks, and
Neighbourhood Attacks which is a form of privacy attack. This gives the attacker sufficient
information about the target. Furthermore, their work presented theories that recent work in
programming language techniques demonstrates that it is possible to build online services that
have strict privacy policies. Unfortunately, service providers need private data to generate
revenue from their 'free' service, limiting the application of strict security and privacy measures. It
is important to clearly identify the main vulnerabilities of current social media [7].
Key vulnerabilities of social media:
 Single Factor Authentication. Single factor authentication is the conventional security
procedure which requires a username/email and password before a user is granted access to
the system. The security methods involved in authenticating account owners before gaining
access into a system started as a debate in the industry and has now become the greatest
cause of concern (Altinkemer, 2011). The sophistication of social engineering tools used by
attackers poses a serious threat to the integrity, reputation and lives of users which may result
in a sudden decline of social media usage (Parwani et al, 2013) [8].
 Profile Cloning. The easiest technique used in stealing the identity of a social network's user is
called profile cloning. With the current structural outlook of profile pages on Facebook, it is
very easy to clone an original user and perform malicious activities with the cloned account
before the victim or the platform providers discovers. The profile pages of Facebook have no
particular unique separation from one another, and in cases were two users co-incidentally
share the same first and surnames as well as the same profile image, the true owner of the
account is left for the viewer to identify (Kumar et al, 2013). There are two types of profile
cloning: existing profile cloning, cross site profile cloning.
 Phishing Attacks. Phishing could be described as a malicious craft used by social engineers
with the aim of exploring the vulnerabilities of a system which are made easy through the
ignorance of end-users (Khonji et al, 2013).
 Watering Hole Malware Attack. This method of malware attack involves the attacker guessing
which websites members of an organization normally go, and then it infects these websites
with malware with the hope that a user‘s computer within the target organisation will be
infected when an infected site is visited. It was discovered in January 2013, that a watering
hole attack was targeted at developers working at Facebook [10].
 Multiple Login Sessions. The security vulnerabilities created by allowing multiple login
sessions can result in both data theft and economic consequences (Imafidon & Ikhalia, 2013).
In websites that make it compulsory that users‘ pay before using their service e.g. an online
movie store; by allowing the creation of multiple login sessions on one account to run
concurrently, the user and other users can share the benefits on one subscription regardless
of their remote locations.
 Weak Password Creation. uessed, cracked, deliberately shared or stolen. Most social
networking sites require users to create passwords of no less than 6 characters during
account registration. The passwords created are usually hashed or encrypted in the database
using MD5 or Shal1 hash functions (Kioon et al, 2013). Unfortunately, users are meant to
believe that when the passwords are encrypted in the database, it is impossible to decrypt
them if the system is compromised [9].
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 101
The proposed social media security model (SMSM):
 Modification of Password Classification Algorithm: This model proposes that passwords
should be classified according to their various levels of security using more declarative terms
such as ‗very unsafe‘, ‗unsafe‘, ‗not secure‘, ‗a little secure‘, ‗secure‘ and ‗very secure‘.
This can be done by ensuring that passwords created by users must consist of combinations
of uppercases, lowercases, numbers and at least two special characters. Furthermore, this
new security feature will display an instant message to the user using any of the
aforementioned declarative terms based on the character combinations selected (Vijaya et al,
2009) [11].
 Embedding Unique Usernames on Profile Pages: This security feature is proposed to solve
the problem of profile cloning within the site. Many social networking sites only display the first
and surnames of users on their profile pages which opens up certain security vulnerabilities
when more than one user coincidentally share the same first and surnames. The
vulnerabilities involved may allow the malicious user exhort money or commit an online crime
under the guise of the victim which could take time to detect. In addition, embedding unique
usernames on profile pages within the site will make it very difficult for a malicious user to
steal the identity of a genuine user and act on their behalf (Chen et al, 2009).
 Re-authenticating Registered Users During Activation: This method will ensure that social
media accounts must be activated before full access is granted. The application of this
approach is generating a 25 character code for each user and the system sends the code to
their email addresses to confirm the ownership of the email used for registration. When the
users attempt to activate the account by clicking the activation link sent to their email address,
they are redirected to another page, and are required to provide the 25 character code, the
username and password before full activation is complete. This security mechanism will help
in preventing web robots from undermining the validation method of the social network.
Furthermore, it will increase security consciousness in every user subscribing for the service
and protect a user whose email address has been compromised (Fu, 2006) [12].
 Email Based Two Factor Authentication: The technique used to implement two factor
authentication in this model, is an email based approach. When a user initiates a login session
and passes the first stage of authentication (traditional username/email and password), the
system sends a randomly generated one time password token to the user‘s email address and
redirects the user to another page within the site which requires the onetime password
concurrently (Ikhalia & Imafidon, 2013). The user must navigate to the email address
containing the randomly generated password token and supply it before access to the system
is granted. This security model is now in huge demand by the industry and implementing an
email based two factor authentication method is feasible and cost effective when compared
with the SMS approach. Retrospectively, this new security enhancement will reduce security
vulnerabilities faced by social media victims of spear phishing, session hijacking, identity and
data theft (Yee, 2004).
 Make Private Message Notification ‘Private’: The importance of this security functionality is to
protect the confidentiality of a user whose email address is being compromised. This model
only allows a message notification sent to a user‘s email when a private message is sent from
the social networking site to the user. In other words, the user must navigate to their private
message inbox within the site to read the content of their messages. The necessity of this is to
protect users whose emails have been compromised and users who have lost their email
accounts (Joshi & Kuo, 2011).
 Restrict Unauthorised Access To Users’ Profile Information: One of the keywords used to
define social media by (Dabner, 2012) is ―A web-based service that allows individuals to
construct a public or semi-public profile within a bounded system‖; therefore, only registered
users must have access to the information they share within the site. This new security
enhancement proposed will prevent external users from viewing profile information of
registered users within the site (Ijeh, Preston & Imafidon, 2009). The major benefit of this
security feature prevents the difficulties involved by digital forensic investigators in tracking
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 102
down malicious attackers hijacking users‘ information for cross site profile cloning (Ademu &
Imafidon, 2012). When this security model is implemented an attacker must register in the
system before a malicious activity can be performed, therefore making it easy for the attacker
to be traced as most social networks store the IP address of registered users and updates
them when they login. (Joshi & Kuo, 2011) [7].
 Prevent Multiple Login Sessions: The system prevents users from creating multiple sessions
at the same time. The implementation of this security mechanism will ensure that users have
control over their accounts. This new enhancement will also make users know when a cyber
intruder is accessing their accounts because a message will be shown to the user if a session
has been established on the account (Choti et al, 2012).
2.3 Privacy Issues of Social Network Based on Facebook Example
Helen Nissenbaum’s work offers an analytical framework for understanding the commonplace
notion that privacy is ‘‘contextual.’’ Nissenbaum’s account of privacy and information technology
is based on what she takes to be two non-controversial facts. First, there are no areas of life not
governed by context-specific norms of information flow. For example, it is appropriate to tell one’s
doctor all about one’s mother’s medical history, but it is most likely inappropriate to share that
information with casual passersby. Second, people move into and out of a plurality of distinct
contexts every day. Thus, as we travel from family to business to leisure, we will be traveling into
and out of different norms for information sharing. As we move between spheres, we have to alter
our behaviors to correspond with the norms of those spheres, and though there will always be
risks that information appropriately shared in one context becomes inappropriately shared in a
context with different norms, people are generally quite good at navigating this highly nuanced
territory. On the basis of these facts, Nissenbaum suggests that the various norms are of two
fundamental types. The first, which she calls norms of ‘‘appropriateness,’’ deal with ‘‘the type or
nature of information about various individuals that, within a given context, is allowable, expected,
or even demanded to be revealed’’ (2004, 120). In her examples, it is appropriate to share
medical information with doctors, but not appropriate to share religious affiliation with employers,
except in very limited circumstances. The second set are norms of ‘‘distribution,’’ and cover the
‘‘movement, or transfer of information from one party to another or others’’ [17]. She grounds
these norms partly in work by Michael Walzer, according to which ‘‘complex equality, the mark of
justice, is achieved when social goods are distributed according to different standards of
distribution in different spheres and the spheres are relatively autonomous’’ [14]. Information is
such a social good. Thus, in friendship, confidentiality is a default rule: it may be appropriate to
share the details of one’s sex life with a friend, but it is not appropriate for that friend to broadcast
the same information on her radio show. In medical situations, on the other hand, a certain
sharing of information—from a radiologist to a primary care physician, for example—is both
normal and expected [13].
The use of Nissenbaum’s account extends well beyond the public surveillance context to which
she applies it. Evidence from social networking sites, and Facebook in particular, suggests that
contextual integrity will be an apt analytical framework in that context as well. First, and unlike
popular perceptions and perhaps unlike other SNS, almost all of the evidence suggests that
Facebook users primarily use the site to solidify and develop their offline social relationships,
rather than to make new relationships online. Information on Facebook would predictably tend to
mirror the offline tendency to contextual situatedness.3 Second, evidence about social conflicts in
Facebook suggests that one of the greatest sources of tension is when information that would
remain internal to one context offline flows to other contexts online. This is in large part an
interface issue—there is no easy way for a user to establish and maintain the many separate and
sometimes overlapping social spheres that characterize offline life. If these fluid spheres are
difficult to explicitly create on Facebook, then managing information flows among them is likely to
be very difficult. Indeed, the underlying architecture of Facebook is largely insensitive to the
granularity of offline social contexts and the ways norms differ between them. Instead, the
program assumes that users want to project a single unified image to the world. These contextual
gaps are endemic to SNS, and a substantial source of privacy problems.4 For example, the
‘status line,’ generally gets projected indiscriminately to all friends. One can think of the status line
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 103
as an answer to the questions ‘‘How’s it going?’’ or ‘‘What’s up?’’ However, in offline life one
tailors responses to the audience one is with. While Facebook now allows users to create multiple
friend groups and then control which groups can see their status, this requires explicit,
cumbersome effort and falls far short of reflecting the many complex and overlapping spheres of
offline life [15].
Two recent Web phenomena, which pre-date social networking sites, highlight some of the
privacy issues that are relevant to Facebook and other social networking sites. The first is
blogging. The number of people who post their thoughts online in Web logs is staggering, and
growing rapidly.
Information posted to the Internet is potentially visible to all. For most people, such universal
broadcast of information has no parallel offline. In other words, offline personal information is
seldom communicated to a context anywhere near as broad as the entire Internet. As social
networking sites continue to grow and increasingly integrate with the rest of the Internet, they
should be expected to inherit some of these issues. At the same time, the contextual situation for
social networking is made both more complex and more difficult by the import of the offline
concept of ‘‘friend.’’ Information flows on social networking sites are mediated not just by the
global nature of Internet communication, but by the ways that those sites and their users interpret
the meaning of online friendship and the social norms that go with it. Not surprisingly, empirical
research indicates that SNS users (possibly excepting younger teens on Myspace) tend to
construct their identities relationally, becoming and expressing who they are by way of
highlighting whom they associate with (Papacharissi 2009; Livingstone 2008) [16].
The precursors to social networking (blogging and webcams) demonstrated two phenomena—the
risk of collateral privacy damage, and the plasticity of norms—and these phenomena are also
evident in recent changes to Facebook. We will examine the case of collateral damage first,
because its analysis is more straightforward than the plasticity of norms. In both cases, we will
suggest that the difficulties lie in the tensions between the tendency of offline social contexts to
be highly granular and specific, and the tendency of social network sites to mirror the more global
context of the Internet, even as they substantially nuance that context through the emerging
norms of online friendship. This granularity gap needs to be addressed through interface and
design decisions that highlight, rather than obscure, the new ways that information travels online
[17].
Another serious violation to the norms of distribution is that the developers behind the
applications are also largely invisible, obscuring the fact that the information is in fact leaving the
confines of Facebook and not just going to the user’s friends. A college student in Germany can
create a quiz application that matches people with the beer they are most like, running it on his
computer in his dorm. Users accessing this application will likely be completely unaware that their
profile information, and their friends’ profile information, can be accessed by a student in a dorm
room in another country. While a third party application developer, such as the student in the
dorm room in Germany, may be acting completely ethically and only accessing the profile
information needed for the application, there is no guarantee of this, and little vetting process—
anyone can become a Facebook application developer [15].
The invisibility of information flows presents a particular problem because when we do not know
what is being done with our information, we have no ability to contest it. If the architecture and
interface of Facebook essentially hides the amount of information that is shared to third parties,
then there is little that a user can do about that sharing. Indeed, there is little that she can do to
avoid the situation, other than decline to use the third party applications. The choice for users is
binary: install the application and give full access to their own and their friends’ personal
information, or don’t use the application at all.16 We can imagine designing a number of different
mechanisms that could provide greater transparency (and in some cases, greater control) of
these information flows.
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 104
The backlash was in part generated by a sense that the News Feed violated the privacy of users.
In the terms of Nissenbaum’s framework, the analogy would be to the case of public records
being placed online.20 For public records, she suggests, the privacy problem is one of the scope
of the distribution of the information. It is not that there is any more information publicly available;
it is that it is a lot easier to get to the information. It is no longer necessary to travel to a (possibly
remote) local courthouse. When an activity is easier, more people will do it. Thus, for example,
idle curiosity about ‘who owns that house, and what they paid for it’ becomes a topic that one
might actually pursue online, when the same topic would be too much like work to pursue offline.
As a result, people have less de facto privacy. Facebook similarly increased the efficiency of
accessing information that was already in principle accessible; in so doing, it reduced the de facto
privacy of its users. A more significant difference occurs on the other end, as newsfeeds do not
just automate the process of receiving updates; they automate the process of sending them. Here
a significant difference from holiday letters emerges: the process of sending those letters is
inefficient, in that it involves individually addressing and stuffing envelopes for every update. If the
letter says something that a recipient might find offensive, he gets a card but not the letter. If the
letter is going to someone that a person no longer cares to keep up with, she might not send it,
and might even remove the person from her list. In other words, being offline in this case
introduces a necessary pause between composing the letter and sending it, and this encourages
the list itself being subject to at least a casual vetting procedure. No such procedural
encouragement exists for Facebook updates. Nothing in the process of sending updates
encourages one to even notice who the full audience is, much less to assess who should be in it.
Retention of updates, combined with reminders to that effect, would further encourage users to
view their identities online as constructed and performative, moving Facebook closer to the world
of Jennicam.25 This encouragement would be magnified even more by a ‘‘Mary has just looked
at all of your updates’’ option; users would increasingly view their Facebook identities as subject
to constant surveillance, and modify them accordingly. If I knew that Mary always looked at all my
updates, I might update with her in mind. More generally, users would be encouraged to have
updates resemble the news wires from which the model was taken. Whether or not this is a
‘‘good’’ model for online friendship on a social networking site could then be discussed by users
who were encouraged to form a mental model that accurately reflected the various flows of
information on the site.
An intellectual property regime that grants strong proprietary rights and few exceptions favors
mass media over independent producers of speech. Even the placement of advertising on portal
sites matters. Recent work comparing Facebook with other SNS suggests that the architectural
features of those sites will tend to facilitate certain kinds of social interactions and not others.26
The preceding analysis underscores the important relations between privacy norms and
application and interface design. Two points bear emphasis. First, at the core of the privacy
issues discussed here is a gap between the granularity of offline social contexts and those on
Facebook. One of the greatest strengths of Nissenbaum’s contextual integrity framework is that it
highlights the importance of social contexts to privacy norms. The application of Nissenbaum’s
work to Facebook underscores the extent to which the granularity differences are driving many of
the surface-level privacy issues. Facebook, as we have argued, does not adequately reflect this
insight. As a result, offline contexts are more easily differentiated and kept separate than those on
Facebook. To the extent that Facebook users use the program to develop offline social ties, this
granularity gap will inevitably produce conflicts. Second, this gap shows that part of what is at
stake is the definition of what it means to be a ‘‘friend’’ on Facebook. The switch to the News
Feed shows that the norms for online friendship are in a state of considerable flux. While no-one
would claim that a Facebook friend is the same as an offline friend, the fact that there are many
overlapping cases means that the shared vocabulary sometimes leads us to a misperception of
sameness. In fact, the concepts of a ‘friend’ on Facebook and an offline friend may be diverging
even more, making this difference bigger. The News Feed makes updates a lot like blogging, and
Applications put users in the position of treating their friends’ information as commodities to
exchange for access to whatever the application in question does. If Facebook norms for
friendship move further away from those offline, we would expect that users’ would model their
online behavior accordingly. While few of them might endorse the sense of complete publicity that
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 105
Jennicam embraced, publicity at nearly that level is enabled by the combination of the News Feed
and a user who posts constant updates.
2.4 Social Media, Privacy and Security
Dr Ackland [18] explained that social media use leads to vast digital repositories of searchable
content such as web pages, blog posts, newsgroup posts, Wikipedia entries. Such repositories
can be thought of as information public goods that have economic value to business because
they enable people to find information efficiently – for example, by using Google to search for
answers to technical questions. For governments, the value of social media lies in the potential
for direct dialogue between citizens and government, promoting participatory democracy (that is,
broader participation in the direction and operation of government). However, people are often
not involved in social media for the instrumental reasons of creating information public goods or
strengthening democracy but for social reasons and to express their individual or collective
identity. Much of the time people’s use of social media is expressive (like voting, or cheering at
the football) rather than instrumental and people tend to be intrinsically motivated to participate –
that is, for the inherent enjoyment of the activity – rather than being extrinsically motivated (for
example, by the expectation of external rewards).
Two issues that may reduce the value of social media to business and government by adversely
affecting people’s motivation to participate.
1. Game mechanics, was not covered in detail but refers to the use of rewards such as badges
on foursquare, which may encourage some users but deter others, so may have the effect of
selecting particular types of users to participate, and may also change the way in which people
engage with online environments.
2. Privacy. The second issue identified as potentially diminishing people’s motivation to
participate in social media, and the issue of particular relevance in the context of this workshop,
was privacy. Existing research [20] is mostly survey research focusing on a particular
demographic: youth. e.g.;
 Sonia Livingstone [21] found that younger teenagers use social media for creating ‘identity
through display’ while older teenagers create ‘identity through connection’. Different risks are
associated with each of these behaviours.
 Zeynep Tufecki’s quantitative study of US college students [18] looked at how privacy
concerns affect students’ willingness to disclose information online. This research suggested
that ‘treating privacy within a context merely of rights and violations, is inadequate for studying
the Internet as a social realm’.
Further research needed. Most existing research has focused on youth in wealthy countries but
there is a need for research on other demographic groups. For understanding how people’s
participation in Government 2.0 might be affected by privacy and security concerns, for example,
it will be important to include demographic groups more relevant to these initiatives as youth tend
not be involved in political activities online [18]. Dr Ackland drew attention to a study of older
Australians and their social media use (funded by an ARC Linkage Grant) being undertaken by
the Australian Demographic and Social Research Institute at The Australian National University in
partnership with National Seniors Australia. He noted that research into privacy and Government
2.0 is generally not based on data and is mainly speculation at this stage. Little is known about
people’s attitudes towards the potential for governments to use their data for purposes other than
policymaking, such as surveillance and law enforcement. Developments in network science and
mining social graphs (that is, the social connections between people or organisations). Social
media use leads to digital traces of activity that are permanent, searchable and crossindexable.
New tools are being developed to support this type of research through data mining, network
visualisation and analysis, including:
 The Virtual Observatory for the Study of Online Networks (VOSON) System
(http://voson.anu.edu.au) developed by Dr Ackland.
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 106
 SAS Social Media Analytics.
 NodeXL, which enables social network data such as that gathered by VOSON to be imported
into Excel.
It was noted [23] that there is an important difference between individual privacy and social graph
security. Network scientists increasingly have the ability to mine social graphs and place people
in groups by identifying clusters in social networks and predicting the existence of social
connections. Protecting the social graph is more important –and more difficult – than protecting
personal data.
Professor Bronitt [21] drew attention to the work of the Centre of Excellence in Policing and
Security (CEPS), which is a research partnership between The Australian National University,
Griffith University, The University of Queensland and Charles Sturt University. The Centre also
brings together a range of industry partners (refer to the CEPS website for a full list) who are
committed to innovation and evidence-based research in policing and security. The Centre’s goal
is ‘to gain a better understanding of the origins, motivations and dynamics of crime and security
threats. Its objectives are to create significant transformations in crime, crime prevention and
security policies, and to bring about evidence-based reform in the practices of policing and
security to enhance Australia’s social, economic and cultural wellbeing.’ Professor Bronitt
discussed the issues of vulnerability and resilience. In relation to infrastructure, he noted that
much vulnerable infrastructure is in the hands of private companies and it will be necessary to
engage with policymakers and private companies before starting research. Social media can
make you vulnerable but also resilient. For example, in the Brisbane floods social media was
used for self-organising [20]. A key role for ‘foresight forums’, which have not previously been
used in Australia much. Laws and regulations don’t yet cover all aspects of what is now
appearing in social media – eg video. He noted that privacy in the modern context clearly goes
into the realm of the public – physically and virtually – related to human dignity. Social media has
no borders but laws are tied to territory, so addressing interjurisdictional issues poses significant
challenges.
2.5 Information Sharing and Privacy on the Facebook
Nobody is literally forced to join an online social network, and most networks we know about
encourage, but do not force users to reveal - for instance - their dates of birth, their cell phone
numbers, or where they currently live. And yet, one cannot help but marvel at the nature, amount,
and detail of the personal information some users provide, and ponder how informed this
information sharing is. Changing cultural trends, familiarity and confidence in digital technologies,
lack of exposure or memory of egregious misuses of personal data by others may all play a role
in this unprecedented phenomenon of information revelation. Yet, online social networks’ security
and access controls are weak by design - to leverage their value as network goods and enhance
their growth by making registration, access, and sharing of information uncomplicated. At the
same time, the costs of mining and storing data continue to decline. Combined, the two features
imply that information provided even on ostensibly private social networks is, effectively, public
data, that could exist for as long as anybody has an incentive to maintain it. Many entities - from
marketers to employers to national and foreign security agencies - may have those incentives.
At the most basic level, an online social network is an Internet community where individuals
interact, often through profiles that (re)present their public persona (and their networks of
connections) to others. Although the concept of computer-based communities dates back to the
early days of computer networks, only after the advent of the commercial Internet did such
communities meet public success. Following the SixDegrees.com experience in 1997, hundreds
of social networks spurred online (see [24] for an extended discussion), sometimes growing very
rapidly, thereby attracting the attention of both media and academia. In particular, [30] have taken
ethnographic and sociological approaches to the study of online self-representation; [25] have
focused on the value of online social networks as recommender systems; [25] have discussed
information sharing and privacy on online social networks, using FB as a case study; [26] have
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 107
demonstrated how information revealed in social networks can be exploited for “social” phishing;
[28] has studied identity-sharing behavior in online social networks.
FB requires a college’s email account for a participant to be admitted to the online social network
of that college. As discussed in [28], this increases the expectations of validity of the personal
information therein provided, as well as the perception of the online space as a closed, trusted,
and trustworthy community (college-oriented social networking sites are, ostensibly, based “on a
shared real space” [29]). However, there are reasons to believe that FB networks more closely
resemble imagined [30] communities (see also [28]): in most online social networks, security,
access controls, and privacy are weak by design; the easier it is for people to join and to find
points of contact with other users (by providing vast amounts of personal information, and by
perusing equally vast amounts of data provided by others), the higher the utility of the network to
the users themselves, and the higher its commercial value for the network’s owners and
managers. FB, unlike other online networks, offers its members very granular and powerful
control on the privacy (in terms of searchability and visibility) of their personal information. Yet its
privacy default settings are very permeable: at the time of writing, by default participants’ profiles
are searchable by anybody else on the FB network, and actually readable by any member at the
same college and geographical location. Based on research [27] we have some conclusions: a
total of 506 respondents accessed the survey. One-hundred-eleven (21.9%) were not currently
affiliated with the college Institution where we conducted our study, or did not have a email
address within that Institution’s domain. They were not allowed to take the rest of the survey. A
separate set of 32 (8.2%) participants had taken part in a previous pilot survey and were also not
allowed to take the survey. Of the remaining respondents, 318 subjects actually completed the
initial calibration questions. Out of this set, 278 (87.4%) had heard about FB, 40 had not. In this
group, 225 (70.8%) had a profile on FB, 85 (26.7%) never had one, and 8 (2.5%) had an account
but deactivated it. Within those three groups, respectively 209, 81, and 7 participants completed
the whole survey. We focus our analysis on that set - from which we further removed 3
observations from the non-members group, since we had reasons to believe that the responses
had been created by the same individual. This left us with a total of 294 respondents. Age and
student status are correlated with FB membership - but what else is? Well, of course, having
heard of the network is a precondition for membership. Thirty-four participants had never heard of
the FB - nearly half of the staff that took our survey, a little less than 23% of the graduate
students, and a negligible portion of the undergraduate students (1.59%). However, together with
age and student status (with the two obviously being highly correlated), another relevant
distinction between members and non-members may arise from privacy attitudes and privacy
concerns. Before we asked questions about FB, our survey ascertained the privacy attitudes of
participants with a battery of questions modelled after the Alan Westin’s studies [26], with a
number of modifications. In particular, in order not to prime the subjects, questions about privacy
attitudes were interspersed with questions about attitudes towards economic policy and the state
of the economy, social issues such as samesex marriage, or security questions related to the fear
of terrorism. In addition, while all instruments asked the respondent to rank agreement, concern,
worries, or importance on a 7-point Likert scale, the questions ranged from general ones (e.g.,
“How important do you consider the following issues in the public debate?”), to more and more
specific (e.g., “How do you personally value the importance of the following issues for your own
life on a day-to-day basis?”), and personal ones (e.g., “Specifically, how worried would you be if”
[a certain scenario took place]). “Privacy policy” was on average considered a highly important
issue in the public debate by our respondents (mean on the 7-point Likert scale: 5.411, where 1 is
“Not important at all” and 7 is “very important”; sd: 1.393795). In fact, it was regarded a more
important issue in the public debate than the threat of terrorism ( t = 2.4534, Pr>t = 0.0074; the
statistical significance of the perceived superiority was confirmed by a Wilcoxon signed-rank test:
z = 2.184 Pr>|z|= 0.0290) and same sex marriage (t = 10.5089, Pr>t = 0.0000; Wilcoxon signed-
rank test: z = 9.103 Pr>|z|= 0.0000 ); but less important than education policy (mean: 5.93; sd:
1.16) or economic policy (mean: 5.79; sd: 1.21) [24]. The slightly larger mean valuation of the
importance of privacy policy over environmental policy was not significant. (These results are
comparable to those found in previous studies, such as [29].) The same ranking of values (and
comparably statistically significant differences) was found when asking for “How do you
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 108
personally value the importance of the following issues for your own life on a day-to-day basis?”
The mean value for the importance of privacy policy was 5.09. For all categories, subjects
assigned slightly (but statistically significantly) more importance to the issue in the public debate
than in their own life on a day-to-day basis (in the privacy policy case, a Wilcoxon signed-rank
test returns z = 3.62 Pr > |z| = 0.0003 when checking the higher valuation of the issue in the
public debate)[24]. Similar results were also found when asking for the respondents’ concern with
a number of issues directly relevant to them: the state of the economy where they live, threats to
their personal privacy, the threat of terrorism, the risks of climate change and global warming.
Respondents were more concerned (with statistically significant differences) about threats to their
personal privacy than about terrorism or global warming, but less concerned than about the state
of the economy. Privacy concerns are not equally distributed across FB members and non-
members populations: a two-sided t test that the mean Likert value for the “importance” of privacy
policy is higher for non-members (5.67 in the non-members group, 5.30 in the members group) 6
is significant (t = -2.0431, Pr<t = 0.0210) [24]. Similar statistically significant differences arise
when checking for the level of concern for privacy threats and for worries associated with the
privacy scenarios described above. The test becomes slightly less significant when checking for
member/non-member differences in the assigned importance of privacy policy on a day-to-day
basis. Importantly, in general no comparable statistically significant differences between the
groups can be found in other categories. For example, worries about the global warming scenario
gain a mean Likert valuation of 5.36 in the members sample and 5.4 in the non-members sample.
(A statistically significant difference can be found however for the general threat of terrorism and
for the personal worry over marriage between two people of same sex: higher values in the non-
members group may be explained by their higher mean age.) In order to understand what
motivates even privacy concerned individual to share personal information on the Facebook, we
need to study what the network itself is used for. Asking participants this question directly is likely
to generate responses biased by self-selection and fear of stigma. Sure enough, by far, FB
members deny FB is useful to them for dating or self-promotion. Instead, members claim that the
FB is very useful to them for learning about and finding classmates (4.93 mean on a 7-point Likert
scale) and for making it more convenient for people to get in touch with them (4.92) [24], but deny
any usefulness for other activities. Other possible applications of FB - such as dating, finding
people who share one’s interests, getting more people to become one’s friends, showing
information about oneself/advertising oneself - are ranked very low. In fact, for those applications,
the relative majority of participants chooses the minimal Likert point to describe their usefulness
(coded as “not at all” useful). Still, while their mean Likert value remains low, male participants
find FB slightly more useful for dating than female.
Often, survey participants are less privacy conscious than non participants. For obvious reasons,
this self-selection bias is particularly problematic for survey studies that focus on privacy. Are our
respondents a biased sample of the Institution’s FB population - biased in the sense that they
provide more information than the average FB members? We did not find strong evidence of that.
Since we mined the network before the survey was administered, we were able to compare
information revelation by survey participants and non survey participants. It is true that, on
average, our survey takers provide slightly more information than the average FB member.
However, the differences in general do not pass a Fisher’s exact test for significance, except for
personal address and classes (where non participants provide statistically significant less
information) and political views (in which the difference is barely significant). Similarly, around
16% of respondents who expressed the highest concern for the scenario in which someone 5
years from now could know their current sexual orientation, partner’s name, and political
orientation, provide nevertheless all three types of information - although we can observe a
descending share of members that provide that information as their reported concerns increase.
Still, more than 48% of those with the highest concern for that scenario reveal at least their
current sexual orientation; 21% provide at least their partner’s name (although we did not control
for the share of respondents who are currently in relationships); and almost 47% provide at least
their political orientation. How knowledgeable is the average FB member about the network’s
features and their implications in terms of profile visibility? By default, everyone on the Facebook
appears in searches of everyone else, and every profile at a certain Institution can be read by
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 109
every member of FB at that Institution. However, the FB provides an extensive privacy policy and
offers very granular control to users to choose what information to reveal to whom. As mentioned
above, relative to a FB member, other users can either be friends, friends of friends, non-friend
users at the same institution, non-friend users at a different institution, and non-friend users at the
same geographical location as the user but at a different university (for example, Harvard vs.
MIT). Users can select their profile visibility (who can read their profiles) as well as their profile
searchability (who can find a snapshot of their profiles through the search features) by type of
users. More granular control is given on contact information, such as phone numbers. And yet,
among current members, 30% claim not to know whether FB grants any way to manage who can
search for and find their profile, or think that they are given no such control. Eighteen percent do
not know whether FB grants any way to manage who can actually read their profile, or think that
they are given no such control. These numbers are not significantly altered by removing the 13
members who claim never to login to their account. In fact, even frequency of login does not
explain the lack of information for some members. On the other hand, members who claim to
login more than once a day are also more likely to believe that they have “complete” control on
whom can search their profile. Awareness of one’s ability to control who can see one’s profile is
not affected by the frequency of login, but is affected by the frequency of update (a Pearson 2(12)
= 28.9182 Pr = 0.004 shows that the distribution is significant). Note the difference between the
two graphs and, specifically, the distribution by frequency of update for respondents who
answered “Do not know” or “No control” (graph on the right). Twenty-two percent of our sample
do not know what the FB privacy settings are or do not remember if they have ever changed
them. Around 25% do not know what the location settings are. To summarize, the majority of FB
members claim to know about ways to control visibility and searchability of their profiles, but a
significant minority of members are unaware of those tools and options. More specifically, we
asked FB members to discuss how visible and searchable their own profiles were. We focused on
those participants who had claimed never to have changed their privacy settings (that by default
make their profile searchable by everybody on FB and visible to anybody at the same Institution),
or who did not know what those settings were. Almost every such respondent realizes that
anybody at their Institution can search their profile. However, 24% incorrectly do not believe that
anybody on FB can in fact search their profile. Misunderstandings about visibility can also go in
the opposite direction: for instance, 16% of current members believe, incorrectly, that anybody on
FB can read their profile. 12 In fact, when asked to guess how many people could search for their
profile on FB (respondents could answer by selecting the following possible answers from a drop-
box: a few hundred, a few thousands, tens of thousands, hundreds of thousands, millions), the
relative majority of members who did not alter their default settings answered, correctly, “Millions.”
However, more than half actually underestimated the number to tens of thousands or less. In
short, the majority of FB members seem to be aware of the true visibility of their profile - but a
significant minority is vastly underestimating the reach and openness of their own profile. Does
this matter at all? In other words, would these respondents be bothered if they realized that their
profile is more visible than what they believe? The answer is complex. First, when asked whether
the current visibility and searchability of the profile is adequate for the user, or whether he or she
would like to restrict it or expand it, the vast majority of members (77% in the case of
searchability; 68% in the case of visibility) claim to be satisfied with what they have - most of them
do not want more or less visibility or searchability for their profiles (although 13% want less
searchability and 20% want less visibility) than what they (correctly or incorrectly) believe to have.
Secondly, FB members remain wary of whom can access their profiles, but claim to manage their
privacy fears by controlling the information they reveal.
While respondent are mildly concerned about who can access their personal information and how
it can be used, they are not, in general, concerned about the information itself, mostly because
they control that information and, with less emphasis, because believe to have some control on
its access. Respondents are fully aware that a social network is based on information sharing: the
strongest motivator they have in providing more information are reported, in fact, as “having fun”
and “revealing enough information so that necessary/useful to me and other people to benefit
from FaceBook.” However, psychological motivations can also explain why information revelation
seems disconnected from the privacy concerns. When asked to express whether they considered
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 110
the current public concern for privacy on social network sites such as the FaceBook or MySpace
to be appropriate (using a 7-point Likert scale, fact, the majority of respondents agree (from mildly
to very much) with the idea that the information other FB members reveal may create privacy
risks to those members (that is, the other members; average response on a 7-point Likert scale:
4.92) - even though they tend to be less concerned about their own privacy on FB (average
response on a 7-point Likert scale: 3.60; Student’s t test shows that this is significantly less than
the concern for other members: t = -10.1863, P <t = 0.0000; also a Wilcoxon matched pair test
provides a similar result: z = -8.738, Pr <|z| = 0.0000). In fact, 33% of our respondents believe
that it is either impossible or quite difficult for individuals not affiliated with an university to access
FB network of that university. “Facebook is for the students” says a student interviewed in [29].
But considering the number of attacks described in [30] or any recent media report on the usage
of FB by police, employers, and parents, it seems in fact that for a significant fraction of users the
FB is only an imagined community.
In order to gauge the accuracy of the survey responses, we compared the answers given to a
question about revealing certain types of information (specifically, birthday, cell phone, home
phone, current address, schedule of classes, AIM screenname, political views, sexual orientation
and the name of their partner) with the data from the actual (visible) profiles. We found that
77.84% of the answers were exactly accurate: if participants said that they revealed a certain type
of information, that information was in fact present; if they wrote it was not present, in fact it was
not. A little more than 8% revealed more than they said they do (i.e. they claim the information is
not present when in fact it is). A little more than 11% revealed less then they claimed they do. In
fact, 1.86% claimed that they provide false information on their profile (information is there that
they claim is intentionally false or incomplete), and 0.71% have missing false information (they
claimed the information they provide is false or incomplete, when in fact there was no
information). We could not locate the FB profiles for 13 self-reported members that participated in
the survey. For the participants with CMU email address, 2 of them did mention in the survey that
they had restricted visibility, searchability, or access to certain contact information, and 3 wrote
that not all CMU users could see their profile.
2.6 Social Media Compliance Policy
Social media are powerful communication tools but they carry significant risks to the reputation of
the University and its members. A prominent risk arises from the blurring of the lines between
personal voice and institutional voice. For the purposes of this policy, social media is defined as
media designed to be disseminated through social interaction, created using highly accessible
and scalable publishing techniques using the internet.
Examples of popular social media sites include but are not limited to:
• LinkedIn
• Twitter
• Facebook
• YouTube
• MySpace
• Flickr
• Yammer
• Yahoo/MSN messenger
• Wiki’s/Blogs
All members of the University using social media tools, including via personal accounts, must be
aware that the same laws, professional expectations, and guidelines apply at all times.
Any social media announcements not issued by the Corporate Communications team, which may
include (but are not limited to) content, comments and opinions must not give the impression they
are in anyway the explicit positioning of the University of Liverpool. Any official University position
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 111
and comment must be approved by the University Corporate Communications Director or his or
her representative.
The Social Media Compliance Policy helps members of the University to use social media sites
without compromising their personal security or the security of University information assets.
The University has adopted the following principles, which underpin this policy:
• All information assets must be appropriately handled and managed in accordance with their
classification.
• University information assets should be made available to all who have a legitimate need for
them.
• The integrity of information must be maintained; information must also be accurate, complete,
timely and consistent with other related information and events.
• All members of the University, who have access to information assets, have a responsibility to
handle them appropriately and in accordance with their classification.
• Information asset owners are responsible for ensuring that the University classification
scheme (which is described in the Information Security Policy) is used appropriately [31].
Below is a table of examples of threats and risks associated with the use of social media:
Threats: Introduction of viruses and malware to the University network; Exposure of the University
and its members through a fraudulent or hijacked presence e.g. unofficial social media accounts;
Unclear or undefined ownership of content rights of information posted to social media sites
(copyright infringement); Use of personal accounts to communicate University owned information
assets; Excessive use of social media within the University.
Risks:
• Data leakage/theft.
• System downtime.
• Resources required to clean systems.
• Customer backlash/adverse legal actions.
• Exposure of University information assets.
• Reputational damage.
• Targeted phishing attacks on customers or employees .
• Exposure of customer information.
• Privacy violations.
• Network utilisation issues.
• Productivity loss.
• Increased risk of exposure to viruses and malware due to longer duration of sessions.
The University’s Information Security Policy defines the categories which are assigned to
University information assets. Those assets which are classified as Confidential, Strictly
Confidential or Secret must not be posted on social media sites. Postings must not contain
personal information concerning students, employees, or alumni. In addition, members of the
University must be aware of copyright and intellectual property rights of others and of the
University.
Members of the University should be aware of the social media terms and conditions and
ownership of content before submitting. Members should familiarise themselves with key
University policies including:
• University IT Regulations
• Information Asset Classification Policy
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 112
• Copyright Policy
• Data Protection Policy
• Acceptable Use of Electronic Resources [31]
Communication via social media sites and tools must protect the University’s institutional voice by
remaining professional in tone and in good taste. Members of the University who use personal
social media accounts must not give the impression that their social media site represents the
explicit positioning of the University of Liverpool. This should be considered when:
• Naming pages or accounts
• Selecting a profile picture or icon
• Selecting content to post [31]
Names, profile images, and posts should all be clearly linked to the particular Faculty, School,
Institute and Professional Service Department. In addition, All University pages must have an
associated member who is identified as being the information asset owner and who is responsible
for its official affiliation of the University. If you are responsible for representing official social
media accounts on behalf of the University of Liverpool when posting on a social media platform,
clearly acknowledge this.
Members of the University must be aware that the University has the right to request the removal
of content from an official social media account and from a personal account if it is deemed that
the account or its submissions pose a risk to the reputation of the University or to that of one of its
members.
All members of the University are directly responsible and liable for the information they handle.
Members of staff are bound to abide by the University IT regulations by the terms of their
employment. Students are bound to abide by the University IT Regulations when registering as a
student member of the University.
Authorised members of the University may monitor the use and management of information
assets to ensure effective use and to detect unauthorised use of information assets.
2.7 Threats and Anti-threats Strategies for Social Network Sites
Online Social Networks (OSN) such as Facebook, Tweeter, MySpace etc. play an important role
in our society, which is heavily influenced by the Information Technology (IT) [32]. In spite of their
user friendly and free of cost services, many people still remain reluctant to use such networking
sites because of privacy concerns. Many people claim, these sites have made the world more
public and less private – consequently a world with less morale is evolving. Some consider this
social change as positive because people are more connected to each other. Nowadays almost
all people use the social networking websites to share their ideas photos, videos, with their
friends and relatives and even discuss many things about their daily life not only social issues but
also others like politics which help lately to change the regime status in many countries such as
Egypt Libya and Tunisia. in 1971 started the first communications in the form of social networks
this happened in the past when the first email was send from one computer to another connected
one. The data exchanged over the phone lines was happened in 1987 where bulletin board
system set and web browsers were used for the first time to make to establish the principle of
communication. Even, social networking websites have many advantages for users to
communicate and exchanges information as we mention above, unfortunately, they have their
negative impact! Most people spend all their time using such websites and forget about their
duties and sometimes many use loss their life using such websites due to the illegal contents like
pornographic, terrorism, religiolism and many other. Social networking websites must satisfy
many security issues to work successfully and to be used by people who unfortunately trust most
of websites. Social networking websites should have save storage for personal data and tools for
managing and securing data and control access to the saved data according some limitations.
They must add some features to restrict individual data access from other users. Unfortunately
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 113
most of social networks have not had applied this issues actually There are many attacks on
social networks such as worms, viruses, Trojan horses and fishing websites. Malicious programs
vulnerabilities and many others such as sql injection top attacks users should be worried about all
types of attacks and have the right firewalls and software which protect them of being hack by
cyber criminal. Social networking features:
 Global social network where geographical and spatial barriers are cancelled.
 Interaction where sites gives space for the active participation for the viewer and reader.
 Easy to use most social networks can be use easily and they contains images and symbols
that make it easier for user interaction [34].
Types of social networks divisions depending on the service provided or targets from its inception
to the following types: personal networks; cultural networks.
Social networks also can be divided according to the way we communicate and services into
three types:
 Networks Allow For Written Communication.
 Network Allow Voice Communication.
 Network allow for visual Communication [32].
Social networks attract thousands of users who represent potential victims to attackers from the
following types (Ref: Figure 1) [34].
FIGURE 1. Threats percentage on social networks (Sophos 2010 Security Threat Report).
There are many procedures, which help us to protect as much as possible our privacy when
using social networks.
 Be careful and don’t write any sensitive information in your profile page bulletin board, instant
messaging or any other type of participation and electronic publishing in internet so that the
identity could be protected against the thefts or security threats.
 Be skeptical because social networking websites are full of illegal users, hackers and cyber
criminals.
 Be wise man and thinks twice before you write any think when using social networking
websites.
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 114
 Be polite and do not publish any illegal picture and video and even don’t write any abnormal
messages and also reflects your personal impacts and be ambassador to all others on the
internet.
 Read the privacy policy of all social networks before using them [35].
Cyber threats that might the users face can be categorized into two categories.
Privacy concerns demand that user profiles never publish and distribute information over the web.
Variety of information on personal home pages may contain very sensitive data such as birth
dates, home addresses, and personal mobile numbers and so on. This information can be used
by hackers who use social engineering techniques to get benefits of such sensitive information
and steal money. Generally, there are two types of security issues: One is the security of people.
Another is the security of the computers people use and data they store in their systems. Since
social networks have enormous numbers of users and store enormous amount of data, they are
natural targets spammers, phishing and malicious attacks. Moreover, online social attacks include
identity theft, defamation, stalking, injures to personal dignity and cyber bulling. Hackers create
false profiles and mimic personalities or brands, or to slander a known individual within a network
of friends[36].
Anti threats strategies. Recent work in programming language techniques demonstrates that it
is possible to build online services that guarantee conformance with strict privacy policies. On the
other hand, since service providers need private data to generate revenue, they have a
motivation to do the opposite. Therefore, the research question to be addressed is to what extent
a user can ensure his/her privacy while benefiting from existing online services.
There are recommended the following strategies for circumventing threats associated with social
website [33]:
a) Building awareness the information disclosure: users most take care and very conscious
regarding the revealing of their personal information in profiles in social websites.
b) Encouraging awareness -raising and educational campaigns: governments have to provide
and offer educational classes about awareness -raising and security issues.
c) Modifying the existing legislation: existing legislation needs to be modified related to the new
technology and new frauds and attacks.
d) Empowering the authentication: access control and authentication must be very strong so that
cybercrimes done by hackers, spammers and other cybercriminals could be reduced as much as
possible.
e) Using the most powerful antivirus tools: users must use the most powerful antivirus tools with
regular updates and must keep the appropriate default setting, so that the antivirus tools could
work more effectively.
f) Providing suitable security tools: here, we give recommendation to the security software
providers and is that: they have to offers some special tools for users that enable them to remove
their accounts and to manage and control the different privacy and security issues.
2.8 Facebook’s Privacy Policy in the Law
Facebook has apparently decided to delay a proposed new privacy policy after a coalition of
privacy groups asked the Federal Trade Commission on Wednesday to block the changes on the
grounds that they violated a 2011 settlement with the regulatory agency.
A spokeswoman for the F.T.C. confirmed Thursday that the agency had received the letter but
had no further comment.
In a statement published by The Los Angeles Times and Politico on Thursday afternoon,
Facebook said, “We are taking the time to ensure that user comments are reviewed and taken
into consideration to determine whether further updates are necessary and we expect to finalize
the process in the coming week.”
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 115
Asked about the delay, a Facebook spokesman said he was unaware of the latest developments.
When it first announced the changes on Aug. 28, Facebook told its 1.2 billion users that the
updates were “to take effect on September 5.” [37]
The changes, while clarifying how Facebook uses some information about its users, also
contained a shift in legal language that appeared to put the burden on users to ask Facebook not
to use their personal data in advertisements. Previously, the company’s terms of use, and its
settlement with the F.T.C., had indicated that it wouldn’t use such personal data without explicit
consent. Facebook’s new terms would also allow it to use the names and photos of teenagers in
advertising, an area of particular concern to privacy advocates because children may be unaware
of the consequences of actions such as liking a brand page.
The original proposal has drawn tens of thousands of comments from Facebook users, most of
them opposed to the changes [37].
Individual differences has been found to greatly impact information disclosure and perceptions of
privacy. The traditional division of users according to their privacy attitudes was proposed by
Hofstede (Hofstede 1996): users were classified into two extreme groups, which were reminiscent
of the classic sociological concepts of individualist and collectivist societies. More recently,
ethnographic studies have explored privacy attitudes of social-networking users, especially of
Facebook ones. After surveying the same group of students twice, once in 2009 and later in
2010, Boyd et al. revealed that users’ confidence in changing their privacy settings strictly
depended on frequency of site use and Internet literacy (Boyd and Hargittai 2010). Also, gender
and cultural differences seem to affect privacy attitudes. Lewis et al. found that women’s profiles
are more likely than men’s to be private and that there is a relationship between music taste and
degree of disclosure of one’s profile (Lewis, Kaufman, and Christakis 2008). Finally, ethnicity
plays a role. Chang et al. studied the privacy attitudes of U.S. Facebook users of different
ethnicities and found that Hispanics tend to share more pictures, Asians more videos, and Afro-
Americans more status updates (Chang et al. 2010). discloses personal information partly
depends on one’s personality traits. In particular, it has been found to depend on the big five
personality traits and the self-monitoring trait, all of which are discussed next. The five-factor
model of personality, or the big five, consists of a comprehensive and reliable set of personality
concepts (Costa and Mccrae 2005; Goldberg et al. 2006). The idea is that an individual can be
associated with five scores that correspond to five main personality traits. Personality traits
predict a number of real-world behaviors. They, for example, are strong predictors of how
marriages turn out: if one of the partner is high in Neuroticism, then divorce is more likely (Nettle
2007). Research has consistently shown that people’s scores are stable over time (they do not
depend on quirks and accidents of mood) and correlate well with how others close to them (e.g.,
friends) see them (Nettle 2007). The relationship between use of Facebook and the big five
personality traits has been widely studied, yet contrasting results have emerged. To recap, the
trait of Openness has been found to be associated with higher information disclosure, while
Conscientiousness has been associated with cautious disclosure. Contrasting findings have
emerged from the traits of Extraversion, Agreeableness, and Neuroticism. Schrammel et al.
wrote: “personality traits do not seem to have any predictive power on the disclosure of
information in online communities” (Schrammel, K¨offel, and Tscheligi 2009). In the same vein,
Ross et al. concluded that “personality traits were not as influential as expected, and most of the
predictions made from previous findings did not materialise” (Ross et al. 2009). These
researchers, however, conceded that they only considered the big five personality traits (while
other traits might better explain the use of Facebook) and that their samples consisted of only
undergrad students. Current studies might tackle these two limitations by, for example: 1)
considering personality traits other than the big five; and 2) having participants who are not only
undergrad students, who might well be more Internet savvy than older people. Next, we will
describe how we partly tackle the first problem by considering the personality trait of “self-
monitoring”. In Section 5, we will describe how we tackle the second problem by collecting a
representative sample of Facebook users in the United States. Having identified the personality
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 116
traits that have been associated with information disclosure in the literature, we now need to
model the process of disclosure itself. By doing so, we will be able to quantify a user’s disposition
to disclose her/his personal information. One measure of information disclosure is the count of
fields (e.g., birthday, hometown, religion) a user shares on her/his profile. The problem of this
approach is that some information fields are more revealing than others, and it is questionable to
assign arbitrary (relevance) weights to those fields. An alternative way is to resort to a
psychometric technique called Item Response Theory (IRT), which has already been used for
modeling information disclosure in social media (Liu and Terzi 2009). Next, we detail what IRT is,
why choosing it, and when and how it works. What IRT is. IRT is a psychometric technique used
to design tests and build evaluation scales for those tests. It extracts patterns from participants’
responses and then creates a mathematical model upon the extracted patterns, and this model
can then be used to access an estimate of user attitude (de Bruin and Buchner 2010) (MacIntosh
1998) (Vishwanath 2006).
Limitations of our data. Critics might rightly put forward one important issue with our data: some
users might have responded untruthfully to personality tests. However, validity tests on the
personality data suggest that users have responded to the personality questions accurately and
honestly, and that is likely because they installed the application motivated primarily by the
prospect of taking and receiving feedback from a high quality personality questionnaire. Critics
might then add that, since our sample consists of self-selected users who are interested in their
personality, these users might be more active than the general Facebook population. We have
looked into this matter and found that our users do not seem to be more active than average - we
have reported that the number of contacts for the average user in our sample is 124 whereas
Facebook reports an average of 1303. Limitation of our study. Our study has four main
limitations. First, we have computed exposure scores only upon user-specified fields, while we
should have ideally considered additional elements (e.g., photos, wall comments, fan page
memberships, segmentation of one’s social contacts into friend groups). We were not able to do
so as we did not have full access to user profiles - we could access only the elements that we
have been studying here. So we looked at sharing of profile fields, which is the simplest instance
of sharing, making it an ideal starting point. Yet, even by considering only profile fields, we have
learned that our users are far from behaving in the same way - very different privacy attitudes
emerge (Figure 1). Second, we have considered Facebook users who live in the United States.
Since cultural guidelines clearly exist about what is more and less private and public, one should
best consider that our results are likely to hold for Facebook users in the United States. We would
refrain from generalizing our results to other social-networking platforms (e.g., Twitter) or to any
other society - or even to certain subcultures within the United States or within Facebook. To
partly tackle this limitation, we are currently studying users of countries other than United States
and platforms other than Facebook (i.e., Twitter). Third, information disclosure and concealment
might be likely confounded by Facebook activity and one thus needs to control for it. We did not
do so because activity information is not readily available from Facebook’s API, and future
studies should propose reasonable proxies for Facebook activity4. Fourth, information disclosure
might be confounded by the general desire to complete one’s profile: fields like “interested in” are
quick to fill out, whereas “work information” just takes more effort and is not generally relevant to
Facebook interactions. Yet, this reflects what the “relationship currency” of Facebook is (Nippert-
Eng 2010) - that is, it reflects which fields are used to maintain relationships and which are
instead concealed without causing any harm.
Novelty of Information Disclosure Model. IRT has already been used to model privacy attitudes of
social-networking users (Liu and Terzi 2009). Therefore, our methodology closely followed what
has been proposed before. However, since IRT has never been applied to large-scale
socialnetworking data, we have run into a number of problems. First, algorithms used for
estimating IRT’s parameters turned out to have a greater impact on result accuracy than what
one might expect.We found that best results are achieved if scores are computed in ways
different than those proposed in (Liu and Terzi 2009) and similar to those proposed in
psychometrics research (Baker 2001). Second, for a large number of users, errors associated
with their exposure scores are unacceptably high. We had to filter those users out, yet we
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 117
worked with a sample of 1,323 users. However, for studies starting off with a smaller initial
sample, this problem should be addressed. Theoretical Implications. Studies that have attempted
to understand how individuals conceptualize privacy offline and online have often resorted to in-
depth interviews. This methodology has enabled scholars to better ground their work on exactly
what people say about privacy in their daily experiences. We have proposed the use of a
complementary methodology: we have studied what people do on Facebook by modeling the
work of disclosure and concealment which their profiles reflect. As a result, we have quantified
the extent to which personality traits affect information disclosure and concealment, and we have
done so upon large-scale Facebook data that has been collected unobtrusively as the users were
on the site. We have found that Openness is correlated, albeit weakly, with the amount of
personal information one discloses. By contrast, after controlling for Extraversion, the self-
monitoring trait’s contribution disappears, suggesting that high self-monitors might present
themselves in likable ways, as the literature would suggest, but they do not tend to share more
private information or to make that information more visible. In addition to differentiating users
based on their personality traits, we have also found important differences among profile fields
and quantified the extent to which not all private fields are equally private: for example, work-
related (job status) information is perceived to be more private than information on whether one is
looking for a partner. Practical Implications. There are two areas in which our findings could be
practically applied in the short term. The first is social media marketing. Marketing research has
previously found that individuals high in Openness tend to be innovators and are more likely to
influence others (Amichai- Hamburger and Vinitzky 2010). Since we have found that these
individuals also tend to have less restrictive privacy settings, social media marketing campaigns
would be able to identify influentials by determining which users have less restrictive privacy
settings. The second area is privacy protection. Our results suggest that, by simply using age and
gender, one could offer a preliminary way of personalizing default privacy settings, which users
could then change. One could also imagine building privacy-protecting tools that inform users
about the extent to which they are exposing information that is generally considered to be
sensitive by the community. This tool could also exploit the independence assumption behind IRT
to parallelize the estimation of the model’s parameters and thus be able to monitor who is
exposing privacy sensitive information in real-time.
2.9 Social Media Report and the Law
Social media is an increasingly important means of communicating and connecting with
customersEven if you do not directly engage with social media, you should consider monitoring
activity on social media that relates to your organisation. Not all interaction on social media will
be positive. You should be prepared for active and sometimes critical debate with social media
users. Attempting to suppress it may backfire. Garner support from the social media community
and argue your case on the merits. Care must be taken over social media postings. They could
lead to serious embarrassment or even be actionable. Social media is subject to the same rules
as information published by other means. If you have paid or otherwise arranged for others to
post on your behalf, you should make this clear. Do not try to pass these posts off as genuine
user-generated content. In some cases, you may become liable for user-generated content. You
should consider monitoring usergenerated content or, at the very least, putting an effective notice
and take down procedure in place. If your organisation is listed, you should ensure you comply
with relevant disclosure rules for inside information. In most cases, information should be
disclosed via a regulatory information service and not just via social media. Those using social
media to communicate on behalf of your organisation should be given very clear guidelines and
training [39].
For most companies, some engagement with social media is unavoidable. Regardless of whether
or not a company has an “official” social media page or account, it is very likely that it is the
subject of a number of unofficial pages, its employees are interacting with social media and it is
the subject of lively discussion online. For example, our review of the FTSE 100 Companies’ use
of social media also looked at unofficial Facebook pages, i.e. pages that do not appear to have
been authorised by that company7. These have typically been set up by employees, ex-
Adam Ali.Zare Hudaib
International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 118
employees, customers or pressure groups and are not always complimentary. A key reason to
become involved in social media is engagement. Social media provides a means for that
organisation to directly interact with its customers and to obtain feedback from them. Handled
properly, this can help to build a brand and create a buzz and awareness to help generate new
marketing leads and attract new customers. Perhaps the most beneficial outcomes from this
interaction is social validation – users who start to “organically” advocate your organisation or its
aims and ideals. This “electronic word of mouth” can be a very powerful marketing and
influencing tool. Engagement with users might include commenting on topical issues and the
organisation using social media to put their side of the story across. One example comes from
energy companies who have used their social media presence to comment on rising fuel costs
and discuss the link between costs and wholesale energy prices, as well as providing information
on related topics such as energy efficiency. One exercise most organisations will want to do,
even if they do not directly engage with social media, is to listen to social media conversations to
find out what users are saying about them. There are a number of off-the-shelf and cloud-based
tools that can be used for this purpose. They not only allow monitoring of individual conversations
but also monitoring on a macro-level, for example through sentiment analysis. These tools
automatically analyse the 500 million tweets sent every day and report on how often a topic is
mentioned and whether it is mentioned in a positive or negative context[39]. Finally, social media
is an increasingly important tool for advertising. Some social media platforms are able to build
detailed profiles about their users including their sex, age, location, marital status and
connections. This allows advertising to be targeted precisely. One risk of social media is too
much engagement. Some users are more than happy to share their views in very
uncompromising terms. A thick skin is vital. For example, energy companies using social media
accounts to comment on rising fuel costs have received direct and frank feedback from their
customers. The same is true in financial services and many other industry sectors. Attempting to
suppress critical comment may backfire. Instead you need to engage with customers and argue
your case on the merits. This means your social media team need to be properly resourced. They
will also need to update your social media presence to ensure it remains fresh and manages to
respond to user comments. This engagement with users is a vital part of any social media
strategy, either to build a distinctive brand (as is the case with Tesco Mobile) or to mollify users’
concerns (as is the case with energy companies). Another problem is embarrassing or ill-judged
posts. There are numerous topical examples. For example, Ian Katz, the editor of Newsnight,
sent a Tweet in September referring to “boring snoring rachel reeves” (Shadow Chief Secretary to
the Treasury) following her appearance on the programme. He had intended to send a private
message to a friend. The circumstances in which the Tweet was sent are not clear but there have
certainly been other cases in which late night rants on Twitter have caused significant
embarrassment[39]. Some postings may not just be stupid but also actionable. One example is
the notorious Tweet by Sally Bercow: “Why is Lord McAlpine trending? *innocent face*”. That
Tweet was made when there was significant speculation about the identity of a senior unnamed
politician who had been engaged in child abuse. The Tweet was false and found to be seriously
defamatory11. It highlights a number of risks with social media. Firstly, the repetition rule applies
such that those repeating a defamatory allegation made by someone else are treated as if they
had made it themselves. This is relevant when retweeting or reposting content. Secondly, while
the courts provide some leeway for the casual nature of social media and the fact that those who
participate “expect a certain amount of repartee or give and take”12, that protection only extends
so far. Civil and criminal liability for social media postings can also arise in a number of other
ways. Paul Chambers was initially convicted for sending a “menacing” message13 after
Tweeting: “Crap! Robin Hood airport is closed. You’ve got a week and a bit to get your shit
together, otherwise I’m blowing the airport sky high!!”. He was fined Ј385 and ordered to pay
Ј600 costs. However, this prosecution was overturned on appeal. As with more traditional
formats, sales and marketing use of social media must be decent, honest and truthful. Most of the
specific social media issues arise out of social validation, i.e. users who “organically” advocate
that organisation or its aims and ideals. As this is such a powerful tool there is a real risk of it
being misused. The Office of Fair Trading has already taken enforcement action against an
operator of a commercial blogging network, Handpicked Media, requiring them to clearly identify
when promotional comments have been paid for. This included publication on website blogs and
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore
Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore

More Related Content

What's hot

Detecting fake news_with_weak_social_supervision
Detecting fake news_with_weak_social_supervisionDetecting fake news_with_weak_social_supervision
Detecting fake news_with_weak_social_supervisionSuresh S
 
Benchmarking the Privacy-­Preserving People Search
Benchmarking the Privacy-­Preserving People SearchBenchmarking the Privacy-­Preserving People Search
Benchmarking the Privacy-­Preserving People SearchDaqing He
 
Thesis proposal v3
Thesis proposal v3Thesis proposal v3
Thesis proposal v3lroddesign
 
Social Network Analysis power point presentation
Social Network Analysis power point presentation Social Network Analysis power point presentation
Social Network Analysis power point presentation Ratnesh Shah
 
Jasist11
Jasist11Jasist11
Jasist11svennus
 
Visible Effort: A Social Entropy Methodology for Managing Computer-Mediated ...
Visible Effort: A Social Entropy Methodology for  Managing Computer-Mediated ...Visible Effort: A Social Entropy Methodology for  Managing Computer-Mediated ...
Visible Effort: A Social Entropy Methodology for Managing Computer-Mediated ...Sorin Adam Matei
 
Dan Trottier
Dan TrottierDan Trottier
Dan Trottiercitasa
 
Chung-Jui LAI - Polarization of Political Opinion by News Media
Chung-Jui LAI - Polarization of Political Opinion by News MediaChung-Jui LAI - Polarization of Political Opinion by News Media
Chung-Jui LAI - Polarization of Political Opinion by News MediaREVULN
 
The Internet is a magnifying glass
The Internet is a magnifying glassThe Internet is a magnifying glass
The Internet is a magnifying glassSorin Adam Matei
 
Least Cost Influence by Mapping Online Social Networks
Least Cost Influence by Mapping Online Social Networks Least Cost Influence by Mapping Online Social Networks
Least Cost Influence by Mapping Online Social Networks paperpublications3
 
Preproc 03
Preproc 03Preproc 03
Preproc 03naeyam
 
Relational Development & Interpersonal Communication In Computer Mediated Con...
Relational Development & Interpersonal Communication In Computer Mediated Con...Relational Development & Interpersonal Communication In Computer Mediated Con...
Relational Development & Interpersonal Communication In Computer Mediated Con...maxbury
 
30 Tools and Tips to Speed Up Your Digital Workflow
30 Tools and Tips to Speed Up Your Digital Workflow 30 Tools and Tips to Speed Up Your Digital Workflow
30 Tools and Tips to Speed Up Your Digital Workflow Mike Kujawski
 
An introduction to the potential of social networking sites in ed
An introduction to the potential of social networking sites in edAn introduction to the potential of social networking sites in ed
An introduction to the potential of social networking sites in edKhairul Nisa
 
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREALOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREAijsptm
 

What's hot (19)

Detecting fake news_with_weak_social_supervision
Detecting fake news_with_weak_social_supervisionDetecting fake news_with_weak_social_supervision
Detecting fake news_with_weak_social_supervision
 
Benchmarking the Privacy-­Preserving People Search
Benchmarking the Privacy-­Preserving People SearchBenchmarking the Privacy-­Preserving People Search
Benchmarking the Privacy-­Preserving People Search
 
2053951715611145
20539517156111452053951715611145
2053951715611145
 
Thesis proposal v3
Thesis proposal v3Thesis proposal v3
Thesis proposal v3
 
Social Network Analysis power point presentation
Social Network Analysis power point presentation Social Network Analysis power point presentation
Social Network Analysis power point presentation
 
Spammer taxonomy using scientific approach
Spammer taxonomy using scientific approachSpammer taxonomy using scientific approach
Spammer taxonomy using scientific approach
 
Web 3.0
Web 3.0Web 3.0
Web 3.0
 
Jasist11
Jasist11Jasist11
Jasist11
 
Visible Effort: A Social Entropy Methodology for Managing Computer-Mediated ...
Visible Effort: A Social Entropy Methodology for  Managing Computer-Mediated ...Visible Effort: A Social Entropy Methodology for  Managing Computer-Mediated ...
Visible Effort: A Social Entropy Methodology for Managing Computer-Mediated ...
 
Research Paper On Correlation
Research Paper On CorrelationResearch Paper On Correlation
Research Paper On Correlation
 
Dan Trottier
Dan TrottierDan Trottier
Dan Trottier
 
Chung-Jui LAI - Polarization of Political Opinion by News Media
Chung-Jui LAI - Polarization of Political Opinion by News MediaChung-Jui LAI - Polarization of Political Opinion by News Media
Chung-Jui LAI - Polarization of Political Opinion by News Media
 
The Internet is a magnifying glass
The Internet is a magnifying glassThe Internet is a magnifying glass
The Internet is a magnifying glass
 
Least Cost Influence by Mapping Online Social Networks
Least Cost Influence by Mapping Online Social Networks Least Cost Influence by Mapping Online Social Networks
Least Cost Influence by Mapping Online Social Networks
 
Preproc 03
Preproc 03Preproc 03
Preproc 03
 
Relational Development & Interpersonal Communication In Computer Mediated Con...
Relational Development & Interpersonal Communication In Computer Mediated Con...Relational Development & Interpersonal Communication In Computer Mediated Con...
Relational Development & Interpersonal Communication In Computer Mediated Con...
 
30 Tools and Tips to Speed Up Your Digital Workflow
30 Tools and Tips to Speed Up Your Digital Workflow 30 Tools and Tips to Speed Up Your Digital Workflow
30 Tools and Tips to Speed Up Your Digital Workflow
 
An introduction to the potential of social networking sites in ed
An introduction to the potential of social networking sites in edAn introduction to the potential of social networking sites in ed
An introduction to the potential of social networking sites in ed
 
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREALOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
LOCATION PRIVACY ONLINE: CHINA, THE NETHERLANDS AND SOUTH KOREA
 

Viewers also liked

COM 427 Social Media and Security
COM 427 Social Media and SecurityCOM 427 Social Media and Security
COM 427 Social Media and SecurityKyle Basedow
 
Facebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli Lieurance
Facebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli LieuranceFacebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli Lieurance
Facebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli LieuranceSocialHRCamp
 
Your Best Practice Guide to Social Media and the Law
Your Best Practice Guide to Social Media and the LawYour Best Practice Guide to Social Media and the Law
Your Best Practice Guide to Social Media and the LawNexus Publishing
 
20160317 ARMA Wyoming Social Media Security Threats
20160317 ARMA Wyoming Social Media Security Threats20160317 ARMA Wyoming Social Media Security Threats
20160317 ARMA Wyoming Social Media Security ThreatsJesse Wilkins
 
Social Media Security
Social Media SecuritySocial Media Security
Social Media SecurityDel Belcher
 
The Intersection of Social Media, HIPAA, and the Workplace
The Intersection of Social Media, HIPAA, and the WorkplaceThe Intersection of Social Media, HIPAA, and the Workplace
The Intersection of Social Media, HIPAA, and the WorkplacePolsinelli PC
 
Legal issues of social media 2016
Legal issues of social media 2016Legal issues of social media 2016
Legal issues of social media 2016Brian Huonker
 

Viewers also liked (7)

COM 427 Social Media and Security
COM 427 Social Media and SecurityCOM 427 Social Media and Security
COM 427 Social Media and Security
 
Facebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli Lieurance
Facebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli LieuranceFacebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli Lieurance
Facebagged, Twerminated & NetWORKed: Social Media & the Law - Kelli Lieurance
 
Your Best Practice Guide to Social Media and the Law
Your Best Practice Guide to Social Media and the LawYour Best Practice Guide to Social Media and the Law
Your Best Practice Guide to Social Media and the Law
 
20160317 ARMA Wyoming Social Media Security Threats
20160317 ARMA Wyoming Social Media Security Threats20160317 ARMA Wyoming Social Media Security Threats
20160317 ARMA Wyoming Social Media Security Threats
 
Social Media Security
Social Media SecuritySocial Media Security
Social Media Security
 
The Intersection of Social Media, HIPAA, and the Workplace
The Intersection of Social Media, HIPAA, and the WorkplaceThe Intersection of Social Media, HIPAA, and the Workplace
The Intersection of Social Media, HIPAA, and the Workplace
 
Legal issues of social media 2016
Legal issues of social media 2016Legal issues of social media 2016
Legal issues of social media 2016
 

Similar to Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore

My new proposal (1).docx
My new proposal (1).docxMy new proposal (1).docx
My new proposal (1).docxAttaUrRahman78
 
SECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKS
SECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKSSECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKS
SECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKSZac Darcy
 
Social Media Networking Site Usage Demographics Stats
Social Media Networking Site Usage Demographics StatsSocial Media Networking Site Usage Demographics Stats
Social Media Networking Site Usage Demographics Statsrishibajaj8
 
The Impacts of Social Networking and Its Analysis
The Impacts of Social Networking and Its AnalysisThe Impacts of Social Networking and Its Analysis
The Impacts of Social Networking and Its AnalysisIJMER
 
Social networking-overview
Social networking-overviewSocial networking-overview
Social networking-overviewsakshicherry
 
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...ijsptm
 
Social Networking Websites and Image Privacy
Social Networking Websites and Image PrivacySocial Networking Websites and Image Privacy
Social Networking Websites and Image PrivacyIOSR Journals
 
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING ijcsit
 
Towards Decision Support and Goal AchievementIdentifying Ac.docx
Towards Decision Support and Goal AchievementIdentifying Ac.docxTowards Decision Support and Goal AchievementIdentifying Ac.docx
Towards Decision Support and Goal AchievementIdentifying Ac.docxturveycharlyn
 
A REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKING
A REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKINGA REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKING
A REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKINGKelly Lipiec
 
Social networking boon or a bane
Social networking boon or a baneSocial networking boon or a bane
Social networking boon or a baneAbhishek Sharma
 
Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Editor IJCATR
 
Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Editor IJCATR
 
A Survey on the Social Impacts of On-line Social Networking Sites
A Survey on the Social Impacts of On-line Social Networking SitesA Survey on the Social Impacts of On-line Social Networking Sites
A Survey on the Social Impacts of On-line Social Networking SitesIOSR Journals
 
Running Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docx
Running Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docxRunning Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docx
Running Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docxtodd521
 
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...AJHSSR Journal
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentIJERD Editor
 

Similar to Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore (20)

My new proposal (1).docx
My new proposal (1).docxMy new proposal (1).docx
My new proposal (1).docx
 
SECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKS
SECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKSSECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKS
SECUREWALL-A FRAMEWORK FOR FINEGRAINED PRIVACY CONTROL IN ONLINE SOCIAL NETWORKS
 
204
204204
204
 
Social Media Networking Site Usage Demographics Stats
Social Media Networking Site Usage Demographics StatsSocial Media Networking Site Usage Demographics Stats
Social Media Networking Site Usage Demographics Stats
 
The Impacts of Social Networking and Its Analysis
The Impacts of Social Networking and Its AnalysisThe Impacts of Social Networking and Its Analysis
The Impacts of Social Networking and Its Analysis
 
Social networking-overview
Social networking-overviewSocial networking-overview
Social networking-overview
 
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...
A SMART WIZARD SYSTEM SUITABLE FOR USE WITH INTERNET MOBILE DEVICES TO ADJUST...
 
Social Networking Websites and Image Privacy
Social Networking Websites and Image PrivacySocial Networking Websites and Image Privacy
Social Networking Websites and Image Privacy
 
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING
APPLYING THE TECHNOLOGY ACCEPTANCE MODEL TO UNDERSTAND SOCIAL NETWORKING
 
H018144450
H018144450H018144450
H018144450
 
Towards Decision Support and Goal AchievementIdentifying Ac.docx
Towards Decision Support and Goal AchievementIdentifying Ac.docxTowards Decision Support and Goal AchievementIdentifying Ac.docx
Towards Decision Support and Goal AchievementIdentifying Ac.docx
 
A REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKING
A REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKINGA REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKING
A REVIEW ON SOCIOLOGICAL IMPACTS OF SOCIAL NETWORKING
 
Social networking boon or a bane
Social networking boon or a baneSocial networking boon or a bane
Social networking boon or a bane
 
Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...
 
Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...Integration of Bayesian Theory and Association Rule Mining in Predicting User...
Integration of Bayesian Theory and Association Rule Mining in Predicting User...
 
Kt3518501858
Kt3518501858Kt3518501858
Kt3518501858
 
A Survey on the Social Impacts of On-line Social Networking Sites
A Survey on the Social Impacts of On-line Social Networking SitesA Survey on the Social Impacts of On-line Social Networking Sites
A Survey on the Social Impacts of On-line Social Networking Sites
 
Running Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docx
Running Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docxRunning Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docx
Running Head SOCIAL NETWORKS DATA PRIVACY POLICIES1.docx
 
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...
Assessing the Knowledge on Internet Addiction and Cybersecurity: A Literature...
 
International Journal of Engineering Research and Development
International Journal of Engineering Research and DevelopmentInternational Journal of Engineering Research and Development
International Journal of Engineering Research and Development
 

Recently uploaded

Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
MS4 level being good citizen -imperative- (1) (1).pdf
MS4 level   being good citizen -imperative- (1) (1).pdfMS4 level   being good citizen -imperative- (1) (1).pdf
MS4 level being good citizen -imperative- (1) (1).pdfMr Bounab Samir
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research DiscourseAnita GoswamiGiri
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Association for Project Management
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxVanesaIglesias10
 
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQ-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQuiz Club NITW
 
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQ-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQuiz Club NITW
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptxmary850239
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptxDhatriParmar
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvRicaMaeCastro1
 
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxMan or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxDhatriParmar
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...DhatriParmar
 

Recently uploaded (20)

Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
MS4 level being good citizen -imperative- (1) (1).pdf
MS4 level   being good citizen -imperative- (1) (1).pdfMS4 level   being good citizen -imperative- (1) (1).pdf
MS4 level being good citizen -imperative- (1) (1).pdf
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research Discourse
 
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptx
 
prashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Professionprashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Profession
 
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQ-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
 
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQ-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx
 
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of EngineeringFaculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
 
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxMan or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
 

Comprehensive Analysis of Social Media Security and Espionage Technology XKeyscore

  • 1. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 97 Comprehensive Social Media Security Analysis & XKeyscore Espionage Technology Adam Ali.Zare Hudaib adamhudaib@gmail.com Licensed Penetration Tester EC-Council Certified Ethical Hacker EC-Council Certified Security Analyst EC-Council Wireshark Certified Network Analyst ( Wireshark University) CEH , ECSA , LPT , WCNA Sweden Abstract Social networks can offer many services to the users for sharing activities events and their ideas. Many attacks can happened to the social networking websites due to trust that have been given by the users. Cyber threats are discussed in this paper. We study the types of cyber threats, classify them and give some suggestions to protect social networking websites of variety of attacks. Moreover, we gave some antithreats strategies with future trends. Keywords: XKeyscore, Social Media Security, Privacy, TFC, Cyber Threats. 1. INTRODUCTION In recent years, global online social network (OSN) usage has increased sharply as these networks become interwoven into people’s everyday lives as places to meet and communicate with others. Online social networks, such as Facebook, Google+, LinkedIn, Sina Weibo, Twitter, Tumblr, and VKontakte (VK), have hundreds of millions of daily active users. Facebook, for example, has more than 1.15 billion monthly active users, 819 million of which are active mobile Facebook users as of June 2013. Facebook users have been shown to accept friendship requests from people whom they do not know but who simply have several friends in common. By accepting these friend requests, users unknowingly disclose their private information to total strangers. This information could be used maliciously, harming users both in the virtual and in the real world. These risks escalate when the users are children or teenagers who are by nature more exposed and vulnerable than adults. As the use of online social networks becomes progressively more embedded into the everyday lives of users, personal information becomes easily exposed and abused. Information harvesting, by both the online social network operator itself and by third-party commercial companies, has recently been identified as a significant security concern for OSN users. Companies can use the harvested personal information for a variety of purposes, all of which can jeopardize a user’s privacy. For example, companies can use collected private information to tailor online ads according to a user’s profile, to gain profitable insights about their customers, or even to share the user’s private and personal data with the government. This information may include general data, such as age, gender, and income; however, in some cases more personal information can be exposed, such as the user’s sexual orientation and even if the user consumed addictive substances. These privacy issues become more alarming when considering the nature of online social networks: information can be obtained on a network user without even directly accessing that individual’s online profile; personal details can be inferred solely by collecting data on the user’s friends.
  • 2. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 98 SOCIAL NETWORK SITES: METRICS, PRIVACY 2.1 Social Network Sites: Usage Metrics, Network Structure and Privacy Social Network Sites (SNSs) exhibit wide popularity, high diffusion and an increasing number of features. Specifically, Facebook, which currently holds a prime position among SNSs, has a continuously evolving feature set and one billion monthly active users, approximately 81% of whom are from outside the U.S. and Canada, and 604 million of whom access the site via mobile devices [1]. Given this diversity, an effective way of understanding Facebook is by exploring motives for using the service via theoretical frameworks such as Uses and Gratifications (U&G) [2]. A good understanding A good understanding of these motives can shed light onto the intricate mechanisms behind important aspects of SNSs, such as site adoption, participation [4], information seeking [5], and the privacy of users [2]. Privacy, in particular, is a major concern since it dictates the usage decisions of many SNS users and as Facebook, specifically, has found itself under harsh criticism regarding the enactment of highly contentious privacy policies and privacy-sensitive features [3]. The emergence of social sites also represents a valuable research resource. Indeed, scholars have highlighted the enormous potential of taking advantage of data that are generated electronically when people use online services [3]. Furthermore, compared to the methods and data available to traditional social scientists, online information can be accessed and analyzed computationally in ways that are both efficient and accurate [3]. In particular, in the case of Facebook, a rich, robust Application Programming Interface (API) allows researchers to collect large volumes of data relating to issues such as site feature use and personal network structure with unprecedented accuracy, granularity and reliability. Media is consumed for a wide range of purposes and individuals utilize different media channels to achieve very different ends [6]. U&G is a theoretical framework for studying these motives and outcomes – fundamentally, the “how” and “why” of media use [2]. A key strength of the approach is its established and broadly applicable frame of analysis (covering media as diverse as tabloids, reality TV as entertainment or social connection) with social and psychological antecedents (such as demographics) and cognitive, attitudinal, or behavioral outcomes (such as usage patterns) [3]. U&G has recently proven valuable in exploring and explaining a wide variety of social media phenomena including topics as diverse as the motivations for contributing content to an online community [4], explaining why political candidates are befriended [1], and cataloguing the psychosocial well-being of teenage girls. U&G studies have explored behavior on most common forms of social media including content sharing sites (e.g., YouTube), SNSs (e.g., Myspace [1]), media sharing communities, and blogs [3]. Taken together, this work highlights the importance of eliciting and understanding users’ motives in social media, as well as the value of employing data from a natural research instrument, like Facebook, for social studies. particular, has most commonly been captured by self-report methods using surveys. Typical questions include time spent on site and visit frequency [3]. Acknowledging the lack of rigor in such ad-hoc methods, the Facebook Intensity Scale was introduced to capture the extent to which a user is emotionally connected to Facebook and the extent to which Facebook is integrated into their daily activities. The scale has been subsequently adopted in a number of other studies. The advent of SNSs has greatly facilitated the capture of personal social network data and a wide range of useful metrics can now be calculated automatically and in real time [2]. Commonly used metrics include:  Network Size: The number of nodes in a participant’s egocentric network, i.e., the number of friends that an individual has. Correlations have been shown between network size and personality and social capital.  Network Density: The extent that nodes in an egocentric network are interconnected – essentially, how many of an individuals’ friends know each other. This is calculated as the ratio of the number of ties to the number of possible ties.  Average Degree: Mean number of mutual friends in an egocentric network. Higher values on this statistic have previously been associated with bonding social capital and higher socioeconomic status.
  • 3. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 99  Average Path Length: The average geodesic distance between all pairs of nodes in a network.  Diameter: The longest geodesic distance within the network, i.e., maximum distance between two nodes.  Network Modularity: A scalar value between −1 and 1 that measures the density of links inside communities as compared to links between communities.  Number of Connected Components: The number of distinct clusters within a network. This has been interpreted as the number of an individual’s social contexts and associated with bridging social capital and social contagion.  Average Clustering Coefficient: The clustering coefficient is a measure of the embeddedness of a node in its neighborhood. The average gives an overall indication of the clustering in the network, and high values are associated with a “small-world” effect [3]. Users often make decisions about whether and how they use a SNS based on the perceived privacy implications of their actions. However, privacy is a complex concept that has presented challenges to the social media ecosystem. One key issue is the tradeoff between providing users with advanced new features that mine their data to provide relevant content but lead to negative effects in terms of how users perceive their privacy [6]. Attempting to understand this topic further, boyd [5] argues that in the context of the social web, privacy violations are common because mediated publics exhibit certain properties that are not present in unmediated publics, namely persistence, searchability, replicability, and invisible audiences. Researchers studying the social implications of privacy have concluded that the right to privacy can be considered a social stratifier that divides users into classes of haves and have-nots, thus creating a privacy divide [3]. The privacy of SNS information is a particularly pertinent topic of study because of research reporting that users find it challenging to understand the privacy implications of SNSs. For instance, recent research has shown that the current Facebook privacy controls allow users to effectively manage threats from outsiders, but are poor at mitigating concerns related to members of a user’s existing friend network [3]. Similarly, a study on Facebook apps found abundant misunderstandings and confusion about how apps function and how they manage, use, and share profile data [4]. Investigating the uses and gratifications of a social network site can provide powerful descriptive and explanatory insights into the mechanisms that drive users’ behaviors. Nationality showed a significant effect on the regression model for the first privacy question, with participants from the USA being less concerned about their privacy on Facebook, possibly due to the fact that they are more tech savvy and comfortable with this online media. On the other hand, nationality did not have a significant effect on the second privacy question, but two of the motives for use did. Specifically, users that were motivated by communication opportunities with like- minded people were found to be more likely to report tweaking their privacy settings. From the factor’s description we know that these people tend to be more enthusiastic about organizing or joining events and groups. This may be because they feel more comfortable in familiar settings and therefore have increased suspicion of strangers or companies on Facebook. Furthermore, since events predominantly take place offline and a popular use of groups is to organize offline meetings, it may be that these people have greater experience of the implications of Facebook privacy settings to offline social interaction. The fact that the Content motive was positively associated with frequently changing privacy settings may be due to the fact that people who frequently use applications and interactive content on Facebook have taken the time to understand the privacy implications of installing such dynamic features. Interestingly, the newsfeed feature, which caused a large backlash with regards to privacy when it was first introduced [6], does not show a significant effect on users’ perceived privacy. Furthermore, a substantial discrepancy was observed in the motives of people that report to be concerned about their privacy on Facebook and those that engage in changing their privacy settings. 2.2 A New Social Media Security Model (SMSM) Previous security methods have been proposed to reduce the vulnerabilities encountered by most social networks without altering the usability of the system. These models emphasised user
  • 4. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 100 education and policy implementation. However, these have very limited success as users are notoriously non-compliant with such methodology. Some work has been done to explore the effects of social networks in disseminating information about successful social engineering attacks. Coronges et al, 2012, developed an experiment where phishing emails were created and sent to a controlled user population. Their results showed that local leadership within an organisation appeared to influence security vulnerability but had no influence on security resilience or phishing prevention. Joshi and Kuo (2011) investigated mathematical formulation and computational models for security and privacy of social networks. In their empirical study, they presented several ways an attacker can take advantage of the security vulnerabilities in an online social network. Some of the vulnerabilities discussed were Social Phishing Attacks, and Neighbourhood Attacks which is a form of privacy attack. This gives the attacker sufficient information about the target. Furthermore, their work presented theories that recent work in programming language techniques demonstrates that it is possible to build online services that have strict privacy policies. Unfortunately, service providers need private data to generate revenue from their 'free' service, limiting the application of strict security and privacy measures. It is important to clearly identify the main vulnerabilities of current social media [7]. Key vulnerabilities of social media:  Single Factor Authentication. Single factor authentication is the conventional security procedure which requires a username/email and password before a user is granted access to the system. The security methods involved in authenticating account owners before gaining access into a system started as a debate in the industry and has now become the greatest cause of concern (Altinkemer, 2011). The sophistication of social engineering tools used by attackers poses a serious threat to the integrity, reputation and lives of users which may result in a sudden decline of social media usage (Parwani et al, 2013) [8].  Profile Cloning. The easiest technique used in stealing the identity of a social network's user is called profile cloning. With the current structural outlook of profile pages on Facebook, it is very easy to clone an original user and perform malicious activities with the cloned account before the victim or the platform providers discovers. The profile pages of Facebook have no particular unique separation from one another, and in cases were two users co-incidentally share the same first and surnames as well as the same profile image, the true owner of the account is left for the viewer to identify (Kumar et al, 2013). There are two types of profile cloning: existing profile cloning, cross site profile cloning.  Phishing Attacks. Phishing could be described as a malicious craft used by social engineers with the aim of exploring the vulnerabilities of a system which are made easy through the ignorance of end-users (Khonji et al, 2013).  Watering Hole Malware Attack. This method of malware attack involves the attacker guessing which websites members of an organization normally go, and then it infects these websites with malware with the hope that a user‘s computer within the target organisation will be infected when an infected site is visited. It was discovered in January 2013, that a watering hole attack was targeted at developers working at Facebook [10].  Multiple Login Sessions. The security vulnerabilities created by allowing multiple login sessions can result in both data theft and economic consequences (Imafidon & Ikhalia, 2013). In websites that make it compulsory that users‘ pay before using their service e.g. an online movie store; by allowing the creation of multiple login sessions on one account to run concurrently, the user and other users can share the benefits on one subscription regardless of their remote locations.  Weak Password Creation. uessed, cracked, deliberately shared or stolen. Most social networking sites require users to create passwords of no less than 6 characters during account registration. The passwords created are usually hashed or encrypted in the database using MD5 or Shal1 hash functions (Kioon et al, 2013). Unfortunately, users are meant to believe that when the passwords are encrypted in the database, it is impossible to decrypt them if the system is compromised [9].
  • 5. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 101 The proposed social media security model (SMSM):  Modification of Password Classification Algorithm: This model proposes that passwords should be classified according to their various levels of security using more declarative terms such as ‗very unsafe‘, ‗unsafe‘, ‗not secure‘, ‗a little secure‘, ‗secure‘ and ‗very secure‘. This can be done by ensuring that passwords created by users must consist of combinations of uppercases, lowercases, numbers and at least two special characters. Furthermore, this new security feature will display an instant message to the user using any of the aforementioned declarative terms based on the character combinations selected (Vijaya et al, 2009) [11].  Embedding Unique Usernames on Profile Pages: This security feature is proposed to solve the problem of profile cloning within the site. Many social networking sites only display the first and surnames of users on their profile pages which opens up certain security vulnerabilities when more than one user coincidentally share the same first and surnames. The vulnerabilities involved may allow the malicious user exhort money or commit an online crime under the guise of the victim which could take time to detect. In addition, embedding unique usernames on profile pages within the site will make it very difficult for a malicious user to steal the identity of a genuine user and act on their behalf (Chen et al, 2009).  Re-authenticating Registered Users During Activation: This method will ensure that social media accounts must be activated before full access is granted. The application of this approach is generating a 25 character code for each user and the system sends the code to their email addresses to confirm the ownership of the email used for registration. When the users attempt to activate the account by clicking the activation link sent to their email address, they are redirected to another page, and are required to provide the 25 character code, the username and password before full activation is complete. This security mechanism will help in preventing web robots from undermining the validation method of the social network. Furthermore, it will increase security consciousness in every user subscribing for the service and protect a user whose email address has been compromised (Fu, 2006) [12].  Email Based Two Factor Authentication: The technique used to implement two factor authentication in this model, is an email based approach. When a user initiates a login session and passes the first stage of authentication (traditional username/email and password), the system sends a randomly generated one time password token to the user‘s email address and redirects the user to another page within the site which requires the onetime password concurrently (Ikhalia & Imafidon, 2013). The user must navigate to the email address containing the randomly generated password token and supply it before access to the system is granted. This security model is now in huge demand by the industry and implementing an email based two factor authentication method is feasible and cost effective when compared with the SMS approach. Retrospectively, this new security enhancement will reduce security vulnerabilities faced by social media victims of spear phishing, session hijacking, identity and data theft (Yee, 2004).  Make Private Message Notification ‘Private’: The importance of this security functionality is to protect the confidentiality of a user whose email address is being compromised. This model only allows a message notification sent to a user‘s email when a private message is sent from the social networking site to the user. In other words, the user must navigate to their private message inbox within the site to read the content of their messages. The necessity of this is to protect users whose emails have been compromised and users who have lost their email accounts (Joshi & Kuo, 2011).  Restrict Unauthorised Access To Users’ Profile Information: One of the keywords used to define social media by (Dabner, 2012) is ―A web-based service that allows individuals to construct a public or semi-public profile within a bounded system‖; therefore, only registered users must have access to the information they share within the site. This new security enhancement proposed will prevent external users from viewing profile information of registered users within the site (Ijeh, Preston & Imafidon, 2009). The major benefit of this security feature prevents the difficulties involved by digital forensic investigators in tracking
  • 6. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 102 down malicious attackers hijacking users‘ information for cross site profile cloning (Ademu & Imafidon, 2012). When this security model is implemented an attacker must register in the system before a malicious activity can be performed, therefore making it easy for the attacker to be traced as most social networks store the IP address of registered users and updates them when they login. (Joshi & Kuo, 2011) [7].  Prevent Multiple Login Sessions: The system prevents users from creating multiple sessions at the same time. The implementation of this security mechanism will ensure that users have control over their accounts. This new enhancement will also make users know when a cyber intruder is accessing their accounts because a message will be shown to the user if a session has been established on the account (Choti et al, 2012). 2.3 Privacy Issues of Social Network Based on Facebook Example Helen Nissenbaum’s work offers an analytical framework for understanding the commonplace notion that privacy is ‘‘contextual.’’ Nissenbaum’s account of privacy and information technology is based on what she takes to be two non-controversial facts. First, there are no areas of life not governed by context-specific norms of information flow. For example, it is appropriate to tell one’s doctor all about one’s mother’s medical history, but it is most likely inappropriate to share that information with casual passersby. Second, people move into and out of a plurality of distinct contexts every day. Thus, as we travel from family to business to leisure, we will be traveling into and out of different norms for information sharing. As we move between spheres, we have to alter our behaviors to correspond with the norms of those spheres, and though there will always be risks that information appropriately shared in one context becomes inappropriately shared in a context with different norms, people are generally quite good at navigating this highly nuanced territory. On the basis of these facts, Nissenbaum suggests that the various norms are of two fundamental types. The first, which she calls norms of ‘‘appropriateness,’’ deal with ‘‘the type or nature of information about various individuals that, within a given context, is allowable, expected, or even demanded to be revealed’’ (2004, 120). In her examples, it is appropriate to share medical information with doctors, but not appropriate to share religious affiliation with employers, except in very limited circumstances. The second set are norms of ‘‘distribution,’’ and cover the ‘‘movement, or transfer of information from one party to another or others’’ [17]. She grounds these norms partly in work by Michael Walzer, according to which ‘‘complex equality, the mark of justice, is achieved when social goods are distributed according to different standards of distribution in different spheres and the spheres are relatively autonomous’’ [14]. Information is such a social good. Thus, in friendship, confidentiality is a default rule: it may be appropriate to share the details of one’s sex life with a friend, but it is not appropriate for that friend to broadcast the same information on her radio show. In medical situations, on the other hand, a certain sharing of information—from a radiologist to a primary care physician, for example—is both normal and expected [13]. The use of Nissenbaum’s account extends well beyond the public surveillance context to which she applies it. Evidence from social networking sites, and Facebook in particular, suggests that contextual integrity will be an apt analytical framework in that context as well. First, and unlike popular perceptions and perhaps unlike other SNS, almost all of the evidence suggests that Facebook users primarily use the site to solidify and develop their offline social relationships, rather than to make new relationships online. Information on Facebook would predictably tend to mirror the offline tendency to contextual situatedness.3 Second, evidence about social conflicts in Facebook suggests that one of the greatest sources of tension is when information that would remain internal to one context offline flows to other contexts online. This is in large part an interface issue—there is no easy way for a user to establish and maintain the many separate and sometimes overlapping social spheres that characterize offline life. If these fluid spheres are difficult to explicitly create on Facebook, then managing information flows among them is likely to be very difficult. Indeed, the underlying architecture of Facebook is largely insensitive to the granularity of offline social contexts and the ways norms differ between them. Instead, the program assumes that users want to project a single unified image to the world. These contextual gaps are endemic to SNS, and a substantial source of privacy problems.4 For example, the ‘status line,’ generally gets projected indiscriminately to all friends. One can think of the status line
  • 7. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 103 as an answer to the questions ‘‘How’s it going?’’ or ‘‘What’s up?’’ However, in offline life one tailors responses to the audience one is with. While Facebook now allows users to create multiple friend groups and then control which groups can see their status, this requires explicit, cumbersome effort and falls far short of reflecting the many complex and overlapping spheres of offline life [15]. Two recent Web phenomena, which pre-date social networking sites, highlight some of the privacy issues that are relevant to Facebook and other social networking sites. The first is blogging. The number of people who post their thoughts online in Web logs is staggering, and growing rapidly. Information posted to the Internet is potentially visible to all. For most people, such universal broadcast of information has no parallel offline. In other words, offline personal information is seldom communicated to a context anywhere near as broad as the entire Internet. As social networking sites continue to grow and increasingly integrate with the rest of the Internet, they should be expected to inherit some of these issues. At the same time, the contextual situation for social networking is made both more complex and more difficult by the import of the offline concept of ‘‘friend.’’ Information flows on social networking sites are mediated not just by the global nature of Internet communication, but by the ways that those sites and their users interpret the meaning of online friendship and the social norms that go with it. Not surprisingly, empirical research indicates that SNS users (possibly excepting younger teens on Myspace) tend to construct their identities relationally, becoming and expressing who they are by way of highlighting whom they associate with (Papacharissi 2009; Livingstone 2008) [16]. The precursors to social networking (blogging and webcams) demonstrated two phenomena—the risk of collateral privacy damage, and the plasticity of norms—and these phenomena are also evident in recent changes to Facebook. We will examine the case of collateral damage first, because its analysis is more straightforward than the plasticity of norms. In both cases, we will suggest that the difficulties lie in the tensions between the tendency of offline social contexts to be highly granular and specific, and the tendency of social network sites to mirror the more global context of the Internet, even as they substantially nuance that context through the emerging norms of online friendship. This granularity gap needs to be addressed through interface and design decisions that highlight, rather than obscure, the new ways that information travels online [17]. Another serious violation to the norms of distribution is that the developers behind the applications are also largely invisible, obscuring the fact that the information is in fact leaving the confines of Facebook and not just going to the user’s friends. A college student in Germany can create a quiz application that matches people with the beer they are most like, running it on his computer in his dorm. Users accessing this application will likely be completely unaware that their profile information, and their friends’ profile information, can be accessed by a student in a dorm room in another country. While a third party application developer, such as the student in the dorm room in Germany, may be acting completely ethically and only accessing the profile information needed for the application, there is no guarantee of this, and little vetting process— anyone can become a Facebook application developer [15]. The invisibility of information flows presents a particular problem because when we do not know what is being done with our information, we have no ability to contest it. If the architecture and interface of Facebook essentially hides the amount of information that is shared to third parties, then there is little that a user can do about that sharing. Indeed, there is little that she can do to avoid the situation, other than decline to use the third party applications. The choice for users is binary: install the application and give full access to their own and their friends’ personal information, or don’t use the application at all.16 We can imagine designing a number of different mechanisms that could provide greater transparency (and in some cases, greater control) of these information flows.
  • 8. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 104 The backlash was in part generated by a sense that the News Feed violated the privacy of users. In the terms of Nissenbaum’s framework, the analogy would be to the case of public records being placed online.20 For public records, she suggests, the privacy problem is one of the scope of the distribution of the information. It is not that there is any more information publicly available; it is that it is a lot easier to get to the information. It is no longer necessary to travel to a (possibly remote) local courthouse. When an activity is easier, more people will do it. Thus, for example, idle curiosity about ‘who owns that house, and what they paid for it’ becomes a topic that one might actually pursue online, when the same topic would be too much like work to pursue offline. As a result, people have less de facto privacy. Facebook similarly increased the efficiency of accessing information that was already in principle accessible; in so doing, it reduced the de facto privacy of its users. A more significant difference occurs on the other end, as newsfeeds do not just automate the process of receiving updates; they automate the process of sending them. Here a significant difference from holiday letters emerges: the process of sending those letters is inefficient, in that it involves individually addressing and stuffing envelopes for every update. If the letter says something that a recipient might find offensive, he gets a card but not the letter. If the letter is going to someone that a person no longer cares to keep up with, she might not send it, and might even remove the person from her list. In other words, being offline in this case introduces a necessary pause between composing the letter and sending it, and this encourages the list itself being subject to at least a casual vetting procedure. No such procedural encouragement exists for Facebook updates. Nothing in the process of sending updates encourages one to even notice who the full audience is, much less to assess who should be in it. Retention of updates, combined with reminders to that effect, would further encourage users to view their identities online as constructed and performative, moving Facebook closer to the world of Jennicam.25 This encouragement would be magnified even more by a ‘‘Mary has just looked at all of your updates’’ option; users would increasingly view their Facebook identities as subject to constant surveillance, and modify them accordingly. If I knew that Mary always looked at all my updates, I might update with her in mind. More generally, users would be encouraged to have updates resemble the news wires from which the model was taken. Whether or not this is a ‘‘good’’ model for online friendship on a social networking site could then be discussed by users who were encouraged to form a mental model that accurately reflected the various flows of information on the site. An intellectual property regime that grants strong proprietary rights and few exceptions favors mass media over independent producers of speech. Even the placement of advertising on portal sites matters. Recent work comparing Facebook with other SNS suggests that the architectural features of those sites will tend to facilitate certain kinds of social interactions and not others.26 The preceding analysis underscores the important relations between privacy norms and application and interface design. Two points bear emphasis. First, at the core of the privacy issues discussed here is a gap between the granularity of offline social contexts and those on Facebook. One of the greatest strengths of Nissenbaum’s contextual integrity framework is that it highlights the importance of social contexts to privacy norms. The application of Nissenbaum’s work to Facebook underscores the extent to which the granularity differences are driving many of the surface-level privacy issues. Facebook, as we have argued, does not adequately reflect this insight. As a result, offline contexts are more easily differentiated and kept separate than those on Facebook. To the extent that Facebook users use the program to develop offline social ties, this granularity gap will inevitably produce conflicts. Second, this gap shows that part of what is at stake is the definition of what it means to be a ‘‘friend’’ on Facebook. The switch to the News Feed shows that the norms for online friendship are in a state of considerable flux. While no-one would claim that a Facebook friend is the same as an offline friend, the fact that there are many overlapping cases means that the shared vocabulary sometimes leads us to a misperception of sameness. In fact, the concepts of a ‘friend’ on Facebook and an offline friend may be diverging even more, making this difference bigger. The News Feed makes updates a lot like blogging, and Applications put users in the position of treating their friends’ information as commodities to exchange for access to whatever the application in question does. If Facebook norms for friendship move further away from those offline, we would expect that users’ would model their online behavior accordingly. While few of them might endorse the sense of complete publicity that
  • 9. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 105 Jennicam embraced, publicity at nearly that level is enabled by the combination of the News Feed and a user who posts constant updates. 2.4 Social Media, Privacy and Security Dr Ackland [18] explained that social media use leads to vast digital repositories of searchable content such as web pages, blog posts, newsgroup posts, Wikipedia entries. Such repositories can be thought of as information public goods that have economic value to business because they enable people to find information efficiently – for example, by using Google to search for answers to technical questions. For governments, the value of social media lies in the potential for direct dialogue between citizens and government, promoting participatory democracy (that is, broader participation in the direction and operation of government). However, people are often not involved in social media for the instrumental reasons of creating information public goods or strengthening democracy but for social reasons and to express their individual or collective identity. Much of the time people’s use of social media is expressive (like voting, or cheering at the football) rather than instrumental and people tend to be intrinsically motivated to participate – that is, for the inherent enjoyment of the activity – rather than being extrinsically motivated (for example, by the expectation of external rewards). Two issues that may reduce the value of social media to business and government by adversely affecting people’s motivation to participate. 1. Game mechanics, was not covered in detail but refers to the use of rewards such as badges on foursquare, which may encourage some users but deter others, so may have the effect of selecting particular types of users to participate, and may also change the way in which people engage with online environments. 2. Privacy. The second issue identified as potentially diminishing people’s motivation to participate in social media, and the issue of particular relevance in the context of this workshop, was privacy. Existing research [20] is mostly survey research focusing on a particular demographic: youth. e.g.;  Sonia Livingstone [21] found that younger teenagers use social media for creating ‘identity through display’ while older teenagers create ‘identity through connection’. Different risks are associated with each of these behaviours.  Zeynep Tufecki’s quantitative study of US college students [18] looked at how privacy concerns affect students’ willingness to disclose information online. This research suggested that ‘treating privacy within a context merely of rights and violations, is inadequate for studying the Internet as a social realm’. Further research needed. Most existing research has focused on youth in wealthy countries but there is a need for research on other demographic groups. For understanding how people’s participation in Government 2.0 might be affected by privacy and security concerns, for example, it will be important to include demographic groups more relevant to these initiatives as youth tend not be involved in political activities online [18]. Dr Ackland drew attention to a study of older Australians and their social media use (funded by an ARC Linkage Grant) being undertaken by the Australian Demographic and Social Research Institute at The Australian National University in partnership with National Seniors Australia. He noted that research into privacy and Government 2.0 is generally not based on data and is mainly speculation at this stage. Little is known about people’s attitudes towards the potential for governments to use their data for purposes other than policymaking, such as surveillance and law enforcement. Developments in network science and mining social graphs (that is, the social connections between people or organisations). Social media use leads to digital traces of activity that are permanent, searchable and crossindexable. New tools are being developed to support this type of research through data mining, network visualisation and analysis, including:  The Virtual Observatory for the Study of Online Networks (VOSON) System (http://voson.anu.edu.au) developed by Dr Ackland.
  • 10. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 106  SAS Social Media Analytics.  NodeXL, which enables social network data such as that gathered by VOSON to be imported into Excel. It was noted [23] that there is an important difference between individual privacy and social graph security. Network scientists increasingly have the ability to mine social graphs and place people in groups by identifying clusters in social networks and predicting the existence of social connections. Protecting the social graph is more important –and more difficult – than protecting personal data. Professor Bronitt [21] drew attention to the work of the Centre of Excellence in Policing and Security (CEPS), which is a research partnership between The Australian National University, Griffith University, The University of Queensland and Charles Sturt University. The Centre also brings together a range of industry partners (refer to the CEPS website for a full list) who are committed to innovation and evidence-based research in policing and security. The Centre’s goal is ‘to gain a better understanding of the origins, motivations and dynamics of crime and security threats. Its objectives are to create significant transformations in crime, crime prevention and security policies, and to bring about evidence-based reform in the practices of policing and security to enhance Australia’s social, economic and cultural wellbeing.’ Professor Bronitt discussed the issues of vulnerability and resilience. In relation to infrastructure, he noted that much vulnerable infrastructure is in the hands of private companies and it will be necessary to engage with policymakers and private companies before starting research. Social media can make you vulnerable but also resilient. For example, in the Brisbane floods social media was used for self-organising [20]. A key role for ‘foresight forums’, which have not previously been used in Australia much. Laws and regulations don’t yet cover all aspects of what is now appearing in social media – eg video. He noted that privacy in the modern context clearly goes into the realm of the public – physically and virtually – related to human dignity. Social media has no borders but laws are tied to territory, so addressing interjurisdictional issues poses significant challenges. 2.5 Information Sharing and Privacy on the Facebook Nobody is literally forced to join an online social network, and most networks we know about encourage, but do not force users to reveal - for instance - their dates of birth, their cell phone numbers, or where they currently live. And yet, one cannot help but marvel at the nature, amount, and detail of the personal information some users provide, and ponder how informed this information sharing is. Changing cultural trends, familiarity and confidence in digital technologies, lack of exposure or memory of egregious misuses of personal data by others may all play a role in this unprecedented phenomenon of information revelation. Yet, online social networks’ security and access controls are weak by design - to leverage their value as network goods and enhance their growth by making registration, access, and sharing of information uncomplicated. At the same time, the costs of mining and storing data continue to decline. Combined, the two features imply that information provided even on ostensibly private social networks is, effectively, public data, that could exist for as long as anybody has an incentive to maintain it. Many entities - from marketers to employers to national and foreign security agencies - may have those incentives. At the most basic level, an online social network is an Internet community where individuals interact, often through profiles that (re)present their public persona (and their networks of connections) to others. Although the concept of computer-based communities dates back to the early days of computer networks, only after the advent of the commercial Internet did such communities meet public success. Following the SixDegrees.com experience in 1997, hundreds of social networks spurred online (see [24] for an extended discussion), sometimes growing very rapidly, thereby attracting the attention of both media and academia. In particular, [30] have taken ethnographic and sociological approaches to the study of online self-representation; [25] have focused on the value of online social networks as recommender systems; [25] have discussed information sharing and privacy on online social networks, using FB as a case study; [26] have
  • 11. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 107 demonstrated how information revealed in social networks can be exploited for “social” phishing; [28] has studied identity-sharing behavior in online social networks. FB requires a college’s email account for a participant to be admitted to the online social network of that college. As discussed in [28], this increases the expectations of validity of the personal information therein provided, as well as the perception of the online space as a closed, trusted, and trustworthy community (college-oriented social networking sites are, ostensibly, based “on a shared real space” [29]). However, there are reasons to believe that FB networks more closely resemble imagined [30] communities (see also [28]): in most online social networks, security, access controls, and privacy are weak by design; the easier it is for people to join and to find points of contact with other users (by providing vast amounts of personal information, and by perusing equally vast amounts of data provided by others), the higher the utility of the network to the users themselves, and the higher its commercial value for the network’s owners and managers. FB, unlike other online networks, offers its members very granular and powerful control on the privacy (in terms of searchability and visibility) of their personal information. Yet its privacy default settings are very permeable: at the time of writing, by default participants’ profiles are searchable by anybody else on the FB network, and actually readable by any member at the same college and geographical location. Based on research [27] we have some conclusions: a total of 506 respondents accessed the survey. One-hundred-eleven (21.9%) were not currently affiliated with the college Institution where we conducted our study, or did not have a email address within that Institution’s domain. They were not allowed to take the rest of the survey. A separate set of 32 (8.2%) participants had taken part in a previous pilot survey and were also not allowed to take the survey. Of the remaining respondents, 318 subjects actually completed the initial calibration questions. Out of this set, 278 (87.4%) had heard about FB, 40 had not. In this group, 225 (70.8%) had a profile on FB, 85 (26.7%) never had one, and 8 (2.5%) had an account but deactivated it. Within those three groups, respectively 209, 81, and 7 participants completed the whole survey. We focus our analysis on that set - from which we further removed 3 observations from the non-members group, since we had reasons to believe that the responses had been created by the same individual. This left us with a total of 294 respondents. Age and student status are correlated with FB membership - but what else is? Well, of course, having heard of the network is a precondition for membership. Thirty-four participants had never heard of the FB - nearly half of the staff that took our survey, a little less than 23% of the graduate students, and a negligible portion of the undergraduate students (1.59%). However, together with age and student status (with the two obviously being highly correlated), another relevant distinction between members and non-members may arise from privacy attitudes and privacy concerns. Before we asked questions about FB, our survey ascertained the privacy attitudes of participants with a battery of questions modelled after the Alan Westin’s studies [26], with a number of modifications. In particular, in order not to prime the subjects, questions about privacy attitudes were interspersed with questions about attitudes towards economic policy and the state of the economy, social issues such as samesex marriage, or security questions related to the fear of terrorism. In addition, while all instruments asked the respondent to rank agreement, concern, worries, or importance on a 7-point Likert scale, the questions ranged from general ones (e.g., “How important do you consider the following issues in the public debate?”), to more and more specific (e.g., “How do you personally value the importance of the following issues for your own life on a day-to-day basis?”), and personal ones (e.g., “Specifically, how worried would you be if” [a certain scenario took place]). “Privacy policy” was on average considered a highly important issue in the public debate by our respondents (mean on the 7-point Likert scale: 5.411, where 1 is “Not important at all” and 7 is “very important”; sd: 1.393795). In fact, it was regarded a more important issue in the public debate than the threat of terrorism ( t = 2.4534, Pr>t = 0.0074; the statistical significance of the perceived superiority was confirmed by a Wilcoxon signed-rank test: z = 2.184 Pr>|z|= 0.0290) and same sex marriage (t = 10.5089, Pr>t = 0.0000; Wilcoxon signed- rank test: z = 9.103 Pr>|z|= 0.0000 ); but less important than education policy (mean: 5.93; sd: 1.16) or economic policy (mean: 5.79; sd: 1.21) [24]. The slightly larger mean valuation of the importance of privacy policy over environmental policy was not significant. (These results are comparable to those found in previous studies, such as [29].) The same ranking of values (and comparably statistically significant differences) was found when asking for “How do you
  • 12. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 108 personally value the importance of the following issues for your own life on a day-to-day basis?” The mean value for the importance of privacy policy was 5.09. For all categories, subjects assigned slightly (but statistically significantly) more importance to the issue in the public debate than in their own life on a day-to-day basis (in the privacy policy case, a Wilcoxon signed-rank test returns z = 3.62 Pr > |z| = 0.0003 when checking the higher valuation of the issue in the public debate)[24]. Similar results were also found when asking for the respondents’ concern with a number of issues directly relevant to them: the state of the economy where they live, threats to their personal privacy, the threat of terrorism, the risks of climate change and global warming. Respondents were more concerned (with statistically significant differences) about threats to their personal privacy than about terrorism or global warming, but less concerned than about the state of the economy. Privacy concerns are not equally distributed across FB members and non- members populations: a two-sided t test that the mean Likert value for the “importance” of privacy policy is higher for non-members (5.67 in the non-members group, 5.30 in the members group) 6 is significant (t = -2.0431, Pr<t = 0.0210) [24]. Similar statistically significant differences arise when checking for the level of concern for privacy threats and for worries associated with the privacy scenarios described above. The test becomes slightly less significant when checking for member/non-member differences in the assigned importance of privacy policy on a day-to-day basis. Importantly, in general no comparable statistically significant differences between the groups can be found in other categories. For example, worries about the global warming scenario gain a mean Likert valuation of 5.36 in the members sample and 5.4 in the non-members sample. (A statistically significant difference can be found however for the general threat of terrorism and for the personal worry over marriage between two people of same sex: higher values in the non- members group may be explained by their higher mean age.) In order to understand what motivates even privacy concerned individual to share personal information on the Facebook, we need to study what the network itself is used for. Asking participants this question directly is likely to generate responses biased by self-selection and fear of stigma. Sure enough, by far, FB members deny FB is useful to them for dating or self-promotion. Instead, members claim that the FB is very useful to them for learning about and finding classmates (4.93 mean on a 7-point Likert scale) and for making it more convenient for people to get in touch with them (4.92) [24], but deny any usefulness for other activities. Other possible applications of FB - such as dating, finding people who share one’s interests, getting more people to become one’s friends, showing information about oneself/advertising oneself - are ranked very low. In fact, for those applications, the relative majority of participants chooses the minimal Likert point to describe their usefulness (coded as “not at all” useful). Still, while their mean Likert value remains low, male participants find FB slightly more useful for dating than female. Often, survey participants are less privacy conscious than non participants. For obvious reasons, this self-selection bias is particularly problematic for survey studies that focus on privacy. Are our respondents a biased sample of the Institution’s FB population - biased in the sense that they provide more information than the average FB members? We did not find strong evidence of that. Since we mined the network before the survey was administered, we were able to compare information revelation by survey participants and non survey participants. It is true that, on average, our survey takers provide slightly more information than the average FB member. However, the differences in general do not pass a Fisher’s exact test for significance, except for personal address and classes (where non participants provide statistically significant less information) and political views (in which the difference is barely significant). Similarly, around 16% of respondents who expressed the highest concern for the scenario in which someone 5 years from now could know their current sexual orientation, partner’s name, and political orientation, provide nevertheless all three types of information - although we can observe a descending share of members that provide that information as their reported concerns increase. Still, more than 48% of those with the highest concern for that scenario reveal at least their current sexual orientation; 21% provide at least their partner’s name (although we did not control for the share of respondents who are currently in relationships); and almost 47% provide at least their political orientation. How knowledgeable is the average FB member about the network’s features and their implications in terms of profile visibility? By default, everyone on the Facebook appears in searches of everyone else, and every profile at a certain Institution can be read by
  • 13. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 109 every member of FB at that Institution. However, the FB provides an extensive privacy policy and offers very granular control to users to choose what information to reveal to whom. As mentioned above, relative to a FB member, other users can either be friends, friends of friends, non-friend users at the same institution, non-friend users at a different institution, and non-friend users at the same geographical location as the user but at a different university (for example, Harvard vs. MIT). Users can select their profile visibility (who can read their profiles) as well as their profile searchability (who can find a snapshot of their profiles through the search features) by type of users. More granular control is given on contact information, such as phone numbers. And yet, among current members, 30% claim not to know whether FB grants any way to manage who can search for and find their profile, or think that they are given no such control. Eighteen percent do not know whether FB grants any way to manage who can actually read their profile, or think that they are given no such control. These numbers are not significantly altered by removing the 13 members who claim never to login to their account. In fact, even frequency of login does not explain the lack of information for some members. On the other hand, members who claim to login more than once a day are also more likely to believe that they have “complete” control on whom can search their profile. Awareness of one’s ability to control who can see one’s profile is not affected by the frequency of login, but is affected by the frequency of update (a Pearson 2(12) = 28.9182 Pr = 0.004 shows that the distribution is significant). Note the difference between the two graphs and, specifically, the distribution by frequency of update for respondents who answered “Do not know” or “No control” (graph on the right). Twenty-two percent of our sample do not know what the FB privacy settings are or do not remember if they have ever changed them. Around 25% do not know what the location settings are. To summarize, the majority of FB members claim to know about ways to control visibility and searchability of their profiles, but a significant minority of members are unaware of those tools and options. More specifically, we asked FB members to discuss how visible and searchable their own profiles were. We focused on those participants who had claimed never to have changed their privacy settings (that by default make their profile searchable by everybody on FB and visible to anybody at the same Institution), or who did not know what those settings were. Almost every such respondent realizes that anybody at their Institution can search their profile. However, 24% incorrectly do not believe that anybody on FB can in fact search their profile. Misunderstandings about visibility can also go in the opposite direction: for instance, 16% of current members believe, incorrectly, that anybody on FB can read their profile. 12 In fact, when asked to guess how many people could search for their profile on FB (respondents could answer by selecting the following possible answers from a drop- box: a few hundred, a few thousands, tens of thousands, hundreds of thousands, millions), the relative majority of members who did not alter their default settings answered, correctly, “Millions.” However, more than half actually underestimated the number to tens of thousands or less. In short, the majority of FB members seem to be aware of the true visibility of their profile - but a significant minority is vastly underestimating the reach and openness of their own profile. Does this matter at all? In other words, would these respondents be bothered if they realized that their profile is more visible than what they believe? The answer is complex. First, when asked whether the current visibility and searchability of the profile is adequate for the user, or whether he or she would like to restrict it or expand it, the vast majority of members (77% in the case of searchability; 68% in the case of visibility) claim to be satisfied with what they have - most of them do not want more or less visibility or searchability for their profiles (although 13% want less searchability and 20% want less visibility) than what they (correctly or incorrectly) believe to have. Secondly, FB members remain wary of whom can access their profiles, but claim to manage their privacy fears by controlling the information they reveal. While respondent are mildly concerned about who can access their personal information and how it can be used, they are not, in general, concerned about the information itself, mostly because they control that information and, with less emphasis, because believe to have some control on its access. Respondents are fully aware that a social network is based on information sharing: the strongest motivator they have in providing more information are reported, in fact, as “having fun” and “revealing enough information so that necessary/useful to me and other people to benefit from FaceBook.” However, psychological motivations can also explain why information revelation seems disconnected from the privacy concerns. When asked to express whether they considered
  • 14. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 110 the current public concern for privacy on social network sites such as the FaceBook or MySpace to be appropriate (using a 7-point Likert scale, fact, the majority of respondents agree (from mildly to very much) with the idea that the information other FB members reveal may create privacy risks to those members (that is, the other members; average response on a 7-point Likert scale: 4.92) - even though they tend to be less concerned about their own privacy on FB (average response on a 7-point Likert scale: 3.60; Student’s t test shows that this is significantly less than the concern for other members: t = -10.1863, P <t = 0.0000; also a Wilcoxon matched pair test provides a similar result: z = -8.738, Pr <|z| = 0.0000). In fact, 33% of our respondents believe that it is either impossible or quite difficult for individuals not affiliated with an university to access FB network of that university. “Facebook is for the students” says a student interviewed in [29]. But considering the number of attacks described in [30] or any recent media report on the usage of FB by police, employers, and parents, it seems in fact that for a significant fraction of users the FB is only an imagined community. In order to gauge the accuracy of the survey responses, we compared the answers given to a question about revealing certain types of information (specifically, birthday, cell phone, home phone, current address, schedule of classes, AIM screenname, political views, sexual orientation and the name of their partner) with the data from the actual (visible) profiles. We found that 77.84% of the answers were exactly accurate: if participants said that they revealed a certain type of information, that information was in fact present; if they wrote it was not present, in fact it was not. A little more than 8% revealed more than they said they do (i.e. they claim the information is not present when in fact it is). A little more than 11% revealed less then they claimed they do. In fact, 1.86% claimed that they provide false information on their profile (information is there that they claim is intentionally false or incomplete), and 0.71% have missing false information (they claimed the information they provide is false or incomplete, when in fact there was no information). We could not locate the FB profiles for 13 self-reported members that participated in the survey. For the participants with CMU email address, 2 of them did mention in the survey that they had restricted visibility, searchability, or access to certain contact information, and 3 wrote that not all CMU users could see their profile. 2.6 Social Media Compliance Policy Social media are powerful communication tools but they carry significant risks to the reputation of the University and its members. A prominent risk arises from the blurring of the lines between personal voice and institutional voice. For the purposes of this policy, social media is defined as media designed to be disseminated through social interaction, created using highly accessible and scalable publishing techniques using the internet. Examples of popular social media sites include but are not limited to: • LinkedIn • Twitter • Facebook • YouTube • MySpace • Flickr • Yammer • Yahoo/MSN messenger • Wiki’s/Blogs All members of the University using social media tools, including via personal accounts, must be aware that the same laws, professional expectations, and guidelines apply at all times. Any social media announcements not issued by the Corporate Communications team, which may include (but are not limited to) content, comments and opinions must not give the impression they are in anyway the explicit positioning of the University of Liverpool. Any official University position
  • 15. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 111 and comment must be approved by the University Corporate Communications Director or his or her representative. The Social Media Compliance Policy helps members of the University to use social media sites without compromising their personal security or the security of University information assets. The University has adopted the following principles, which underpin this policy: • All information assets must be appropriately handled and managed in accordance with their classification. • University information assets should be made available to all who have a legitimate need for them. • The integrity of information must be maintained; information must also be accurate, complete, timely and consistent with other related information and events. • All members of the University, who have access to information assets, have a responsibility to handle them appropriately and in accordance with their classification. • Information asset owners are responsible for ensuring that the University classification scheme (which is described in the Information Security Policy) is used appropriately [31]. Below is a table of examples of threats and risks associated with the use of social media: Threats: Introduction of viruses and malware to the University network; Exposure of the University and its members through a fraudulent or hijacked presence e.g. unofficial social media accounts; Unclear or undefined ownership of content rights of information posted to social media sites (copyright infringement); Use of personal accounts to communicate University owned information assets; Excessive use of social media within the University. Risks: • Data leakage/theft. • System downtime. • Resources required to clean systems. • Customer backlash/adverse legal actions. • Exposure of University information assets. • Reputational damage. • Targeted phishing attacks on customers or employees . • Exposure of customer information. • Privacy violations. • Network utilisation issues. • Productivity loss. • Increased risk of exposure to viruses and malware due to longer duration of sessions. The University’s Information Security Policy defines the categories which are assigned to University information assets. Those assets which are classified as Confidential, Strictly Confidential or Secret must not be posted on social media sites. Postings must not contain personal information concerning students, employees, or alumni. In addition, members of the University must be aware of copyright and intellectual property rights of others and of the University. Members of the University should be aware of the social media terms and conditions and ownership of content before submitting. Members should familiarise themselves with key University policies including: • University IT Regulations • Information Asset Classification Policy
  • 16. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 112 • Copyright Policy • Data Protection Policy • Acceptable Use of Electronic Resources [31] Communication via social media sites and tools must protect the University’s institutional voice by remaining professional in tone and in good taste. Members of the University who use personal social media accounts must not give the impression that their social media site represents the explicit positioning of the University of Liverpool. This should be considered when: • Naming pages or accounts • Selecting a profile picture or icon • Selecting content to post [31] Names, profile images, and posts should all be clearly linked to the particular Faculty, School, Institute and Professional Service Department. In addition, All University pages must have an associated member who is identified as being the information asset owner and who is responsible for its official affiliation of the University. If you are responsible for representing official social media accounts on behalf of the University of Liverpool when posting on a social media platform, clearly acknowledge this. Members of the University must be aware that the University has the right to request the removal of content from an official social media account and from a personal account if it is deemed that the account or its submissions pose a risk to the reputation of the University or to that of one of its members. All members of the University are directly responsible and liable for the information they handle. Members of staff are bound to abide by the University IT regulations by the terms of their employment. Students are bound to abide by the University IT Regulations when registering as a student member of the University. Authorised members of the University may monitor the use and management of information assets to ensure effective use and to detect unauthorised use of information assets. 2.7 Threats and Anti-threats Strategies for Social Network Sites Online Social Networks (OSN) such as Facebook, Tweeter, MySpace etc. play an important role in our society, which is heavily influenced by the Information Technology (IT) [32]. In spite of their user friendly and free of cost services, many people still remain reluctant to use such networking sites because of privacy concerns. Many people claim, these sites have made the world more public and less private – consequently a world with less morale is evolving. Some consider this social change as positive because people are more connected to each other. Nowadays almost all people use the social networking websites to share their ideas photos, videos, with their friends and relatives and even discuss many things about their daily life not only social issues but also others like politics which help lately to change the regime status in many countries such as Egypt Libya and Tunisia. in 1971 started the first communications in the form of social networks this happened in the past when the first email was send from one computer to another connected one. The data exchanged over the phone lines was happened in 1987 where bulletin board system set and web browsers were used for the first time to make to establish the principle of communication. Even, social networking websites have many advantages for users to communicate and exchanges information as we mention above, unfortunately, they have their negative impact! Most people spend all their time using such websites and forget about their duties and sometimes many use loss their life using such websites due to the illegal contents like pornographic, terrorism, religiolism and many other. Social networking websites must satisfy many security issues to work successfully and to be used by people who unfortunately trust most of websites. Social networking websites should have save storage for personal data and tools for managing and securing data and control access to the saved data according some limitations. They must add some features to restrict individual data access from other users. Unfortunately
  • 17. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 113 most of social networks have not had applied this issues actually There are many attacks on social networks such as worms, viruses, Trojan horses and fishing websites. Malicious programs vulnerabilities and many others such as sql injection top attacks users should be worried about all types of attacks and have the right firewalls and software which protect them of being hack by cyber criminal. Social networking features:  Global social network where geographical and spatial barriers are cancelled.  Interaction where sites gives space for the active participation for the viewer and reader.  Easy to use most social networks can be use easily and they contains images and symbols that make it easier for user interaction [34]. Types of social networks divisions depending on the service provided or targets from its inception to the following types: personal networks; cultural networks. Social networks also can be divided according to the way we communicate and services into three types:  Networks Allow For Written Communication.  Network Allow Voice Communication.  Network allow for visual Communication [32]. Social networks attract thousands of users who represent potential victims to attackers from the following types (Ref: Figure 1) [34]. FIGURE 1. Threats percentage on social networks (Sophos 2010 Security Threat Report). There are many procedures, which help us to protect as much as possible our privacy when using social networks.  Be careful and don’t write any sensitive information in your profile page bulletin board, instant messaging or any other type of participation and electronic publishing in internet so that the identity could be protected against the thefts or security threats.  Be skeptical because social networking websites are full of illegal users, hackers and cyber criminals.  Be wise man and thinks twice before you write any think when using social networking websites.
  • 18. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 114  Be polite and do not publish any illegal picture and video and even don’t write any abnormal messages and also reflects your personal impacts and be ambassador to all others on the internet.  Read the privacy policy of all social networks before using them [35]. Cyber threats that might the users face can be categorized into two categories. Privacy concerns demand that user profiles never publish and distribute information over the web. Variety of information on personal home pages may contain very sensitive data such as birth dates, home addresses, and personal mobile numbers and so on. This information can be used by hackers who use social engineering techniques to get benefits of such sensitive information and steal money. Generally, there are two types of security issues: One is the security of people. Another is the security of the computers people use and data they store in their systems. Since social networks have enormous numbers of users and store enormous amount of data, they are natural targets spammers, phishing and malicious attacks. Moreover, online social attacks include identity theft, defamation, stalking, injures to personal dignity and cyber bulling. Hackers create false profiles and mimic personalities or brands, or to slander a known individual within a network of friends[36]. Anti threats strategies. Recent work in programming language techniques demonstrates that it is possible to build online services that guarantee conformance with strict privacy policies. On the other hand, since service providers need private data to generate revenue, they have a motivation to do the opposite. Therefore, the research question to be addressed is to what extent a user can ensure his/her privacy while benefiting from existing online services. There are recommended the following strategies for circumventing threats associated with social website [33]: a) Building awareness the information disclosure: users most take care and very conscious regarding the revealing of their personal information in profiles in social websites. b) Encouraging awareness -raising and educational campaigns: governments have to provide and offer educational classes about awareness -raising and security issues. c) Modifying the existing legislation: existing legislation needs to be modified related to the new technology and new frauds and attacks. d) Empowering the authentication: access control and authentication must be very strong so that cybercrimes done by hackers, spammers and other cybercriminals could be reduced as much as possible. e) Using the most powerful antivirus tools: users must use the most powerful antivirus tools with regular updates and must keep the appropriate default setting, so that the antivirus tools could work more effectively. f) Providing suitable security tools: here, we give recommendation to the security software providers and is that: they have to offers some special tools for users that enable them to remove their accounts and to manage and control the different privacy and security issues. 2.8 Facebook’s Privacy Policy in the Law Facebook has apparently decided to delay a proposed new privacy policy after a coalition of privacy groups asked the Federal Trade Commission on Wednesday to block the changes on the grounds that they violated a 2011 settlement with the regulatory agency. A spokeswoman for the F.T.C. confirmed Thursday that the agency had received the letter but had no further comment. In a statement published by The Los Angeles Times and Politico on Thursday afternoon, Facebook said, “We are taking the time to ensure that user comments are reviewed and taken into consideration to determine whether further updates are necessary and we expect to finalize the process in the coming week.”
  • 19. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 115 Asked about the delay, a Facebook spokesman said he was unaware of the latest developments. When it first announced the changes on Aug. 28, Facebook told its 1.2 billion users that the updates were “to take effect on September 5.” [37] The changes, while clarifying how Facebook uses some information about its users, also contained a shift in legal language that appeared to put the burden on users to ask Facebook not to use their personal data in advertisements. Previously, the company’s terms of use, and its settlement with the F.T.C., had indicated that it wouldn’t use such personal data without explicit consent. Facebook’s new terms would also allow it to use the names and photos of teenagers in advertising, an area of particular concern to privacy advocates because children may be unaware of the consequences of actions such as liking a brand page. The original proposal has drawn tens of thousands of comments from Facebook users, most of them opposed to the changes [37]. Individual differences has been found to greatly impact information disclosure and perceptions of privacy. The traditional division of users according to their privacy attitudes was proposed by Hofstede (Hofstede 1996): users were classified into two extreme groups, which were reminiscent of the classic sociological concepts of individualist and collectivist societies. More recently, ethnographic studies have explored privacy attitudes of social-networking users, especially of Facebook ones. After surveying the same group of students twice, once in 2009 and later in 2010, Boyd et al. revealed that users’ confidence in changing their privacy settings strictly depended on frequency of site use and Internet literacy (Boyd and Hargittai 2010). Also, gender and cultural differences seem to affect privacy attitudes. Lewis et al. found that women’s profiles are more likely than men’s to be private and that there is a relationship between music taste and degree of disclosure of one’s profile (Lewis, Kaufman, and Christakis 2008). Finally, ethnicity plays a role. Chang et al. studied the privacy attitudes of U.S. Facebook users of different ethnicities and found that Hispanics tend to share more pictures, Asians more videos, and Afro- Americans more status updates (Chang et al. 2010). discloses personal information partly depends on one’s personality traits. In particular, it has been found to depend on the big five personality traits and the self-monitoring trait, all of which are discussed next. The five-factor model of personality, or the big five, consists of a comprehensive and reliable set of personality concepts (Costa and Mccrae 2005; Goldberg et al. 2006). The idea is that an individual can be associated with five scores that correspond to five main personality traits. Personality traits predict a number of real-world behaviors. They, for example, are strong predictors of how marriages turn out: if one of the partner is high in Neuroticism, then divorce is more likely (Nettle 2007). Research has consistently shown that people’s scores are stable over time (they do not depend on quirks and accidents of mood) and correlate well with how others close to them (e.g., friends) see them (Nettle 2007). The relationship between use of Facebook and the big five personality traits has been widely studied, yet contrasting results have emerged. To recap, the trait of Openness has been found to be associated with higher information disclosure, while Conscientiousness has been associated with cautious disclosure. Contrasting findings have emerged from the traits of Extraversion, Agreeableness, and Neuroticism. Schrammel et al. wrote: “personality traits do not seem to have any predictive power on the disclosure of information in online communities” (Schrammel, K¨offel, and Tscheligi 2009). In the same vein, Ross et al. concluded that “personality traits were not as influential as expected, and most of the predictions made from previous findings did not materialise” (Ross et al. 2009). These researchers, however, conceded that they only considered the big five personality traits (while other traits might better explain the use of Facebook) and that their samples consisted of only undergrad students. Current studies might tackle these two limitations by, for example: 1) considering personality traits other than the big five; and 2) having participants who are not only undergrad students, who might well be more Internet savvy than older people. Next, we will describe how we partly tackle the first problem by considering the personality trait of “self- monitoring”. In Section 5, we will describe how we tackle the second problem by collecting a representative sample of Facebook users in the United States. Having identified the personality
  • 20. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 116 traits that have been associated with information disclosure in the literature, we now need to model the process of disclosure itself. By doing so, we will be able to quantify a user’s disposition to disclose her/his personal information. One measure of information disclosure is the count of fields (e.g., birthday, hometown, religion) a user shares on her/his profile. The problem of this approach is that some information fields are more revealing than others, and it is questionable to assign arbitrary (relevance) weights to those fields. An alternative way is to resort to a psychometric technique called Item Response Theory (IRT), which has already been used for modeling information disclosure in social media (Liu and Terzi 2009). Next, we detail what IRT is, why choosing it, and when and how it works. What IRT is. IRT is a psychometric technique used to design tests and build evaluation scales for those tests. It extracts patterns from participants’ responses and then creates a mathematical model upon the extracted patterns, and this model can then be used to access an estimate of user attitude (de Bruin and Buchner 2010) (MacIntosh 1998) (Vishwanath 2006). Limitations of our data. Critics might rightly put forward one important issue with our data: some users might have responded untruthfully to personality tests. However, validity tests on the personality data suggest that users have responded to the personality questions accurately and honestly, and that is likely because they installed the application motivated primarily by the prospect of taking and receiving feedback from a high quality personality questionnaire. Critics might then add that, since our sample consists of self-selected users who are interested in their personality, these users might be more active than the general Facebook population. We have looked into this matter and found that our users do not seem to be more active than average - we have reported that the number of contacts for the average user in our sample is 124 whereas Facebook reports an average of 1303. Limitation of our study. Our study has four main limitations. First, we have computed exposure scores only upon user-specified fields, while we should have ideally considered additional elements (e.g., photos, wall comments, fan page memberships, segmentation of one’s social contacts into friend groups). We were not able to do so as we did not have full access to user profiles - we could access only the elements that we have been studying here. So we looked at sharing of profile fields, which is the simplest instance of sharing, making it an ideal starting point. Yet, even by considering only profile fields, we have learned that our users are far from behaving in the same way - very different privacy attitudes emerge (Figure 1). Second, we have considered Facebook users who live in the United States. Since cultural guidelines clearly exist about what is more and less private and public, one should best consider that our results are likely to hold for Facebook users in the United States. We would refrain from generalizing our results to other social-networking platforms (e.g., Twitter) or to any other society - or even to certain subcultures within the United States or within Facebook. To partly tackle this limitation, we are currently studying users of countries other than United States and platforms other than Facebook (i.e., Twitter). Third, information disclosure and concealment might be likely confounded by Facebook activity and one thus needs to control for it. We did not do so because activity information is not readily available from Facebook’s API, and future studies should propose reasonable proxies for Facebook activity4. Fourth, information disclosure might be confounded by the general desire to complete one’s profile: fields like “interested in” are quick to fill out, whereas “work information” just takes more effort and is not generally relevant to Facebook interactions. Yet, this reflects what the “relationship currency” of Facebook is (Nippert- Eng 2010) - that is, it reflects which fields are used to maintain relationships and which are instead concealed without causing any harm. Novelty of Information Disclosure Model. IRT has already been used to model privacy attitudes of social-networking users (Liu and Terzi 2009). Therefore, our methodology closely followed what has been proposed before. However, since IRT has never been applied to large-scale socialnetworking data, we have run into a number of problems. First, algorithms used for estimating IRT’s parameters turned out to have a greater impact on result accuracy than what one might expect.We found that best results are achieved if scores are computed in ways different than those proposed in (Liu and Terzi 2009) and similar to those proposed in psychometrics research (Baker 2001). Second, for a large number of users, errors associated with their exposure scores are unacceptably high. We had to filter those users out, yet we
  • 21. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 117 worked with a sample of 1,323 users. However, for studies starting off with a smaller initial sample, this problem should be addressed. Theoretical Implications. Studies that have attempted to understand how individuals conceptualize privacy offline and online have often resorted to in- depth interviews. This methodology has enabled scholars to better ground their work on exactly what people say about privacy in their daily experiences. We have proposed the use of a complementary methodology: we have studied what people do on Facebook by modeling the work of disclosure and concealment which their profiles reflect. As a result, we have quantified the extent to which personality traits affect information disclosure and concealment, and we have done so upon large-scale Facebook data that has been collected unobtrusively as the users were on the site. We have found that Openness is correlated, albeit weakly, with the amount of personal information one discloses. By contrast, after controlling for Extraversion, the self- monitoring trait’s contribution disappears, suggesting that high self-monitors might present themselves in likable ways, as the literature would suggest, but they do not tend to share more private information or to make that information more visible. In addition to differentiating users based on their personality traits, we have also found important differences among profile fields and quantified the extent to which not all private fields are equally private: for example, work- related (job status) information is perceived to be more private than information on whether one is looking for a partner. Practical Implications. There are two areas in which our findings could be practically applied in the short term. The first is social media marketing. Marketing research has previously found that individuals high in Openness tend to be innovators and are more likely to influence others (Amichai- Hamburger and Vinitzky 2010). Since we have found that these individuals also tend to have less restrictive privacy settings, social media marketing campaigns would be able to identify influentials by determining which users have less restrictive privacy settings. The second area is privacy protection. Our results suggest that, by simply using age and gender, one could offer a preliminary way of personalizing default privacy settings, which users could then change. One could also imagine building privacy-protecting tools that inform users about the extent to which they are exposing information that is generally considered to be sensitive by the community. This tool could also exploit the independence assumption behind IRT to parallelize the estimation of the model’s parameters and thus be able to monitor who is exposing privacy sensitive information in real-time. 2.9 Social Media Report and the Law Social media is an increasingly important means of communicating and connecting with customersEven if you do not directly engage with social media, you should consider monitoring activity on social media that relates to your organisation. Not all interaction on social media will be positive. You should be prepared for active and sometimes critical debate with social media users. Attempting to suppress it may backfire. Garner support from the social media community and argue your case on the merits. Care must be taken over social media postings. They could lead to serious embarrassment or even be actionable. Social media is subject to the same rules as information published by other means. If you have paid or otherwise arranged for others to post on your behalf, you should make this clear. Do not try to pass these posts off as genuine user-generated content. In some cases, you may become liable for user-generated content. You should consider monitoring usergenerated content or, at the very least, putting an effective notice and take down procedure in place. If your organisation is listed, you should ensure you comply with relevant disclosure rules for inside information. In most cases, information should be disclosed via a regulatory information service and not just via social media. Those using social media to communicate on behalf of your organisation should be given very clear guidelines and training [39]. For most companies, some engagement with social media is unavoidable. Regardless of whether or not a company has an “official” social media page or account, it is very likely that it is the subject of a number of unofficial pages, its employees are interacting with social media and it is the subject of lively discussion online. For example, our review of the FTSE 100 Companies’ use of social media also looked at unofficial Facebook pages, i.e. pages that do not appear to have been authorised by that company7. These have typically been set up by employees, ex-
  • 22. Adam Ali.Zare Hudaib International Journal of Computer Science and Security (IJCSS), Volume (8) : Issue (4) : 2014 118 employees, customers or pressure groups and are not always complimentary. A key reason to become involved in social media is engagement. Social media provides a means for that organisation to directly interact with its customers and to obtain feedback from them. Handled properly, this can help to build a brand and create a buzz and awareness to help generate new marketing leads and attract new customers. Perhaps the most beneficial outcomes from this interaction is social validation – users who start to “organically” advocate your organisation or its aims and ideals. This “electronic word of mouth” can be a very powerful marketing and influencing tool. Engagement with users might include commenting on topical issues and the organisation using social media to put their side of the story across. One example comes from energy companies who have used their social media presence to comment on rising fuel costs and discuss the link between costs and wholesale energy prices, as well as providing information on related topics such as energy efficiency. One exercise most organisations will want to do, even if they do not directly engage with social media, is to listen to social media conversations to find out what users are saying about them. There are a number of off-the-shelf and cloud-based tools that can be used for this purpose. They not only allow monitoring of individual conversations but also monitoring on a macro-level, for example through sentiment analysis. These tools automatically analyse the 500 million tweets sent every day and report on how often a topic is mentioned and whether it is mentioned in a positive or negative context[39]. Finally, social media is an increasingly important tool for advertising. Some social media platforms are able to build detailed profiles about their users including their sex, age, location, marital status and connections. This allows advertising to be targeted precisely. One risk of social media is too much engagement. Some users are more than happy to share their views in very uncompromising terms. A thick skin is vital. For example, energy companies using social media accounts to comment on rising fuel costs have received direct and frank feedback from their customers. The same is true in financial services and many other industry sectors. Attempting to suppress critical comment may backfire. Instead you need to engage with customers and argue your case on the merits. This means your social media team need to be properly resourced. They will also need to update your social media presence to ensure it remains fresh and manages to respond to user comments. This engagement with users is a vital part of any social media strategy, either to build a distinctive brand (as is the case with Tesco Mobile) or to mollify users’ concerns (as is the case with energy companies). Another problem is embarrassing or ill-judged posts. There are numerous topical examples. For example, Ian Katz, the editor of Newsnight, sent a Tweet in September referring to “boring snoring rachel reeves” (Shadow Chief Secretary to the Treasury) following her appearance on the programme. He had intended to send a private message to a friend. The circumstances in which the Tweet was sent are not clear but there have certainly been other cases in which late night rants on Twitter have caused significant embarrassment[39]. Some postings may not just be stupid but also actionable. One example is the notorious Tweet by Sally Bercow: “Why is Lord McAlpine trending? *innocent face*”. That Tweet was made when there was significant speculation about the identity of a senior unnamed politician who had been engaged in child abuse. The Tweet was false and found to be seriously defamatory11. It highlights a number of risks with social media. Firstly, the repetition rule applies such that those repeating a defamatory allegation made by someone else are treated as if they had made it themselves. This is relevant when retweeting or reposting content. Secondly, while the courts provide some leeway for the casual nature of social media and the fact that those who participate “expect a certain amount of repartee or give and take”12, that protection only extends so far. Civil and criminal liability for social media postings can also arise in a number of other ways. Paul Chambers was initially convicted for sending a “menacing” message13 after Tweeting: “Crap! Robin Hood airport is closed. You’ve got a week and a bit to get your shit together, otherwise I’m blowing the airport sky high!!”. He was fined Ј385 and ordered to pay Ј600 costs. However, this prosecution was overturned on appeal. As with more traditional formats, sales and marketing use of social media must be decent, honest and truthful. Most of the specific social media issues arise out of social validation, i.e. users who “organically” advocate that organisation or its aims and ideals. As this is such a powerful tool there is a real risk of it being misused. The Office of Fair Trading has already taken enforcement action against an operator of a commercial blogging network, Handpicked Media, requiring them to clearly identify when promotional comments have been paid for. This included publication on website blogs and