SlideShare a Scribd company logo
Assistive Technology Considerations Template
Subject Area
Sample Task
Assistive Technology Tools, Accommodations or Modifications
Links to Resource Vendors
Example:
Math
Use coins and bills to make change and solve math word
problems involving money. (Arizona Department of Education,
2015)
Talking Calculator and
Coin-U-lator
Talking calculators and money calculators provide tactile,
auditory, and visual feedback and can help students with
disabilities perform math calculation assignments. The Coin-U-
Lator actually has buttons for coins and bills.
Math APPs are electronic games that can provide additional
independent practice on an iPad, phone, or computer.
https://www.enablemart.com/coin-u-lator-accessories
http://www.attainmentcompany.com/talking-calculator
https://www.commonsensemedia.org/learning-
ratings/reviews?sort=field_reference_review_lr%3Afield_total_l
earning_rating&order=desc&gclid=COXVoKWB68gCFROSfgod
pqsKFA
Reading
Writing
Listening
Oral Communication
Development
© 2015. Grand Canyon University. All Rights Reserved.
Future Scenarios and Challenges for
Security and Privacy
Meredydd Williams, Louise Axon, Jason R. C. Nurse and Sadie
Creese
Department of Computer Science,
University of Oxford, Oxford, UK
{firstname.lastname}@cs.ox.ac.uk
Abstract—Over the past half-century, technology has evolved
beyond our wildest dreams. However, while the benefits of
technological growth are undeniable, the nascent Internet did
not anticipate the online threats we routinely encounter and
the harms which can result. As our world becomes increas-
ingly connected, it is critical we consider what implications
current and future technologies have for security and privacy.
We approach this challenge by surveying 30 predictions across
industry, academia and international organisations to extract a
number of common themes. Through this, we distill 10
emerging
scenarios and reflect on the impact these might have on a range
of
stakeholders. Considering gaps in best practice and
requirements
for further research, we explore how security and privacy might
evolve over the next decade. We find that existing guidelines
both fail to consider the relationships between stakeholders and
do not address the novel risks from wearable devices and
insider
threats. Our approach rigorously analyses emerging scenarios
and suggests future improvements, of crucial importance as we
look to pre-empt new technological threats.
Index Terms—Emerging scenarios, cybersecurity, privacy, fu-
ture technologies
I. INTRODUCTION
Over the past half-century, technology has evolved beyond
our wildest dreams. While we once completed our time-
shared tasks on chunky mainframes, we now collaborate on
documents using our mobile phones. Though these advances
have created innumerable benefits for society, they have also
brought many risks to security and privacy. Since the Internet
was not predicted to face malicious intent, this architecture is
now victim to a host of phishing, spam and malware attacks.
We currently stand at the dawn of a further technological
revolution, with the growth of the Internet-of-Things (IoT) and
sophisticated machine learning techniques. As we begin to live
our lives ‘online’, it is crucial we consider what future changes
might mean for security and privacy.
Due to this criticality, several predictions have been made
in industry, academia and international organisations. For
instance, The Microsoft Cyberspace2025 [1] report considers
three technological possibilities: a stable but stalled ‘plat-
eau’, a cooperative and innovative ‘peak’, and a ‘canyon’ of
fragmented solutions. Choo [2] approached the topic from a
criminological angle, with his scenarios ranging from mobile
device malware to highly-sophisticated phishing attacks. The
World Economic Forum [3] collated interviews from global
executives and discussed hackers stunting economic growth,
threats damaging online services, and resilience triggering
further innovation. Through surveying many such scenarios,
we found that while reports often differ in scope and audience,
there are several common themes.
In this paper we consolidate 30 scenarios from a number of
studies to explore those technological futures most frequently
considered. Rather than fielding remote predictions, we use a
rigorous methodology to identify common existing themes and
distil 10 emerging scenarios. We reflect on these possibilities,
which include the proliferation of attack tools, before consid-
ering their potential impact on a range of stakeholders. We
then analyse existing security frameworks and best practices,
exploring their suitability in these new environments. Finally,
we characterise what research is required to address outstand-
ing security and privacy risks, and consider implications for
policy makers and academia.
II. METHODOLOGY
Our methodology consists of four stages, using an iter-
ative approach to identify those scenarios most-frequently
considered. Rather than merely fielding our own predictions,
we distil emerging scenarios from a wide range of surveyed
literature. These stages are as follows:
1) We surveyed a large number of existing security and
privacy predictions, many of which are described in
Section III. These works originated from a range of
fields, including academia, industry and international
organisations. This was critical to ensure we considered
both predictions grounded in technical expertise and
those written at a policy level. Regardless of background,
these articles were selected based on three key criteria:
relevance, citation count and rigour of methodology.
2) The works were then indexed and their predictions
individually extracted. Whereas several articles clearly
delineated between different situations, others required
careful inspection to isolate scenarios. We coded these
predictions based on their general themes, such as ‘big
data’, enabling exploration of which points academia,
industry and organisations were in agreement.
3) Next, we grouped similar predictions, repeating this
process in an iterative fashion until we achieved conver-
gence. Where assigned codes concerned related trends,
such as ‘big data’ and ‘machine learning’, these were
merged to construct rich categories. From our initial 30
scenarios, we distilled 10 predictions which we found
most representative.
2016 IEEE 2nd International Forum on Research and
Technologies for Society and Industry Leveraging a better
tomorrow (RTSI)
978-1-5090-1131-5/16/$31.00 ©2016 IEEE
4) Finally, we expanded these scenarios into brief narratives
for further analysis. Through considering the themes
identified in each instance, we went on to explore
how these changes could affect societies commercially,
technologically and politically.
III. EMERGING SCENARIOS
We now present our surveyed predictions before consol-
idating common themes into core emerging scenarios. These
works originate from industry, international organisations and
academia, providing a comprehensive base for our analyses.
A. Surveyed predictions
Among industrial predictions, Burt et al. [1] developed
the Microsoft Cyberspace2025 report, considering scenarios
grounded in an econometric model. In addition to the quant-
itative analysis of 100 socio-economic indicators, qualitative
insights were sourced from researchers and experts. They
constructed three future scenarios: ‘peak’ comprising the em-
bracement of technology, ‘plateau’ representing piecemeal cy-
ber adoption, and ‘canyon’ theorising protectionist strategies.
Hunter [4], performing industrial research through Gartner,
plotted an x-axis of targets against a y-axis of authorit-
ies. His scenario quadrants ranged from regulated risk (the
establishment of software liabilities) to coalition rule (the
growth of underground hackers). In a less formalised approach,
ICSPA [5] validated their concepts through an expert panel.
Through collaboration with the TrendMicro security firm, they
discussed a second-generation digital native, a business in a
highly-technological market, and a future nation state.
Organisational predictions included that from the World
Economic Forum [3], who collated interviews and workshops
with global executives. Through this, they considered expert
feedback before constructing a 14-point roadmap for external
collaboration. These findings were drawn into several scen-
arios, including governments erecting technological barriers
and the adoption of new trade innovations. Schwartz [6], writ-
ing for the Center for a New American Security, considered
optimistic and pessimistic predictions. Through his analyses
of policy and technology he identified both scenarios and in-
dicators of cybersecurity progress. His considerations included
the construction of national networks and the proliferation of
simple attack tools.
We finally surveyed academic predictions, including scen-
arios from the Berkeley Center for Long-Term Cybersecurity
[7]. They convened an interdisciplinary conference with ex-
perts from computer science and politics. From nascent ideas
formed during the sessions, Berkeley graduate researchers
developed the concepts into five detailed narratives. These
included the Internet becoming an anarchy and the monitoring
of human emotions. Benzel [8] took an alternative approach,
describing discussions from a US Department of Homeland
Security session concerning cybersecurity. The event saw a
number of working groups deliberating those objectives which
should be prioritised over the next five years. Recommenda-
tions included increasing the role of the private sector and
developing metrics to evaluate cyber investments. In contrast,
Choo [2] conducted an exploration of cybercrime and how this
might evolve in the future. Through considering affairs in the
US, UK and Australia, he constructed several predictions and
suggested criminological prevention strategies. His scenarios
ranged from social media propaganda to the escalation of
international tensions.
B. Constructed scenarios
Our 10 emerging scenarios are now discussed in detail, con-
sidering the technological, commercial and political possibil-
ities of the next decade. These scenarios are not necessarily
complementary, but describe a range of potential eventualities.
They are as follows:
1) Growth of the Internet-of-Things. The Internet-of-
Things (IoT) pervades daily life, blurring the physical
and virtual worlds. This entanglement leads to online
risks becoming increasingly intangible, contributing to
further cyber-threats [9]. Social norms evolve suffi-
ciently that the IoT is considered ‘normal’ and those who
shun these novel technologies are viewed as antiquated.
([1–3, 5, 7])
2) Proliferation of offensive tools. Offensive government
cyber capabilities and the proliferation of simple attack
tools both contribute to frequent incidents. Nation-state
weaponry leaks onto the black market and is reused by
cybercriminals to steal from organisations. Intelligence
agencies stockpile zero-day vulnerabilities rather than
informing affected vendors, resulting in a deluge of data
breaches ([1–4, 7]).
3) Privacy becomes reinterpreted. The concept of privacy
is reinterpreted by ‘digital natives’ who have been raised
in an age of social networking and ubiquitous Internet
access. Individuals become accustomed to the develop-
ment of invasive technologies which offer great benefits
to convenience and productivity. Lives are lived ‘online’
and reputational damage is frequent as citizens display
intimate histories through digital portals ([5, 7]).
4) Repressive enforcement of online order. While many
states take liberal risk-based approaches, several favour
repressive means to enforce order online. This leads to
blanket censorship and surveillance, an order stronger
than in current regimes, damaging cross-border trade
and placing global commerce under pressure. Although
these measures successfully reduce the scale of domestic
cyber-attacks, the injury to free enterprise results in
economic degradation ([1, 2, 4, 5]).
5) Heterogeneity of state postures. The heterogeneity of
state technology postures stifles international agreement
and cooperation over cyber norms. This raises global
tensions as attacks increasingly originate from ‘safe
havens’ who refuse to prosecute their cyber criminals.
The challenge of digital attribution leads to a fragmenta-
tion of shared understandings, with nations treating their
allies with cool suspicion ([1–5]).
6) Traditional business models under pressure. The
concept of intellectual property evolves as traditional
business models are placed under increasing pressure
2016 IEEE 2nd International Forum on Research and
Technologies for Society and Industry Leveraging a better
tomorrow (RTSI)
978-1-5090-1131-5/16/$31.00 ©2016 IEEE
from both pirates and new competitors. Established
market leaders fall and sell their data assets to innovative
firms better-equipped to operate in these novel environ-
ments. Cloud platforms become the norm, with individu-
als granting increased agency to those companies which
store their data ([1, 4, 5, 7]).
7) Big data enables greater control. Big data and machine
learning supports the manipulation of individuals’ be-
haviour by corporations and governments. Sophisticated
statistical analysis enables companies to intricately tailor
their advertisements, while political parties customise
their media campaigns to target individual citizens.
While these developments benefit commerce and law
enforcement, deviations from the norm are increasingly
viewed with suspicion ([5, 7, 8]).
8) Growth of public-private partnerships. Organisations
continue to know more about their customers than
national governments, contributing to the growth of
public-private data-sharing partnerships. These arrange-
ments are indeed beneficial to national security, but
large parts of critical infrastructure remain owned by
foreign corporations. This reliance on private industry
shifts considerable power from elected state officials to
unaccountable executives, resulting in damage to the
democratic process. With power production frequently
under the influence of foreign investors, national security
becomes critically undermined ([1–3, 5, 8]).
9) Citizens demand greater control. Some citizens de-
mand greater transparency and agency over their online
data. Technologically-literate individuals store inform-
ation remotely, occasionally selling details for a range
of benefits. Corporations offer paid alternatives to data-
hungry social networks, creating new markets for on-
line communities and Privacy-Enhancing Technologies
(PETs) ([2, 4, 5, 7]).
10) Organisations value cyber-resilience. Cyber-resilience
is increasingly important for informing business de-
cisions, with issues such as insider threats high on the
agenda. The infeasibility of absolute security drives a
market for cyber-insurance, as corporations adopt pre-
scribed measures to reduce their premiums. Board rooms
quickly learn the reputational costs of cyber-attacks,
resulting in greater private investment in the actuarial
and mathematical sciences ([2–5, 8]).
C. Stakeholder impact
We continue by considering our emerging scenarios and the
impact they could have at three stakeholder levels: individual,
organisational and national. Investigation of stakeholder im-
pact is crucial to explore how ordinary citizens, corporations
and governments may operate in the future. Considerations
are summarised below in Fig. 1, with yellow, orange and red
representing low, medium and high impact, respectively. These
ratings were determined through identifying which scenarios
could critically affect a stakeholder, which would require
changes in practice, and which would have a limited impact.
In detail, the stakeholder impacts are as follows:
Figure 1: Emerging scenarios and stakeholder impact
1) The pervasion of the Internet-of-Things would present
dangerous opportunities for the surveillance of individu-
als, particularly if biological data is accessible through
implanted devices. While organisations would benefit
from enhanced home-working efficiencies, they must
defend against Bring-Your-Own-Device (BYOD) risks.
Although national governments would benefit econom-
ically from the IoT, the dependence of critical infra-
structure on modern technologies would greatly increase
attack surfaces.
2) The proliferation of attack tools would place individu-
als under the ubiquitous risk of identity theft. Organisa-
tions similarly must defend against data exfiltration (of
high impact) and therefore might be less willing to store
customer data. International tensions would be fraught
as high-profile attacks become commonplace, especially
with attribution appearing largely intractable.
3) Reinterpretations of privacy would see PII (Personally
Identifiable Information) lose its importance as more-
sensitive details, such as sexual orientation, are available
through social media. Although organisations encourage
the exchange of data for convenience, they are at risk
of their own employees sharing confidential information.
Nations benefit from large collections of open source in-
telligence, but find their citizenry increasingly dependent
on services hosted in foreign countries.
4) The repressive enforcement of online order would in-
volve widespread filtering, censorship and surveillance,
highlighted as high impact in Fig. 1. Organisational
opportunities would be stifled by both technological
restrictions and costly regulatory compliance. Foreign
opposition to repressive policies would damage interna-
tional trade, while prosecuting dissidents would be both
challenging and expensive.
5) The heterogeneity of state postures would result in
minimal cross-border personal data protection, espe-
cially when nations differ in their definitions of PII.
Unpredictable international relations would contribute
to market instability, while high-impact cyber attacks
continue from ‘safe havens’. Reductions in information
sharing would damage law enforcement, with global
tensions increasing through frequent disagreements.
6) The evolution of new business models would see indi-
viduals’ personal data become the most valuable com-
modity, frequently traded on global exchanges. Increased
market competitivity would depress profit margins while
organisations reel from ubiquitous copyright infringe-
ment. National economies are similarly weakened by
intellectual property violations and lose stability as es-
2016 IEEE 2nd International Forum on Research and
Technologies for Society and Industry Leveraging a better
tomorrow (RTSI)
978-1-5090-1131-5/16/$31.00 ©2016 IEEE
tablished firms are replaced by start-ups. With citizen
data increasingly residing in foreign data centres, both
individuals and governments would be vulnerable to the
decisions of other states.
7) Advances in big data and machine learning would
result in consumer manipulation and reductions to their
individual agency. Organisations would benefit from
sophisticated financial forecasting and seamless targeted
advertising. Nations better control their citizenry through
advanced media management, yet increasingly rely on a
technology bubble for their economic growth.
8) Public-private data-sharing partnerships would in-
fringe individuals’ rights as their personal data is shared
with foreign governments. The wider distribution of this
information would also increase the risk of confidential-
ity being breached. Although organisations might learn
valuable threat intelligence, they could have their repu-
tations damaged by governmental arrangements. While
national agencies would acquire valuable data, foreign
states might exert an troubling corporate influence over
critical infrastructure.
9) Although citizen demands might give certain indi-
viduals greater freedom, technical literacy is required
and commercial costs would be inherited through price
increases. Organisations struggle under the pressure to
reveal corporate information while national governments
find themselves increasingly accountable to their cit-
izens. Despite both initiatives being beneficial to demo-
cracy, they weaken domestic economies against unac-
countable rival states.
10) The importance of cyber-resilience emphasises the
ubiquitous personal risk from data breaches. Organisa-
tions would invest in costly technological protection,
but still fall victim to high-impact Advanced Persistent
Threats (APTs). Insider threats would challenge the
sensitive world of government, particularly since the
capabilities of malicious employees will increase with
ubiquitous technology. As a result, frequent breaches in
both the public and private sectors could undermine trust
in state institutions.
IV. CHALLENGES FOR SECURITY AND PRIVACY
As a next stage, we reflect on the future applicability
of current ‘best practice’ guidelines for individuals, organ-
isations and nations. This is of great importance, as these
documents inform stakeholder groups and are expected to
guard against technological threats. We summarise the gaps
in current frameworks, selecting several key examples due
to space restrictions, and point to required research and
development. At the individual level, we consider the main
UK and US government initiatives for improving cybersecurity
practices: for the UK, the Cyber Streetwise campaign [10], and
for the US, Stop.Think.Connect [11]. The UK Get Safe Online
initiative was also considered, but this effort is private sector
and less authoritative than Cyber Streetwise. For organisations,
we examine the US National Institute for Standards and
Technology’s (NIST) cybersecurity framework [12], and the
Center for Internet Security’s Critical Security Controls (CIS
CSC) [13]. The International Telecommunication Union (ITU)
National Cybersecurity Strategy Guide [14] and NATO Na-
tional Cybersecurity Framework Manual [15] are considered
as internationally-recognised guidelines for nations. Based on
our findings, we conclude this section with a call to action for
researchers, stakeholders and policy makers.
A. Gaps in existing guidelines
While advantageous for the present day, awareness cam-
paigns for individuals do not translate to a future in which
the IoT pervades daily life. In this future, individuals’ under-
standing of, and agency over, their online presence is key to the
protection of their privacy. Current campaigns go some way
to promoting good practice in this area (e.g. recommendations
for protecting privacy when using social media [10]; privacy-
protecting recommendations given for the use of web services,
devices, purchase history, and location [11]). However, these
initiatives do not extend to educating citizens on the privacy
implications of increasingly-used wearable, biometric and
sensor-based technologies. Nor is clear usable advice given
on the specific types of data collected by different applications
and devices. For guidelines to be suitable for future scenarios,
campaigns must focus on emerging technologies and give
specific, practical advice. Furthermore, these public initiatives
should be periodically evaluated to ascertain their effective-
ness in improving individuals’ behaviour. Whether undertaken
through large-scale surveys or targeted focus groups, feedback
mechanisms would enable campaigns to be iteratively refined.
Existing frameworks for organisations do not adequately
address those human aspects which will be key to future cor-
porate cybersecurity. The NIST and CSC documents [12, 13]
are highly comprehensive, with useful references to external
guidelines. They do not, however, provide frameworks within
which the human aspects of organisational cybersecurity can
be addressed; namely, problems arising from insider threats
and BYOD. For instance, NIST aims to mitigate insider threats
by recommending the comparison of personnel actions against
expected behaviours and data-flow baselines. CSC addresses
the problem by advocating the controlled use of administrator
privileges, access controls on a “need to know” basis, and the
monitoring of employee and contractor accounts. While these
activities are essential, employee tracking alone will not de-
fend against insiders who are aware of the monitoring systems
in place [16]. Unfortunately, both frameworks fail to give any
consideration to the growing need to study personnel motives
or to monitor behaviour from a psychological perspective.
Both the NIST and CSC recommendations protect organisa-
tions against BYOD risks insofar as they specify that physical
devices should be inventoried and monitored. We foresee a
huge expansion in the ownership of IoT products, resulting in
an increase in the number of personally-owned devices present
in the workplace; in particular, wearables and implants. In this
future, registering all connected products will not be feasible.
Sophisticated means of mitigating IoT and BYOD risk are
required [17], such as protection against potentially-invisible
devices (e.g. implanted medical chips) which are challenging
2016 IEEE 2nd International Forum on Research and
Technologies for Society and Industry Leveraging a better
tomorrow (RTSI)
978-1-5090-1131-5/16/$31.00 ©2016 IEEE
to catalogue. This is crucial to recognising the increasing
corporate blur between internal and external spaces.
Cybersecurity frameworks for nations [14, 15] recom-
mend international cooperation to unite national strategies.
Set against our future scenarios, this is extremely optimistic,
especially between non-allied states. Indeed, it is predicted that
heterogeneous state postures and differing cybersecurity laws
will stifle international cooperation. This combined with the
proliferation of cyber weaponry and problems of attribution
suggests that the alignment of national strategies is unrealistic.
Of greater importance is the construction of policies and legis-
lation which allows states to suit their own, and international,
interests in a non-unified cyber landscape.
Based on our scenarios, we foresee that both state cyber
weaponry will grow and international tensions will escalate
as a result of high-profile incidents. A recent example of
expected events is the unattributed Ukrainian power outage
of December 2015, apparently caused by a cyber-attack [18].
Currently, agreement of those terms key to building legislation
is lacking. For example, there are no widely-agreed definitions
for either “cyber warfare” or “cyber attack” [15]. Legislation in
this area is crucial to maintaining order in the coming decades.
B. Requirements for research and development
Based on the gaps identified, much work is required to
extend cybersecurity frameworks for the future. We discuss
areas that must be addressed through academic research, the
development of policy, and action by stakeholders.
For the protection of individuals’ privacy, policies and
laws on collection, storage and processing must be recon-
sidered in light of the increased sensitivity and abundance
of data. Education will be central to afford citizens the
understanding to retain agency over their information, with
progress monitored through national surveys. The development
of disclosure metrics for individuals and organisations will
help technology users perceive the risks of their decisions,
and assist regulators in detecting abuse. Similarly, methods to
predict the lifetime and impact of sensitive data would enable
individuals to understand their risk exposure. Usable privacy
solutions should also be developed, such as applications which
highlight the data collected by wearable devices.
For organisations, research into the human aspects of
insider threat, such as behavioural psychology and crimino-
logy, is important. Both the corporate adoption of education
schemes and the consideration of employees’ actions are
critical to proactively mitigating these risks. Whilst existing
studies have began to explore the psychological, behavioural,
cultural and technical characteristics of insider threats [19],
further research is required to develop comprehensive solutions
that are ultimately useful for organisations. The evolution
of current practice is crucial if companies wish to maintain
resistance against increasingly-sophisticated insider threats.
BYOD currently poses a number of organisational chal-
lenges [17, 20] and our emerging scenarios suggest this situ-
ation will only become worse. To protect companies against
the increased number and variety of personal devices brought
to the office, two developments are required. Firstly, ap-
plications should be built which secure company networks
against personal technologies. Secondly, effective organisa-
tional policies should be implemented to manage risk in a
future pervaded by wearable and implanted devices.
At the national level, more specificity is required in calls
for international cooperation. Whilst some states will negotiate
mutual pledges, as the US and China did in their 2015 Cyber
Agreement [21], the global unification of security postures
is highly unlikely. As such, both international cybersecurity
bodies and frameworks for addressing key matters, such as
piracy and cross-border law enforcement, must function in a
heterogeneous environment. We predict the ownership of large
parts of critical national infrastructure by foreign corporations,
granting power to foreign investors and underlying states. In
a cyber landscape in which nations are not concordant, legal
agreements and technological solutions are required to address
external control over critical infrastructure.
As offensive cyber weapons proliferate and global ten-
sions increase, legislation on ‘acts of war’ and digitally-
proportionate responses must be defined. It is inevitable that
nation-state cyber weaponry will reach the black market, and
therefore it is crucial the consequences of its use are under-
stood [22]. Furthermore, strong controls should be enforced
to avoid the escalation of these most harmful risks.
Interactions between individuals, organisations and na-
tions must also be considered. While we have explored the
future scenario impact on our three stakeholders individually,
in reality, there are several relationships and dependencies
between these parties. For example, the effect that advances
in machine learning will have on individuals’ privacy will
depend largely on both the regulations governing data col-
lection and the behaviour of organisations. There is, how-
ever, no single document that details cybersecurity guidelines
across these three differing levels. This lack of cohesion
introduces complex subtleties and key dependencies can be
missed. Broad frameworks would enable a clear vision of
both current cybersecurity practices and their future weak-
nesses. However, it might be challenging to locate a party
capable of constructing such a comprehensive document. This
is especially true considering the international scope of the
field, with recommendations required to be equally applicable
to a Chinese power station and a US bank. Specificity is
naturally in tension with generality, and new guidelines would
also require frequent update to ensure they are suitable. These
factors, and the difficulty of selecting an authorship trusted
by all international parties, ensure that the construction of
a comprehensive document would be challenging. However,
such efforts are essential if we wish to protect security and
privacy in an uncertain global future.
Perhaps a more achievable proposal would be to increase
collaboration between guideline authors at the individual,
organisational and national levels. Such unification of ap-
proaches and ideas would enable the production of frameworks
which are more strongly contextualised against cybersecurity
at all levels. We illustrate this point by drawing on our above
example, which is based on advances in machine learning
and consumer prediction capabilities. This scenario highlights
a discrepancy between the recommendations made at the
organisational and individual level. Given that guidelines in the
2016 IEEE 2nd International Forum on Research and
Technologies for Society and Industry Leveraging a better
tomorrow (RTSI)
978-1-5090-1131-5/16/$31.00 ©2016 IEEE
former [12,13] provide only limited advice on the use of sens-
itive consumer information, the latter’s [10, 11] recommenda-
tions on data protection is inadequate to maintain privacy in
a world of sensor-based devices. In the space between these
two frameworks, there are real risks of organisational data use
against individuals’ wishes. These gaps could be bridged by
collaboration between the authors of such guidelines.
V. CONCLUSIONS
Emerging scenarios provide useful insights into the fu-
ture of security and privacy. While previous predictions are
diverse in their scopes and audiences, through constructing
representative futures, we explored those developments most
frequently-anticipated. Our study of the impact on individu-
als, organisations and nations highlights key considerations
for these important stakeholders. We also analysed existing
frameworks and best practices, identifying guidelines which
do not adequately translate to future predictions. Our findings
suggest avenues for research and development, including pri-
vacy education for the general public, approaches to mitigate
organisational insider threat, greater investigation of BYOD
risk, and international legislation for cyber warfare. It is crucial
that emerging scenarios are explored in anticipation of the
security and privacy threats of the coming decade.
We have considered a number of possibilities for future
work. Firstly, we could conduct a technical analysis of why
reputable guidelines fail to address the threats from emerging
scenarios. In this work, we could rigorously examine indi-
vidual sections in ‘best practice’ documents and identify which
areas urgently require updating. Secondly, these emerging
scenarios could be explored within the context of specific
domains, such as healthcare. The pervasion of the Internet-of-
Things might be beneficial with the development of wireless
sensors, but also threaten welfare when critical devices are
compromised. Similarly, although advances in big data might
revolutionise the detection of health conditions, these measures
could unfairly increase insurance premiums for victims of
profiling. Finally, we could expand our scope and explore
future possibilities for technology as a whole. The coming
decades promise to radically change how we perceive digital
devices, and it is crucial we are prepared for these novel
developments.
REFERENCES
[1] D. Burt, A. Kleiner, J. P. Nicholas, and K. Sullivan,
Cyberspace2025,
2014.
[2] K.-K. R. Choo, “The cyber threat landscape: Challenges and
future
research directions,” Computers & Security, vol. 30, no. 8,
2011.
[3] World Economic Forum, Risk and responsibility in a
hyperconnected
world, 2014.
[4] R. Hunter, “The future of global information security,”
2013, Accessed
8 June 2016. [Online]. Available:
www.gartner.com/it/content/2479400/
2479415/june 13 future of global info sec rhunter.pdf
[5] International Cyber Security Protection Alliance, “Project
2020,” 2013,
Accessed 8 June 2016. [Online]. Available:
www.europol.europa.eu/
sites/default/files/publications/2020 white paper.pdf
[6] P. Schwartz, “Scenarios for the future of cyber security,”
America’s
Cyber Future, vol. 2, pp. 217–228, 2011.
[7] Berkeley Center for Long-Term Cybersecurity, “CLTC
scenarios,” 2015,
Accessed 8 June 2016. [Online]. Available:
cltc.berkeley.edu/scenarios/
[8] T. Benzel, “A strategic plan for cybersecurity research and
development,”
IEEE Security & Privacy, vol. 4, pp. 3–5, 2015.
[9] M. Williams, J. R. C. Nurse, and S. Creese, “The perfect
storm: The
privacy paradox and the Internet-of-Things,” in Workshop on
Challenges
in Information Security and Privacy Management (ISPM) at
11th Inter-
national Conference on Availability, Reliability and Security
(ARES),
2016.
[10] HM Government, Cyber Streetwise, Accessed 8 June 2016.
[Online].
Available: www.cyberstreetwise.com/
[11] National Cyber Security Alliance, Stop.Think.Connect,
Accessed 8 June
2016. [Online]. Available: www.stopthinkconnect.org/
[12] National Institute of Standards and Technology,
Framework for Improv-
ing Critical Infrastructure Cybersecurity, 2014.
[13] Center for Internet Security, “CIS critical security controls
-
Version 6.0,” Accessed 8 June 2016, 2016. [Online]. Available:
https://www.sans.org/critical-security-controls/controls
[14] International Telecommunication Union, National
Cybersecurity
Strategy Guide, 2011.
[15] NATO Cooperative Cyber Defence Centre of Excellence,
National
Cybersecurity Framework Manual, 2012.
[16] C. Colwill, “Human factors in information security: The
insider threat–
who can you trust these days?” Information Security Technical
Report,
vol. 14, no. 4, pp. 186–196, 2009.
[17] J. R. C. Nurse, A. Erola, I. Agrafiotis, M. Goldsmith, and
S. Creese,
“Smart insiders: Exploring the threat from insiders using the
Internet-
of-Things,” in ESORICS Workshops, 2015, pp. 5–14.
[18] BBC News, “Hackers caused power cut in western Ukraine
- US,”
Accessed 8 June 2016, 2016. [Online]. Available:
www.bbc.co.uk/news/
technology-35297464
[19] J. R. C. Nurse, O. Buckley, P. A. Legg, M. Goldsmith, S.
Creese, G. R.
T. Wright, and M. Whitty, “Understanding insider threat: A
framework
for characterising attacks,” in IEEE Security and Privacy
Workshops
(SPW), 2014, pp. 214–228.
[20] A. B. Garba, J. Armarego, D. Murray, and W. Kenworthy,
“Review of the
information security and privacy challenges in Bring Your Own
Device
(BYOD) environments,” Journal of Information Privacy and
Security,
vol. 11, no. 1, pp. 38–54, 2015.
[21] E. Rosenfeld, “US-China agree to not conduct cybertheft of
intellectual property,” 2015, Accessed 8 June 2016. [Online].
Available: www.cnbc.com/2015/09/25/us-china-agree-to-not-
conduct-
cybertheft-of-intellectual-property-white-house.html
[22] D. Hodges and S. Creese, “Understanding cyber-attacks,”
Cyber war-
fare: A multidisciplinary analysis, pp. 33–60, 2015.
2016 IEEE 2nd International Forum on Research and
Technologies for Society and Industry Leveraging a better
tomorrow (RTSI)
978-1-5090-1131-5/16/$31.00 ©2016 IEEE
Legal Implications of Using Digital Technology in Public
Schools: Effects on Privacy
JOANNA TUDOR, J.D *
Policymakers have been attempting to improve our education
system and student achievement for decades. With
encouragement from recent legislation—such as the America
Competes Act, which places an emphasis on STEM education—
school districts across the country are incorporating technology
into academics in sweeping strides. Between 2012 and 2013, the
United States Department of Education gave out over $500
million in Race to the Top funds and almost every recipient
school district implemented a one-to-one digital device
initiative. As technology is integrated into academic instruction
for use at school and at home, student interaction with digital
technology has been propelled to exceptional levels. .
Meanwhile, the efficiency of digital technology has made it
easier tor school districts and other entities to collect, organize,
and store large databases of information. School districts, state
departments of education, and the U.S. Department of Education
all collect and use data about students. At the local level,
specific information on individual students is collected to help
teachers improve their instruction or keep records of student
behavior. For example, Apple’s iTunes has a behavior-
monitoring application (app) available to teachers that allows
them to compile information about which students have
behavior
*This article was prepared as part of the author’s work at the
University of San Diego’s Mobile Technology Learning Center
(MTLC), an Institute of Entrepreneurship in Education (IEE)
knowledge lab in the School of Leadership and Education
Sciences. 1. Ross Brenneman, Before Buying Technology,
Asking ‘Why,?’ Educ. Weekly, June 18, 2014,
http://www.edweek.org/tm/articles/2014/06/18/gp-
definitions.html.
287
288 Journal of Law & Education
[Vol. 44, No. 3
problems and which students are helpful in the classroom.2 Data
collection can also help administrators make better-informed
decisions about allocating resources to district programs.
Sometimes districts contract with outside service providers who
deliver services to students and often these service providers
also collect and use personally identifiable student information
to help them improve their services. For example, some school
cafeterias are now using a biometric identification system to
allow students to pay for lunch by scanning their fingerprint at
the checkout line.3 The ease and convenience of collecting,
organizing, and storing data through the use of technology
means that most, if not all, of these data are in digital form.
However, the convenience and insight allowed by
unprecedented access to technology comes at a cost, which
frequently manifests as invasions of student privacy.
Furthermore, districts may not realize the extent to which
student privacy is at risk, particularly because technology is
constantly evolving, thus creating new risks Understanding the
risks involved is only half the battle. Districts must dso
navigate the multitude of federal and state laws currently in
place that are designed to protect student privacy. In the last
several decades, privacy laws in general have morphed as
advances in technology have challenged well-established
concepts in privacy law.4 Compounding the matter, student
expectations in privacy has always been a vague concept, and is
only further complicated by the challenges digital technology
has introduced to privacy law. This report discusses the
potential risks to students when districts incorporate technology
by considering who is collecting data and why it is problematic.
The report focuses on privacy violations that are likely
. „ 2;,. TlPTaPTech, LLC, Easy Behavior Tracker for Teachers,
iTunes Preview
March ir S t5 ) P e’C° ^
US/aPP/eaSy’behaViOr‘traCker'f0r/id553367265?mt=8 (last
visited
3 Adam Shanks, North Adams schools to implement biometric
ID at cafeterias The
T , (SePt'h13; 2° 14’ 12:11 AM)’ http://www.berksh£:Xo^ i c
“6525" 0/nOrth‘adams' schools' lmPlement'biometric-id-at-
cafeterias. 4. For example, in U.S. v. Jones, 132 S.Ct. 945, 181
L.Ed.2d 911 (2012) the United States Supreme Court ruled that
the attachment of a Global-Positioning-System (GPS) to a car
wa a rvT veemma Tf T " °f “ Amendment a"d ^ ^ - t “ ed by
government officials without a warrant. J
Summer 2015]
Using Digital Technology in Public Schools 289
to occur due to a district’s increased use of technology and
explores questions such as, “What are students’ expectations of
privacy m digital technology?” and “What is a district’s duty to
protect students Irom privacy invasions?” Where answers are
not yet available, the report provides relevant information that
districts need to contemplate as they implement technology
devices and programs in their schools. The report begins with
an overview of federal privacy laws, California privacy laws,
and other laws that regulate technology use. The report then
takes a look at how student privacy is compromised and who are
the biggest contributors to student privacy invasions and why.
Based on the law described in the first two sections, the report
then discusses students expectations in digital technology, and
the implications for school districts.
I OVERVIEW OF RELEVANT LAW AFFECTING STUDENT
PRIVACY
Privacy rights can be separated into three main categories: (1)
the right to access and to control access to personal information,
(2) the right to personal autonomy, and (3) the right to control
public use of an individual’s persona. Of these three, the first
two (the right to access personal information and the right to
personal autonomy) are largely at issue with regard to the legal
implications of students who use digita technology in an
academic setting. Federal statutes such as the Federa
Educational Rights and Privacy Act (FERPA) and the Children s
Online Privacy Protection Act (COPPA) address the right to
access and to control access to personal information, while the
United States Constitution offers some protection for personal
autonomy through the First, Fourth and Fourteenth
Amendments. Additionally, each state has its own set of
legislation governing student privacy. Although the third
category (the right to control persona) could arise within the
educational context, it would be an unusual occurrence;
therefore, this report will only discuss the first two privacy
concepts.
290 Journal of Law & Education
[VoL 44, No. 3
Within the first two ideas, there are many federal and state laws
that can and do apply to school districts, students, and other
interested parties. In fact, in just the past few years, almost 150
bills affecting student privacy were introduced in 40 different
state legislatures6 Because of the diversity of state laws
affecting student privacy this report focuses mainly on federal
law. California serves as the model when exploring the
intersection of federal and state law. California lends itself to
these examples because its Silicon Valley hosts many high-
technology companies, such as Google, Facebook, Apple,
Hewlett- Packard, and Intel, to name a few.6 Of course, these
companies’ influences reach far beyond the boundaries of
California, so this is not to suggest that by being headquartered
in California their impact is hmited to California. Therefore, in
order to fully understand the subtleties of some of the
implications presented by digital technology programs within
other states, a thorough exploration and analysis of the state s
law is necessary.
A. Federal Privacy Laws
The United States Constitution does not guarantee privacy
specifically, but there are constitutional limits as to how far the
government can intrude into an individual’s private life. The
Fourth mendment establishes some of those limits by ensuring
the right to be free from unreasonable search and seizure of
body, house, papers, and effects^ In addition to the Fourth
Amendment, the First and Fourteenth Amendments also protect
personal autonomy. The First Amendment
20155' Bil Wouldn't St°P Data lining of Kids, Politico (Mar.
23, • http.7/www.politico.com/story/2015/03/privacy-bill-
woulclrrt-ston data mining-of-ktds-
116299.html#ixzz3WYqkGlsp. woman t stop data
visited GZ t t ° Cf r ^ J ttp:l/wVW g° ^ le-
com/about/comPany/facts/locations/ (last M rC TT16’ 20 5)’
Faceho°k HQ, Facebook, https://www facebook com/
pages/Facebook-HQ/166793820034304 (last visited March 16.
2015); Apple Corporate Office
M u d T r n t S T T T T ' 'lllP:^ecorPorateoffices-coni/App!e-
!697 ^(last , i L about hnL fi015 4 C l , n Head^mrters’ HP-
http://www8.hp.com/us/en/hp-information/ out-
hp/headquarters.htm1 (last visited March 16, 2015); Intel
Corporate Office &  A 15’) eCOrP°rate0fflCeS’
http://ecorporateoffices.com/Intel-2953 (fast visited March
7. U.S. Const, amend. IV.
Summer 2015]
Using Digital Technology in Public Schools 291
safeguards personal autonomy by protecting the right to speak
freely and peaceably assemble,8 while the Fourteenth
Amendment ensures substantive due process.9 The Ninth
Amendment protects privacy indirectly because it holds that
even if a right is not explicitly mentioned in the Constitution,
the government may not infringe upon that right it it has been
granted through other means. Unlike the Constitution, there are
many federal statutes that^directly address privacy concerns,
such as the Privacy Act of 1974 Some statutes specifically
concern digital privacy, like the Electronic Communications
Privacy Act,13 while others are even more specific an address
the digital privacy of students or minors. The following
descriptions of federal law focuses on those sections that
address students’ privacy rights specifically.
1. Family Educational Rights and Privacy Act (FERPA)'4
Student privacy rights begin with FERPA. FERPA governs
schools that receive federal funding and maintain student
education records. It requires that parental consent be given
before any personally identifiable information or information
held in the education record can be shared.15 Although
FERPA’s jurisdiction covers elementary, secondary and post-
secondary school students, this report will focus solely on the
portions of FERPA that apply to K-12 students. FERPA grants
rights to parents of minors and to students themselves, once
they have reached the age of majority. These rights include the
ability to inspect the education record that the school maintains
on their
eight amendments to the Constitution were not later deemed
exnausuve. 12. 5 U.S.C. § 552a (2010). 13. See generally, 18
U.S.C. § 2510 (2002). 14. 20 U.S.C. § 1232g (2013). 15. Id. at §
1232g(b)(l).
8. Id. amend. I. 9. Id. amend. XIV. 10. Id. amend. IX. 11. This
concept was
292 Journal of Law & Education
[Vol. 44, No. 3
child and the right to challenge and correct or delete any
content in the record that is inaccurate or misleading.16
Additionally, the Code of Federal Regulations, which has
expanded upon FERPA’s rights requires districts to annually
notify parents of these rights.17 The statute does not grant a
private right of action however, meaning parents cannot sue
districts that are in breach of FERPA. The only statutory
incentive districts have to remain in compliance with this law is
that federal funding can be revoked for any school in violation
of the law.18 Tod d a 1^ ormatlon held in the education record
is protected under FERPA. However, as information conduits
and data storage methods have evolved, a bit of a controversy
has developed regarding what information is included in the
education record and therefore protected from being accessed by
others. The statute defines “education record” as records, files,
documents or other materials which “contain information irectly
related to a student” and “are maintained by an educational
agency or institution.” 19 At times, FERPA has been applied
broadly when identifying content within the education record.20
However, the education record does not include teachers’
personal notes that are not s ared with others except substitutes,
or records created and kept by law
There are a few exceptions granted under the statute that allow
districts to disclose information that would otherwise be a part
of the education record without permission. The two most
relevant exceptions include the Directory Information exception
and the School Official exception.- Directory Information refers
to information that is generally not considered harmful if
disclosed.23 This information may include but
16. Id. at § 1232g(a)(2). 17. 34 C.F.R. § 99.67 (2012). 18. 20
U.S.C. §1232g(a)(l)(B)-(2). 19. 20 U.S.C. § 1232g(a)(4)(A).
,h, . 2° ,F0r a° examPIe- ^ U.S. v. Miami University, 91
F.Supp.2d 1132 (2000) (including student disciplinary records
as part of the “education record”). 8 21. 20 U.S.C.
§1232g(a)(4)(B). Subsection (a)(4)(B) also identifies records of
employees
is eishtee 3tl0na 316 n0t als° students of that a8ency and
records on a student who is eighteen years of age or older that
have been made or maintained by a physician or
P ^2 2 C§?I232g(b)>t lnCluded 10 the deflnltion of “education
record” for purposes of FERPA.
23. 34 C.F.R. § 99.3 (2012).
Summer 2015]
Using Digital Technology in Public Schools 293
is not limited to the student’s: name; address; telephone listing;
e-mail address; photograph; date and place of birth; grade level;
major field o study; participation in officially recognized
activities and sports; weight and height (of members of athletic
teams); dates of attendance; degrees and awards received; and
the most recent previous educational agency or institution the
student attended.24 Directory information does not include
social security numbers.25 Under the Directory Information
exception, districts may make such information public, but must
give notice to parents of the categories of information that it
intends to disclose. Parents have the rightto “opt out” if they do
not wish for this information to be made public.26 The School
Official exception allows schools to disclose to certain
individuals or entities information that is considered persona y
identifiable information” (PH) 27 or information that would
otherwise be a part of the education record. The Code of
Federal Regulations defines “personally identifiable
information” as including, but not limited to a student’s: name;
names of family members; address; any persona identifier such
as social security number or biometric record; other indirect
identifiers such as date or place of birth, or mother’s maiden
name; and any other information that “alone or in combination,
is linked or linkable to a specific student that would allow a
reasonable person in the schoo community, who does not have
personal knowledge of the relevant circumstances, to identify
the student with reasonable certainty. Under this exception, the
school may disclose information without prior consent to other
school officials, including teachers,29 or to vendors who
perform a function for the school that would otherwise be
performed by a school employee.30 The vendor must have a
legitimate educational interest in the data and the school must
be in direct control of the vendor’s use and maintenance of the
data.31 Under these circumstances, the vendor can
24. 20 U.S.C. § 1232g(a)(5)(A); 34 C.F.R. § 99.3. 25. Id. at §
1232g(a)(5)(B).
27. The statute does not define what constitutes “personally
identifiable information. 28. 34 C.F.R. §99.3. 29. Id. at § 99.31
(a)(l)(i)(A). 30. Id. at § 99.31 (a)(l)(i)(B). 31. Id. at §
99.31(a)(1).
294 Journal of Law & Education
[Vol. 44, No. 3
use PII for authorized purposes only, and cannot redisclose the
data without permission from the school and written consent
from the parent. In light of the current trend of heightened
violence in schools two other exceptions under FERPA are
worth noting. The first includes situations where an emergency
has arisen and PII or information from the education record
must be disclosed to protect the health or safety of the student
or others. The second includes disclosing disciplinary
information in the education record that concerns student
conduct that has posed a significant risk to the safety or well-
being of that student or others. Such information may be
disclosed to teachers, school officials or others who have a
legitimate educational interest in that student’s behavior.
Whenever an exception applies, information may only be
disclosed on the conditions that the recipient of the information
not share it with any other party without prior consent of the
parent, and the information will only be used for the purpose for
which the disclosure was made.35 Whenever a disclosure has
been made, the district must make a reasonable attempt to notify
the parents and provide them with a copy of the record that was
disclosed.36 This has big implications for school districts
looking to contracting with third-party providers if the third-
party will or may have access to student records. FERPA also
requires that districts keep a record in each student’s file ot any
individual or entity that has accessed or maintained the
education record Along with that information the district must
indicate what that individual or entity’s legitimate interest was
in obtaining the information. Where a third party maintains
education records under the School Official exception, the
district must be able to provide a requesting parent access to
those records.
32. 20 U.S.C. § 1232g(b)(4)(B). 33. Id. at § 1232g(b)(l)(F). 34.
Id. at § 1232g(h)(2). 35. 34 C.F.R. § 99.33(a). 36. Id. at §
99.34(a)(1). 37. 20 U.S.C. § 1232g(b)(4)(A).
Summer 2015] Using Digital Technology in Public Schools 295
FERPA also requires that regulations governing surveys
or^data- collecting activities be implemented to protect student
privacy.^ This requirement was codified in what is sometimes
referred to as the “Hatch Amendment,” but is also known as the
Protection of Pupil Rights Amendment. There are some
significant problems that are not addressed by the FERPA
legislation. For these reasons, there is a strong likelihood that a
proposal to amend FERPA will arise in Congress in the near
future. In fact, during a congressional hearing held in early
2015, the House of Representatives heard testimony of several
experts regarding the ways in which FERPA is falling short of
protecting student privacy. At this time, it appears the House is
considering updating FERPA
2. Protection of Pupil Rights Amendment of 1978 (PPRAf0
PPRA governs the administration of surveys soliciting specific
categories of information, and imposes certain requirements
regarding the collection and use of student information for
marketing purposes. It requires that schools give notice to,
obtain written consent from, and provide an opt-out opportunity
to parents before students can participate in commercial
activities that involve the collection, disclosure or use of
personal information for marketing purposes.41 The exception
to this is if a vendor is using student data solely for the purpose
of developing, evaluating, or providing educational products or
services to students or schools.42 Under the statute, “personal
information” is defined as individually identifiable information
including a student or parent s first and last name, a physical
address, a telephone number, or a social security number.43 A ,
_ . In addition to the difference in the way PPRA and FERPA
define “personally identifiable information,” some of the other
distinctions
38. Id. at § 1232g(c). . 39. Technology and Student Privacy:
Before the H. Comm. On Education and the Workforce 114th
Cong. (2015) (statement of Todd Rokita, Chairman Subcomm.
On Early Childhood, Elementary, and Secondary Education).
40. 20U.S.C. § 1232h (2002). 41. Id. 42. Id. at §
1232h(c)(4)(A). 43. Id. at § 1232h(c)(6)(E).
296 Journal of Law & Education
[Vol. 44, No. 3
between the two statutes are that FERPA is triggered where a
school simply maintains education records containing
personally identifiable information, while PPRA is only invoked
when personal information is collected from the student. An
example illustrating the difference between the two comes from
the Privacy Technical Assistance Center a subset of the U.S.
Department of Education. If a district contracts with a provider
under FERPA’s school official exception for basic applications
designed to help educate students, and sets up user accounts
using basic enrollment information, such as name and grade,
then the provider may not use data about individual student
preferences to target ads to individual students. This would be
an unauthorized use of the data, and would not constitute a
legitimate educational interest as specified under FERPA. PPRA
would also prohibit targeted ads unless the district had a policy
addressing this and parents were notified. Parents must also
have been given an opportunity to opt-out of the marketing.
However, the provider could use data conveying personally
identifiable information to improve their product and its
delivery. Additionally, the provider could use non-PII data to
develop new products and services that the provider could then
market to schools and districts.44 PPRA applies to K-12
students only, while FERPA protections extend to post-
secondary students. Like FERPA, there is no private right of
action under PPRA, so students and parents cannot enforce
compliance with the statute. However, schools that do not
comply jeopardize federal funding.45 Also noteworthy, PPRA
applies to all students, regardless of age, as opposed to the
Children’s Online Privacy Protection Act, discussed in the next
section, which only governs children age thirteen and younger.
, J4 Protecting Student Privacy While Using Online Educational
Services: Requirements citlfn f Puft Ce% P7 aCy Technical
Assistance Center (February 2014), http://ptac.ed.gov/ %Sftb“,“
202oT^;“ ™ vac),* 20a"d%20Onli"e%20Edu“ tio"aW20s" vi“
s%20 Hooff 20 U S C ® I232h(e); See also’ CN ' v- Ridgewood
Bd. of Edu., 146 F. Supp.2d 528
Summer 2015] Using Digital Technology in Public Schools 297
3. Children’s Online Privacy Protection Act (COPPA f '
COPPA is designed to protect the privacy of children under the
age of thirteen COPPA prohibits commercial website operators
from collecting personally identifiable information from
children under thirteen without first obtaining “verifiable
parental consent.” The Code of Federal Regulations define PII
as:
[I]ndividually identifiable information about an individual
collected online, including: a first and last name, a home or
other physical address including street name and name of a city
or town, online contact information [such as email address or
other substantially similar identifier], a screen or user name, a
telephone number, a social security number, a persistent
identifier that can be used to recognize a user over time and
across different Web sites or online services (includes but is not
limited to a customer number held in a cookie, an Internet
Protocol address, a processor or device serial number, or unique
device identifier), a photograph, video or audio file where such
files contains a child’s image or voice, geolocation information
sufficient to identify street name and the name of a city or town,
or information concerning the child or the parents of that child
that the operator collects online from the child and combines
with an identifier described above.
Consent must be obtained even when such information is
offered voluntarily. COPPA applies to any website operator or
online service that is directed to children. The Code of Federal
Regulations expounds on some of the factors that are considered
when evaluating whether a website is directed towards children.
These factors include subject matter of the site, visual content,
use of animated characters or chil - oriented activities, music
and other audio content, age of models presence of child
celebrities or celebrities who appeal to children, and whether
advertising promoting or appearing on the website is directed to
children.49 When a website has “actual knowledge that it
collects
46. 15 U.S.C. §§ 6501-6506. 47. Id. at § 6502. 48. 16 C.F.R§
312.2 (2013). 49. Id.
298 Journal of Law & Education
[Vol. 44, No. 3
personal information directly from users of another website”
that is directed towards children, the first website is also
considered directed towards children for purposes of applying
COPPA.50 COPPA also covers operators who direct their
content towards a general audience but have actual knowledge
that they are collecting personal information rom a child.
Websites that do not target children as their primary audience
are exempt under COPPA if they do not collect personal
information from any visitor prior to collecting age information
and prevents the use, collection or disclosure of personal
information from visitors who identify themselves as under the
age of thirteen.52 Website operators who fall under COPPA
regulations must provide notice on their website of their data
collection policies as well as allow parents to review the
personal information collected from a child and allow parents to
refuse further permission of its use.53 Once personal
information is collected, the website operator must protect the
confidentiality, security and integrity of that information and
take reasonable steps to release the information only to third
parties who are capable of maintaining the confidentiality,
security and integrity of the information.54 Any personal
information collected may be retained only for as long as is
necessary to fulfill the puipose for which the information was
collected Once that purpose has been fulfilled, the operator
must delete the information using reasonable measures to
protect against unauthorized access to, or use of, the
information.55 Like FERPA and PPRA, there is no private right
of action for children whose information has been collected in
violation of COPPA. However, when a website operator has
violated the Act, a state’s attorney general may bring a civil
action to enjoin the illegal practice and enforce compliance as
well as obtain damages, restitution or other
50. id. 51. Id. 52. 15U.S.C. § 6502. 53. Id. at § 312.3. 54. Id. at
§ 312.8. 55. Id. §312.10.
Summer 2015]
Using Digital Technology in Public Schools 299
compensation on behalf of residents of the State.16 The statute
also allows for “other relief as the court may consider to be
appropriate.
4. Children’s Internet Protection Act (CIPA)^& Neighborhood
Children’s Internet Protection Act (NCIPAf8
Congress enacted CIPA in response to public library patrons
who were using the library’s Internet to access pornographic
websites. Under CIPA, a public library, or elementary or
secondary school may not receive federal funds under the E-rate
program or the Library Services and Technology Act (LSTA)
unless it has “a policy of Internet safety for minors that
includes the operation of a technology protection measure ...
that protects against access... to visual depictions that are
obscene or pornographic or harmful to minors. These entities
must submit certification that they are complying with the
protection measures required by CIPA.61 Failure to do so may
result in loss of federal funding.62 . ... . One important thing to
note about CIPA is that while it protects minors from exposure
to pornography, it is not designed to protect against other non-
visual communications on the Internet. Thus, Cl leaves minors
unprotected from other forms of inappropriate or harmfu
interactions on the Internet, including data mining and exposure
to cyberbullying. NCIPA tackles this problem in part by
requiring schools and public libraries to adopt and implement an
Internet safety policy that addresses the accessibility of
inappropriate material on the Internet the safety and security of
minors while using email and other forms of Internet
communication, and the unauthorized disclosure of personally
56. 15 U.S.C§ 6504 (2012). 57 Id 58* 20 U S C. § 9134(f)
(2012). 47 U.S.C. § 254(h) (2012). 59. United States, v. Am.
Library Ass’n. Inc., 539 U.S. 194, 199 (2003) 60 20 U.S.C. §
9134(f)( 1)(A)(i) (2012). See 47 U.S.C. § 254(h)(5)(B)(i)
(2012). 61.47 U.S.C. § 254(h)(5)(B) (2012). 62 Id §
254(h)(5)(F) (2012). See 20 U.S.C. § 9134(f)(5) (2012). 63.
Frank Kemerer & Peter Sansom, California School Law 81
(Stanford University
Press, 3d ed. 2013).
300 Journal of Law & Education
[Vol. 44, No. 3
identifiable information« NCIPA also addresses hacking and
other unlawful activities by minors online.65
®^6°SSible LegisIation: Student Digits' Privacy and Parental
Rights
At the time of this writing, U.S. Representatives Jared Polis and
Luke Messer are sponsoring a bill known on Capitol Hill as the
Student Digital Privacy and Parental Rights Act. Although it
has not yet been formally introduced in Congress, a draft of the
bill has circulated. Due to the criticism the bill has received, the
introduction has been postponed Among the bill’s critics are
privacy advocates who feel the bill is not rigorous enough to
adequately protect student privacy, as well as technology groups
who oppose the bill because they favor self- regulation.68 The
current draft of the Student Digital Privacy and Parental Rights
Act appears to allow education technology companies to
continue to collect and compile student information and then
mine that data for commercial gain. Apparently, such companies
may sell students’ personal information to colleges, employers,
and military recruiters.69 The current draft also appears to
allow schools to authorize even wider isclosure of student data
without parental notification or permission.70 However, the
current bill does ban the use of targeted advertising, and it
requires that ed-tech companies must delete data within 45 days
upon a school s request. Making it unique among other federal
student privacy
64. 47 U.S.C. § 254(1) (2012). 65. Id.
6. At this time, a bill has not been formally introduced and there
is no bill number 67. Simon, supra note 5; Benjamin Herold,
Federal Student-Data Privacy Bill Delayed fa wn- Education
Week (March- 23, 2015, 9:33pm), http://blog edweek o g/
edweek/Dig,talEduCat,on/2015/03/federal_student_data_privaC
y_bill delayed.html?pdnt^i°r^
York T i m e T S " W°M U* °fStudent * * The New lin>i,-u,e^si
t a d a « L ^ ^ 69. Simon, supra note 5 70. Id.
Summer 2015]
Using Digital Technology in Public Schools 301
laws, the current version of the bill appoints the Federal Trade
Commission to handle enforcement of the law.71 Of course,
there is no guarantee that the Student Digital Privacy an
Parental Rights Act will become law, and if it does, there is no
guarantee that any of the elements mentioned above will be
present in the final version of the bill. It is clear, however, that
Congress is alert to the growing concerns of parents and
students regarding student privacy being at risk. Regardless of
whether the Student Digital Privacy Act^and Parental Rights
Act is approved by Congress, the reader should be aware that
there will likely be more legislation to follow.
C. Relevant California Law
The following list is not comprehensive, but it does highlight
the breadth as well as give substantial insight into current
California laws that affect student privacy:
1. California Constitution
Article 1, section 1 of the California Constitution explicitly
grants privacy as an inalienable right: “All people are by nature
free and independent and have inalienable rights. Among these
are enjoying and defending life and liberty, acquiring,
possessing, and protecting property, and pursuing and obtaining
safety, happiness, and privacy.
2. California Civil Code section 1798.2973
Any agency in California that owns or licenses or maintains
computerized data that includes personally identifiable
information (PII) must disclose any breach of the data to every
California resident whose unencrypted personal information is
reasonably believed to have been acquired by an unauthorized
person. Included under the definition of PH is an individual’s
first name (or initial) and last name, m combination with one or
more designated data elements relating to.
71. Id. 72. Cal. Const, art. 1, § 1. 73. Cal. Civ. Code § 1798.29
(West 2014).
302 Journal of Law & Education
[Vol. 44, No. 3
social security numbers, driver’s license numbers, financial
accounts and medical information. As well, a user name or
email address in combination with a password or security
question and answer that
lAformano™'1 t0 “ 0nline account also qualifies as personal
3. California Business and Professions Code section 2258074
This is the California version of COPPA, which went into effect
January 1, 2015. It prohibits the operator of a website, online
service online application or mobile application from marketing
or advertising certain products to a minor. Such products
include alcohol, firearms aerosol containers capable of defacing
property, tobacco products and electronic cigarettes, salvia
divinorum, fireworks, tanning devices dietary supplements,
lottery tickets, tattoos, drug paraphernalia and obscene matter.
It prohibits an operator from knowingly using disclosing,
compiling, or allowing a third party to use, disclose or compile,
the personal information of a minor for the purpose’ of
marketing or advertising products. The prohibition is also
applicable to an advertising service that has been notified by the
website operator that t e site is directed to minors. One thing
that distinguishes this piece of egislation from COPPA is that
operators of mobile applications are definitively obligated under
this law, whereas COPPA is less clear as to whether operators
of mobile applications fall within the confines of the law. This
law also authorizes a minor to have content or information
removed.
4. California Online Privacy Act (sections 22575-22579)15
Section 22575 Business and Professions Code states that
commercial website operators and online services that collect
PII about customers must post a conspicuous privacy policy on
their website. The policy must identify the types of data being
collected and the types of third
74. Cal. Bus. & Prof. Code § 22580 (West 2015) 75. Id. §§
22575-22579 (West 2014).
Summer 2015] Using Digital Technology in Public Schools 303
parties with whom the information may be shared. Section
22576 states that website operators are in violation of this
section if the operator knowingly and willfully or negligently
and materially violates the statute. PII includes first and last
name, physical address, email address, telephone number, social
security number, and any other identifier that permits the
physical or online contact of a specific individual.
5. New Legislation: Student Online Personal Information
Protection Act (SOPIPA)(Cal. Bus. & Prof. Code sections
22584-22585f 6
Senate Bill 1177 brought this newly enacted piece of legislation
into existence, effective on January 1, 2016. It will prohibit an
operator of a website, online service, online application, or
mobile application from knowingly engaging in targeted
advertising to students or their parents or legal guardians. Such
operators will be prohibited from using covered information to
collect student profiles, selling student information, and
disclosing covered information. It requires operators to
implement and maintain reasonable security procedures and
practices to protect the information from unauthorized access,
and it requires operators to delete student data if requested to
do so by a school or district.
6. California Education Code:
Section 4906277 gives authority to the State Board of Education
to establish the standards under which school distncts will
establish, maintain, and destroy student records. Section
35182.578 states that a school may not enter into a contract for
electronic products or services that requires the dissemination
of advertising to students unless the electronic product or
service would be an integral educational component and the
district cannot afford to provide the product or service without
contracting to permit dissemination of advertising. In such a
case, the school must provide written notice to parents that the
advertising will be used and parents
76. S.B. 1177 (Cal. 2014). http://www.leginfo.ca.gov/pub/13-
14/bill/sen/sb_11511200/
sb_l 177_bill_20140929_chaptered.pdf 77. Cal. Educ. Code §
49062 (West 2014). 78. Id. § 35182.5 (West 2014).
304 Journal of Law & Education
[Vol. 44, No. 3
must be given the opportunity to request in writing that the
student not be exposed to the program containing the
advertising.
7. Privacy of Pupil Records (sections 49073-49079.7)79
Similar to requirements under FERPA, California law says
schools must adopt a policy and give annual notice identifying
which information will be defined as “directory information”
and thus will be subject to release. Parents have the right to
prohibit directory information from being released if they notify
the school. Additionally n,° inf°iy iatl0Ib including directory
information, regarding a student identified as homeless may be
released without consent. As well no directory information
about any student may be released to a private profit-making
entity.80 Parents may give written consent to allow access to
student records to anyone of their choosing. Those who have
permission to access records must be given notice that the
transmission of the information to others without written
consent from the parent is prohibited.81 Districts cannot permit
access to student records to anyone without parental consent
except as established in the regulations under FERPA. This
section includes a few exceptions to who may obtain records
without parental consent, including school officials and
employees, and students sixteen years of age or older or who
have completed 10th grade Students who are fourteen years of
age or older who are either homeless or unaccompanied (42
USC section 11434a) are also entitled to their own records.
Schools may release records when there is an emergency and the
information is necessary to protect the health or safety of the
student or others. The statute also allows information to be
released to organizations that conduct studies for educational
agencies for the purpose of developing programs or tests or
improving instruction. 6
79. Id. §§ 49073-49079.9 (West 2014). 80. Id. § 49073(a) (West
2014). 81. W. § 49075(a) (West 2002). 82. Id. § 49076(a)(2)(E)
(West 2014).
Summer 2015]
Using Digital Technology in Public Schools 305
Districts must provide teachers with information when a student
has engaged in, or is reasonably suspected to have engaged in,
acts that warrant suspension or expulsion under the education
code. Information provided to teachers in these instances will
come from records that the district maintains in its ordinary
course of business.
8. New Legislation: Cal. Educ. Code section 49073.6M
Assembly Bill 1442 went into effect on January 1, 2015, and
requires schools to contact parents when considering the
implementation of a program to gather or maintain as part of the
education record, information about a student obtained from
social media. Parents must be informed of the proposed program
and provided with an opportunity to offer public comment
before any such program may be adopted, lhis section also
requires the destruction of such information one year after a
student reaches the age of eighteen, or within one year after the
student is no longer enrolled in the district.
9. New Legislation: Cal. Educ. Code section 49073. E5
Assembly Bill 1584 also went into effect on January 1, 2015,
and allows schools to enter into contracts with third parties to
provide the digital storage, management, and retrieval of
student records. Include in such contracts, there must be a
statement declaring that student records continue to be the
property of and under the control of the district, along with a
description of the actions the third party will take to ensure the
security and confidentiality of student records and a description
of how the district will ensure compliance with FERPA.
Contracts failing to comply with these requirements are void.
The law does not apply to any contracts existing prior to
January 1, 2015, unless or until they expire, are amended or are
renewed.
83. Id. § 49079(a) (West 2001). 84. Id. § 49073.6 (West 2015).
85. Id. §49073.1 (West 2015).
306 Journal of Law & Education
[Vol. 44, No. 3
II. WHO IS COLLECTING DATA, HOW IS IT BEING
COLLECTED, AND WHY DOES IT MATTER?
As districts begin to incorporate technology into their schools,
especially where one-to-one programs are in place,
understanding the actions that expose students to risk is
imperative. There are three main groups that are interested in
data collection. These include advertisers criminals, and the
government.86 Each group wants different information for
vaiying reasons, but they are all using technology to access the
information they are seeking, and are jeopardizing students’
privacy in the process. Of course, as individual consumers, we
are all at risk of privacy invasion when using the Internet, but
students in particular are a vulnerable group because of their
exposure to online dealings both at home and at school as well
as their maturity level and v 87C^ ° f lnhlbltlon with regard to
sharing personal information online. The following sections will
discuss how each of the three main ata collectors (advertisers,
criminals and government) uses technology to mliltrate
technology users’ lives and gather private data.
A. Advertisers
Advertisers want to reach large audiences efficiently and
effectively, they do this using a process called “targeted
advertising” which allows marketers to place advertisements in
front of groups of consumers based on various traits such as
demographics and other behavioral variables 1 his kind of
behavioral targeting uses information gathered about individual
consumers’ habits so as to gain insights about the types of
products and services that an individual is likely to purchase.
Then marketers can strategically place advertisements in front
of an individual that are specifically tailored to that individual.
One of the most efficient ways to collect data about individual
consumers is by
hitn //6’ Fact Sheet 2bu Pnvacy m ,he ASe ° f the Smartphone,
Privacy Rights Clearinghouse
hti^y/www.pnvacynghts.c^smartphone-celP/^Ophone-
privacj^martphonedata (last updated Feb.’
MlSS8L.JMia035G(20Wl 0 ^ ’ ReaS°nable ExPectati°™ of
Privacy for Youth in a Digital Age, 80
Summer 2015] Using Digital Technology in Public Schools 307
tracking their activity online. Thus, a quick overview of how
data miners are able to track activity is necessary to better
understand how advertisers work and why.
1. IP Addresses
In the past, every digital device was designed to have its own
Internet Protocol (IP) address, which could be used to identify
the web activities of each device. When signing up with an
Internet service provider (ISP), a record connecting a
computer’s IP address to an individual or an entity was created.
By simply browsing the Internet, a record was created for every
website the device visited.88 More devices are entering society
and the average Amencan now owns four devices.89 This means
there are more connected devices than there are unique public
IP addresses.90 To resolve the problem, a public IP address is
assigned to a router, which then shares the address among all
other computers and devices that access Internet via that
particular router.91 This process is sometimes referred to as “IP
tethering” because it ties all devices using the same router
together. Local IP addresses are then issued to devices that use
the router to make them distinguishable from each other.
Generally, a device needs to access the Internet more than once
for the device to be tied to the public IP address. Regardless of
whether the individual device or the router is given the public
IP address, information can be tracked back to the individual or
entity that is registered as owning that IP address. In the case of
a
88. Fact Sheet 18: Online Privacy: Using the Internet Safely,
PRIVACY RIGHTS Clearinghouse,
https://www.privacyrights.org/online-privacy-usmg-intemet-
safely, (last
updated Oct. 2014). . . , . tl 89 The U.S. Digital Consumer
Report, Nielsen, http://www.melsen.com/content/
corporate/us/en/insights/reports/2014/the-us-digital-consumer-
report.html (last updated Feb.
’ *90 Chris Hoffman, How and Why All Devices in Your Home
Share One IP Address, HOW-TO Geek,
http://www.howtogeek.com/148664/how-and-why-all-devices-
m-your-home
share-one-ip-address/ (last updated Apr. 15, 2013).
92. Gavin Dunaway, ID Is Key: Unlocking Mobile Tracking &
Cross-Device Measurement, Part 2, ADMONSTERS,
https://www.admonsters.com/blog/id-key-unlockmg- mobile-
tracking-cross-device-measurement-part-2 (last updated Aug. 3,
2013).
308 Journal of Law & Education
[Vol. 44, No. 3
tstrict owned device, the IP address would be registered to the
district, and thus the connection between IP address and owner
would not immediately implicate a student who has been
assigned to that device However, as is brought to light in the
subsequent sections, due to advances in technology, this is a
specious conviction.
2. Search Engines
Data collection does not end with IP addresses, however. Search
engines, such as Google, Yahoo and Bing, are programs that are
designed to search documents on the Internet for specific
keywords and then generate a list of documents where the
keywords were found.93 Search engines have the ability to not
just track an IP address, but also the terms that were used in the
search, the time that the search occurred as well as other data.94
This is where data collection can begin to implicate individual
students, regardless of to whom the device is registered. If
search terms reveal personal information, for example a name
and a social security number, then the search engine will have a
record of that information connecting it to the user of the
device. When an individual uses the same entity for a search
engine and email accounts, for example, Google and Gmail, the
search engine can then tie the email login to the searches as
well.95 As this occurs, data collection becomes no longer tied
to the registered owner of the device, but rather is tied to the
individual user.
3. Cookies
“Cookies” are another tracking tool that websites deposit on a
device’s hard drive to record and save information. The cookie
enables the web browser to retain information such as login,
user preferences,
93. Vangie Beal, search engine, Webopedia,
http://www.webopedia.eom/TERM/S/ search_engme.html (last
visited Sept. 30, 2014). 94. Fact Sheet 18, supra note 88. 95. Id.
rpre9' lS <he difference between a web browser and a search
engine?, The Computer SiTawi ^^//^r/W COmputer-geek
net/what-is-the-difference-be-va-47.html (last visited Oct. 1 -
14) (A web browser is different from a search engine in that the
browser is the tool used to
Summer 2015] Using Digital Technology in Public Schools 309
shopping cart information and other personalized information
that bolster the ease and speed of digital interactions. Although
this can be convenient for the user, some cookies also track and
collect information about the user, which is then communicated
to advertisers and online marketers ^ Websites such as
www.Ghostery.com allow users to identify and block companies
that are tracking through the use of cookies. Moreover,
smartphones, which are being increasingly used to access
webpages, do not use cookies. This combination has caused
cookies to lose appeal to data miners who are trying to stay
ahead of consumers. Instead, data miners are now turning to
what is known as fingerprinting technology.
4. Fingerprinting
Similar to cookies, but far more surreptitious, are “fingerprints”
which summarize the software and hardware settings (like clock
settings, fonts, installed software, etc.) that are unique to each
computer Each time the device connects to the Internet, these
details are collected and pieced together to form a “fingerprint.”
The more changes to a device’s software and settings, the more
identifiable the device becomes." The fingerprint can be
assigned an identifying number that is used to track the device
in the same way cookies are used. Fingerprints, unlike
cookies—which can be detected and removed from a hard
drive—are impossible to detect and much tougher to block. This
makes them a preferable tracking method for data miners and
thus fingerprinting is being used more and more frequently than
antiquated cookies. To identify the details a computer or other
device is transmitting, visit www.panopticlick.eff.org.
access the Internet. The search engine is the tool used to search
within the Internet once it has been accessed by the browser).
97. Fact Sheet 18, supra note 88. T, . 98. Adam Tanner, The
Web Cookie Is Dying. Here’s The Creepier Technology That
Comes Next, Forbes,
http://www.forbes.com/sites/adamtanner/2013/06/17/the-web-
cookie-isdying-heres-the-creepier-technology-that-comes-next/
(last updated Jun. 17, 2013).
99. Id.
310 Journal of Law & Education
[Vol. 44, No. 3
5. Householding
Technology100 has now advanced to the point where the
combination of IP tethering and fingerprinting can be used to
track across multiple devices, so that they are all associated
with one user or one household, and layer all the information
collected into one profile.101 This process] called
“householding,” is also sometimes referred to as “multi-screen”
or cross-device identification. In the past, unique identifiers
were linked to individual computers, but now identifiers are
more closely tied to the individual using the device.102 More
than any other tracking method, householding has major
implications for school districts with one-to-one technology
programs regardless of whether the program offers district-
owned devices or through a Bring-Your-Own-Device (BYOD)
program. Householding allows data collection to occur across
devices regardless of whether or not they are owned by a
district or owned privately by a student, and thus any activity
that a student engages in at home on a personal device can be
linked to activity conducted by that student at school.
Householding is able to identify every device the student uses,
regardless of who owns the device, where it is used, and for
what purpose. Therefore the line between data collected from
the student on the student’s own device on his or her own time,
and data collected from the student while using a district device
at school is virtually eliminated. Data miners will not
distinguish between data collected at school from data collected
at home. Nor will they distinguish between data collected on a
district device form data collected on a personal device. All
devices are capable of being linked to one individual and all
data collected across devices can be compounded and cross-
referenced to create a broader image of who an individual user
is.
] 00. See Cross-Screen Starrs Here, BLUECAVA,
Bluecava.com/about (last visited Oct 1 2014) See also Specific
Media Launches Householding, Specific Media,
http://specificmedia.com/specific-medialaunches-householding-
2/(last updated Sept 24,2012). 101. Dunaway, supra note 92.
102. Id.
Summer 2015] Using Digital Technology in Public Schools 311
6. Smartphones and Tablets
Smartphones, like computers, have their own IP address, and
thus can be tracked in the same manner that a computer can be
tracked. Service providers for smartphones are also able to
collect information about incoming and outgoing calls
(including phone numbers and duration), how frequently users
check email or access the Internet from the phone, and the
user’s location.103 Most individuals who use smartphones (and
tablets) use mobile applications (apps) rather than an Internet
browser, to access the Internet and its features. Apps can
provide a wide variety of functions ranging from mapping
services, to games, to flashlights—all of which make them
attractive to users. The apps often have a dual function
however. Naturally, they deliver the content or service they are
designed to provide, but they are also advertising gold mines.
Particularly when an app is free to download, developers make
money either by selling space within the app to advertisers who
can market to a captive audience, or by collecting data from the
unsuspecting user and then selling it to marketing agencies.
Generally, the type of data collected includes phone and email
contacts, call logs, Internet data, calendar data, data about the
device s location, the device’s unique IDs and information
about how the user engages with the app itself.105 Some apps
have a user agreement alerting users to their data collection
policies, but the privacy policy (if it exists at all) does not
always identify what data are being collected, how long the data
are being stored, where the data are being stored, and who may
ultimately end up receiving the data.106 Thus, these policies are
often unleashed on users who are unaware their data are being
harvested. For example, Angry Birds, one of the first games
made for smart phones,
103. Fact Sheet 2B, supra note 86. 104. Understanding Mobile
Apps, On Guard Online, http://www.onguardonhne.gov/
articles/0018-understanding-mobile-apps (last visited Oct. 1,
2014). 105. Id. . 106. FPF Finds Nearly Three-Quarters of Most
Downloaded Mobile Apps Lack a Privacy Policy Future of
Privacy Forum, http://www.futureofprivacy.org/2011/05/12/fpf-
findsnearly-three-quarters-of-most-downloaded-mobile-apps-
lack-a-privacy-policy (last updated May 12, 2011) (The Future
Privacy Forum found that nearly 75% of downloadable apps
lack a privacy policy).
312 Journal of Law & Education [Vol. 44, No. 3
collects personal information such as location data from users
that is unnecessary to run the game.107 In fact, in a 2013
Carnegie Mellon study, over half of the top 100 Android
applications accessed the device ID, contacts lists or location
data. The study found that this was often contrary to user
expectation.108 In a similar investigation, the Wall Street
Journal found that almost half of 101 smartphone apps were
tracking the phone’s location.109 Fifty-six shared the phone’s
unique ID number forty-seven transmitted the phone’s location,
and five shared the user’s &§>£■> gender and other personal
details such as contacts list.110 Victims of this kind of privacy
invasion extend beyond just the users of mobile apps. Any of
the users’ contacts are also at risk of having information
collected. Of course, contact information such as phone number
and email address will be collected if the users’ address book is
accessed, but any information offered in an email or text
message may be intercepted and collected as well. Districts with
one-to-one technology programs need to be careful when
students download and use apps for classroom or homework
assignments. FERPA does not appear to cover any information
collected by a third-party app operator, and COPPA only
protects students up to age thirteen, so student information
accessed in this manner can be easily collected without federal
violation. Thus, the question arises, whose responsibility is it to
protect students from exposure to data collection in this
situation? Do parents or students have the right to opt out of an
assignment that requires the use of a mobile app? How can
students who wish to maintain anonymity protect themselves?
What rights do they have? The answers to these questions
remain unresolved. However, as more legislation is
... !,07- Crhe'jyl Conner’ Your Privacy has Gone to the [Angry]
Birds, FORBES,
nttp.//www.forbes.com/sites/cherylsnappconner/2012/12/05/you
r-privacy-has-gone-to-theangry-birds/ (last updated Dec. 05,
2012). 108. Press Release: Did Your Smartphone Flashlight Rat
You Out? Crowdsourcing Privacy Concerns of Mobile Apps,
CARNEGIE MELLON UNIVERSITY, http://www.cmu.edu/
ncws/stones/arch|ves/2° I S/january/jan
15_appprivacyconcems.html (last updated Jan. 15,
109. Fact Sheet 18, supra note 88. 110. What They Know-
Mobile, Wall Street Journal, (Oct. 1, 2014, 7:56 AM)
httjr//blo«s wsj.com/wtk-mobile/. 6 '
Summer 2015] Using Digital Technology in Public Schools 313
introduced and as case law on the subject develops, answers
will surely begin to emerge.
7. The Cloud As technology advances, the trend is to move
information storage and hosting from tangible dedicated servers
to cloud computing. The term “cloud” is simply another name
for the Internet, although this term is constantly evolving.111 In
general, cloud-computing platforms allow users to access
software applications through web browsers that enable them to
store files and email on the Internet.112 Cloud computing is
attractive to school districts because it requires less
maintenance, and is cheaper and more efficient. Cloud storage
allows users to access files from any device that has Internet
access, thus the cloud can be used to increase student access to
learning activities and enhance the ability of students to
collaborate with each other or receive feedback from teachers.
However, the cloud has also opened the door to advertisers
looking to access personal information.113 One of the problems
with cloud computing is that most cloud providers who don’t
charge for their services are funding their services through
advertising to their clients or through data mining. For example,
Google, Yahoo, and Hotmail are all email providers that use the
cloud, but they are able to provide free services because of the
money they make by selling user information to data collectors
and by selling targeted advertisements created from the
collected data. The major problem with this is that even though
most companies promise to remove identifying information
from data when they mine it, once the data has been transferred
to a marketer, data aggregators can recombine information to
re-identify an individual.116 Most people are aware that
marketers have the ability to track online interactions. However,
the
H I. Andrea Cascia, Don’t Lose Your Head in the Cloud: Cloud
Computing and Directed Marketing Raise Student Privacy
Issues in K-12 Schools, 261 Ed. L. Rep. 883, 883 (2011). 112.
Id. (Technically, files and emails are stored in offsite servers).
113. Id. at 885. 114. Id. 115. Id. at 885. 116. Id.
314 Journal of Law & Education
[Vol. 44, No. 3
aggregation of data compiled from pieces collected in various
places creates a substantial issue: it can turn non-personally
identifiable information (non-PII) into personally identifiable
information (PII) virtually eliminating the distinction. There is
a wide misperception that if specific identifiable information,
such as name, address, or social security number, is not
voluntarily given, an individual can operate online
anonymously. 7 Unfortunately, the problem is not unique to
cloud computing and exists any time data collection is
occurring. For example, the combination of three seemingly
innocuous, and relatively available, pieces of information—
gender, zip code, and date of birth are sufficient to accurately
identify 87% of individuals in the United tates. When this
concept is applied to data mining through the collection of a
student’s web search terms and other text from messages and
online documents, the aggregation of data can combine to form
a well-developed profile of the individual.119 By the time the
information is sold to an advertiser there is no sense of
anonymity.
8. The Problems With PII
The ability to aggregate data to compile a portrait of an
identifiable individual is particularly problematic for schools
because of the various restrictions on PII in federal legislation.
As seen in the first section many laws, including FERPA,
PPRA, and COPPA center on the “ p J mTWhat constitutes
Personally identifiable information. Under rEKFA, PII cannot
be shared without parental consent; under PPRA proper
notification must be given before PII may be collected for
commercial purposes; COPPA prevents the collection of PII
without consent, from children under the age of thirteen.
Therefore, whenever a student’s personal information is at
issue, some law is likely to apply, dentifying which laws apply
can be difficult, especially when the
117. Paul M. Schwartz and Daniel J. Solove, The PII Problem:
Privacy and a New Concept of Personally Identifiably
Information, 86 N.Y.U. L. Rev. 1814 1836 (2011) 118. Id at
1842 (citing Latanya Sweeney, Simple Demographics Often
Identify People
No! 3“ 2000)) meS'e UniV" SCh' °f ComPuter Sci" Data Privacy
Lab., Working Paper 119. Cascia, supra note 111 at 885.
Summer 2015] Using Digital Technology in Public Schools 315
collected information might not become PII until after the
transaction at issue has occurred. Some of the laws have
broadened their definitions of PII to include data that are not
traditionally linkable to a student, such as geolocation.
However, each law has its own definition of PII, making it
harder to keep track of when a transaction might trigger federal
or state
A startling example of how aggregation can convert non-PII
into PII was exposed by two computer scientists who
demonstrated that some people could be identified simply based
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx
Assistive Technology Considerations TemplateSubject AreaSample.docx

More Related Content

Similar to Assistive Technology Considerations TemplateSubject AreaSample.docx

Laypeople's and Experts' Risk Perception of Cloud Computing Services
Laypeople's and Experts' Risk Perception of Cloud Computing Services Laypeople's and Experts' Risk Perception of Cloud Computing Services
Laypeople's and Experts' Risk Perception of Cloud Computing Services
neirew J
 
Laypeople’s and experts’ risk perception of
Laypeople’s and experts’ risk perception ofLaypeople’s and experts’ risk perception of
Laypeople’s and experts’ risk perception of
ijccsa
 
Running Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY .docx
Running Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY    .docxRunning Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY    .docx
Running Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY .docx
healdkathaleen
 
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
IJNSA Journal
 
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
IJNSA Journal
 
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
IJNSA Journal
 
presentation
presentationpresentation
presentation
SaveraAyub2
 
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docxRunning Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
todd521
 
A STUDY ON CYBER SECURITY AND ITS RISKS K. Jenifer
A STUDY ON CYBER SECURITY AND ITS RISKS  K. JeniferA STUDY ON CYBER SECURITY AND ITS RISKS  K. Jenifer
A STUDY ON CYBER SECURITY AND ITS RISKS K. Jenifer
AM Publications
 
Cyber security: challenges for society- literature review
Cyber security: challenges for society- literature reviewCyber security: challenges for society- literature review
Cyber security: challenges for society- literature review
IOSR Journals
 
Information Technology For Educational Development
Information Technology For Educational DevelopmentInformation Technology For Educational Development
Information Technology For Educational Development
Theresa Singh
 
Fake News Detection Using Machine Learning
Fake News Detection Using Machine LearningFake News Detection Using Machine Learning
Fake News Detection Using Machine Learning
IRJET Journal
 
Academic PaperCurrent Cybersecurity Trends A PowerPoint prese.docx
Academic PaperCurrent Cybersecurity Trends A PowerPoint prese.docxAcademic PaperCurrent Cybersecurity Trends A PowerPoint prese.docx
Academic PaperCurrent Cybersecurity Trends A PowerPoint prese.docx
bartholomeocoombs
 
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
eraser Juan José Calderón
 
The red book
The red book  The red book
The red book
habiba Elmasry
 
WSIS10 Action Line C5 Building Confidence and Security in the use of ICT's
WSIS10 Action Line C5 Building Confidence and Security in the use of ICT'sWSIS10 Action Line C5 Building Confidence and Security in the use of ICT's
WSIS10 Action Line C5 Building Confidence and Security in the use of ICT's
Dr Lendy Spires
 
How Cyber Resilient are we?
How Cyber Resilient are we?How Cyber Resilient are we?
How Cyber Resilient are we?
CIO Academy Asia Community
 
Information Systems Management
Information Systems ManagementInformation Systems Management
Information Systems Management
ijitcs
 
Information Sharing of Cyber Threat Intelligence with their Issue and Challenges
Information Sharing of Cyber Threat Intelligence with their Issue and ChallengesInformation Sharing of Cyber Threat Intelligence with their Issue and Challenges
Information Sharing of Cyber Threat Intelligence with their Issue and Challenges
ijtsrd
 
Security for the IoT - Report Summary
Security for the IoT - Report SummarySecurity for the IoT - Report Summary
Security for the IoT - Report Summary
Accenture Technology
 

Similar to Assistive Technology Considerations TemplateSubject AreaSample.docx (20)

Laypeople's and Experts' Risk Perception of Cloud Computing Services
Laypeople's and Experts' Risk Perception of Cloud Computing Services Laypeople's and Experts' Risk Perception of Cloud Computing Services
Laypeople's and Experts' Risk Perception of Cloud Computing Services
 
Laypeople’s and experts’ risk perception of
Laypeople’s and experts’ risk perception ofLaypeople’s and experts’ risk perception of
Laypeople’s and experts’ risk perception of
 
Running Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY .docx
Running Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY    .docxRunning Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY    .docx
Running Head ANNOTATED BIBLIOGRAPHYANNOTATED BIBLIOGRAPHY .docx
 
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
 
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
 
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
DESIGNING A CYBER-SECURITY CULTURE ASSESSMENT SURVEY TARGETING CRITICAL INFRA...
 
presentation
presentationpresentation
presentation
 
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docxRunning Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
 
A STUDY ON CYBER SECURITY AND ITS RISKS K. Jenifer
A STUDY ON CYBER SECURITY AND ITS RISKS  K. JeniferA STUDY ON CYBER SECURITY AND ITS RISKS  K. Jenifer
A STUDY ON CYBER SECURITY AND ITS RISKS K. Jenifer
 
Cyber security: challenges for society- literature review
Cyber security: challenges for society- literature reviewCyber security: challenges for society- literature review
Cyber security: challenges for society- literature review
 
Information Technology For Educational Development
Information Technology For Educational DevelopmentInformation Technology For Educational Development
Information Technology For Educational Development
 
Fake News Detection Using Machine Learning
Fake News Detection Using Machine LearningFake News Detection Using Machine Learning
Fake News Detection Using Machine Learning
 
Academic PaperCurrent Cybersecurity Trends A PowerPoint prese.docx
Academic PaperCurrent Cybersecurity Trends A PowerPoint prese.docxAcademic PaperCurrent Cybersecurity Trends A PowerPoint prese.docx
Academic PaperCurrent Cybersecurity Trends A PowerPoint prese.docx
 
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
 
The red book
The red book  The red book
The red book
 
WSIS10 Action Line C5 Building Confidence and Security in the use of ICT's
WSIS10 Action Line C5 Building Confidence and Security in the use of ICT'sWSIS10 Action Line C5 Building Confidence and Security in the use of ICT's
WSIS10 Action Line C5 Building Confidence and Security in the use of ICT's
 
How Cyber Resilient are we?
How Cyber Resilient are we?How Cyber Resilient are we?
How Cyber Resilient are we?
 
Information Systems Management
Information Systems ManagementInformation Systems Management
Information Systems Management
 
Information Sharing of Cyber Threat Intelligence with their Issue and Challenges
Information Sharing of Cyber Threat Intelligence with their Issue and ChallengesInformation Sharing of Cyber Threat Intelligence with their Issue and Challenges
Information Sharing of Cyber Threat Intelligence with their Issue and Challenges
 
Security for the IoT - Report Summary
Security for the IoT - Report SummarySecurity for the IoT - Report Summary
Security for the IoT - Report Summary
 

More from cockekeshia

at least 2 references in each peer responses! I noticed .docx
at least 2 references in each peer responses! I noticed .docxat least 2 references in each peer responses! I noticed .docx
at least 2 references in each peer responses! I noticed .docx
cockekeshia
 
At least 2 pages longMarilyn Lysohir, an internationally celebra.docx
At least 2 pages longMarilyn Lysohir, an internationally celebra.docxAt least 2 pages longMarilyn Lysohir, an internationally celebra.docx
At least 2 pages longMarilyn Lysohir, an internationally celebra.docx
cockekeshia
 
At least 2 citations. APA 7TH EditionResponse 1. TITop.docx
At least 2 citations. APA 7TH EditionResponse 1. TITop.docxAt least 2 citations. APA 7TH EditionResponse 1. TITop.docx
At least 2 citations. APA 7TH EditionResponse 1. TITop.docx
cockekeshia
 
At each decision point, you should evaluate all options before selec.docx
At each decision point, you should evaluate all options before selec.docxAt each decision point, you should evaluate all options before selec.docx
At each decision point, you should evaluate all options before selec.docx
cockekeshia
 
At an elevation of nearly four thousand metres above sea.docx
At an elevation of nearly four thousand metres above sea.docxAt an elevation of nearly four thousand metres above sea.docx
At an elevation of nearly four thousand metres above sea.docx
cockekeshia
 
At a minimum, your outline should include the followingIntroducti.docx
At a minimum, your outline should include the followingIntroducti.docxAt a minimum, your outline should include the followingIntroducti.docx
At a minimum, your outline should include the followingIntroducti.docx
cockekeshia
 
At least 500 wordsPay attention to the required length of these.docx
At  least 500 wordsPay attention to the required length of these.docxAt  least 500 wordsPay attention to the required length of these.docx
At least 500 wordsPay attention to the required length of these.docx
cockekeshia
 
At a generic level, innovation is a core business process concerned .docx
At a generic level, innovation is a core business process concerned .docxAt a generic level, innovation is a core business process concerned .docx
At a generic level, innovation is a core business process concerned .docx
cockekeshia
 
Asymmetric Cryptography•Description of each algorithm•Types•Encrypt.docx
Asymmetric Cryptography•Description of each algorithm•Types•Encrypt.docxAsymmetric Cryptography•Description of each algorithm•Types•Encrypt.docx
Asymmetric Cryptography•Description of each algorithm•Types•Encrypt.docx
cockekeshia
 
Astronomy HWIn 250-300 words,What was Aristarchus idea of the.docx
Astronomy HWIn 250-300 words,What was Aristarchus idea of the.docxAstronomy HWIn 250-300 words,What was Aristarchus idea of the.docx
Astronomy HWIn 250-300 words,What was Aristarchus idea of the.docx
cockekeshia
 
Astronomy ASTA01The Sun and PlanetsDepartment of Physic.docx
Astronomy ASTA01The Sun and PlanetsDepartment of Physic.docxAstronomy ASTA01The Sun and PlanetsDepartment of Physic.docx
Astronomy ASTA01The Sun and PlanetsDepartment of Physic.docx
cockekeshia
 
Astronomers have been reflecting laser beams off the Moon since refl.docx
Astronomers have been reflecting laser beams off the Moon since refl.docxAstronomers have been reflecting laser beams off the Moon since refl.docx
Astronomers have been reflecting laser beams off the Moon since refl.docx
cockekeshia
 
Astrategicplantoinformemergingfashionretailers.docx
Astrategicplantoinformemergingfashionretailers.docxAstrategicplantoinformemergingfashionretailers.docx
Astrategicplantoinformemergingfashionretailers.docx
cockekeshia
 
Asthma, Sleep, and Sun-SafetyPercentage of High School S.docx
Asthma, Sleep, and Sun-SafetyPercentage of High School S.docxAsthma, Sleep, and Sun-SafetyPercentage of High School S.docx
Asthma, Sleep, and Sun-SafetyPercentage of High School S.docx
cockekeshia
 
Asthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docx
Asthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docxAsthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docx
Asthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docx
cockekeshia
 
Assumption-Busting1. What assumption do you have that is in s.docx
Assumption-Busting1.  What assumption do you have that is in s.docxAssumption-Busting1.  What assumption do you have that is in s.docx
Assumption-Busting1. What assumption do you have that is in s.docx
cockekeshia
 
Assuming you have the results of the Business Impact Analysis and ri.docx
Assuming you have the results of the Business Impact Analysis and ri.docxAssuming you have the results of the Business Impact Analysis and ri.docx
Assuming you have the results of the Business Impact Analysis and ri.docx
cockekeshia
 
Assuming you are hired by a corporation to assess the market potenti.docx
Assuming you are hired by a corporation to assess the market potenti.docxAssuming you are hired by a corporation to assess the market potenti.docx
Assuming you are hired by a corporation to assess the market potenti.docx
cockekeshia
 
Assuming that you are in your chosen criminal justice professi.docx
Assuming that you are in your chosen criminal justice professi.docxAssuming that you are in your chosen criminal justice professi.docx
Assuming that you are in your chosen criminal justice professi.docx
cockekeshia
 
assuming that Nietzsche is correct that conventional morality is aga.docx
assuming that Nietzsche is correct that conventional morality is aga.docxassuming that Nietzsche is correct that conventional morality is aga.docx
assuming that Nietzsche is correct that conventional morality is aga.docx
cockekeshia
 

More from cockekeshia (20)

at least 2 references in each peer responses! I noticed .docx
at least 2 references in each peer responses! I noticed .docxat least 2 references in each peer responses! I noticed .docx
at least 2 references in each peer responses! I noticed .docx
 
At least 2 pages longMarilyn Lysohir, an internationally celebra.docx
At least 2 pages longMarilyn Lysohir, an internationally celebra.docxAt least 2 pages longMarilyn Lysohir, an internationally celebra.docx
At least 2 pages longMarilyn Lysohir, an internationally celebra.docx
 
At least 2 citations. APA 7TH EditionResponse 1. TITop.docx
At least 2 citations. APA 7TH EditionResponse 1. TITop.docxAt least 2 citations. APA 7TH EditionResponse 1. TITop.docx
At least 2 citations. APA 7TH EditionResponse 1. TITop.docx
 
At each decision point, you should evaluate all options before selec.docx
At each decision point, you should evaluate all options before selec.docxAt each decision point, you should evaluate all options before selec.docx
At each decision point, you should evaluate all options before selec.docx
 
At an elevation of nearly four thousand metres above sea.docx
At an elevation of nearly four thousand metres above sea.docxAt an elevation of nearly four thousand metres above sea.docx
At an elevation of nearly four thousand metres above sea.docx
 
At a minimum, your outline should include the followingIntroducti.docx
At a minimum, your outline should include the followingIntroducti.docxAt a minimum, your outline should include the followingIntroducti.docx
At a minimum, your outline should include the followingIntroducti.docx
 
At least 500 wordsPay attention to the required length of these.docx
At  least 500 wordsPay attention to the required length of these.docxAt  least 500 wordsPay attention to the required length of these.docx
At least 500 wordsPay attention to the required length of these.docx
 
At a generic level, innovation is a core business process concerned .docx
At a generic level, innovation is a core business process concerned .docxAt a generic level, innovation is a core business process concerned .docx
At a generic level, innovation is a core business process concerned .docx
 
Asymmetric Cryptography•Description of each algorithm•Types•Encrypt.docx
Asymmetric Cryptography•Description of each algorithm•Types•Encrypt.docxAsymmetric Cryptography•Description of each algorithm•Types•Encrypt.docx
Asymmetric Cryptography•Description of each algorithm•Types•Encrypt.docx
 
Astronomy HWIn 250-300 words,What was Aristarchus idea of the.docx
Astronomy HWIn 250-300 words,What was Aristarchus idea of the.docxAstronomy HWIn 250-300 words,What was Aristarchus idea of the.docx
Astronomy HWIn 250-300 words,What was Aristarchus idea of the.docx
 
Astronomy ASTA01The Sun and PlanetsDepartment of Physic.docx
Astronomy ASTA01The Sun and PlanetsDepartment of Physic.docxAstronomy ASTA01The Sun and PlanetsDepartment of Physic.docx
Astronomy ASTA01The Sun and PlanetsDepartment of Physic.docx
 
Astronomers have been reflecting laser beams off the Moon since refl.docx
Astronomers have been reflecting laser beams off the Moon since refl.docxAstronomers have been reflecting laser beams off the Moon since refl.docx
Astronomers have been reflecting laser beams off the Moon since refl.docx
 
Astrategicplantoinformemergingfashionretailers.docx
Astrategicplantoinformemergingfashionretailers.docxAstrategicplantoinformemergingfashionretailers.docx
Astrategicplantoinformemergingfashionretailers.docx
 
Asthma, Sleep, and Sun-SafetyPercentage of High School S.docx
Asthma, Sleep, and Sun-SafetyPercentage of High School S.docxAsthma, Sleep, and Sun-SafetyPercentage of High School S.docx
Asthma, Sleep, and Sun-SafetyPercentage of High School S.docx
 
Asthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docx
Asthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docxAsthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docx
Asthma DataSchoolNumStudentIDGenderZipDOBAsthmaRADBronchitisWheezi.docx
 
Assumption-Busting1. What assumption do you have that is in s.docx
Assumption-Busting1.  What assumption do you have that is in s.docxAssumption-Busting1.  What assumption do you have that is in s.docx
Assumption-Busting1. What assumption do you have that is in s.docx
 
Assuming you have the results of the Business Impact Analysis and ri.docx
Assuming you have the results of the Business Impact Analysis and ri.docxAssuming you have the results of the Business Impact Analysis and ri.docx
Assuming you have the results of the Business Impact Analysis and ri.docx
 
Assuming you are hired by a corporation to assess the market potenti.docx
Assuming you are hired by a corporation to assess the market potenti.docxAssuming you are hired by a corporation to assess the market potenti.docx
Assuming you are hired by a corporation to assess the market potenti.docx
 
Assuming that you are in your chosen criminal justice professi.docx
Assuming that you are in your chosen criminal justice professi.docxAssuming that you are in your chosen criminal justice professi.docx
Assuming that you are in your chosen criminal justice professi.docx
 
assuming that Nietzsche is correct that conventional morality is aga.docx
assuming that Nietzsche is correct that conventional morality is aga.docxassuming that Nietzsche is correct that conventional morality is aga.docx
assuming that Nietzsche is correct that conventional morality is aga.docx
 

Recently uploaded

Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
Jisc
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
Jheel Barad
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
Sandy Millin
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
Pavel ( NSTU)
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
EverAndrsGuerraGuerr
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
BhavyaRajput3
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
CarlosHernanMontoyab2
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
timhan337
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
Jean Carlos Nunes Paixão
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
RaedMohamed3
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
SACHIN R KONDAGURI
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
EugeneSaldivar
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
EduSkills OECD
 
Polish students' mobility in the Czech Republic
Polish students' mobility in the Czech RepublicPolish students' mobility in the Czech Republic
Polish students' mobility in the Czech Republic
Anna Sz.
 

Recently uploaded (20)

Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
 
Polish students' mobility in the Czech Republic
Polish students' mobility in the Czech RepublicPolish students' mobility in the Czech Republic
Polish students' mobility in the Czech Republic
 

Assistive Technology Considerations TemplateSubject AreaSample.docx

  • 1. Assistive Technology Considerations Template Subject Area Sample Task Assistive Technology Tools, Accommodations or Modifications Links to Resource Vendors Example: Math Use coins and bills to make change and solve math word problems involving money. (Arizona Department of Education, 2015) Talking Calculator and Coin-U-lator Talking calculators and money calculators provide tactile, auditory, and visual feedback and can help students with disabilities perform math calculation assignments. The Coin-U- Lator actually has buttons for coins and bills. Math APPs are electronic games that can provide additional independent practice on an iPad, phone, or computer. https://www.enablemart.com/coin-u-lator-accessories http://www.attainmentcompany.com/talking-calculator https://www.commonsensemedia.org/learning- ratings/reviews?sort=field_reference_review_lr%3Afield_total_l earning_rating&order=desc&gclid=COXVoKWB68gCFROSfgod pqsKFA Reading Writing Listening
  • 2. Oral Communication Development © 2015. Grand Canyon University. All Rights Reserved. Future Scenarios and Challenges for Security and Privacy Meredydd Williams, Louise Axon, Jason R. C. Nurse and Sadie Creese Department of Computer Science, University of Oxford, Oxford, UK {firstname.lastname}@cs.ox.ac.uk Abstract—Over the past half-century, technology has evolved beyond our wildest dreams. However, while the benefits of technological growth are undeniable, the nascent Internet did not anticipate the online threats we routinely encounter and the harms which can result. As our world becomes increas- ingly connected, it is critical we consider what implications current and future technologies have for security and privacy. We approach this challenge by surveying 30 predictions across industry, academia and international organisations to extract a number of common themes. Through this, we distill 10 emerging scenarios and reflect on the impact these might have on a range of
  • 3. stakeholders. Considering gaps in best practice and requirements for further research, we explore how security and privacy might evolve over the next decade. We find that existing guidelines both fail to consider the relationships between stakeholders and do not address the novel risks from wearable devices and insider threats. Our approach rigorously analyses emerging scenarios and suggests future improvements, of crucial importance as we look to pre-empt new technological threats. Index Terms—Emerging scenarios, cybersecurity, privacy, fu- ture technologies I. INTRODUCTION Over the past half-century, technology has evolved beyond our wildest dreams. While we once completed our time- shared tasks on chunky mainframes, we now collaborate on documents using our mobile phones. Though these advances have created innumerable benefits for society, they have also brought many risks to security and privacy. Since the Internet was not predicted to face malicious intent, this architecture is now victim to a host of phishing, spam and malware attacks. We currently stand at the dawn of a further technological revolution, with the growth of the Internet-of-Things (IoT) and sophisticated machine learning techniques. As we begin to live our lives ‘online’, it is crucial we consider what future changes might mean for security and privacy. Due to this criticality, several predictions have been made in industry, academia and international organisations. For instance, The Microsoft Cyberspace2025 [1] report considers three technological possibilities: a stable but stalled ‘plat- eau’, a cooperative and innovative ‘peak’, and a ‘canyon’ of fragmented solutions. Choo [2] approached the topic from a
  • 4. criminological angle, with his scenarios ranging from mobile device malware to highly-sophisticated phishing attacks. The World Economic Forum [3] collated interviews from global executives and discussed hackers stunting economic growth, threats damaging online services, and resilience triggering further innovation. Through surveying many such scenarios, we found that while reports often differ in scope and audience, there are several common themes. In this paper we consolidate 30 scenarios from a number of studies to explore those technological futures most frequently considered. Rather than fielding remote predictions, we use a rigorous methodology to identify common existing themes and distil 10 emerging scenarios. We reflect on these possibilities, which include the proliferation of attack tools, before consid- ering their potential impact on a range of stakeholders. We then analyse existing security frameworks and best practices, exploring their suitability in these new environments. Finally, we characterise what research is required to address outstand- ing security and privacy risks, and consider implications for policy makers and academia. II. METHODOLOGY Our methodology consists of four stages, using an iter- ative approach to identify those scenarios most-frequently considered. Rather than merely fielding our own predictions, we distil emerging scenarios from a wide range of surveyed literature. These stages are as follows: 1) We surveyed a large number of existing security and privacy predictions, many of which are described in Section III. These works originated from a range of fields, including academia, industry and international organisations. This was critical to ensure we considered
  • 5. both predictions grounded in technical expertise and those written at a policy level. Regardless of background, these articles were selected based on three key criteria: relevance, citation count and rigour of methodology. 2) The works were then indexed and their predictions individually extracted. Whereas several articles clearly delineated between different situations, others required careful inspection to isolate scenarios. We coded these predictions based on their general themes, such as ‘big data’, enabling exploration of which points academia, industry and organisations were in agreement. 3) Next, we grouped similar predictions, repeating this process in an iterative fashion until we achieved conver- gence. Where assigned codes concerned related trends, such as ‘big data’ and ‘machine learning’, these were merged to construct rich categories. From our initial 30 scenarios, we distilled 10 predictions which we found most representative. 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI) 978-1-5090-1131-5/16/$31.00 ©2016 IEEE 4) Finally, we expanded these scenarios into brief narratives for further analysis. Through considering the themes identified in each instance, we went on to explore how these changes could affect societies commercially, technologically and politically. III. EMERGING SCENARIOS
  • 6. We now present our surveyed predictions before consol- idating common themes into core emerging scenarios. These works originate from industry, international organisations and academia, providing a comprehensive base for our analyses. A. Surveyed predictions Among industrial predictions, Burt et al. [1] developed the Microsoft Cyberspace2025 report, considering scenarios grounded in an econometric model. In addition to the quant- itative analysis of 100 socio-economic indicators, qualitative insights were sourced from researchers and experts. They constructed three future scenarios: ‘peak’ comprising the em- bracement of technology, ‘plateau’ representing piecemeal cy- ber adoption, and ‘canyon’ theorising protectionist strategies. Hunter [4], performing industrial research through Gartner, plotted an x-axis of targets against a y-axis of authorit- ies. His scenario quadrants ranged from regulated risk (the establishment of software liabilities) to coalition rule (the growth of underground hackers). In a less formalised approach, ICSPA [5] validated their concepts through an expert panel. Through collaboration with the TrendMicro security firm, they discussed a second-generation digital native, a business in a highly-technological market, and a future nation state. Organisational predictions included that from the World Economic Forum [3], who collated interviews and workshops with global executives. Through this, they considered expert feedback before constructing a 14-point roadmap for external collaboration. These findings were drawn into several scen- arios, including governments erecting technological barriers and the adoption of new trade innovations. Schwartz [6], writ- ing for the Center for a New American Security, considered optimistic and pessimistic predictions. Through his analyses of policy and technology he identified both scenarios and in-
  • 7. dicators of cybersecurity progress. His considerations included the construction of national networks and the proliferation of simple attack tools. We finally surveyed academic predictions, including scen- arios from the Berkeley Center for Long-Term Cybersecurity [7]. They convened an interdisciplinary conference with ex- perts from computer science and politics. From nascent ideas formed during the sessions, Berkeley graduate researchers developed the concepts into five detailed narratives. These included the Internet becoming an anarchy and the monitoring of human emotions. Benzel [8] took an alternative approach, describing discussions from a US Department of Homeland Security session concerning cybersecurity. The event saw a number of working groups deliberating those objectives which should be prioritised over the next five years. Recommenda- tions included increasing the role of the private sector and developing metrics to evaluate cyber investments. In contrast, Choo [2] conducted an exploration of cybercrime and how this might evolve in the future. Through considering affairs in the US, UK and Australia, he constructed several predictions and suggested criminological prevention strategies. His scenarios ranged from social media propaganda to the escalation of international tensions. B. Constructed scenarios Our 10 emerging scenarios are now discussed in detail, con- sidering the technological, commercial and political possibil- ities of the next decade. These scenarios are not necessarily complementary, but describe a range of potential eventualities. They are as follows: 1) Growth of the Internet-of-Things. The Internet-of- Things (IoT) pervades daily life, blurring the physical
  • 8. and virtual worlds. This entanglement leads to online risks becoming increasingly intangible, contributing to further cyber-threats [9]. Social norms evolve suffi- ciently that the IoT is considered ‘normal’ and those who shun these novel technologies are viewed as antiquated. ([1–3, 5, 7]) 2) Proliferation of offensive tools. Offensive government cyber capabilities and the proliferation of simple attack tools both contribute to frequent incidents. Nation-state weaponry leaks onto the black market and is reused by cybercriminals to steal from organisations. Intelligence agencies stockpile zero-day vulnerabilities rather than informing affected vendors, resulting in a deluge of data breaches ([1–4, 7]). 3) Privacy becomes reinterpreted. The concept of privacy is reinterpreted by ‘digital natives’ who have been raised in an age of social networking and ubiquitous Internet access. Individuals become accustomed to the develop- ment of invasive technologies which offer great benefits to convenience and productivity. Lives are lived ‘online’ and reputational damage is frequent as citizens display intimate histories through digital portals ([5, 7]). 4) Repressive enforcement of online order. While many states take liberal risk-based approaches, several favour repressive means to enforce order online. This leads to blanket censorship and surveillance, an order stronger than in current regimes, damaging cross-border trade and placing global commerce under pressure. Although these measures successfully reduce the scale of domestic cyber-attacks, the injury to free enterprise results in economic degradation ([1, 2, 4, 5]). 5) Heterogeneity of state postures. The heterogeneity of
  • 9. state technology postures stifles international agreement and cooperation over cyber norms. This raises global tensions as attacks increasingly originate from ‘safe havens’ who refuse to prosecute their cyber criminals. The challenge of digital attribution leads to a fragmenta- tion of shared understandings, with nations treating their allies with cool suspicion ([1–5]). 6) Traditional business models under pressure. The concept of intellectual property evolves as traditional business models are placed under increasing pressure 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI) 978-1-5090-1131-5/16/$31.00 ©2016 IEEE from both pirates and new competitors. Established market leaders fall and sell their data assets to innovative firms better-equipped to operate in these novel environ- ments. Cloud platforms become the norm, with individu- als granting increased agency to those companies which store their data ([1, 4, 5, 7]). 7) Big data enables greater control. Big data and machine learning supports the manipulation of individuals’ be- haviour by corporations and governments. Sophisticated statistical analysis enables companies to intricately tailor their advertisements, while political parties customise their media campaigns to target individual citizens. While these developments benefit commerce and law enforcement, deviations from the norm are increasingly viewed with suspicion ([5, 7, 8]).
  • 10. 8) Growth of public-private partnerships. Organisations continue to know more about their customers than national governments, contributing to the growth of public-private data-sharing partnerships. These arrange- ments are indeed beneficial to national security, but large parts of critical infrastructure remain owned by foreign corporations. This reliance on private industry shifts considerable power from elected state officials to unaccountable executives, resulting in damage to the democratic process. With power production frequently under the influence of foreign investors, national security becomes critically undermined ([1–3, 5, 8]). 9) Citizens demand greater control. Some citizens de- mand greater transparency and agency over their online data. Technologically-literate individuals store inform- ation remotely, occasionally selling details for a range of benefits. Corporations offer paid alternatives to data- hungry social networks, creating new markets for on- line communities and Privacy-Enhancing Technologies (PETs) ([2, 4, 5, 7]). 10) Organisations value cyber-resilience. Cyber-resilience is increasingly important for informing business de- cisions, with issues such as insider threats high on the agenda. The infeasibility of absolute security drives a market for cyber-insurance, as corporations adopt pre- scribed measures to reduce their premiums. Board rooms quickly learn the reputational costs of cyber-attacks, resulting in greater private investment in the actuarial and mathematical sciences ([2–5, 8]). C. Stakeholder impact We continue by considering our emerging scenarios and the
  • 11. impact they could have at three stakeholder levels: individual, organisational and national. Investigation of stakeholder im- pact is crucial to explore how ordinary citizens, corporations and governments may operate in the future. Considerations are summarised below in Fig. 1, with yellow, orange and red representing low, medium and high impact, respectively. These ratings were determined through identifying which scenarios could critically affect a stakeholder, which would require changes in practice, and which would have a limited impact. In detail, the stakeholder impacts are as follows: Figure 1: Emerging scenarios and stakeholder impact 1) The pervasion of the Internet-of-Things would present dangerous opportunities for the surveillance of individu- als, particularly if biological data is accessible through implanted devices. While organisations would benefit from enhanced home-working efficiencies, they must defend against Bring-Your-Own-Device (BYOD) risks. Although national governments would benefit econom- ically from the IoT, the dependence of critical infra- structure on modern technologies would greatly increase attack surfaces. 2) The proliferation of attack tools would place individu- als under the ubiquitous risk of identity theft. Organisa- tions similarly must defend against data exfiltration (of high impact) and therefore might be less willing to store customer data. International tensions would be fraught as high-profile attacks become commonplace, especially with attribution appearing largely intractable. 3) Reinterpretations of privacy would see PII (Personally Identifiable Information) lose its importance as more- sensitive details, such as sexual orientation, are available
  • 12. through social media. Although organisations encourage the exchange of data for convenience, they are at risk of their own employees sharing confidential information. Nations benefit from large collections of open source in- telligence, but find their citizenry increasingly dependent on services hosted in foreign countries. 4) The repressive enforcement of online order would in- volve widespread filtering, censorship and surveillance, highlighted as high impact in Fig. 1. Organisational opportunities would be stifled by both technological restrictions and costly regulatory compliance. Foreign opposition to repressive policies would damage interna- tional trade, while prosecuting dissidents would be both challenging and expensive. 5) The heterogeneity of state postures would result in minimal cross-border personal data protection, espe- cially when nations differ in their definitions of PII. Unpredictable international relations would contribute to market instability, while high-impact cyber attacks continue from ‘safe havens’. Reductions in information sharing would damage law enforcement, with global tensions increasing through frequent disagreements. 6) The evolution of new business models would see indi- viduals’ personal data become the most valuable com- modity, frequently traded on global exchanges. Increased market competitivity would depress profit margins while organisations reel from ubiquitous copyright infringe- ment. National economies are similarly weakened by intellectual property violations and lose stability as es- 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI)
  • 13. 978-1-5090-1131-5/16/$31.00 ©2016 IEEE tablished firms are replaced by start-ups. With citizen data increasingly residing in foreign data centres, both individuals and governments would be vulnerable to the decisions of other states. 7) Advances in big data and machine learning would result in consumer manipulation and reductions to their individual agency. Organisations would benefit from sophisticated financial forecasting and seamless targeted advertising. Nations better control their citizenry through advanced media management, yet increasingly rely on a technology bubble for their economic growth. 8) Public-private data-sharing partnerships would in- fringe individuals’ rights as their personal data is shared with foreign governments. The wider distribution of this information would also increase the risk of confidential- ity being breached. Although organisations might learn valuable threat intelligence, they could have their repu- tations damaged by governmental arrangements. While national agencies would acquire valuable data, foreign states might exert an troubling corporate influence over critical infrastructure. 9) Although citizen demands might give certain indi- viduals greater freedom, technical literacy is required and commercial costs would be inherited through price increases. Organisations struggle under the pressure to reveal corporate information while national governments find themselves increasingly accountable to their cit- izens. Despite both initiatives being beneficial to demo-
  • 14. cracy, they weaken domestic economies against unac- countable rival states. 10) The importance of cyber-resilience emphasises the ubiquitous personal risk from data breaches. Organisa- tions would invest in costly technological protection, but still fall victim to high-impact Advanced Persistent Threats (APTs). Insider threats would challenge the sensitive world of government, particularly since the capabilities of malicious employees will increase with ubiquitous technology. As a result, frequent breaches in both the public and private sectors could undermine trust in state institutions. IV. CHALLENGES FOR SECURITY AND PRIVACY As a next stage, we reflect on the future applicability of current ‘best practice’ guidelines for individuals, organ- isations and nations. This is of great importance, as these documents inform stakeholder groups and are expected to guard against technological threats. We summarise the gaps in current frameworks, selecting several key examples due to space restrictions, and point to required research and development. At the individual level, we consider the main UK and US government initiatives for improving cybersecurity practices: for the UK, the Cyber Streetwise campaign [10], and for the US, Stop.Think.Connect [11]. The UK Get Safe Online initiative was also considered, but this effort is private sector and less authoritative than Cyber Streetwise. For organisations, we examine the US National Institute for Standards and Technology’s (NIST) cybersecurity framework [12], and the Center for Internet Security’s Critical Security Controls (CIS CSC) [13]. The International Telecommunication Union (ITU) National Cybersecurity Strategy Guide [14] and NATO Na- tional Cybersecurity Framework Manual [15] are considered
  • 15. as internationally-recognised guidelines for nations. Based on our findings, we conclude this section with a call to action for researchers, stakeholders and policy makers. A. Gaps in existing guidelines While advantageous for the present day, awareness cam- paigns for individuals do not translate to a future in which the IoT pervades daily life. In this future, individuals’ under- standing of, and agency over, their online presence is key to the protection of their privacy. Current campaigns go some way to promoting good practice in this area (e.g. recommendations for protecting privacy when using social media [10]; privacy- protecting recommendations given for the use of web services, devices, purchase history, and location [11]). However, these initiatives do not extend to educating citizens on the privacy implications of increasingly-used wearable, biometric and sensor-based technologies. Nor is clear usable advice given on the specific types of data collected by different applications and devices. For guidelines to be suitable for future scenarios, campaigns must focus on emerging technologies and give specific, practical advice. Furthermore, these public initiatives should be periodically evaluated to ascertain their effective- ness in improving individuals’ behaviour. Whether undertaken through large-scale surveys or targeted focus groups, feedback mechanisms would enable campaigns to be iteratively refined. Existing frameworks for organisations do not adequately address those human aspects which will be key to future cor- porate cybersecurity. The NIST and CSC documents [12, 13] are highly comprehensive, with useful references to external guidelines. They do not, however, provide frameworks within which the human aspects of organisational cybersecurity can be addressed; namely, problems arising from insider threats and BYOD. For instance, NIST aims to mitigate insider threats by recommending the comparison of personnel actions against
  • 16. expected behaviours and data-flow baselines. CSC addresses the problem by advocating the controlled use of administrator privileges, access controls on a “need to know” basis, and the monitoring of employee and contractor accounts. While these activities are essential, employee tracking alone will not de- fend against insiders who are aware of the monitoring systems in place [16]. Unfortunately, both frameworks fail to give any consideration to the growing need to study personnel motives or to monitor behaviour from a psychological perspective. Both the NIST and CSC recommendations protect organisa- tions against BYOD risks insofar as they specify that physical devices should be inventoried and monitored. We foresee a huge expansion in the ownership of IoT products, resulting in an increase in the number of personally-owned devices present in the workplace; in particular, wearables and implants. In this future, registering all connected products will not be feasible. Sophisticated means of mitigating IoT and BYOD risk are required [17], such as protection against potentially-invisible devices (e.g. implanted medical chips) which are challenging 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI) 978-1-5090-1131-5/16/$31.00 ©2016 IEEE to catalogue. This is crucial to recognising the increasing corporate blur between internal and external spaces. Cybersecurity frameworks for nations [14, 15] recom- mend international cooperation to unite national strategies. Set against our future scenarios, this is extremely optimistic, especially between non-allied states. Indeed, it is predicted that
  • 17. heterogeneous state postures and differing cybersecurity laws will stifle international cooperation. This combined with the proliferation of cyber weaponry and problems of attribution suggests that the alignment of national strategies is unrealistic. Of greater importance is the construction of policies and legis- lation which allows states to suit their own, and international, interests in a non-unified cyber landscape. Based on our scenarios, we foresee that both state cyber weaponry will grow and international tensions will escalate as a result of high-profile incidents. A recent example of expected events is the unattributed Ukrainian power outage of December 2015, apparently caused by a cyber-attack [18]. Currently, agreement of those terms key to building legislation is lacking. For example, there are no widely-agreed definitions for either “cyber warfare” or “cyber attack” [15]. Legislation in this area is crucial to maintaining order in the coming decades. B. Requirements for research and development Based on the gaps identified, much work is required to extend cybersecurity frameworks for the future. We discuss areas that must be addressed through academic research, the development of policy, and action by stakeholders. For the protection of individuals’ privacy, policies and laws on collection, storage and processing must be recon- sidered in light of the increased sensitivity and abundance of data. Education will be central to afford citizens the understanding to retain agency over their information, with progress monitored through national surveys. The development of disclosure metrics for individuals and organisations will help technology users perceive the risks of their decisions, and assist regulators in detecting abuse. Similarly, methods to predict the lifetime and impact of sensitive data would enable individuals to understand their risk exposure. Usable privacy
  • 18. solutions should also be developed, such as applications which highlight the data collected by wearable devices. For organisations, research into the human aspects of insider threat, such as behavioural psychology and crimino- logy, is important. Both the corporate adoption of education schemes and the consideration of employees’ actions are critical to proactively mitigating these risks. Whilst existing studies have began to explore the psychological, behavioural, cultural and technical characteristics of insider threats [19], further research is required to develop comprehensive solutions that are ultimately useful for organisations. The evolution of current practice is crucial if companies wish to maintain resistance against increasingly-sophisticated insider threats. BYOD currently poses a number of organisational chal- lenges [17, 20] and our emerging scenarios suggest this situ- ation will only become worse. To protect companies against the increased number and variety of personal devices brought to the office, two developments are required. Firstly, ap- plications should be built which secure company networks against personal technologies. Secondly, effective organisa- tional policies should be implemented to manage risk in a future pervaded by wearable and implanted devices. At the national level, more specificity is required in calls for international cooperation. Whilst some states will negotiate mutual pledges, as the US and China did in their 2015 Cyber Agreement [21], the global unification of security postures is highly unlikely. As such, both international cybersecurity bodies and frameworks for addressing key matters, such as piracy and cross-border law enforcement, must function in a heterogeneous environment. We predict the ownership of large parts of critical national infrastructure by foreign corporations, granting power to foreign investors and underlying states. In
  • 19. a cyber landscape in which nations are not concordant, legal agreements and technological solutions are required to address external control over critical infrastructure. As offensive cyber weapons proliferate and global ten- sions increase, legislation on ‘acts of war’ and digitally- proportionate responses must be defined. It is inevitable that nation-state cyber weaponry will reach the black market, and therefore it is crucial the consequences of its use are under- stood [22]. Furthermore, strong controls should be enforced to avoid the escalation of these most harmful risks. Interactions between individuals, organisations and na- tions must also be considered. While we have explored the future scenario impact on our three stakeholders individually, in reality, there are several relationships and dependencies between these parties. For example, the effect that advances in machine learning will have on individuals’ privacy will depend largely on both the regulations governing data col- lection and the behaviour of organisations. There is, how- ever, no single document that details cybersecurity guidelines across these three differing levels. This lack of cohesion introduces complex subtleties and key dependencies can be missed. Broad frameworks would enable a clear vision of both current cybersecurity practices and their future weak- nesses. However, it might be challenging to locate a party capable of constructing such a comprehensive document. This is especially true considering the international scope of the field, with recommendations required to be equally applicable to a Chinese power station and a US bank. Specificity is naturally in tension with generality, and new guidelines would also require frequent update to ensure they are suitable. These factors, and the difficulty of selecting an authorship trusted by all international parties, ensure that the construction of a comprehensive document would be challenging. However, such efforts are essential if we wish to protect security and
  • 20. privacy in an uncertain global future. Perhaps a more achievable proposal would be to increase collaboration between guideline authors at the individual, organisational and national levels. Such unification of ap- proaches and ideas would enable the production of frameworks which are more strongly contextualised against cybersecurity at all levels. We illustrate this point by drawing on our above example, which is based on advances in machine learning and consumer prediction capabilities. This scenario highlights a discrepancy between the recommendations made at the organisational and individual level. Given that guidelines in the 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI) 978-1-5090-1131-5/16/$31.00 ©2016 IEEE former [12,13] provide only limited advice on the use of sens- itive consumer information, the latter’s [10, 11] recommenda- tions on data protection is inadequate to maintain privacy in a world of sensor-based devices. In the space between these two frameworks, there are real risks of organisational data use against individuals’ wishes. These gaps could be bridged by collaboration between the authors of such guidelines. V. CONCLUSIONS Emerging scenarios provide useful insights into the fu- ture of security and privacy. While previous predictions are diverse in their scopes and audiences, through constructing representative futures, we explored those developments most frequently-anticipated. Our study of the impact on individu-
  • 21. als, organisations and nations highlights key considerations for these important stakeholders. We also analysed existing frameworks and best practices, identifying guidelines which do not adequately translate to future predictions. Our findings suggest avenues for research and development, including pri- vacy education for the general public, approaches to mitigate organisational insider threat, greater investigation of BYOD risk, and international legislation for cyber warfare. It is crucial that emerging scenarios are explored in anticipation of the security and privacy threats of the coming decade. We have considered a number of possibilities for future work. Firstly, we could conduct a technical analysis of why reputable guidelines fail to address the threats from emerging scenarios. In this work, we could rigorously examine indi- vidual sections in ‘best practice’ documents and identify which areas urgently require updating. Secondly, these emerging scenarios could be explored within the context of specific domains, such as healthcare. The pervasion of the Internet-of- Things might be beneficial with the development of wireless sensors, but also threaten welfare when critical devices are compromised. Similarly, although advances in big data might revolutionise the detection of health conditions, these measures could unfairly increase insurance premiums for victims of profiling. Finally, we could expand our scope and explore future possibilities for technology as a whole. The coming decades promise to radically change how we perceive digital devices, and it is crucial we are prepared for these novel developments. REFERENCES [1] D. Burt, A. Kleiner, J. P. Nicholas, and K. Sullivan, Cyberspace2025, 2014. [2] K.-K. R. Choo, “The cyber threat landscape: Challenges and
  • 22. future research directions,” Computers & Security, vol. 30, no. 8, 2011. [3] World Economic Forum, Risk and responsibility in a hyperconnected world, 2014. [4] R. Hunter, “The future of global information security,” 2013, Accessed 8 June 2016. [Online]. Available: www.gartner.com/it/content/2479400/ 2479415/june 13 future of global info sec rhunter.pdf [5] International Cyber Security Protection Alliance, “Project 2020,” 2013, Accessed 8 June 2016. [Online]. Available: www.europol.europa.eu/ sites/default/files/publications/2020 white paper.pdf [6] P. Schwartz, “Scenarios for the future of cyber security,” America’s Cyber Future, vol. 2, pp. 217–228, 2011. [7] Berkeley Center for Long-Term Cybersecurity, “CLTC scenarios,” 2015, Accessed 8 June 2016. [Online]. Available: cltc.berkeley.edu/scenarios/ [8] T. Benzel, “A strategic plan for cybersecurity research and development,” IEEE Security & Privacy, vol. 4, pp. 3–5, 2015. [9] M. Williams, J. R. C. Nurse, and S. Creese, “The perfect storm: The
  • 23. privacy paradox and the Internet-of-Things,” in Workshop on Challenges in Information Security and Privacy Management (ISPM) at 11th Inter- national Conference on Availability, Reliability and Security (ARES), 2016. [10] HM Government, Cyber Streetwise, Accessed 8 June 2016. [Online]. Available: www.cyberstreetwise.com/ [11] National Cyber Security Alliance, Stop.Think.Connect, Accessed 8 June 2016. [Online]. Available: www.stopthinkconnect.org/ [12] National Institute of Standards and Technology, Framework for Improv- ing Critical Infrastructure Cybersecurity, 2014. [13] Center for Internet Security, “CIS critical security controls - Version 6.0,” Accessed 8 June 2016, 2016. [Online]. Available: https://www.sans.org/critical-security-controls/controls [14] International Telecommunication Union, National Cybersecurity Strategy Guide, 2011. [15] NATO Cooperative Cyber Defence Centre of Excellence, National Cybersecurity Framework Manual, 2012. [16] C. Colwill, “Human factors in information security: The insider threat– who can you trust these days?” Information Security Technical
  • 24. Report, vol. 14, no. 4, pp. 186–196, 2009. [17] J. R. C. Nurse, A. Erola, I. Agrafiotis, M. Goldsmith, and S. Creese, “Smart insiders: Exploring the threat from insiders using the Internet- of-Things,” in ESORICS Workshops, 2015, pp. 5–14. [18] BBC News, “Hackers caused power cut in western Ukraine - US,” Accessed 8 June 2016, 2016. [Online]. Available: www.bbc.co.uk/news/ technology-35297464 [19] J. R. C. Nurse, O. Buckley, P. A. Legg, M. Goldsmith, S. Creese, G. R. T. Wright, and M. Whitty, “Understanding insider threat: A framework for characterising attacks,” in IEEE Security and Privacy Workshops (SPW), 2014, pp. 214–228. [20] A. B. Garba, J. Armarego, D. Murray, and W. Kenworthy, “Review of the information security and privacy challenges in Bring Your Own Device (BYOD) environments,” Journal of Information Privacy and Security, vol. 11, no. 1, pp. 38–54, 2015. [21] E. Rosenfeld, “US-China agree to not conduct cybertheft of intellectual property,” 2015, Accessed 8 June 2016. [Online]. Available: www.cnbc.com/2015/09/25/us-china-agree-to-not- conduct- cybertheft-of-intellectual-property-white-house.html
  • 25. [22] D. Hodges and S. Creese, “Understanding cyber-attacks,” Cyber war- fare: A multidisciplinary analysis, pp. 33–60, 2015. 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI) 978-1-5090-1131-5/16/$31.00 ©2016 IEEE Legal Implications of Using Digital Technology in Public Schools: Effects on Privacy JOANNA TUDOR, J.D * Policymakers have been attempting to improve our education system and student achievement for decades. With encouragement from recent legislation—such as the America Competes Act, which places an emphasis on STEM education— school districts across the country are incorporating technology into academics in sweeping strides. Between 2012 and 2013, the United States Department of Education gave out over $500 million in Race to the Top funds and almost every recipient school district implemented a one-to-one digital device initiative. As technology is integrated into academic instruction for use at school and at home, student interaction with digital technology has been propelled to exceptional levels. . Meanwhile, the efficiency of digital technology has made it easier tor school districts and other entities to collect, organize, and store large databases of information. School districts, state departments of education, and the U.S. Department of Education all collect and use data about students. At the local level, specific information on individual students is collected to help teachers improve their instruction or keep records of student behavior. For example, Apple’s iTunes has a behavior- monitoring application (app) available to teachers that allows
  • 26. them to compile information about which students have behavior *This article was prepared as part of the author’s work at the University of San Diego’s Mobile Technology Learning Center (MTLC), an Institute of Entrepreneurship in Education (IEE) knowledge lab in the School of Leadership and Education Sciences. 1. Ross Brenneman, Before Buying Technology, Asking ‘Why,?’ Educ. Weekly, June 18, 2014, http://www.edweek.org/tm/articles/2014/06/18/gp- definitions.html. 287 288 Journal of Law & Education [Vol. 44, No. 3 problems and which students are helpful in the classroom.2 Data collection can also help administrators make better-informed decisions about allocating resources to district programs. Sometimes districts contract with outside service providers who deliver services to students and often these service providers also collect and use personally identifiable student information to help them improve their services. For example, some school cafeterias are now using a biometric identification system to allow students to pay for lunch by scanning their fingerprint at the checkout line.3 The ease and convenience of collecting, organizing, and storing data through the use of technology means that most, if not all, of these data are in digital form. However, the convenience and insight allowed by unprecedented access to technology comes at a cost, which frequently manifests as invasions of student privacy. Furthermore, districts may not realize the extent to which student privacy is at risk, particularly because technology is constantly evolving, thus creating new risks Understanding the risks involved is only half the battle. Districts must dso navigate the multitude of federal and state laws currently in place that are designed to protect student privacy. In the last several decades, privacy laws in general have morphed as advances in technology have challenged well-established
  • 27. concepts in privacy law.4 Compounding the matter, student expectations in privacy has always been a vague concept, and is only further complicated by the challenges digital technology has introduced to privacy law. This report discusses the potential risks to students when districts incorporate technology by considering who is collecting data and why it is problematic. The report focuses on privacy violations that are likely . „ 2;,. TlPTaPTech, LLC, Easy Behavior Tracker for Teachers, iTunes Preview March ir S t5 ) P e’C° ^ US/aPP/eaSy’behaViOr‘traCker'f0r/id553367265?mt=8 (last visited 3 Adam Shanks, North Adams schools to implement biometric ID at cafeterias The T , (SePt'h13; 2° 14’ 12:11 AM)’ http://www.berksh£:Xo^ i c “6525" 0/nOrth‘adams' schools' lmPlement'biometric-id-at- cafeterias. 4. For example, in U.S. v. Jones, 132 S.Ct. 945, 181 L.Ed.2d 911 (2012) the United States Supreme Court ruled that the attachment of a Global-Positioning-System (GPS) to a car wa a rvT veemma Tf T " °f “ Amendment a"d ^ ^ - t “ ed by government officials without a warrant. J Summer 2015] Using Digital Technology in Public Schools 289 to occur due to a district’s increased use of technology and explores questions such as, “What are students’ expectations of privacy m digital technology?” and “What is a district’s duty to protect students Irom privacy invasions?” Where answers are not yet available, the report provides relevant information that districts need to contemplate as they implement technology devices and programs in their schools. The report begins with an overview of federal privacy laws, California privacy laws, and other laws that regulate technology use. The report then takes a look at how student privacy is compromised and who are the biggest contributors to student privacy invasions and why. Based on the law described in the first two sections, the report then discusses students expectations in digital technology, and
  • 28. the implications for school districts. I OVERVIEW OF RELEVANT LAW AFFECTING STUDENT PRIVACY Privacy rights can be separated into three main categories: (1) the right to access and to control access to personal information, (2) the right to personal autonomy, and (3) the right to control public use of an individual’s persona. Of these three, the first two (the right to access personal information and the right to personal autonomy) are largely at issue with regard to the legal implications of students who use digita technology in an academic setting. Federal statutes such as the Federa Educational Rights and Privacy Act (FERPA) and the Children s Online Privacy Protection Act (COPPA) address the right to access and to control access to personal information, while the United States Constitution offers some protection for personal autonomy through the First, Fourth and Fourteenth Amendments. Additionally, each state has its own set of legislation governing student privacy. Although the third category (the right to control persona) could arise within the educational context, it would be an unusual occurrence; therefore, this report will only discuss the first two privacy concepts. 290 Journal of Law & Education [VoL 44, No. 3 Within the first two ideas, there are many federal and state laws that can and do apply to school districts, students, and other interested parties. In fact, in just the past few years, almost 150 bills affecting student privacy were introduced in 40 different state legislatures6 Because of the diversity of state laws affecting student privacy this report focuses mainly on federal law. California serves as the model when exploring the intersection of federal and state law. California lends itself to these examples because its Silicon Valley hosts many high- technology companies, such as Google, Facebook, Apple, Hewlett- Packard, and Intel, to name a few.6 Of course, these companies’ influences reach far beyond the boundaries of
  • 29. California, so this is not to suggest that by being headquartered in California their impact is hmited to California. Therefore, in order to fully understand the subtleties of some of the implications presented by digital technology programs within other states, a thorough exploration and analysis of the state s law is necessary. A. Federal Privacy Laws The United States Constitution does not guarantee privacy specifically, but there are constitutional limits as to how far the government can intrude into an individual’s private life. The Fourth mendment establishes some of those limits by ensuring the right to be free from unreasonable search and seizure of body, house, papers, and effects^ In addition to the Fourth Amendment, the First and Fourteenth Amendments also protect personal autonomy. The First Amendment 20155' Bil Wouldn't St°P Data lining of Kids, Politico (Mar. 23, • http.7/www.politico.com/story/2015/03/privacy-bill- woulclrrt-ston data mining-of-ktds- 116299.html#ixzz3WYqkGlsp. woman t stop data visited GZ t t ° Cf r ^ J ttp:l/wVW g° ^ le- com/about/comPany/facts/locations/ (last M rC TT16’ 20 5)’ Faceho°k HQ, Facebook, https://www facebook com/ pages/Facebook-HQ/166793820034304 (last visited March 16. 2015); Apple Corporate Office M u d T r n t S T T T T ' 'lllP:^ecorPorateoffices-coni/App!e- !697 ^(last , i L about hnL fi015 4 C l , n Head^mrters’ HP- http://www8.hp.com/us/en/hp-information/ out- hp/headquarters.htm1 (last visited March 16, 2015); Intel Corporate Office & A 15’) eCOrP°rate0fflCeS’ http://ecorporateoffices.com/Intel-2953 (fast visited March 7. U.S. Const, amend. IV. Summer 2015] Using Digital Technology in Public Schools 291 safeguards personal autonomy by protecting the right to speak freely and peaceably assemble,8 while the Fourteenth Amendment ensures substantive due process.9 The Ninth
  • 30. Amendment protects privacy indirectly because it holds that even if a right is not explicitly mentioned in the Constitution, the government may not infringe upon that right it it has been granted through other means. Unlike the Constitution, there are many federal statutes that^directly address privacy concerns, such as the Privacy Act of 1974 Some statutes specifically concern digital privacy, like the Electronic Communications Privacy Act,13 while others are even more specific an address the digital privacy of students or minors. The following descriptions of federal law focuses on those sections that address students’ privacy rights specifically. 1. Family Educational Rights and Privacy Act (FERPA)'4 Student privacy rights begin with FERPA. FERPA governs schools that receive federal funding and maintain student education records. It requires that parental consent be given before any personally identifiable information or information held in the education record can be shared.15 Although FERPA’s jurisdiction covers elementary, secondary and post- secondary school students, this report will focus solely on the portions of FERPA that apply to K-12 students. FERPA grants rights to parents of minors and to students themselves, once they have reached the age of majority. These rights include the ability to inspect the education record that the school maintains on their eight amendments to the Constitution were not later deemed exnausuve. 12. 5 U.S.C. § 552a (2010). 13. See generally, 18 U.S.C. § 2510 (2002). 14. 20 U.S.C. § 1232g (2013). 15. Id. at § 1232g(b)(l). 8. Id. amend. I. 9. Id. amend. XIV. 10. Id. amend. IX. 11. This concept was 292 Journal of Law & Education [Vol. 44, No. 3 child and the right to challenge and correct or delete any content in the record that is inaccurate or misleading.16 Additionally, the Code of Federal Regulations, which has expanded upon FERPA’s rights requires districts to annually
  • 31. notify parents of these rights.17 The statute does not grant a private right of action however, meaning parents cannot sue districts that are in breach of FERPA. The only statutory incentive districts have to remain in compliance with this law is that federal funding can be revoked for any school in violation of the law.18 Tod d a 1^ ormatlon held in the education record is protected under FERPA. However, as information conduits and data storage methods have evolved, a bit of a controversy has developed regarding what information is included in the education record and therefore protected from being accessed by others. The statute defines “education record” as records, files, documents or other materials which “contain information irectly related to a student” and “are maintained by an educational agency or institution.” 19 At times, FERPA has been applied broadly when identifying content within the education record.20 However, the education record does not include teachers’ personal notes that are not s ared with others except substitutes, or records created and kept by law There are a few exceptions granted under the statute that allow districts to disclose information that would otherwise be a part of the education record without permission. The two most relevant exceptions include the Directory Information exception and the School Official exception.- Directory Information refers to information that is generally not considered harmful if disclosed.23 This information may include but 16. Id. at § 1232g(a)(2). 17. 34 C.F.R. § 99.67 (2012). 18. 20 U.S.C. §1232g(a)(l)(B)-(2). 19. 20 U.S.C. § 1232g(a)(4)(A). ,h, . 2° ,F0r a° examPIe- ^ U.S. v. Miami University, 91 F.Supp.2d 1132 (2000) (including student disciplinary records as part of the “education record”). 8 21. 20 U.S.C. §1232g(a)(4)(B). Subsection (a)(4)(B) also identifies records of employees is eishtee 3tl0na 316 n0t als° students of that a8ency and records on a student who is eighteen years of age or older that have been made or maintained by a physician or P ^2 2 C§?I232g(b)>t lnCluded 10 the deflnltion of “education
  • 32. record” for purposes of FERPA. 23. 34 C.F.R. § 99.3 (2012). Summer 2015] Using Digital Technology in Public Schools 293 is not limited to the student’s: name; address; telephone listing; e-mail address; photograph; date and place of birth; grade level; major field o study; participation in officially recognized activities and sports; weight and height (of members of athletic teams); dates of attendance; degrees and awards received; and the most recent previous educational agency or institution the student attended.24 Directory information does not include social security numbers.25 Under the Directory Information exception, districts may make such information public, but must give notice to parents of the categories of information that it intends to disclose. Parents have the rightto “opt out” if they do not wish for this information to be made public.26 The School Official exception allows schools to disclose to certain individuals or entities information that is considered persona y identifiable information” (PH) 27 or information that would otherwise be a part of the education record. The Code of Federal Regulations defines “personally identifiable information” as including, but not limited to a student’s: name; names of family members; address; any persona identifier such as social security number or biometric record; other indirect identifiers such as date or place of birth, or mother’s maiden name; and any other information that “alone or in combination, is linked or linkable to a specific student that would allow a reasonable person in the schoo community, who does not have personal knowledge of the relevant circumstances, to identify the student with reasonable certainty. Under this exception, the school may disclose information without prior consent to other school officials, including teachers,29 or to vendors who perform a function for the school that would otherwise be performed by a school employee.30 The vendor must have a legitimate educational interest in the data and the school must be in direct control of the vendor’s use and maintenance of the
  • 33. data.31 Under these circumstances, the vendor can 24. 20 U.S.C. § 1232g(a)(5)(A); 34 C.F.R. § 99.3. 25. Id. at § 1232g(a)(5)(B). 27. The statute does not define what constitutes “personally identifiable information. 28. 34 C.F.R. §99.3. 29. Id. at § 99.31 (a)(l)(i)(A). 30. Id. at § 99.31 (a)(l)(i)(B). 31. Id. at § 99.31(a)(1). 294 Journal of Law & Education [Vol. 44, No. 3 use PII for authorized purposes only, and cannot redisclose the data without permission from the school and written consent from the parent. In light of the current trend of heightened violence in schools two other exceptions under FERPA are worth noting. The first includes situations where an emergency has arisen and PII or information from the education record must be disclosed to protect the health or safety of the student or others. The second includes disclosing disciplinary information in the education record that concerns student conduct that has posed a significant risk to the safety or well- being of that student or others. Such information may be disclosed to teachers, school officials or others who have a legitimate educational interest in that student’s behavior. Whenever an exception applies, information may only be disclosed on the conditions that the recipient of the information not share it with any other party without prior consent of the parent, and the information will only be used for the purpose for which the disclosure was made.35 Whenever a disclosure has been made, the district must make a reasonable attempt to notify the parents and provide them with a copy of the record that was disclosed.36 This has big implications for school districts looking to contracting with third-party providers if the third- party will or may have access to student records. FERPA also requires that districts keep a record in each student’s file ot any individual or entity that has accessed or maintained the education record Along with that information the district must indicate what that individual or entity’s legitimate interest was
  • 34. in obtaining the information. Where a third party maintains education records under the School Official exception, the district must be able to provide a requesting parent access to those records. 32. 20 U.S.C. § 1232g(b)(4)(B). 33. Id. at § 1232g(b)(l)(F). 34. Id. at § 1232g(h)(2). 35. 34 C.F.R. § 99.33(a). 36. Id. at § 99.34(a)(1). 37. 20 U.S.C. § 1232g(b)(4)(A). Summer 2015] Using Digital Technology in Public Schools 295 FERPA also requires that regulations governing surveys or^data- collecting activities be implemented to protect student privacy.^ This requirement was codified in what is sometimes referred to as the “Hatch Amendment,” but is also known as the Protection of Pupil Rights Amendment. There are some significant problems that are not addressed by the FERPA legislation. For these reasons, there is a strong likelihood that a proposal to amend FERPA will arise in Congress in the near future. In fact, during a congressional hearing held in early 2015, the House of Representatives heard testimony of several experts regarding the ways in which FERPA is falling short of protecting student privacy. At this time, it appears the House is considering updating FERPA 2. Protection of Pupil Rights Amendment of 1978 (PPRAf0 PPRA governs the administration of surveys soliciting specific categories of information, and imposes certain requirements regarding the collection and use of student information for marketing purposes. It requires that schools give notice to, obtain written consent from, and provide an opt-out opportunity to parents before students can participate in commercial activities that involve the collection, disclosure or use of personal information for marketing purposes.41 The exception to this is if a vendor is using student data solely for the purpose of developing, evaluating, or providing educational products or services to students or schools.42 Under the statute, “personal information” is defined as individually identifiable information including a student or parent s first and last name, a physical address, a telephone number, or a social security number.43 A ,
  • 35. _ . In addition to the difference in the way PPRA and FERPA define “personally identifiable information,” some of the other distinctions 38. Id. at § 1232g(c). . 39. Technology and Student Privacy: Before the H. Comm. On Education and the Workforce 114th Cong. (2015) (statement of Todd Rokita, Chairman Subcomm. On Early Childhood, Elementary, and Secondary Education). 40. 20U.S.C. § 1232h (2002). 41. Id. 42. Id. at § 1232h(c)(4)(A). 43. Id. at § 1232h(c)(6)(E). 296 Journal of Law & Education [Vol. 44, No. 3 between the two statutes are that FERPA is triggered where a school simply maintains education records containing personally identifiable information, while PPRA is only invoked when personal information is collected from the student. An example illustrating the difference between the two comes from the Privacy Technical Assistance Center a subset of the U.S. Department of Education. If a district contracts with a provider under FERPA’s school official exception for basic applications designed to help educate students, and sets up user accounts using basic enrollment information, such as name and grade, then the provider may not use data about individual student preferences to target ads to individual students. This would be an unauthorized use of the data, and would not constitute a legitimate educational interest as specified under FERPA. PPRA would also prohibit targeted ads unless the district had a policy addressing this and parents were notified. Parents must also have been given an opportunity to opt-out of the marketing. However, the provider could use data conveying personally identifiable information to improve their product and its delivery. Additionally, the provider could use non-PII data to develop new products and services that the provider could then market to schools and districts.44 PPRA applies to K-12 students only, while FERPA protections extend to post- secondary students. Like FERPA, there is no private right of action under PPRA, so students and parents cannot enforce
  • 36. compliance with the statute. However, schools that do not comply jeopardize federal funding.45 Also noteworthy, PPRA applies to all students, regardless of age, as opposed to the Children’s Online Privacy Protection Act, discussed in the next section, which only governs children age thirteen and younger. , J4 Protecting Student Privacy While Using Online Educational Services: Requirements citlfn f Puft Ce% P7 aCy Technical Assistance Center (February 2014), http://ptac.ed.gov/ %Sftb“,“ 202oT^;“ ™ vac),* 20a"d%20Onli"e%20Edu“ tio"aW20s" vi“ s%20 Hooff 20 U S C ® I232h(e); See also’ CN ' v- Ridgewood Bd. of Edu., 146 F. Supp.2d 528 Summer 2015] Using Digital Technology in Public Schools 297 3. Children’s Online Privacy Protection Act (COPPA f ' COPPA is designed to protect the privacy of children under the age of thirteen COPPA prohibits commercial website operators from collecting personally identifiable information from children under thirteen without first obtaining “verifiable parental consent.” The Code of Federal Regulations define PII as: [I]ndividually identifiable information about an individual collected online, including: a first and last name, a home or other physical address including street name and name of a city or town, online contact information [such as email address or other substantially similar identifier], a screen or user name, a telephone number, a social security number, a persistent identifier that can be used to recognize a user over time and across different Web sites or online services (includes but is not limited to a customer number held in a cookie, an Internet Protocol address, a processor or device serial number, or unique device identifier), a photograph, video or audio file where such files contains a child’s image or voice, geolocation information sufficient to identify street name and the name of a city or town, or information concerning the child or the parents of that child that the operator collects online from the child and combines with an identifier described above. Consent must be obtained even when such information is
  • 37. offered voluntarily. COPPA applies to any website operator or online service that is directed to children. The Code of Federal Regulations expounds on some of the factors that are considered when evaluating whether a website is directed towards children. These factors include subject matter of the site, visual content, use of animated characters or chil - oriented activities, music and other audio content, age of models presence of child celebrities or celebrities who appeal to children, and whether advertising promoting or appearing on the website is directed to children.49 When a website has “actual knowledge that it collects 46. 15 U.S.C. §§ 6501-6506. 47. Id. at § 6502. 48. 16 C.F.R§ 312.2 (2013). 49. Id. 298 Journal of Law & Education [Vol. 44, No. 3 personal information directly from users of another website” that is directed towards children, the first website is also considered directed towards children for purposes of applying COPPA.50 COPPA also covers operators who direct their content towards a general audience but have actual knowledge that they are collecting personal information rom a child. Websites that do not target children as their primary audience are exempt under COPPA if they do not collect personal information from any visitor prior to collecting age information and prevents the use, collection or disclosure of personal information from visitors who identify themselves as under the age of thirteen.52 Website operators who fall under COPPA regulations must provide notice on their website of their data collection policies as well as allow parents to review the personal information collected from a child and allow parents to refuse further permission of its use.53 Once personal information is collected, the website operator must protect the confidentiality, security and integrity of that information and take reasonable steps to release the information only to third parties who are capable of maintaining the confidentiality, security and integrity of the information.54 Any personal
  • 38. information collected may be retained only for as long as is necessary to fulfill the puipose for which the information was collected Once that purpose has been fulfilled, the operator must delete the information using reasonable measures to protect against unauthorized access to, or use of, the information.55 Like FERPA and PPRA, there is no private right of action for children whose information has been collected in violation of COPPA. However, when a website operator has violated the Act, a state’s attorney general may bring a civil action to enjoin the illegal practice and enforce compliance as well as obtain damages, restitution or other 50. id. 51. Id. 52. 15U.S.C. § 6502. 53. Id. at § 312.3. 54. Id. at § 312.8. 55. Id. §312.10. Summer 2015] Using Digital Technology in Public Schools 299 compensation on behalf of residents of the State.16 The statute also allows for “other relief as the court may consider to be appropriate. 4. Children’s Internet Protection Act (CIPA)^& Neighborhood Children’s Internet Protection Act (NCIPAf8 Congress enacted CIPA in response to public library patrons who were using the library’s Internet to access pornographic websites. Under CIPA, a public library, or elementary or secondary school may not receive federal funds under the E-rate program or the Library Services and Technology Act (LSTA) unless it has “a policy of Internet safety for minors that includes the operation of a technology protection measure ... that protects against access... to visual depictions that are obscene or pornographic or harmful to minors. These entities must submit certification that they are complying with the protection measures required by CIPA.61 Failure to do so may result in loss of federal funding.62 . ... . One important thing to note about CIPA is that while it protects minors from exposure to pornography, it is not designed to protect against other non- visual communications on the Internet. Thus, Cl leaves minors unprotected from other forms of inappropriate or harmfu
  • 39. interactions on the Internet, including data mining and exposure to cyberbullying. NCIPA tackles this problem in part by requiring schools and public libraries to adopt and implement an Internet safety policy that addresses the accessibility of inappropriate material on the Internet the safety and security of minors while using email and other forms of Internet communication, and the unauthorized disclosure of personally 56. 15 U.S.C§ 6504 (2012). 57 Id 58* 20 U S C. § 9134(f) (2012). 47 U.S.C. § 254(h) (2012). 59. United States, v. Am. Library Ass’n. Inc., 539 U.S. 194, 199 (2003) 60 20 U.S.C. § 9134(f)( 1)(A)(i) (2012). See 47 U.S.C. § 254(h)(5)(B)(i) (2012). 61.47 U.S.C. § 254(h)(5)(B) (2012). 62 Id § 254(h)(5)(F) (2012). See 20 U.S.C. § 9134(f)(5) (2012). 63. Frank Kemerer & Peter Sansom, California School Law 81 (Stanford University Press, 3d ed. 2013). 300 Journal of Law & Education [Vol. 44, No. 3 identifiable information« NCIPA also addresses hacking and other unlawful activities by minors online.65 ®^6°SSible LegisIation: Student Digits' Privacy and Parental Rights At the time of this writing, U.S. Representatives Jared Polis and Luke Messer are sponsoring a bill known on Capitol Hill as the Student Digital Privacy and Parental Rights Act. Although it has not yet been formally introduced in Congress, a draft of the bill has circulated. Due to the criticism the bill has received, the introduction has been postponed Among the bill’s critics are privacy advocates who feel the bill is not rigorous enough to adequately protect student privacy, as well as technology groups who oppose the bill because they favor self- regulation.68 The current draft of the Student Digital Privacy and Parental Rights Act appears to allow education technology companies to continue to collect and compile student information and then mine that data for commercial gain. Apparently, such companies may sell students’ personal information to colleges, employers,
  • 40. and military recruiters.69 The current draft also appears to allow schools to authorize even wider isclosure of student data without parental notification or permission.70 However, the current bill does ban the use of targeted advertising, and it requires that ed-tech companies must delete data within 45 days upon a school s request. Making it unique among other federal student privacy 64. 47 U.S.C. § 254(1) (2012). 65. Id. 6. At this time, a bill has not been formally introduced and there is no bill number 67. Simon, supra note 5; Benjamin Herold, Federal Student-Data Privacy Bill Delayed fa wn- Education Week (March- 23, 2015, 9:33pm), http://blog edweek o g/ edweek/Dig,talEduCat,on/2015/03/federal_student_data_privaC y_bill delayed.html?pdnt^i°r^ York T i m e T S " W°M U* °fStudent * * The New lin>i,-u,e^si t a d a « L ^ ^ 69. Simon, supra note 5 70. Id. Summer 2015] Using Digital Technology in Public Schools 301 laws, the current version of the bill appoints the Federal Trade Commission to handle enforcement of the law.71 Of course, there is no guarantee that the Student Digital Privacy an Parental Rights Act will become law, and if it does, there is no guarantee that any of the elements mentioned above will be present in the final version of the bill. It is clear, however, that Congress is alert to the growing concerns of parents and students regarding student privacy being at risk. Regardless of whether the Student Digital Privacy Act^and Parental Rights Act is approved by Congress, the reader should be aware that there will likely be more legislation to follow. C. Relevant California Law The following list is not comprehensive, but it does highlight the breadth as well as give substantial insight into current California laws that affect student privacy: 1. California Constitution Article 1, section 1 of the California Constitution explicitly grants privacy as an inalienable right: “All people are by nature
  • 41. free and independent and have inalienable rights. Among these are enjoying and defending life and liberty, acquiring, possessing, and protecting property, and pursuing and obtaining safety, happiness, and privacy. 2. California Civil Code section 1798.2973 Any agency in California that owns or licenses or maintains computerized data that includes personally identifiable information (PII) must disclose any breach of the data to every California resident whose unencrypted personal information is reasonably believed to have been acquired by an unauthorized person. Included under the definition of PH is an individual’s first name (or initial) and last name, m combination with one or more designated data elements relating to. 71. Id. 72. Cal. Const, art. 1, § 1. 73. Cal. Civ. Code § 1798.29 (West 2014). 302 Journal of Law & Education [Vol. 44, No. 3 social security numbers, driver’s license numbers, financial accounts and medical information. As well, a user name or email address in combination with a password or security question and answer that lAformano™'1 t0 “ 0nline account also qualifies as personal 3. California Business and Professions Code section 2258074 This is the California version of COPPA, which went into effect January 1, 2015. It prohibits the operator of a website, online service online application or mobile application from marketing or advertising certain products to a minor. Such products include alcohol, firearms aerosol containers capable of defacing property, tobacco products and electronic cigarettes, salvia divinorum, fireworks, tanning devices dietary supplements, lottery tickets, tattoos, drug paraphernalia and obscene matter. It prohibits an operator from knowingly using disclosing, compiling, or allowing a third party to use, disclose or compile, the personal information of a minor for the purpose’ of marketing or advertising products. The prohibition is also applicable to an advertising service that has been notified by the
  • 42. website operator that t e site is directed to minors. One thing that distinguishes this piece of egislation from COPPA is that operators of mobile applications are definitively obligated under this law, whereas COPPA is less clear as to whether operators of mobile applications fall within the confines of the law. This law also authorizes a minor to have content or information removed. 4. California Online Privacy Act (sections 22575-22579)15 Section 22575 Business and Professions Code states that commercial website operators and online services that collect PII about customers must post a conspicuous privacy policy on their website. The policy must identify the types of data being collected and the types of third 74. Cal. Bus. & Prof. Code § 22580 (West 2015) 75. Id. §§ 22575-22579 (West 2014). Summer 2015] Using Digital Technology in Public Schools 303 parties with whom the information may be shared. Section 22576 states that website operators are in violation of this section if the operator knowingly and willfully or negligently and materially violates the statute. PII includes first and last name, physical address, email address, telephone number, social security number, and any other identifier that permits the physical or online contact of a specific individual. 5. New Legislation: Student Online Personal Information Protection Act (SOPIPA)(Cal. Bus. & Prof. Code sections 22584-22585f 6 Senate Bill 1177 brought this newly enacted piece of legislation into existence, effective on January 1, 2016. It will prohibit an operator of a website, online service, online application, or mobile application from knowingly engaging in targeted advertising to students or their parents or legal guardians. Such operators will be prohibited from using covered information to collect student profiles, selling student information, and disclosing covered information. It requires operators to implement and maintain reasonable security procedures and practices to protect the information from unauthorized access,
  • 43. and it requires operators to delete student data if requested to do so by a school or district. 6. California Education Code: Section 4906277 gives authority to the State Board of Education to establish the standards under which school distncts will establish, maintain, and destroy student records. Section 35182.578 states that a school may not enter into a contract for electronic products or services that requires the dissemination of advertising to students unless the electronic product or service would be an integral educational component and the district cannot afford to provide the product or service without contracting to permit dissemination of advertising. In such a case, the school must provide written notice to parents that the advertising will be used and parents 76. S.B. 1177 (Cal. 2014). http://www.leginfo.ca.gov/pub/13- 14/bill/sen/sb_11511200/ sb_l 177_bill_20140929_chaptered.pdf 77. Cal. Educ. Code § 49062 (West 2014). 78. Id. § 35182.5 (West 2014). 304 Journal of Law & Education [Vol. 44, No. 3 must be given the opportunity to request in writing that the student not be exposed to the program containing the advertising. 7. Privacy of Pupil Records (sections 49073-49079.7)79 Similar to requirements under FERPA, California law says schools must adopt a policy and give annual notice identifying which information will be defined as “directory information” and thus will be subject to release. Parents have the right to prohibit directory information from being released if they notify the school. Additionally n,° inf°iy iatl0Ib including directory information, regarding a student identified as homeless may be released without consent. As well no directory information about any student may be released to a private profit-making entity.80 Parents may give written consent to allow access to student records to anyone of their choosing. Those who have permission to access records must be given notice that the
  • 44. transmission of the information to others without written consent from the parent is prohibited.81 Districts cannot permit access to student records to anyone without parental consent except as established in the regulations under FERPA. This section includes a few exceptions to who may obtain records without parental consent, including school officials and employees, and students sixteen years of age or older or who have completed 10th grade Students who are fourteen years of age or older who are either homeless or unaccompanied (42 USC section 11434a) are also entitled to their own records. Schools may release records when there is an emergency and the information is necessary to protect the health or safety of the student or others. The statute also allows information to be released to organizations that conduct studies for educational agencies for the purpose of developing programs or tests or improving instruction. 6 79. Id. §§ 49073-49079.9 (West 2014). 80. Id. § 49073(a) (West 2014). 81. W. § 49075(a) (West 2002). 82. Id. § 49076(a)(2)(E) (West 2014). Summer 2015] Using Digital Technology in Public Schools 305 Districts must provide teachers with information when a student has engaged in, or is reasonably suspected to have engaged in, acts that warrant suspension or expulsion under the education code. Information provided to teachers in these instances will come from records that the district maintains in its ordinary course of business. 8. New Legislation: Cal. Educ. Code section 49073.6M Assembly Bill 1442 went into effect on January 1, 2015, and requires schools to contact parents when considering the implementation of a program to gather or maintain as part of the education record, information about a student obtained from social media. Parents must be informed of the proposed program and provided with an opportunity to offer public comment before any such program may be adopted, lhis section also requires the destruction of such information one year after a
  • 45. student reaches the age of eighteen, or within one year after the student is no longer enrolled in the district. 9. New Legislation: Cal. Educ. Code section 49073. E5 Assembly Bill 1584 also went into effect on January 1, 2015, and allows schools to enter into contracts with third parties to provide the digital storage, management, and retrieval of student records. Include in such contracts, there must be a statement declaring that student records continue to be the property of and under the control of the district, along with a description of the actions the third party will take to ensure the security and confidentiality of student records and a description of how the district will ensure compliance with FERPA. Contracts failing to comply with these requirements are void. The law does not apply to any contracts existing prior to January 1, 2015, unless or until they expire, are amended or are renewed. 83. Id. § 49079(a) (West 2001). 84. Id. § 49073.6 (West 2015). 85. Id. §49073.1 (West 2015). 306 Journal of Law & Education [Vol. 44, No. 3 II. WHO IS COLLECTING DATA, HOW IS IT BEING COLLECTED, AND WHY DOES IT MATTER? As districts begin to incorporate technology into their schools, especially where one-to-one programs are in place, understanding the actions that expose students to risk is imperative. There are three main groups that are interested in data collection. These include advertisers criminals, and the government.86 Each group wants different information for vaiying reasons, but they are all using technology to access the information they are seeking, and are jeopardizing students’ privacy in the process. Of course, as individual consumers, we are all at risk of privacy invasion when using the Internet, but students in particular are a vulnerable group because of their exposure to online dealings both at home and at school as well as their maturity level and v 87C^ ° f lnhlbltlon with regard to sharing personal information online. The following sections will
  • 46. discuss how each of the three main ata collectors (advertisers, criminals and government) uses technology to mliltrate technology users’ lives and gather private data. A. Advertisers Advertisers want to reach large audiences efficiently and effectively, they do this using a process called “targeted advertising” which allows marketers to place advertisements in front of groups of consumers based on various traits such as demographics and other behavioral variables 1 his kind of behavioral targeting uses information gathered about individual consumers’ habits so as to gain insights about the types of products and services that an individual is likely to purchase. Then marketers can strategically place advertisements in front of an individual that are specifically tailored to that individual. One of the most efficient ways to collect data about individual consumers is by hitn //6’ Fact Sheet 2bu Pnvacy m ,he ASe ° f the Smartphone, Privacy Rights Clearinghouse hti^y/www.pnvacynghts.c^smartphone-celP/^Ophone- privacj^martphonedata (last updated Feb.’ MlSS8L.JMia035G(20Wl 0 ^ ’ ReaS°nable ExPectati°™ of Privacy for Youth in a Digital Age, 80 Summer 2015] Using Digital Technology in Public Schools 307 tracking their activity online. Thus, a quick overview of how data miners are able to track activity is necessary to better understand how advertisers work and why. 1. IP Addresses In the past, every digital device was designed to have its own Internet Protocol (IP) address, which could be used to identify the web activities of each device. When signing up with an Internet service provider (ISP), a record connecting a computer’s IP address to an individual or an entity was created. By simply browsing the Internet, a record was created for every website the device visited.88 More devices are entering society and the average Amencan now owns four devices.89 This means there are more connected devices than there are unique public
  • 47. IP addresses.90 To resolve the problem, a public IP address is assigned to a router, which then shares the address among all other computers and devices that access Internet via that particular router.91 This process is sometimes referred to as “IP tethering” because it ties all devices using the same router together. Local IP addresses are then issued to devices that use the router to make them distinguishable from each other. Generally, a device needs to access the Internet more than once for the device to be tied to the public IP address. Regardless of whether the individual device or the router is given the public IP address, information can be tracked back to the individual or entity that is registered as owning that IP address. In the case of a 88. Fact Sheet 18: Online Privacy: Using the Internet Safely, PRIVACY RIGHTS Clearinghouse, https://www.privacyrights.org/online-privacy-usmg-intemet- safely, (last updated Oct. 2014). . . , . tl 89 The U.S. Digital Consumer Report, Nielsen, http://www.melsen.com/content/ corporate/us/en/insights/reports/2014/the-us-digital-consumer- report.html (last updated Feb. ’ *90 Chris Hoffman, How and Why All Devices in Your Home Share One IP Address, HOW-TO Geek, http://www.howtogeek.com/148664/how-and-why-all-devices- m-your-home share-one-ip-address/ (last updated Apr. 15, 2013). 92. Gavin Dunaway, ID Is Key: Unlocking Mobile Tracking & Cross-Device Measurement, Part 2, ADMONSTERS, https://www.admonsters.com/blog/id-key-unlockmg- mobile- tracking-cross-device-measurement-part-2 (last updated Aug. 3, 2013). 308 Journal of Law & Education [Vol. 44, No. 3 tstrict owned device, the IP address would be registered to the district, and thus the connection between IP address and owner would not immediately implicate a student who has been
  • 48. assigned to that device However, as is brought to light in the subsequent sections, due to advances in technology, this is a specious conviction. 2. Search Engines Data collection does not end with IP addresses, however. Search engines, such as Google, Yahoo and Bing, are programs that are designed to search documents on the Internet for specific keywords and then generate a list of documents where the keywords were found.93 Search engines have the ability to not just track an IP address, but also the terms that were used in the search, the time that the search occurred as well as other data.94 This is where data collection can begin to implicate individual students, regardless of to whom the device is registered. If search terms reveal personal information, for example a name and a social security number, then the search engine will have a record of that information connecting it to the user of the device. When an individual uses the same entity for a search engine and email accounts, for example, Google and Gmail, the search engine can then tie the email login to the searches as well.95 As this occurs, data collection becomes no longer tied to the registered owner of the device, but rather is tied to the individual user. 3. Cookies “Cookies” are another tracking tool that websites deposit on a device’s hard drive to record and save information. The cookie enables the web browser to retain information such as login, user preferences, 93. Vangie Beal, search engine, Webopedia, http://www.webopedia.eom/TERM/S/ search_engme.html (last visited Sept. 30, 2014). 94. Fact Sheet 18, supra note 88. 95. Id. rpre9' lS <he difference between a web browser and a search engine?, The Computer SiTawi ^^//^r/W COmputer-geek net/what-is-the-difference-be-va-47.html (last visited Oct. 1 - 14) (A web browser is different from a search engine in that the browser is the tool used to Summer 2015] Using Digital Technology in Public Schools 309
  • 49. shopping cart information and other personalized information that bolster the ease and speed of digital interactions. Although this can be convenient for the user, some cookies also track and collect information about the user, which is then communicated to advertisers and online marketers ^ Websites such as www.Ghostery.com allow users to identify and block companies that are tracking through the use of cookies. Moreover, smartphones, which are being increasingly used to access webpages, do not use cookies. This combination has caused cookies to lose appeal to data miners who are trying to stay ahead of consumers. Instead, data miners are now turning to what is known as fingerprinting technology. 4. Fingerprinting Similar to cookies, but far more surreptitious, are “fingerprints” which summarize the software and hardware settings (like clock settings, fonts, installed software, etc.) that are unique to each computer Each time the device connects to the Internet, these details are collected and pieced together to form a “fingerprint.” The more changes to a device’s software and settings, the more identifiable the device becomes." The fingerprint can be assigned an identifying number that is used to track the device in the same way cookies are used. Fingerprints, unlike cookies—which can be detected and removed from a hard drive—are impossible to detect and much tougher to block. This makes them a preferable tracking method for data miners and thus fingerprinting is being used more and more frequently than antiquated cookies. To identify the details a computer or other device is transmitting, visit www.panopticlick.eff.org. access the Internet. The search engine is the tool used to search within the Internet once it has been accessed by the browser). 97. Fact Sheet 18, supra note 88. T, . 98. Adam Tanner, The Web Cookie Is Dying. Here’s The Creepier Technology That Comes Next, Forbes, http://www.forbes.com/sites/adamtanner/2013/06/17/the-web- cookie-isdying-heres-the-creepier-technology-that-comes-next/ (last updated Jun. 17, 2013).
  • 50. 99. Id. 310 Journal of Law & Education [Vol. 44, No. 3 5. Householding Technology100 has now advanced to the point where the combination of IP tethering and fingerprinting can be used to track across multiple devices, so that they are all associated with one user or one household, and layer all the information collected into one profile.101 This process] called “householding,” is also sometimes referred to as “multi-screen” or cross-device identification. In the past, unique identifiers were linked to individual computers, but now identifiers are more closely tied to the individual using the device.102 More than any other tracking method, householding has major implications for school districts with one-to-one technology programs regardless of whether the program offers district- owned devices or through a Bring-Your-Own-Device (BYOD) program. Householding allows data collection to occur across devices regardless of whether or not they are owned by a district or owned privately by a student, and thus any activity that a student engages in at home on a personal device can be linked to activity conducted by that student at school. Householding is able to identify every device the student uses, regardless of who owns the device, where it is used, and for what purpose. Therefore the line between data collected from the student on the student’s own device on his or her own time, and data collected from the student while using a district device at school is virtually eliminated. Data miners will not distinguish between data collected at school from data collected at home. Nor will they distinguish between data collected on a district device form data collected on a personal device. All devices are capable of being linked to one individual and all data collected across devices can be compounded and cross- referenced to create a broader image of who an individual user is. ] 00. See Cross-Screen Starrs Here, BLUECAVA,
  • 51. Bluecava.com/about (last visited Oct 1 2014) See also Specific Media Launches Householding, Specific Media, http://specificmedia.com/specific-medialaunches-householding- 2/(last updated Sept 24,2012). 101. Dunaway, supra note 92. 102. Id. Summer 2015] Using Digital Technology in Public Schools 311 6. Smartphones and Tablets Smartphones, like computers, have their own IP address, and thus can be tracked in the same manner that a computer can be tracked. Service providers for smartphones are also able to collect information about incoming and outgoing calls (including phone numbers and duration), how frequently users check email or access the Internet from the phone, and the user’s location.103 Most individuals who use smartphones (and tablets) use mobile applications (apps) rather than an Internet browser, to access the Internet and its features. Apps can provide a wide variety of functions ranging from mapping services, to games, to flashlights—all of which make them attractive to users. The apps often have a dual function however. Naturally, they deliver the content or service they are designed to provide, but they are also advertising gold mines. Particularly when an app is free to download, developers make money either by selling space within the app to advertisers who can market to a captive audience, or by collecting data from the unsuspecting user and then selling it to marketing agencies. Generally, the type of data collected includes phone and email contacts, call logs, Internet data, calendar data, data about the device s location, the device’s unique IDs and information about how the user engages with the app itself.105 Some apps have a user agreement alerting users to their data collection policies, but the privacy policy (if it exists at all) does not always identify what data are being collected, how long the data are being stored, where the data are being stored, and who may ultimately end up receiving the data.106 Thus, these policies are often unleashed on users who are unaware their data are being harvested. For example, Angry Birds, one of the first games
  • 52. made for smart phones, 103. Fact Sheet 2B, supra note 86. 104. Understanding Mobile Apps, On Guard Online, http://www.onguardonhne.gov/ articles/0018-understanding-mobile-apps (last visited Oct. 1, 2014). 105. Id. . 106. FPF Finds Nearly Three-Quarters of Most Downloaded Mobile Apps Lack a Privacy Policy Future of Privacy Forum, http://www.futureofprivacy.org/2011/05/12/fpf- findsnearly-three-quarters-of-most-downloaded-mobile-apps- lack-a-privacy-policy (last updated May 12, 2011) (The Future Privacy Forum found that nearly 75% of downloadable apps lack a privacy policy). 312 Journal of Law & Education [Vol. 44, No. 3 collects personal information such as location data from users that is unnecessary to run the game.107 In fact, in a 2013 Carnegie Mellon study, over half of the top 100 Android applications accessed the device ID, contacts lists or location data. The study found that this was often contrary to user expectation.108 In a similar investigation, the Wall Street Journal found that almost half of 101 smartphone apps were tracking the phone’s location.109 Fifty-six shared the phone’s unique ID number forty-seven transmitted the phone’s location, and five shared the user’s &§>£■> gender and other personal details such as contacts list.110 Victims of this kind of privacy invasion extend beyond just the users of mobile apps. Any of the users’ contacts are also at risk of having information collected. Of course, contact information such as phone number and email address will be collected if the users’ address book is accessed, but any information offered in an email or text message may be intercepted and collected as well. Districts with one-to-one technology programs need to be careful when students download and use apps for classroom or homework assignments. FERPA does not appear to cover any information collected by a third-party app operator, and COPPA only protects students up to age thirteen, so student information accessed in this manner can be easily collected without federal violation. Thus, the question arises, whose responsibility is it to
  • 53. protect students from exposure to data collection in this situation? Do parents or students have the right to opt out of an assignment that requires the use of a mobile app? How can students who wish to maintain anonymity protect themselves? What rights do they have? The answers to these questions remain unresolved. However, as more legislation is ... !,07- Crhe'jyl Conner’ Your Privacy has Gone to the [Angry] Birds, FORBES, nttp.//www.forbes.com/sites/cherylsnappconner/2012/12/05/you r-privacy-has-gone-to-theangry-birds/ (last updated Dec. 05, 2012). 108. Press Release: Did Your Smartphone Flashlight Rat You Out? Crowdsourcing Privacy Concerns of Mobile Apps, CARNEGIE MELLON UNIVERSITY, http://www.cmu.edu/ ncws/stones/arch|ves/2° I S/january/jan 15_appprivacyconcems.html (last updated Jan. 15, 109. Fact Sheet 18, supra note 88. 110. What They Know- Mobile, Wall Street Journal, (Oct. 1, 2014, 7:56 AM) httjr//blo«s wsj.com/wtk-mobile/. 6 ' Summer 2015] Using Digital Technology in Public Schools 313 introduced and as case law on the subject develops, answers will surely begin to emerge. 7. The Cloud As technology advances, the trend is to move information storage and hosting from tangible dedicated servers to cloud computing. The term “cloud” is simply another name for the Internet, although this term is constantly evolving.111 In general, cloud-computing platforms allow users to access software applications through web browsers that enable them to store files and email on the Internet.112 Cloud computing is attractive to school districts because it requires less maintenance, and is cheaper and more efficient. Cloud storage allows users to access files from any device that has Internet access, thus the cloud can be used to increase student access to learning activities and enhance the ability of students to collaborate with each other or receive feedback from teachers. However, the cloud has also opened the door to advertisers looking to access personal information.113 One of the problems
  • 54. with cloud computing is that most cloud providers who don’t charge for their services are funding their services through advertising to their clients or through data mining. For example, Google, Yahoo, and Hotmail are all email providers that use the cloud, but they are able to provide free services because of the money they make by selling user information to data collectors and by selling targeted advertisements created from the collected data. The major problem with this is that even though most companies promise to remove identifying information from data when they mine it, once the data has been transferred to a marketer, data aggregators can recombine information to re-identify an individual.116 Most people are aware that marketers have the ability to track online interactions. However, the H I. Andrea Cascia, Don’t Lose Your Head in the Cloud: Cloud Computing and Directed Marketing Raise Student Privacy Issues in K-12 Schools, 261 Ed. L. Rep. 883, 883 (2011). 112. Id. (Technically, files and emails are stored in offsite servers). 113. Id. at 885. 114. Id. 115. Id. at 885. 116. Id. 314 Journal of Law & Education [Vol. 44, No. 3 aggregation of data compiled from pieces collected in various places creates a substantial issue: it can turn non-personally identifiable information (non-PII) into personally identifiable information (PII) virtually eliminating the distinction. There is a wide misperception that if specific identifiable information, such as name, address, or social security number, is not voluntarily given, an individual can operate online anonymously. 7 Unfortunately, the problem is not unique to cloud computing and exists any time data collection is occurring. For example, the combination of three seemingly innocuous, and relatively available, pieces of information— gender, zip code, and date of birth are sufficient to accurately identify 87% of individuals in the United tates. When this concept is applied to data mining through the collection of a student’s web search terms and other text from messages and
  • 55. online documents, the aggregation of data can combine to form a well-developed profile of the individual.119 By the time the information is sold to an advertiser there is no sense of anonymity. 8. The Problems With PII The ability to aggregate data to compile a portrait of an identifiable individual is particularly problematic for schools because of the various restrictions on PII in federal legislation. As seen in the first section many laws, including FERPA, PPRA, and COPPA center on the “ p J mTWhat constitutes Personally identifiable information. Under rEKFA, PII cannot be shared without parental consent; under PPRA proper notification must be given before PII may be collected for commercial purposes; COPPA prevents the collection of PII without consent, from children under the age of thirteen. Therefore, whenever a student’s personal information is at issue, some law is likely to apply, dentifying which laws apply can be difficult, especially when the 117. Paul M. Schwartz and Daniel J. Solove, The PII Problem: Privacy and a New Concept of Personally Identifiably Information, 86 N.Y.U. L. Rev. 1814 1836 (2011) 118. Id at 1842 (citing Latanya Sweeney, Simple Demographics Often Identify People No! 3“ 2000)) meS'e UniV" SCh' °f ComPuter Sci" Data Privacy Lab., Working Paper 119. Cascia, supra note 111 at 885. Summer 2015] Using Digital Technology in Public Schools 315 collected information might not become PII until after the transaction at issue has occurred. Some of the laws have broadened their definitions of PII to include data that are not traditionally linkable to a student, such as geolocation. However, each law has its own definition of PII, making it harder to keep track of when a transaction might trigger federal or state A startling example of how aggregation can convert non-PII into PII was exposed by two computer scientists who demonstrated that some people could be identified simply based