The document discusses how technology is rapidly changing the human brain and society. It notes that research shows technology use can impair attention and cause brains to adapt to multitasking and quick decision making. However, brains are highly plastic and can learn new skills. While some experts warn that these changes could undermine deep thinking, others argue that different skills can coexist. The document concludes that by working together, digital natives and immigrants can help each other develop balanced skills to thrive in a fast-paced world without losing important abilities.
Micah Allen: Zombies or Cyborgs: Is Facebook eating your brain?Seismonaut
Micah Allen er hjerneforsker og PhD studerende på Århus Universitet. Her fortæller han om sociale mediers indflydelse på hjernen til Headstart Morgenseminar d. 17. marts 2010.
This report examines how digital technologies are impacting human cognition, neurology and behaviour. It is based on interviews with four globally recognised experts spanning the fields of neuroscience and behavioural psychology.
Intimate technology - the battle for our body and behaviourKarlos Svoboda
This essay aims to spark a wave of public and political debate about a series of new products already showered out over you, the volume of which will continue to increase during the coming years. This essay takes a serious look at the trend that technology is rapidly nesting itself in between us, very close to us and even within us, increasingly coming to know us and even receiving human traits. In short, we have become human-machine mixtures, cyborgs.
Micah Allen: Zombies or Cyborgs: Is Facebook eating your brain?Seismonaut
Micah Allen er hjerneforsker og PhD studerende på Århus Universitet. Her fortæller han om sociale mediers indflydelse på hjernen til Headstart Morgenseminar d. 17. marts 2010.
This report examines how digital technologies are impacting human cognition, neurology and behaviour. It is based on interviews with four globally recognised experts spanning the fields of neuroscience and behavioural psychology.
Intimate technology - the battle for our body and behaviourKarlos Svoboda
This essay aims to spark a wave of public and political debate about a series of new products already showered out over you, the volume of which will continue to increase during the coming years. This essay takes a serious look at the trend that technology is rapidly nesting itself in between us, very close to us and even within us, increasingly coming to know us and even receiving human traits. In short, we have become human-machine mixtures, cyborgs.
Some futurists and artificial intelligence experts envision credible scenarios in which synthetic brains will, within this century, extend the functionality of our own brains to the point where they will rival and then surpass the power of an or-ganic human brain. At the same time, humans seem to have no limitations when it comes to finding ways to attack the computerized devices that others have invent-ed. Attackers have successfully compromised computers, mobile phones, ATMs, telephone networks, and even networked power grids. If neural devices fulfill the promise of treatment, and enhance our quality of lives and functionality—which appears likely, given the preliminary clinical success demonstrated from neuropros-thetics— their use and adoption will likely grow in the future. When this happens, inevitably, a wide variety of legal, security, and public policy concerns will follow. We will begin this article with an overview of brain implants and neural devic-es and their likely uses in the future. We will then discuss the legal issues that will arise from the intersection among neural devices, information security, cybercrime, and the law.
By current estimates, we’re about a decade away from having exascale computing capability. That’s a pretty long time – especially in our world of HPC. What will the world be like in 2022? What form will exascale computing take when it’s real? These are difficult questions to answer. Never before has the HPC community focused so intensely on a machine so far beyond its grasp. Nevertheless, stalwart cadres around the globe are drafting strategies, plans, and roadmaps to get from here to exascale. So, what about the rest of us? Are there useful things we could do while waiting - or instead of waiting - for exascale? Perhaps there are. In this talk we’ll take a look at a few possibilities, including:
• Education
• eScience
• Big Data
• Broad HPC Deployment
• Computing in Industry
• Public Engagement
• Infrastructure Development and Build Out
• Success Metrics
Exascale computing may be a decade away, but there’s a lot to accomplish to be ready to exploit it. We’ll explore a few options here. We make no claim that these constitute the right agenda for the coming decade – nor do we suggest that we’ve given an exhaustive to-do list. Our intention is rather to open the conversation about what we should do while “waiting” for exascale.
Project NEMO - towards empathy-enabling digital environmentsKatri Saarikivi
A presentation on the interdisciplinary research project NEMO (Natural Emotions in digital interaction) that won the science-based idea competition arranged by the University of Helsinki, the Helsinki Challenge, in 2015. Presented at the Presented at the Digital International Collaboration in Education seminar arranged by the Finnish Ministry of Education and Culture on May 26th, 2016.
Quantifying human experience for increased intelligence within work teams an...Katri Saarikivi
We are looking for industry partners for a research project exploring the possibilities of the sciences and technologies of emotions and interaction in human-centered business.
Kim Solez intro tech&futmedicinecourse1sept2015Kim Solez ,
Kim Solez presents the "Introduction to the Technology and Future of Medicine Course - The Accessible Future" on September 1st, 2015 at the University of Alberta in Edmonton, Canada. http://www.singularitycourse.com http://www.youtube.com/user/kimsolez Copyright (c) 2015, JustMachines Inc.
Cyber Summit 2016: Knowing More and Understanding Less in the Age of Big DataCybera Inc.
The Internet has revolutionized how — and how much — each of us can know. Our digital tools put the knowledge of the world at our fingertips — and soon, maybe, right into our heads. But what kinds of of knowledge do our devices give us, and how are they reshaping and challenging the role that education and libraries should play in our lives?
This talk was delivered by Michael Patrick Lynch, professor of philosophy at the University of Connecticut, where he directs the university’s Humanities Institute.
Bit. This foundational element originated from the combination of mathematics and Claude Shannon's Theory of Information. Coupled with the 50-year legacy of Moore's Law, which is aging fairly gracefully, the bit has propelled the digitization of our world. However, the cloud data explosion, further accelerated by AI applications, depends upon a continued exponential performance growth in computing power being fueled by computing innovation. Advancing from Narrow AI to Broad AI will encompass the unification of learning and reasoning through neuro-symbolic systems, resulting in a form of AI which will perform multiple tasks, operate across multiple domains, and will learn continuously. This AI will signal the necessity of the transition of the computing foundational element from bits to neurons. Looking further, the union of physics and information leads to the emergence of Quantum Information Theory and the development of the quantum bit -the qubit -forming the basis of quantum computers. The future of computing will look fundamentally different than it has in the past. It will not be based on more and cheaper bits alone, but rather, it will be built upon bits + neurons + qubits. This future will enable the next generation of intelligent mission-critical systems and accelerate the rate of science-driven discovery. Starting from a philosophical standpoint about the motivation of researching "What's Next" in computing, in this talk I would like to share our vision for "bits + neurons + qubits". We will further discuss the role of a specific form of bits, i.e. the bits processed by reconfigurable logic blocks, which form the foundational element of the Field Programmable Gate Arrays (FPGAs) and we will review how such devices can contribute to shaping our vision.
Man’s dreams of ‘intelligences and robots’ goes back thousands of years to the worship of gods and statues; mythologies: talisman and puppets; people, places and objects with supposed magical and (often) judgemental/punitive abilities. But it wasn’t until the electronic revolution in 1915, accelerated by WWII that we saw the realisation of two game changing-machines: Colossus (Decoding Machine of Bletchley Park) 1943 and ENIAC (Artillery Computation Engine and Nuclear Bomb Design @ The University of Pennsylvania) 1946.
And so in 1950 the modern AI movement was optimistically projecting what machines would be capable of ‘almost anything’ by 1960/70. Unfortunately, there was no understanding of the complexity to be addressed, and all the projections were wildly wrong; leading to a deep trough of disparagement and disillusionment of some 30 years. However, 70 years on and the original AI optimism and projections of what might be have at least been largely achieved with AI outgunning humans at every board and card game including Poker and GO, and of course; general knowledge, medical diagnosis, image and information pattern recognition…
Does the Internet Make You DumberThe cognitive effects are measurab.docxjacksnathalie
Does the Internet Make You Dumber?The cognitive effects are measurable: We're turning into shallow thinkers, says Nicholas Carr.
By NICHOLAS CARR- the wall street journal
Updated June 5, 2010 12:01 a.m. ET
The Roman philosopher Seneca may have put it best 2,000 years ago: "To be everywhere is to be nowhere." Today, the Internet grants us easy access to unprecedented amounts of information. But a growing body of scientific evidence suggests that the Net, with its constant distractions and interruptions, is also turning us into scattered and superficial thinkers. (1)
The picture emerging from the research is deeply troubling, at least to anyone who values the depth, rather than just the velocity, of human thought. People who read text studded with links, the studies show, comprehend less than those who read traditional linear text. People who watch busy multimedia presentations remember less than those who take in information in a more sedate and focused manner. People who are continually distracted by emails, alerts and other messages understand less than those who are able to concentrate. And people who juggle many tasks are less creative and less productive than those who do one thing at a time. (2)
The common thread in these disabilities is the division of attention. The richness of our thoughts, our memories and even our personalities hinges on our ability to focus the mind and sustain concentration. Only when we pay deep attention to a new piece of information are we able to associate it "meaningfully and systematically with knowledge already well established in memory," writes the Nobel Prize-winning neuroscientist Eric Kandel. Such associations are essential to mastering complex concepts. (3)
When we're constantly distracted and interrupted, as we tend to be online, our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory. (4)
In an article published in Science last year, Patricia Greenfield, a leading developmental psychologist, reviewed dozens of studies on how different media technologies influence our cognitive abilities. Some of the studies indicated that certain computer tasks, like playing video games, can enhance "visual literacy skills," increasing the speed at which people can shift their focus among icons and other images on screens. Other studies, however, found that such rapid shifts in focus, even if performed adeptly, result in less rigorous and "more automatic" thinking. (5)
In one experiment conducted at Cornell University, for example, half a class of students was allowed to use Internet-connected laptops during a lecture, while the other had to keep their computers shut. Those who browsed the Web performed much worse on a subsequent test of how well they retained the lecture's content. While it's hardly surprising th ...
Zombies or Cyborgs: is Facebook Eating Your Brain?guestcf1e8d8
While some present the dawn of the social web as a doomsday, we believe that social media technologies represent a secondary revolution to that described above by cyborg cognition theorist Andy Clark. Trapped within this debate lies the brain; recent advances in the neurosciences have thrown open our concept of the brain, revealing a neural substrate that is highly flexible and plastic (Green and Bavelier 2008). This phenomenal level of plasticity likely underpins much of what separates us from the animal kingdom, through a profound enhancement of our ability to use new technologies and their cultural co-products (Clark and Chalmers 1998; Schoenemann, et al. 2005; Shaw, et al. 2006). Yet many fear that this plasticity represents a precise threat to our cognitive stability in light of the technological invasion of Twitter-like websites. By investigating how the brain changes as we undergo profound self alteration via digital meditation, we can begin to unravel the biological mysteries of plasticity that underpin a vast array of issues in the humanities and social sciences.
Some futurists and artificial intelligence experts envision credible scenarios in which synthetic brains will, within this century, extend the functionality of our own brains to the point where they will rival and then surpass the power of an or-ganic human brain. At the same time, humans seem to have no limitations when it comes to finding ways to attack the computerized devices that others have invent-ed. Attackers have successfully compromised computers, mobile phones, ATMs, telephone networks, and even networked power grids. If neural devices fulfill the promise of treatment, and enhance our quality of lives and functionality—which appears likely, given the preliminary clinical success demonstrated from neuropros-thetics— their use and adoption will likely grow in the future. When this happens, inevitably, a wide variety of legal, security, and public policy concerns will follow. We will begin this article with an overview of brain implants and neural devic-es and their likely uses in the future. We will then discuss the legal issues that will arise from the intersection among neural devices, information security, cybercrime, and the law.
By current estimates, we’re about a decade away from having exascale computing capability. That’s a pretty long time – especially in our world of HPC. What will the world be like in 2022? What form will exascale computing take when it’s real? These are difficult questions to answer. Never before has the HPC community focused so intensely on a machine so far beyond its grasp. Nevertheless, stalwart cadres around the globe are drafting strategies, plans, and roadmaps to get from here to exascale. So, what about the rest of us? Are there useful things we could do while waiting - or instead of waiting - for exascale? Perhaps there are. In this talk we’ll take a look at a few possibilities, including:
• Education
• eScience
• Big Data
• Broad HPC Deployment
• Computing in Industry
• Public Engagement
• Infrastructure Development and Build Out
• Success Metrics
Exascale computing may be a decade away, but there’s a lot to accomplish to be ready to exploit it. We’ll explore a few options here. We make no claim that these constitute the right agenda for the coming decade – nor do we suggest that we’ve given an exhaustive to-do list. Our intention is rather to open the conversation about what we should do while “waiting” for exascale.
Project NEMO - towards empathy-enabling digital environmentsKatri Saarikivi
A presentation on the interdisciplinary research project NEMO (Natural Emotions in digital interaction) that won the science-based idea competition arranged by the University of Helsinki, the Helsinki Challenge, in 2015. Presented at the Presented at the Digital International Collaboration in Education seminar arranged by the Finnish Ministry of Education and Culture on May 26th, 2016.
Quantifying human experience for increased intelligence within work teams an...Katri Saarikivi
We are looking for industry partners for a research project exploring the possibilities of the sciences and technologies of emotions and interaction in human-centered business.
Kim Solez intro tech&futmedicinecourse1sept2015Kim Solez ,
Kim Solez presents the "Introduction to the Technology and Future of Medicine Course - The Accessible Future" on September 1st, 2015 at the University of Alberta in Edmonton, Canada. http://www.singularitycourse.com http://www.youtube.com/user/kimsolez Copyright (c) 2015, JustMachines Inc.
Cyber Summit 2016: Knowing More and Understanding Less in the Age of Big DataCybera Inc.
The Internet has revolutionized how — and how much — each of us can know. Our digital tools put the knowledge of the world at our fingertips — and soon, maybe, right into our heads. But what kinds of of knowledge do our devices give us, and how are they reshaping and challenging the role that education and libraries should play in our lives?
This talk was delivered by Michael Patrick Lynch, professor of philosophy at the University of Connecticut, where he directs the university’s Humanities Institute.
Bit. This foundational element originated from the combination of mathematics and Claude Shannon's Theory of Information. Coupled with the 50-year legacy of Moore's Law, which is aging fairly gracefully, the bit has propelled the digitization of our world. However, the cloud data explosion, further accelerated by AI applications, depends upon a continued exponential performance growth in computing power being fueled by computing innovation. Advancing from Narrow AI to Broad AI will encompass the unification of learning and reasoning through neuro-symbolic systems, resulting in a form of AI which will perform multiple tasks, operate across multiple domains, and will learn continuously. This AI will signal the necessity of the transition of the computing foundational element from bits to neurons. Looking further, the union of physics and information leads to the emergence of Quantum Information Theory and the development of the quantum bit -the qubit -forming the basis of quantum computers. The future of computing will look fundamentally different than it has in the past. It will not be based on more and cheaper bits alone, but rather, it will be built upon bits + neurons + qubits. This future will enable the next generation of intelligent mission-critical systems and accelerate the rate of science-driven discovery. Starting from a philosophical standpoint about the motivation of researching "What's Next" in computing, in this talk I would like to share our vision for "bits + neurons + qubits". We will further discuss the role of a specific form of bits, i.e. the bits processed by reconfigurable logic blocks, which form the foundational element of the Field Programmable Gate Arrays (FPGAs) and we will review how such devices can contribute to shaping our vision.
Man’s dreams of ‘intelligences and robots’ goes back thousands of years to the worship of gods and statues; mythologies: talisman and puppets; people, places and objects with supposed magical and (often) judgemental/punitive abilities. But it wasn’t until the electronic revolution in 1915, accelerated by WWII that we saw the realisation of two game changing-machines: Colossus (Decoding Machine of Bletchley Park) 1943 and ENIAC (Artillery Computation Engine and Nuclear Bomb Design @ The University of Pennsylvania) 1946.
And so in 1950 the modern AI movement was optimistically projecting what machines would be capable of ‘almost anything’ by 1960/70. Unfortunately, there was no understanding of the complexity to be addressed, and all the projections were wildly wrong; leading to a deep trough of disparagement and disillusionment of some 30 years. However, 70 years on and the original AI optimism and projections of what might be have at least been largely achieved with AI outgunning humans at every board and card game including Poker and GO, and of course; general knowledge, medical diagnosis, image and information pattern recognition…
Does the Internet Make You DumberThe cognitive effects are measurab.docxjacksnathalie
Does the Internet Make You Dumber?The cognitive effects are measurable: We're turning into shallow thinkers, says Nicholas Carr.
By NICHOLAS CARR- the wall street journal
Updated June 5, 2010 12:01 a.m. ET
The Roman philosopher Seneca may have put it best 2,000 years ago: "To be everywhere is to be nowhere." Today, the Internet grants us easy access to unprecedented amounts of information. But a growing body of scientific evidence suggests that the Net, with its constant distractions and interruptions, is also turning us into scattered and superficial thinkers. (1)
The picture emerging from the research is deeply troubling, at least to anyone who values the depth, rather than just the velocity, of human thought. People who read text studded with links, the studies show, comprehend less than those who read traditional linear text. People who watch busy multimedia presentations remember less than those who take in information in a more sedate and focused manner. People who are continually distracted by emails, alerts and other messages understand less than those who are able to concentrate. And people who juggle many tasks are less creative and less productive than those who do one thing at a time. (2)
The common thread in these disabilities is the division of attention. The richness of our thoughts, our memories and even our personalities hinges on our ability to focus the mind and sustain concentration. Only when we pay deep attention to a new piece of information are we able to associate it "meaningfully and systematically with knowledge already well established in memory," writes the Nobel Prize-winning neuroscientist Eric Kandel. Such associations are essential to mastering complex concepts. (3)
When we're constantly distracted and interrupted, as we tend to be online, our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory. (4)
In an article published in Science last year, Patricia Greenfield, a leading developmental psychologist, reviewed dozens of studies on how different media technologies influence our cognitive abilities. Some of the studies indicated that certain computer tasks, like playing video games, can enhance "visual literacy skills," increasing the speed at which people can shift their focus among icons and other images on screens. Other studies, however, found that such rapid shifts in focus, even if performed adeptly, result in less rigorous and "more automatic" thinking. (5)
In one experiment conducted at Cornell University, for example, half a class of students was allowed to use Internet-connected laptops during a lecture, while the other had to keep their computers shut. Those who browsed the Web performed much worse on a subsequent test of how well they retained the lecture's content. While it's hardly surprising th ...
Zombies or Cyborgs: is Facebook Eating Your Brain?guestcf1e8d8
While some present the dawn of the social web as a doomsday, we believe that social media technologies represent a secondary revolution to that described above by cyborg cognition theorist Andy Clark. Trapped within this debate lies the brain; recent advances in the neurosciences have thrown open our concept of the brain, revealing a neural substrate that is highly flexible and plastic (Green and Bavelier 2008). This phenomenal level of plasticity likely underpins much of what separates us from the animal kingdom, through a profound enhancement of our ability to use new technologies and their cultural co-products (Clark and Chalmers 1998; Schoenemann, et al. 2005; Shaw, et al. 2006). Yet many fear that this plasticity represents a precise threat to our cognitive stability in light of the technological invasion of Twitter-like websites. By investigating how the brain changes as we undergo profound self alteration via digital meditation, we can begin to unravel the biological mysteries of plasticity that underpin a vast array of issues in the humanities and social sciences.
Zombies or Cyborgs: Is Facebook Eating Your Brain?Micah Allen
In this talk, I review recent findings in neuroplasticity as well as basic methods for measuring functional and structural plasticity in the human brain. I apply insights from these findings to debate concerning the neurocognitive impact of our rising uses of social media networks. This talk reviews my ongoing empirical research in this area and ultimately suggest that we can reject the 'zombies' in favor of the adaptive social cyborg view of mind.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
Normal Labour/ Stages of Labour/ Mechanism of LabourWasim Ak
Normal labor is also termed spontaneous labor, defined as the natural physiological process through which the fetus, placenta, and membranes are expelled from the uterus through the birth canal at term (37 to 42 weeks
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Dyer portfolio essay technology and the changing brain
1. Running head: CHANGING BRAIN 1
Technology and the Ever-Changing Brain
Michelle Dyer
Western Oregon University
Author Note
This essayhas been submitted for portfolio essaygraduation requirement for Western Oregon University’s
Master of Education:Information Technologyprogram.
2. CHANGING BRAIN 2
Technology and the Changing Brain
We live in a time like no other time in history. With the creation of the internet,
mobile technologies, social media, and other emerging technologies, we can access
much of the world’s combined knowledge at a moment’s notice, from almost anywhere
on the planet. As these technologies are rapidly developed and improved, our brains,
and the way we think are adapting very rapidly as well, to work efficiently in this fast-
paced world. There is a lot of ongoing discussions about the changes going on because
of technology; Changes in our brains, the way we think, communicate and interact with
the world around us. While most experts who are talking about these issues agree that
the changes are real, substantial and happening very quickly; what they can’t agree on
is what these changes mean for us as individuals and us as a society.
There has been a lot of research on technology’s effects of the brain. In Gary
Small’s (2009) study Your Brain on Google, the researchers noted that their finding
pointed to “sensitivity of brain neural circuits to common computer tasks such as
searching online and constant use of such technologies have the potential for negative
brain and behavior effects, including impaired attention and addiction (p. 125).
Observation in the workplace alone can show you ways that people who use are heavy
technology users exhibit impaired attention; how they think fast, react fast, skim, make
quick decision, multitask [a lot], talk in shorthand and can lack interpersonal skills.
Frustratingly, this can be seen in most social situations. It seems like it is getting
increasingly difficult for people to speak in complete sentences. They speak as if they
are emailing, or worse, texting: in shorthand.
3. CHANGING BRAIN 3
While, with all the current research, it can hardly be refuted that technology is
rewiring our brains, it does not necessarily follow that this is the end of all thought,
attention and reason. Small’s (2009) research also showed that there is “significant
associations between engaging in mentally stimulating activities and better cognitive
performance” (p. 124) And Nicholas Carr (2011), in his book The Shallow, What the
Internet is Doing to Our Brains, made the point that the internet was designed in a way
that makes it the most effective medium ever at rewiring our brains. He said, “the Net
delivers precisely the kind of sensory and cognitive stimuli – repetitive, intensive,
interactive, addictive – that have been shown to result in strong and rapid alterations in
brain circuits and functions” (p.116). These changes aren’t new though, they have been
going on for a long time, from other forms of technology, the internet has just taken it to
a new level.
Media and the Changing Brain
Jane Healy (1990) was concerned about technology, primarily television, and
points out in her book Endangered Minds that the things children do every day, from
what they think, to what they pay attention to, not only changes the way their brains are
used, but physically changes the structure or neural wiring in their brains (p. 51).
Whereras Healy claimed this was a danger to society, and especially our children,
Steven Johnson (2005), in his book Everything Bad is Good for You, made the claim
that media is actually making us smarter. Johnson claims that there is empirical
evidence that media and the cognitive workout it gives us is responsible for a 13.8
average IQ score increase in the past forty-six years (p. 140). Neil Postman (1985) also
talked about how the phenomenon of television, media and Sesame Street were
4. CHANGING BRAIN 4
changing the way people thought and interacted in his book Amusing Ourselves to
Death long before smartphones, video games, and the internet. Certainly, those
changes were just as real as the changes brought on by computers, internet,
smartphones and other emerging technologies. One difference is that now we have the
means to use science and technology to back up their claims.
There have been many scientific research studies, such as Gary Small’s 2009
study, in which MRI’s and other scientific techniques have been used to show very clear
pictures what exactly technology does to our brains. We can clearly see that the use of
technology, whether it is the internet or video games, does in-fact stimulate different
parts of our brains, and does cause our brains to make different connections than the
connections made by reading and deep thought. Some connections diminish, others
grow. Neurons that fire together wire together, creating stronger connections, this is a
lifelong process in our amazing, ever-adapting brains. Small’s (2009) study shows that
significant changes can occur in as little as two weeks (p. 122). However, none of this
has to mean the end of civilization.
Our brains are designed with a high degree of plasticity. They are designed to
adapt very quickly to ever changing environments. Most experts studying the effects of
technology on the brain and the way people communicate and interact can tell you that
some of the effects of technology on the brain are that it causes people to skim
information, pick out important bits of information very quickly, make quick decisions
and move on. People are becoming really good and efficient at multitasking, as the
average digital native is operating on at least three screens or tasks at any given time. It
really should not be surprising that diagnosis of ADHD in children is on the rise and has
5. CHANGING BRAIN 5
been for the last decade. In her Ted Talk, Technology & the Human Mind, scientist
Susan Greenfield (2014) points out something that should be very obvious. Greenfield
said, “if you have a young brain with the evolutionary mandate, as the human brain has,
to adapt to the environment and that brain is placed in an environment that is very fast-
paced requiring a little attention span where you move on to the next thing, you interact
very fast, the brain will obligingly adapt to that.” The problem is with people assuming
that this is a one-way change. That our brains, which wire to multitasking, and quick
decision making, cannot also learn to think deeply and spend time in focused, deep
concentration.
Steven Johnson (2006) discredits that point of view, as he shares some very
interesting recent research that shows that playing video games actually “sharpens the
brain’s ability to shift from an ‘idle’ state of inactivity to a focused, task-driven state, and
to separate out signal from noise in a complex situation” (p. 208). The researchers are
recommending that elderly people play video games to sharpen their minds. Adaptation
is something that our brains were designed to do. They do it well, they do it quickly, and
most importantly, they do it our entire lives.
Nicholas Carr (2011) claims that “given our brain’s plasticity… we can assume
that the neural circuits devoted to scanning, skimming and multitasking are expanding
and strengthening, while those used for reading and thinking deeply, with sustained
concentrations, are weakening or eroding.” (p.140) However, Gary Small (2009), in his
book iBrain, Surviving the technological Alteration of the Modern Mind, talks about the
opportunity that we have overcome that.
6. CHANGING BRAIN 6
Bridging the Gap
Our brains are capable of developing to be good at both functions; It doesn’t
have to be one or the other. Small counters Carr’s rational that we are losing the ability
to think deeply and concentrate. However, to accomplish this, Small (2009) proposes
the we should be working together generationally; he believes that digital natives and
digital immigrants need to come together and help each other develop the skills that
each may lack, instead of conflicting and working against each other because of the
perceived lack of skills they may see in each other. And he believes that as we learn to
work together and learn from each other and “as our society bridges the brain gap the
future brain will emerge” (p. 186). He goes on to describe his view of the future brain,
saying, “Not only will this future brain be tech-savvy and ready to try new things, it will
have mastered multitasking and paying attention and fine-tuned it verbal and nonverbal
skills. It will know how to assert itself as well as express empathy, have excellent people
skills and be able to nurture its own creativity” (p.186). There is no reason to think that
the human brain cannot have it all. The human brain is highly malleable, and when you
look at how far we’ve come even just in the last 200 years, it isn’t logical to assume, as
critics such as Carr and Healy have, that our brains could not adapt to be adept at
thinking deeply and working very quickly. What we are seeing and experiencing right
now are just the growing pains of evolutionary change.
Technology and Changes to Society
What changes do the effects of technology bring to our society? On the one
hand, some argue that this is ushering in the coming of a dark age, from where no new
knowledge can come, and no attention can be held. From this perspective, no good
7. CHANGING BRAIN 7
can come of technology, it will just ruin us. Maggie Jackson (2009), in her book
Distracted: The Erosion of Attention and the Coming Dark Age, points out the dangers
of habitual multitasking, she says, “Without the powers of focus, awareness, and
judgement that fuel self-control, we cannot fend off distractions, set goals, manage a
complex, changing environment, and ultimately shape the trajectory of our lives” (p.
233). Yes, we must have the power to focus and exercise good judgment and set goals,
of course. However, on the other hand, being able to quickly skim information, and
quickly make good choices and decisions and then move on is a critical skill in a world
that is so quickly evolving and changing. The world today is moving exponentially faster
than it was forty years ago. Technology, business, resources, and society continually
change, practically overnight; leaders and citizens of tomorrow, who are growing up in
this fast-paced world have to be able to keep up.
Some would argue that these changes in society are just a natural evolution of
things. Not much different than the changes brought to society by the clock, printing
press, or the television. When clocks were invented, societies stop running their lives by
the natural rhythms of the sun. We became keepers of time, savers of time, and now we
are slaves of time. But who living today could imagine not being able to show up at work
at precisely 8am or not being able to meet a client for lunch at exactly 12pm? We
adjusted to this new technology, and now the clocks have moved out of the church
tower, on to our wrists, phones, computers and walls.
The printing press moved us from oral societies, to written societies. It changed
the way of politics and rhetoric. It moved the word of God out of the hands of the
church, and in to the hands of the people, where they could read, reflect and interpret it
8. CHANGING BRAIN 8
in their own way, instead of having the interpretation being controlled by the church. It
changed the way we spread news, learned, spoke, and viewed the world. Yes, it
changed us, but now who could imagine a world without printed words. Society adjusts
to these changes. In fact, where would we be without them? Change is not necessarily
bad.
Emerging technologies are changing our society again. However, one main
difference is the time frame in which these changes are taking place. Technology is
emerging at an ever-increasing rate, changing communication and interaction in years,
rather than decades or longer. The world, primarily because of computers and the
internet, has shrunk exponentially, in just a couple decades, instead of the changes that
take place much slower, over a longer period of time. In sum, then the issue is whether
we as a society can master the fast-paced, instantaneous, multitasking ways of the new
generation, without losing the verbal, mental and interpersonal skills mastered by the
generations that came before them. Many believe we can. We just need to be more
informed and intentional about making changes to society.
Working Together for a Better Tomorrow
Since these changes are taking place so quickly, we are in the unique position of
having multiple generations together at different places in the process of these changes.
We are in the inimitable position of being able to work together, as digital natives (the
children born into the technology) and digital immigrants (the adults who watched the
technology being born). In the words Gary Small, one of this view’s main proponents,
“as digital native and digital immigrants learn to come together rather than collide, their
brain neural circuitry will adapt for the better” According to this view, we are in a time
9. CHANGING BRAIN 9
where baby boomers and gen Xers (digital immigrants) can, and should, work together
with millennials and the next generation, Gen Z (digital natives), to bridge the skills
gaps, so the previously mentioned “future brai” can emerge. There is no turning back
the clock. We cannot undo the progress, nor would we want to. What we need to do is
to lean it to it, and leaders, educators, and academics, are in the best possible position;
one where we have the foresight and scientific evidence to see what is happening, and
the opportunity to set the stage for the best possible outcome. Jackson (2009) asks
“Can a society without deep focus preserve and learn from its past? Does a culture of
distraction evolve to meet the needs of its future?” (p. 215). There are many that believe
that yes, we can. We can be wired for the distraction, and for deep focus. We can
switch between these skill sets, to meet the needs of our evolving, changing, shrinking
world. What we are experiences now are just the growing pains of the unknown; We
can’t quite see yet where this will all lead us, we just know it is changing us.
10. CHANGING BRAIN 10
References
Carr, N. G. (2011). The shallows: how the internet is changing the way we think, read
and remember. London: Atlantic Books.
Greenfield, S. (2014, July 03). Technology & the human mind | Susan Greenfield |
TEDxOxford. Retrieved March 08, 2018, from
https://www.youtube.com/watch?v=oc7ZYj4CCdM
Jackson, M. (2009). Distracted: the erosion of attention and the coming
Dark Age. Amherst, NY: Prometheus Books
Johnson, S. (2006). Everything bad is good for you: how todays popular culture is
actually making us smarter. New York: Riverhead Books.
Healy, J. M. (1999). Endangered minds: why our children don’t think. New York: Simon
and Schuster.
Huxley, A., & Huxley, A. (n.d.). Brave New World; and, Brave New World Revisited.
New York: Harper Perennial Modern Classics.
Postman, N. (2005). Amusing ourselves to death: public discourse in the age of show
business. Penguin USA.
Small, G. W., & Vorgan, G. (2009). IBrain: surviving the technological alteration of the
modern mind. New York: William Morrow.
Small, G. W., Moody, T. D., Siddarth, P., & Bookheimer, S. Y. (2009). Your brain on
Google: patterns of cerebral activation during internet searching. The American
Journal of Geriatric Psychiatry, 17(2), 116-126.
Thomas, M. (2011). Deconstructing digital natives: Young people, technology, and the
new literacies. New York: Routledge.